[pymvpa] Rapid ER design

Michael Hanke michael.hanke at gmail.com
Sun Mar 29 08:47:43 UTC 2015


Hi,

On Fri, Mar 27, 2015 at 11:55 PM, Jeffrey Bloch <jeffreybloch1 at gmail.com>
wrote:

> Dear All,
>
> Hello!  I am in the middle of setting up a rapid event-related design
> analysis, and was hoping I could ask a few questions.  I've been working
> through this for a while, but to no avail.  Thanks in advance!
>
> My main confusion(s) stem from the fact that my design has events that are
> not multiples of the TR.  Namely, there are 110 events (of 3s duration),
> but my TR is 2s.  So, technically, any volume could have multiple
> conditions, and/or conditions can spread over more than one TR.  What is
> the best way of dealing with this when setting up the analysis?
>

The HRF-modeling approach seems like a good choice.

1.  PyMVPA knows that my nifti file has 179 volumes, so it keeps getting
> mad when I try to create an attribute that has fewer items (say, a
> condition for each of the 110 events).  How can I make the software forget
> about the volumes and just focus on the timepoints of my events?  The
> original attribute file (with just chunks and condition) has to be 179
> items long, but 110 doesn't fit into 179 very nicely!  :)
>

You cannot make it forget. But you do not have to work with the original
time series and model your events to get parameter estimates.

2. I am trying to get GLM estimates (per event/volume) using
> 'eventrelated_design' with 'hrf'.  I understand that it is possible to
> bring in betas from outsidePyMVPA, but I was hoping to keep it all "in
> house."  Even if I make my event list by multiplying by TR (as in the
> tutorial), the original dataset need is still 179 volumes, so the events
> and attributes don't line up.
>

HRF modeling doesn't require timing to be expressed in multiples of TR. It
would be best, if you would share the relevant parts of your code to get a
better picture of what you are doing.

3.  I find that my "time_coords" are always blank (all zeros) when I create
> my datasets, but several functions seem to require this information.
> Similarly, the time_indices are always 179 due to the nifti volumes, but
> that's not very useful for me (again, 110 events, sorry to be repetitive).
>

Please also share the parts of your code that create your datasets. Also,
please be aware that you can simple add the time_coords attribute to a
dataset at any time.


> Lastly (I hope!), is it possible using eventrelated_dataset to get
> parameters for each event (instead of each target)?  I was hoping to be
> able to tease apart relationships at an event by event basis.
>

Yes, you can do that. But of course a per event modeling tends to be
noisier than a coarser one.

Michael
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20150329/5ac07e0f/attachment.html>


More information about the Pkg-ExpPsy-PyMVPA mailing list