[pymvpa] Using masks with different dimensions than the functional volumes

Derek Huffman derek.huffman23 at gmail.com
Wed Sep 4 20:45:40 UTC 2013


Hi Shane,

I agree with Nick, I would use AFNI's 3dresample command. Make sure to use
the -master flag followed by the name of your fMRI data, e.g.,

3dresample -master bold.nii -prefix rois_in_fmri_space -inset
your_ROI_filename_here.nii

(you may get an error with gzipped data... if so first gzip -d bold.nii.gz)

Then AFNI's 3dAFNItoNIFTI command will get the resultant file into NIFTI
format for you:

3dAFNItoNIFTI rois_in_fmri_space+orig.

If you cannot install AFNI on your local machine, you could always install
the Neurodebian virtual machine: http://neuro.debian.net/vm.html

Cheers,
Derek


On Sat, Aug 31, 2013 at 8:46 PM, Shane Hoversten <shanusmagnus at gmail.com>wrote:

> Howdy -
>
> Let me preface this msg by saying that I only _barely_ know what I'm
> doing.  This will probably become evident immediately.
>
> In an experiment whose data I'm trying to analyze, I'd like to
> localize the analysis based on the results of various masks, including
> Brodmann masks created with the wfu pickatlas tool
> (http://fmri.wfubmc.edu/software/PickAtlas).  I can use these masks in
> the course of univariate analyses with SPM despite the fact that the
> masks have different numbers of voxels than do our acquired volumes --
> the NIFTI headers apparently contain enough info to match up the mask
> and the volumes in physical space, and SPM just does the right thing,
> which was a pleasant surprise.
>
> But now I'd like to use these masks with PyMVPA.  I've been working
> through the tutorials with the Haxby dataset, and part 2 of the
> tutorial demonstrates loading the Haxby BOLD data with a mask, like
> so:
>
> ds = fmri_dataset(os.path.join(path, 'bold.nii.gz'),
> mask=os.path.join(path, 'mask_vt.nii.gz'))
>
> As it happens, the shape of the voxel matrix for this BOLD data, and
> for this mask, are the same.  That's a luxury we don't have in our
> dataset, but I crossed my fingers and tried to load one of our volumes
> with the mask, but to no avail.  I got this error:
>
>
> /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/mvpa2/mappers/flatten.pyc
> in _forward_data(self, data)
>     103             raise ValueError("FlattenMapper has not been
> trained for data "
>     104                              "shape '%s' (known only '%s')."
> --> 105                              % (str(sshape), str(oshape)))
>     106         ## input matches the shape of a single sample
>     107         #if sshape == oshape:
>
> ValueError: FlattenMapper has not been trained for data shape '(91,
> 109, 91)' (known only '(96, 96, 37)').
>
> The (91,109,91) are the dimensions of the masks produced by wfu
> pickatlas; our data is the (96, 96, 37).  So my question is, how does
> one deal with these situations?  I've googled as best as I'm able but
> haven't found the issue addressed.  Can anyone point me in a good
> direction?
>
> Thanks very much,
>
> S
>
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20130904/14a1c57b/attachment.html>


More information about the Pkg-ExpPsy-PyMVPA mailing list