[pymvpa] 3D images with NiftiDataset?

Michael Hanke michael.hanke at gmail.com
Thu Sep 18 07:33:53 UTC 2008

Hi Dylan,

On Thu, Sep 18, 2008 at 01:27:21AM -0400, Dylan David Wagner wrote:
> Hey list,
> I've been mucking about trying to get NiftiDataset to grab my images (3d 
> maps of beta weights from a GLM) and mask but keep running into this error:
> ValueError: The mask dataspace shape [(54, 65, 53)] is not compatible 
> with the shape of the provided data samples [(65, 53)].
> The mask and data are in the same space (dimensions, voxel size, etc.). 
> I also tried just using the same file as both mask and sample just to 
> make sure, and still get the error. If I'm reading the API reference 
> correctly (which isn't even remotely within the vicinity of being a safe 
> bet as I only started learning python this morning) it seems 
> NiftiDataset is expecting 4d images for samples (i.e. line 114 of module 
> mvpa.datasets.nifit and other places). I believe what is happening is 
> it's treating the first dimension it finds as 'time" and then using the 
> remaining dimensions to define the data. The mask, obviously, stays in 3d.

Your analysis is correct. NiftiDataset assumes that the first axis
separates the samples (not necessary a time axis, though!). Therefore,
to be able to use NiftiDataset in its current form with your data, you
have to combine your samples first. Here is an outline (should run out
of the box, but I haven't tested it heavily:

# compile a list of filenames with the beta weights for each subject
subjects = ['ga14', 'fz11', 'vx76']
# assumes FSL-like structure, YMMV
glm_filenames = [i + '.feat/stats/mycontrast.nii.gz' for i in subjects]

# now merge all beta maps into one NiftiImage while determining header
# properties from a single image 'common.nii.gz'. This file could be
# any of the beta maps or just a mask file in the same space as the beta
# maps
merged_betas = NiftiImage(
                    N.array( [ NiftiImage(i).asarray() for i in glm_filenames] ),
                    NiftiImage('common.nii.gz').header )

# now create NiftiDataset, directly from NiftiImage object
ds = NiftiDataset(samples=merged_betas, labels=..., ...)

You can create the labels vector using 'subjects' list, ow whatever
defines the groups of subjects in your study.

> So! Is there a way to import 3d images?  (fyi: I'm looking to classify 
> subjects not events, in a fashion similar to this paper which used 
> anatomical images: 
> http://archpsyc.ama-assn.org/cgi/content/abstract/62/11/1218)
Sitting in a train right now, so I can't check the paper, but above
should do what you need.

> p.s. while I have your attention, any chance of making available the 
> tutorial/example dataset that the manual refers to? And any plans to 
> release windows binaries for 0.3.1?
Anything that the manual refers to is part of the source tarball (look
into data/). The manual and the examples are part of the test battery of
PyMVPA, ie. they are checked if they still do what they should, whenever
PyMVPA itself is tested. The whole testsuite is self-contained (tests +
data) and is part of the source tarball.

At the moment there are no further example dataset -- maybe
in the (near) future...

Per windows binaries: Sure, I just did not manage to get to a windows box so far
-- should happen either today or tomorrow.



GPG key:  1024D/3144BE0F Michael Hanke
ICQ: 48230050

More information about the Pkg-ExpPsy-PyMVPA mailing list