[pymvpa] load raw 'not nifti' data

Michael Waskom mwaskom at stanford.edu
Thu Oct 6 18:02:39 UTC 2011


Right but Freesurfer has routines where it can represent the 2d vector in
several "slices" that will fit in the Nifti dimensions.  It does require
that the number of vertices can be factored evenly with the largest factor <
2**15, which tends not to be the case with single-subject surfaces but is
the case for fsaverage and I wonder if they build those constraints into the
make_average_surface algorithm.

Try running:

mri_surf2surf --sval lh.yourimage.mgz --tval lh.newimage.nii.gz --reshape
--s subject_id --hemi lh

And you may or may not get a warning about compatibility with other
programs.

Michael

On Wed, Oct 5, 2011 at 11:51 PM, Gregory Kirk <gkirk at wisc.edu> wrote:

> no it cannot be fit into a nifti, it is only cortical surface values not
> volume information ( 2D information).there are n_subjects x
> n_cortical_thickness_values ,in this case 60 x ~200,000
>
>
>
>
> it needs to be imported into an object with 60 samples each with 200,000
> features.
>
>
> greg
>
> On 10/05/11, Michael Waskom   wrote:
> > Does the make_average_subject routine guarantee that the surface can be
> reshaped to fit into Nifti?  If so, it's probably easiest to just stick
> the timeserieses together in a reshaped nifti and read that in with Nibabel.
> >
> >
> >
> >
> >
> >
> > Or feel free to just steal the read_scalar_data function from the
> PySurfer source.  It's quite simple :)
> >
> >
> > https://github.com/nipy/PySurfer/blob/master/surfer/io.py#L108
> >
> >
> >
> >
> >
> >
> > Best,
> > Michael
> >
> > On Wed, Oct 5, 2011 at 6:59 PM, Gregory Kirk <gkirk at wisc.edu <
> gkirk at wisc.edu>> wrote:
> >
> >
> >
> > >
> >
> > > i use the ' old way' in freesurfer, so i made an average
> surface with make_average_subject, then resampled allmy subjects onto it
> with mris_preproc and the resulting file is in .mgh format.
> > >
> >
> > >
> > >
> >
> > >
> > >
> >
> > > i know i can load the data into a matlab matrix with a freesurfer
> matlab script, i am forced to use
> > >
> >
> > > the redhat derived scientific linux as that is what is installed by the
> IT guys for us. so earlier today
> > >
> >
> > > i had an e-mail with Yarik about the C++ compile errors i was getting
> and he was going to get back to me.
> > >
> >
> > > so at the moment maybe installing the Pysurfer module may not be as
> easy for me as i cant just apt-get install
> > >
> >
> > > and be sure it will go smooth, so for a first crack
> > >
> >
> > > im hoping the load matlab mat  method that i got in an earlier e-mail
> from the list may be more simple, although
> > >
> >
> > > as i get rolling maybethe viewer you suggest sounds interesting.
> > >
> >
> > >
> > >
> >
> > >
> > >
> >
> > > thanks, ill be in touch once i get the basic PyMVPA running
> > >
> >
> > >
> > >
> >
> > >
> > >
> >
> > > cheers
> > >
> >
> > >
> > >
> >
> > >
> > >
> >
> > > Greg
> > >
> >
> > >
> > >
> >
> > > On 10/05/11, Michael Waskom   wrote:
> > >
> >
> > > > As I understand the question, you want to use morphometric data
> that's been transformed into some standard space (i.e. you've run
> recon-all -qcache), so actually you wouldn't use read_morph_data (which
> only applies to the morphometric data files in the format that gets spit out
> of recon-all) but rather read_scalar_data, which will read in either
> Freesurfer .mg{hz} files (which is probably the format your data are in) or
> anything that's readable by nibabel.  Note that it will always just
> return a vector of datapoints matching the vertices in surface the scalar
> data file, even if the underlying data is a "reshaped" nifiti file which is
> probably what you want anyway if you're sticking it together into a
> dataset sample matrix.
> > >
> >
> > >
> >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > > The other upside of installing PySurfer is that you can then display
> pretty MVPA results with a lot more flexibility than tksurfer offers you :)
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > > Best,
> > >
> >
> > > > Michael
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > >
> > > > On Tue, Oct 4, 2011 at 7:02 PM, Yaroslav Halchenko <
> debian at onerussian.com <debian at onerussian.com> <debian at onerussian.com <
> debian at onerussian.com>>> wrote:
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > another possible (haven't tried... unfortunately I am still
> ignorant in
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > FreeSurfer ;) ) way is
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >  sudo apt-get install python-surfer
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > and then in python
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > >
> > > > >   from surfer.io(http://surfer.io)(http://surfer.io) import
> read_morph_data
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >   x =  read_morph_data(filename)
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > and then combined them into dataset as Per mentioned.
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > On Tue, 04 Oct 2011, Per B. Sederberg wrote:
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > Hi Greg:
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > You can save out that big matrix to a mat file from matlab (say
> the
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > matrix is named mymat and you save it to mymatrix.mat) and then
> read
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > it in with:
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > >
> > > > > > import scipy.io(http://scipy.io)(http://scipy.io)
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > x = scipy.io.loadmat('mymatrix.mat')['mymat']
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > Then in pymvpa (depending on the version you are using) you can
> get it
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > ready for business with:
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > from mvpa.suite import *
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > dat = dataset_wizard(x, targets=cov_of_interest,
> chunks=range(len(x)))
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > Then you can can start analyzing away...
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > Best,
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > > Per
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > > --
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> =------------------------------------------------------------------=
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > >
> > > > > Keep in touch
> www.onerussian.com(http://www.onerussian.com)(http://www.onerussian.com)
> > >
> >
> > >
> >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > Yaroslav Halchenko
> www.ohloh.net/accounts/yarikoptic(http://www.ohloh.net/accounts/yarikoptic)(http://www.ohloh.net/accounts/yarikoptic(http://www.ohloh.net/accounts/yarikoptic))
> > >
> >
> > >
> >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > _______________________________________________
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > Pkg-ExpPsy-PyMVPA mailing list
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > > Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org <
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org> <
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org <
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org>>
> > >
> >
> > >
> >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
> > >
> >
> > >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > >
> > >
> >
> > > > _______________________________________________
> > >
> >
> > > > Pkg-ExpPsy-PyMVPA mailing list
> > >
> >
> > > > Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org <
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org>
> > >
> >
> > > >
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
> > >
> >
> > >
> > >
> >
> > >
> > >
> >
> > > _______________________________________________
> > >
> >
> > > Pkg-ExpPsy-PyMVPA mailing list
> > >
> >
> > > Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org <
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org>
> > >
> >
> > >
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
> > >
> >
> > >
> > >
> > >
> >
> >
> >
> >
> >
> >
> >
> > _______________________________________________
> > Pkg-ExpPsy-PyMVPA mailing list
> > Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> >
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
>
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20111006/443991ce/attachment-0001.html>


More information about the Pkg-ExpPsy-PyMVPA mailing list