[pymvpa] normalization: zscore by example?

Mike E. Klein michaeleklein at gmail.com
Mon Nov 7 23:09:51 UTC 2011


Hi,

Thanks for getting back. I'm a bit confused by the response, which is
pretty much the norm for me.

- My reason for wanting to do this is because of a relative paucity of
examples (9 per run, 3 categories, 9 runs), which I plan to reduce in
number further by some averaging. It seems (to me) that the zscoring would
be more accurate when considering thousands of voxels (per example) as
opposed to several voxels (per run).

- I guess I don't see why zscoring this way would render a searchlight
invalid. I'm looking at the Pereira paper clip that was referenced recently
in the listserv: "In the example study, we normalized each example (row) to
have mean 0 and standard deviation 1. The idea in this case is to reduce
the effect of large, image-wide signal changes. Another possibility would
be to normalize each feature (column) to have mean 0 and standard deviation
1, either across the entire experiment or within examples coming from the
same run."

On Mon, Nov 7, 2011 at 5:29 PM, Yaroslav Halchenko <debian at onerussian.com>wrote:

> nothing is impossible... but why would you like to do that? quick
> situation/answers pairs
>
> * I care only about classification performance and don't do any feature
>  (voxel) diagnosticity analysis such as searchlight
>
>  well -- then zscoring is legit... but I don't think we have builtin
>  mapper/function for it, since if you don't have any groups of voxels
>  to care about zscoring independently, then it is as simple to in
>  pure python as
>
>  ds.samples -= np.mean(ds, axis=1)[:, None]
>  ds.samples /= np.std(ds, axis=1)[:, None]
>
> * I do care about per-voxel estimates of diagnosticity
>
>  DO NOT ZSCORE ACROSS FEATURES (VOXELS)! ;)
>
> On Mon, 07 Nov 2011, Mike E. Klein wrote:
>
> >    Hi all,
> >    I realize now (after a boneheaded mistake), that PyMVPA's zscore()
> >    function is normalizing for all the values across a chunk (looking at
> each
> >    voxel individually). I'm wondering if it's possible to, instead, find
> the
> >    zscores for all the voxels in a single volume (against each voxel's
> >    resting state), considering each 3d volume separately.�
> >    Thanks,
> >    Mike
>
> > _______________________________________________
> > Pkg-ExpPsy-PyMVPA mailing list
> > Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> >
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
>
> --
> =------------------------------------------------------------------=
> Keep in touch                                     www.onerussian.com
> Yaroslav Halchenko                 www.ohloh.net/accounts/yarikoptic
>
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20111107/b5ebdb18/attachment.html>


More information about the Pkg-ExpPsy-PyMVPA mailing list