[pymvpa] IndexError from gnbsearchlight analysis
Zhen Zonglei
zonglei.fsl at gmail.com
Tue Jul 26 01:41:15 UTC 2011
Thanks very much, Yarik.
The fellowing the summary of dataset. For hdf5 file, please see attachments.
Zonglei Zhen
---------------------------------------------------------------------------------------
In [4]: print dataset.summary()
Dataset: 72x821 at float32, <sa: chunks,time_indices,targets,time_coords>, <fa:
voxel_indices>, <a: mapper,voxel_eldim,voxel_dim,imghdr>
stats: mean=-5.2752e-08 std=0.999996 var=0.999993 min=-3.02555 max=3.05246
Counts of targets in each chunk:
chunks\targets attf attm
--- ---
0.0 6 6
1.0 6 6
2.0 6 6
3.0 6 6
4.0 6 6
5.0 6 6
Summary for targets across chunks
targets mean std min max #chunks
attf 6 0 6 6 6
attm 6 0 6 6 6
Summary for chunks across targets
chunks mean std min max #targets
0 6 0 6 6 2
1 6 0 6 6 2
2 6 0 6 6 2
3 6 0 6 6 2
4 6 0 6 6 2
5 6 0 6 6 2
Sequence statistics for 72 entries from set ['attf', 'attm']
Counter-balance table for orders up to 2:
Targets/Order O1 | O2 |
attf: 30 6 | 24 12 |
attm: 5 30 | 10 24 |
Correlations: min=-1 max=1 mean=-0.014 sum(abs)=35
On Mon, Jul 25, 2011 at 11:17 PM, Yaroslav Halchenko
<debian at onerussian.com>wrote:
> my blunt guess is that it is due to unbalanced # of samples per each
> chunk: i.e. in some data chunk having no samples of some label.
>
> Could you share
>
> print dataset.summary()
>
> ?
>
> and ideally, if more help needed (;-)), dataset itself (h5save it)
> so we could reproduce/fix it.
>
> Cheers
>
> On Mon, 25 Jul 2011, Zhen Zonglei wrote:
>
> > Hi,guys
>
> > I used the following code to do GNB searchlight analysis,
>
> > # GNB searchlight analysis
>
> > dataset = alldata[timeselect,spatialslelect ]
> > sl=
> >
> sphere_gnbsearchlight(GNB(),NFoldPartitioner(),radius=3,postproc=mean_s
> > ample())
> > res = sl(dataset)
> > sphere_errors = res.samples[0]
>
> > BUT, the errors were reported(see bellow). The version I used is
> > pymvpa 0.6.
>
> > In addition, with the similar code, the general searchlight analysis
> > can run sucessfully.
>
> > # Genearal searchlight analysis
> > 75 cv =
> > CrossValidation(linearCSVMC(),NFoldPartitioner(),errorfx=lambda p, t:
> > np.mean(p == t),enabl e_ca=['stats'])
> > 76 sl =
> > sphere_searchlight(cv,radius=3,postproc=mean_sample())
> > 77 res = sl(dataset)
> > 78 sphere_errors = res.samples[0]
>
>
>
>
>
> > So, what happened in GNB searchlight analysis?
>
>
>
> > Zonglei Zhen
>
>
>
>
>
> > -----------------IndexError from GNB searchlight
> > analysis------------------------
>
> > In [2]: from gnbsearchlight import *
> > Warning: divide by zero encountered in log
> > ERROR: An unexpected error occurred while tokenizing input
> > The following traceback may be corrupted or invalid
> > The error message is: ('EOF in multi-line statement', (43, 0))
>
> >
> -----------------------------------------------------------------------
> > ----
> > IndexError Traceback (most recent call
> > last)
>
> > //mystudy/code/python/<ipython console> in <module>()
> > //mystudy/code/python/gnbsearchlight.py in <module>()
> > 70 #GNB searchlight
> > sl =
> >
> sphere_gnbsearchlight(GNB(),NFoldPartitioner(),radius=3,postproc=mean_s
> > ample())
> > ---> 72 res = sl(dataset)
> > 73 sphere_errors = res.samples[0]
> > 74
>
> > /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/base/learner.pyc
> > in __call__(self, ds)
> > 220 "used and auto training is
> > disabled."
> > 221 % str(self))
> > --> 222 return super(Learner, self).__call__(ds)
> > 223
> > 224
>
> > /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/base/node.pyc in
> > __call__(self, ds)
> > 74
> > 75 self._precall(ds)
> > ---> 76 result = self._call(ds)
> > 77 result = self._postcall(ds, result)
> > 78
>
> >
> /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/measures/searchli
> > ght.pyc in _call(self, dataset)
> > 108
> > 109 # pass to subclass
>
> > --> 110 results, roi_sizes = self._sl_call(dataset, roi_ids,
> > nproc)
> > 111
> > 112 if not roi_sizes is None:
>
> >
> /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/measures/gnbsearc
> > hlight.pyc in _sl_call(self, dataset, roi_ids, nproc)
> > 459 norm_weight = -0.5 * np.log(2*np.pi*variances)
> > 460 # last added dimension would be for ROIs
>
> > --> 461 logpriors = np.log(priors[:, np.newaxis,
> > np.newaxis])
> > 462
> > 463 if __debug__:
>
> > IndexError: 0-d arrays can only use a single () or a list of newaxes
> > (and a single ...) as an index
>
> > _______________________________________________
> > Pkg-ExpPsy-PyMVPA mailing list
> > Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> >
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
>
> --
> =------------------------------------------------------------------=
> Keep in touch www.onerussian.com
> Yaroslav Halchenko www.ohloh.net/accounts/yarikoptic
>
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20110726/4cc9add5/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: att.gzipped.hdf5
Type: application/octet-stream
Size: 835065 bytes
Desc: not available
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20110726/4cc9add5/attachment-0001.obj>
More information about the Pkg-ExpPsy-PyMVPA
mailing list