[pymvpa] IndexError from gnbsearchlight analysis
Yaroslav Halchenko
debian at onerussian.com
Mon Jul 25 15:17:21 UTC 2011
my blunt guess is that it is due to unbalanced # of samples per each
chunk: i.e. in some data chunk having no samples of some label.
Could you share
print dataset.summary()
?
and ideally, if more help needed (;-)), dataset itself (h5save it)
so we could reproduce/fix it.
Cheers
On Mon, 25 Jul 2011, Zhen Zonglei wrote:
> Hi,guys
> I used the following code to do GNB searchlight analysis,
> # GNB searchlight analysis
> dataset = alldata[timeselect,spatialslelect ]
> sl=
> sphere_gnbsearchlight(GNB(),NFoldPartitioner(),radius=3,postproc=mean_s
> ample())
> res = sl(dataset)
> sphere_errors = res.samples[0]
> BUT, the errors were reported(see bellow). The version I used is
> pymvpa 0.6.
> In addition, with the similar code, the general searchlight analysis
> can run sucessfully.
> # Genearal searchlight analysis
> 75 cv =
> CrossValidation(linearCSVMC(),NFoldPartitioner(),errorfx=lambda p, t:
> np.mean(p == t),enabl e_ca=['stats'])
> 76 sl =
> sphere_searchlight(cv,radius=3,postproc=mean_sample())
> 77 res = sl(dataset)
> 78 sphere_errors = res.samples[0]
> So, what happened in GNB searchlight analysis?
> Zonglei Zhen
> -----------------IndexError from GNB searchlight
> analysis------------------------
> In [2]: from gnbsearchlight import *
> Warning: divide by zero encountered in log
> ERROR: An unexpected error occurred while tokenizing input
> The following traceback may be corrupted or invalid
> The error message is: ('EOF in multi-line statement', (43, 0))
> -----------------------------------------------------------------------
> ----
> IndexError Traceback (most recent call
> last)
> //mystudy/code/python/<ipython console> in <module>()
> //mystudy/code/python/gnbsearchlight.py in <module>()
> 70 #GNB searchlight
> sl =
> sphere_gnbsearchlight(GNB(),NFoldPartitioner(),radius=3,postproc=mean_s
> ample())
> ---> 72 res = sl(dataset)
> 73 sphere_errors = res.samples[0]
> 74
> /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/base/learner.pyc
> in __call__(self, ds)
> 220 "used and auto training is
> disabled."
> 221 % str(self))
> --> 222 return super(Learner, self).__call__(ds)
> 223
> 224
> /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/base/node.pyc in
> __call__(self, ds)
> 74
> 75 self._precall(ds)
> ---> 76 result = self._call(ds)
> 77 result = self._postcall(ds, result)
> 78
> /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/measures/searchli
> ght.pyc in _call(self, dataset)
> 108
> 109 # pass to subclass
> --> 110 results, roi_sizes = self._sl_call(dataset, roi_ids,
> nproc)
> 111
> 112 if not roi_sizes is None:
> /usr/local/neurosoft/lib/python2.6/site-packages/mvpa/measures/gnbsearc
> hlight.pyc in _sl_call(self, dataset, roi_ids, nproc)
> 459 norm_weight = -0.5 * np.log(2*np.pi*variances)
> 460 # last added dimension would be for ROIs
> --> 461 logpriors = np.log(priors[:, np.newaxis,
> np.newaxis])
> 462
> 463 if __debug__:
> IndexError: 0-d arrays can only use a single () or a list of newaxes
> (and a single ...) as an index
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
--
=------------------------------------------------------------------=
Keep in touch www.onerussian.com
Yaroslav Halchenko www.ohloh.net/accounts/yarikoptic
More information about the Pkg-ExpPsy-PyMVPA
mailing list