[pymvpa] Returning trained classifiers generated during cross-validation
Tyson Aflalo
tyson.aflalo at gmail.com
Mon Jan 9 03:13:41 UTC 2012
No oddities. I just thought I would pass along some concrete usage which
might expose a mistake that I was unaware of... nothing worse than a
mistake that returns reasonable but incorrect results.
Thanks again
On Sun, Jan 8, 2012 at 7:03 PM, Yaroslav Halchenko <debian at onerussian.com>wrote:
>
> On Sun, 08 Jan 2012, Tyson Aflalo wrote:
> > I happen to be using libsvm, so I am attempting to use option 2. From
> > what I understand SplitClassifier is a meta-classifier, and so I can
> > simply feed my previous classifier to SplitClassifier and feed that to
> > CrossValidation. SplitClassifier than just provides a layer that can
> save
> > stuff out over the folds... I have a tenuous grasp but hopefully this
> is
> > basically correct.
>
> seems to be 100% identical to my comprehension of that beast ;)
>
> > Can you glance at the couple of lines below to verify
> > that I am using SplitClassifier correctly?
>
> I think it looks all right -- have you spot some oddity which lead you
> to ask this question?
>
> > Thanks for the help!
>
> > baseclf = LinearCSVMC()
>
> > svdmapper=SVDMapper()
>
> > get_SVD_sliced = lambda x: ChainMapper([svdmapper,
> > StaticFeatureSelection(x)])
>
> > metaclf = MappedClassifier(baseclf, get_SVD_sliced(slice(0, 15)))
>
> > sc = SplitClassifier(metaclf, enable_ca=['stats'])
>
> > cv = CrossValidation(sc, NFoldPartitioner(),
> > errorfx=mean_mismatch_error, enable_ca=['stats','datasets'])
>
> > err = cv(ds)
>
> > # now to test the novel dataset on an example classifier
>
> > mean(sc.clfs[1].predict(ds2.samples) == ds2.targets)
>
> > On Sun, Jan 8, 2012 at 4:14 PM, Yaroslav Halchenko
> > <[1]debian at onerussian.com> wrote:
>
> > there are 2 ways:
>
> > 1. [available only in mvpa2]
> > any RepeatedMeasure (including CrossValidation) takes argument
> > 'callback':
>
> > callback : functor
> > Optional callback to extract information from inside the
> main
> > loop of
> > the measure. The callback is called with the input 'data',
> the
> > 'node'
> > instance that is evaluated repeatedly and the 'result' of a
> > single
> > evaluation -- passed as named arguments (see labels in
> quotes)
> > for
> > every iteration, directly after evaluating the node.
>
> > so there you could access anything you care about in the 'node',
> which
> > is
> > classifier in this case
>
> > BUT because the same classifier instance gets reused through the
> > iterations,
> > you can't just "store" the classifier. you can deepcopy some of
> them
> > (e.g.
> > the ones relying on swig-ed APIs, like libsvm, would not be
> > deepcopy-able)
>
> > 2. SplitClassifier
>
> > That one behaves similarly to cross-validation (just access its
> > .ca.stats to
> > get results of cross-validation), but also operates on copies of
> the
> > originally
> > provided classifier, so you could access all of them via .clfs
> > attribute.
>
> > Helps?
> > On Sun, 08 Jan 2012, Tyson Aflalo wrote:
>
> > > Is there a means of accessing each trained classifier that is
> > generated as
> > > part of a cross-validation analysis?�
>
> > > Thanks,
>
> > > tyson
>
> > > _______________________________________________
> > > Pkg-ExpPsy-PyMVPA mailing list
> > > [2]Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
>
> > [3]
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
> --
> =------------------------------------------------------------------=
> Keep in touch www.onerussian.com
> Yaroslav Halchenko www.ohloh.net/accounts/yarikoptic
>
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20120108/90fd1f36/attachment-0001.html>
More information about the Pkg-ExpPsy-PyMVPA
mailing list