[pymvpa] Using FeatureSelectionClassifier for feature elmination

Yaroslav Halchenko debian at onerussian.com
Thu Jul 17 02:13:27 UTC 2008

> so when we call clf.train(dataset_1) where clf is a SplitClassifier,  
> it selects features and trains several classifiers using the splits?   
hm... not exactly. SplitClassifier just takes care about splits. If its
slave classifier is FeatureSelection classifier, then for each of those
splits it selects the features. So, there is a clean separation of
SplitClassifier -- splits dataset_1 and trains clones of slave
classifier on each split
FeatureSelectionClassifier -- selects features first and then trains
slave classifier on those features

if slave classifier of SplitClassifier is in turn
FeatureSelectionClassifier, then indeed your sentence is correct ;-)

> When we then call clf.predict(dataset_test.samples), it uses only  
> those features selected during training?
yeap (once again only if it is a FeatureSelectionClassifier, otherwise
it uses all features)

> Thanks for the email -- generally it cleared up almost all of my  
> questions.  I'd be happy to write up the docs for RFE based on what  
> you wrote, once I get a better hang of using the package.
sure -- feel free to ask more questions. I know that such peculiarities
as all those meta-classifiers  are worth multiple clarifications ;)

Yaroslav Halchenko
Research Assistant, Psychology Department, Rutgers-Newark
Student  Ph.D. @ CS Dept. NJIT
Office: (973) 353-5440x263 | FWD: 82823 | Fax: (973) 353-1171
        101 Warren Str, Smith Hall, Rm 4-105, Newark NJ 07102
WWW:     http://www.linkedin.com/in/yarik        

More information about the Pkg-ExpPsy-PyMVPA mailing list