[pymvpa] Feature selection for multiple classifiers

Richard Dinga dinga92 at gmail.com
Tue Jul 29 22:24:02 UTC 2014


Hi all,

I have a question in regards of feature selection if more than one
classifier is involved, because there are more than two classes. If I
understand it correctly,  in multi-class problem PyMVPA will train a
classifier for every possible pair of classes and result is decided by
vote. So if I select beforehand 100 best voxels by anova, all the
classifiers would be trained on them and there is possibility that in this
subset wouldn't be informative voxels for all the classes. How can I do it
in a way that every classifier would choose best voxels for it's own pair
of classes?

And related question: lets say I have 8 classes and I created tree like
this
clf = TreeClassifier(FeatureSelectionClassifier(LinearCSVMC(), fsel),
                     {'a': ((1,2), LinearCSVMC()),
                      'b': ((3,4), LinearCSVMC()),
                      'c': ((5,6), LinearCSVMC()),
                      'd': ((7,8), LinearCSVMC())})

The first classifier would select best voxels for dividing in 8 or 4
classes? And on which voxels would be the secondary classifiers trained?

Thank you,
Richard
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20140730/b75ed883/attachment.html>


More information about the Pkg-ExpPsy-PyMVPA mailing list