[pymvpa] PCA transformation prior to SVM classification

Yaroslav Halchenko debian at onerussian.com
Mon Nov 29 13:58:01 UTC 2010


On Mon, 29 Nov 2010, Jakob Scherer wrote:
> > actually it depends... e.g. if underlying classifier's regularization is
> > invariant to the transformation (e.g. margin width), then yeap -- there should
> > be no effect.  But if it is sensitive to it (e.g. feature selection , like in
> > SMLR), then you might get advantage since, like in the case of SMLR, the goal
> > of having fewer important features might be achieved with higher
> > generalization.
> A follow-up question; is the inverse true too: can having fewer
> important features lead to a higher generalization?

if you are asking:

* "can having fewer important features among bulk of irrelevant features"
  then I guess answer is "No"

* "can having fewer features (just important ones)..."
  then the answer "oh Yes" -- that is the goal of feature selection
  procedures, to distill featureset so only important ones are left

or did I misunderstand entirely?
-- 
=------------------------------------------------------------------=
Keep in touch                                     www.onerussian.com
Yaroslav Halchenko                 www.ohloh.net/accounts/yarikoptic



More information about the Pkg-ExpPsy-PyMVPA mailing list