[pymvpa] getting simple regression working
Yaroslav Halchenko
debian at onerussian.com
Tue Feb 9 17:39:55 UTC 2010
the major "problem" was that dv was actually 2D with degenerate 2nd
dimension... so my version of your code is
from mvpa.suite import *
import numpy as n
iv=n.random.normal(0,1,(5,1))
dv=(2.0*iv).squeeze()
mydata=Dataset(samples=iv,labels=dv)
cc=GPR(regression=True,kernel=KernelLinear())
cc.train(mydata)
print cc.predict(mydata.samples)-dv
cc=SVM(svm_impl='NU_SVR', kernel_type='rbf',
regression=True)
cc.train(mydata)
print cc.predict(mydata.samples) - dv
Mention that regression=True is needed also for SVM since by default in 0.4.x
they all are treated as classifiers (that was changed in 0.5, where regressions
are regressions ;))
P.S. Also your snippet managed to hit yet another bug which will get
fixed and pushed into maint/0.4 to be included in 0.4.5 bugfix release ;)
On Tue, 09 Feb 2010, kimberg at mail.med.upenn.edu wrote:
> Incidentally, is there an easy way to retrieve the model parameters?
unfortunately we do not have some simple uniform way (yet?) but for
kernel GPR, model is described by original samples (._train_fv) and
._alpha vector
--
.-.
=------------------------------ /v\ ----------------------------=
Keep in touch // \\ (yoh@|www.)onerussian.com
Yaroslav Halchenko /( )\ ICQ#: 60653192
Linux User ^^-^^ [175555]
More information about the Pkg-ExpPsy-PyMVPA
mailing list