[pymvpa] getting simple regression working

Emanuele Olivetti emanuele at relativita.com
Wed Feb 10 09:20:05 UTC 2010


kimberg at mail.med.upenn.edu wrote:
> ...
>
> My intuition is that this is pretty good for GPR with just five samples, and GPR does give me smaller errors (O(1e-10)) if I give it a lot more samples.  Simple linear regression should obviously produce even smaller errors for this degenerate case, but I couldn't find it in the list of classifiers.  Incidentally, is there an easy way to retrieve the model parameters?
>
>   

GPR with linear kernel is linear regression. It is Bayesian linear 
regression as
shown here (chapter 2.1):
http://www.gaussianprocess.org/gpml/chapters/RW2.pdf
(full book here: http://www.gaussianprocess.org/gpml/chapters/ , I 
recommend it).
When prior information is non-informative (i.e. sigma_p grows very large 
, see LinearKernel
in kernel.py) you get the "standard" linear regression.

But you are right, it would be nice if PyMVPA would offer/expose further 
simple models as
well as the more complex ones, like linear regression, logistic 
regression and so on.
It is in my plans to add them but as last year shows, I'm terribly 
swamped by too
many tasks.
Anyway they are in my 2010 todo/whish list and you are warmly invited to 
participate! :-)

Best,

Emanuele




More information about the Pkg-ExpPsy-PyMVPA mailing list