[pymvpa] Optimising hyper parameters for Gaussian processes

Emanuele Olivetti emanuele at relativita.com
Fri Jun 3 08:37:54 UTC 2011


I worked on this problem long ago and attempted a solution within PyMVPA,
but my commitment was not enough to get generic hyperparameters' optimization
so that the GPR way of minimizing the marginal likelihood would have had its proper place.
I'm the one to blame for this part of PyMVPA that is not fully developed ;-)

Anyway what was done is in mvpa/clfs/model_selector.py
which is imported in gpr.py and used by the class GPRWeights(). My goal
at that time was exactly to implement a Python version of what is in the GPML
book. Yarik rearranged that part in later evolutions of PyMVPA. So my suggestion
is to dig that part of the code. Yarik might want to add more comments on this.

Unfortunately I had not much opportunity to work on it after that



On 05/31/2011 01:00 PM, Martin Fergie wrote:
> Hi,
> I've been looking into using PyMVPA recently for performing Gaussian process regression. 
> I can't seem to find a method from gpr.py or the examples of minimizing the log marginal 
> likelihood with respect to the kernel hyper parameters.
> Is there a recommended way of doing this? or would I have to implement some sort of 
> wrapper to combine gpr.py with a gradient ascent routine?
> I currently use the GPML matlab package, however I'd like to replace it with a python 
> solution so I can easily parallelize training multiple Gaussian processes over multiple 
> machines.
> Thanks for your help,
> Martin
> _______________________________________________
> Pkg-ExpPsy-PyMVPA mailing list
> Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20110603/85bdb694/attachment-0001.html>

More information about the Pkg-ExpPsy-PyMVPA mailing list