[pymvpa] Optimising hyper parameters for Gaussian processes
Martin Fergie
mfergie at cs.man.ac.uk
Tue May 31 11:00:43 UTC 2011
Hi,
I've been looking into using PyMVPA recently for performing Gaussian process
regression. I can't seem to find a method from gpr.py or the examples of
minimizing the log marginal likelihood with respect to the kernel hyper
parameters.
Is there a recommended way of doing this? or would I have to implement some
sort of wrapper to combine gpr.py with a gradient ascent routine?
I currently use the GPML matlab package, however I'd like to replace it with
a python solution so I can easily parallelize training multiple Gaussian
processes over multiple machines.
Thanks for your help,
Martin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20110531/013aee79/attachment.html>
More information about the Pkg-ExpPsy-PyMVPA
mailing list