[pymvpa] Optimising hyper parameters for Gaussian processes

Yaroslav Halchenko debian at onerussian.com
Fri Jun 3 12:57:42 UTC 2011

Thank you Emanuele, and sorry for not following up timely as well:

* due to unfinished refactoring, such model selection is working only in
  0.4.x series atm

* indeed GPRWeights would be the example of using it, and depending on
  your goal, necessity to tune model selection parameters (e.g. starting
  point, optimizer etc), you might like just to give a try to GPRWeights
  as the helper to get the optimized GPR instance. E.g. if you simply do

        k = GeneralizedLinearKernel()
        clf = GPR(k, enable_states=['log_marginal_likelihood'])
        sa_ms = clf.getSensitivityAnalyzer(flavor='model_select') # with model selection

  and your clf (== sa_ms.clf) would be the GPR with optimized

On Fri, 03 Jun 2011, Emanuele Olivetti wrote:

>    Hi,
>    I worked on this problem long ago and attempted a solution within
>    PyMVPA,
>    but my commitment was not enough to get generic hyperparameters'
>    optimization
>    so that the GPR way of minimizing the marginal likelihood would have
>    had its proper place.
>    I'm the one to blame for this part of PyMVPA that is not fully
>    developed ;-)
>    Anyway what was done is in mvpa/clfs/model_selector.py
>    which is imported in gpr.py and used by the class GPRWeights(). My goal
>    at that time was exactly to implement a Python version of what is in
>    the GPML
>    book. Yarik rearranged that part in later evolutions of PyMVPA. So my
>    suggestion
>    is to dig that part of the code. Yarik might want to add more comments
>    on this.
>    Unfortunately I had not much opportunity to work on it after that
>    attempt.
>    Best,
>    Emanuele
Keep in touch                                     www.onerussian.com
Yaroslav Halchenko                 www.ohloh.net/accounts/yarikoptic

More information about the Pkg-ExpPsy-PyMVPA mailing list