[pymvpa] WARNING: evil (stupid?) changes by yarik in yoh/master -- please review if you like them

Emanuele Olivetti emanuele at relativita.com
Mon May 26 10:52:15 UTC 2008


Yaroslav Halchenko wrote:
> Dear Co-developers,
>
> I've done another evil change and I have mixed feelings about it -- may
> be I don't foresee some problems/inflexibility. In 2 words: I
> think we should define all parameters of classifiers as 'Parameter' or
> 'KernelParameter', and subclass from Parametrized (for now it is  a
> separate beast, but I think if you like it, we get just
> ParametrizedClassifier as the base class for all our non-meta
> classifiers).
>
> Pros:
>
>   
...
> * validation of values to be assigned to a parameter if min/max/choices
>   were specified, which later on could be used by 'OptimizedClassifier'
>   ;)
>
>   

I'm mostly interested in this aspect, as you can guess. In Gaussian
Processes regression or classification (i.e., the part on which I'm
contributing right now) the optimization of hyperparameters is of
greatest interest to me. In a short time I'll push some code on the
squared-exponential "ARD" kernel which assigns an hyperparameter
to each feature/variable of the dataset. This means that finding out
better/best hyperparameters means doing features' weighting (or
sensitivity analysis, as you call it). A standard task in my job is
to compute the weight (i.e. the importance) of each voxel of fMRI
scans w.r.t. the set of stimuli given to a subject. Recently I'm doing
this with GPR-ARD. In order to exploit all available prior knowledge
(e.g. spatial relations between voxels) I'm trying to push complex
constraints on kernel during the hyperparameters' optimization step. The
optimzation is then done by OpenOpt which is able to handle many
kinds constraints (all I needed until now). So anything
that could help classifiers/regressors to talk easily with OpenOpt is
very appreciated :).

Currently I use simple constraints, like "hyperparameter should NOT be
negative", which agrees to what you did. The next step is to add constraints
like "optimize this hyperparameter, but NOT that one which is fixed". Etc.
When it comes to optimization/selection of a model I think the optimizer
could be the right place to express constraints. But the validation step
(that you suggested) is useful as well.

OK, this is not an opinion to your new code, but just an attempt to let
you know what is ahead of me and relevant for this aspect of PyMVPA.
Hope this helps for future development of Parameter/Parametrized.


Bye,

Emanuele





More information about the Pkg-ExpPsy-PyMVPA mailing list