[pymvpa] one class again

Nick Oosterhof n.n.oosterhof at googlemail.com
Sun Feb 22 13:02:42 UTC 2015

On 20 Feb 2015, at 22:24, basile pinsard <basile.pinsard at gmail.com> wrote:

> I have questions regarding the one-class svm implementation:
> SVM(svm_impl='ONE_CLASS')
> when training on a dataset of small size (where there might not be any structure) the call to train often hangs forever.

It is possible that libSVM takes a very long time to converge [1]. 

> When I create a random dataset with the same dimensions, or if I replace the content of the sample array with random, it runs smoothly.
> I checked in my dataset and there are no nans or infs.
> Could this be just the distribution of the data that causes this to hang?

Yes, if there is very little structure in the data, this could happen. You could use another classifier (e.g. LDA). 

Alternatively, it may be possible to change some of the parameters used for SVM (I have very little experience with this).

> I know that this problem is due to LibSVM but I wanted to confirm that it is the last version that is included in PyMVPA.

The version in included in PyMVPA seems to be 2.89 [2], but that is not the latest one. Currently the latest version is 3.20 [3].

[1] http://www.csie.ntu.edu.tw/~cjlin/libsvm/faq.html#f412
[2] https://github.com/PyMVPA/PyMVPA/tree/master/3rd/libsvm
[3] http://www.csie.ntu.edu.tw/~cjlin/libsvm/

More information about the Pkg-ExpPsy-PyMVPA mailing list