[pymvpa] classification based on individual parameter estimates from FSL

Meng Liang meng.liang at hotmail.co.uk
Fri Aug 8 10:49:51 UTC 2014


Hi David,
In your case with contrasts defined as 1000, 0100, etc, the PEs and the corresponding COPEs should be the same, so it should not make any difference either using PEs or COPEs. But I don't really understand why you say the PEs would not be independent. Can you explain it a bit more?
Best,Meng

Date: Tue, 5 Aug 2014 16:40:39 +0100
From: d.soto.b at gmail.com
To: pkg-exppsy-pymvpa at lists.alioth.debian.org
Subject: Re: [pymvpa] classification based on individual parameter estimates from FSL

Hi Michael (and all), just a quick clarification on your previous response to my query relating classification based on individual parameter estimates (PEs) - you mentioned  I could use the PEs associated with the temporal derivative or even the PEs associated with a set of basis functions....however I wonder that this PEs would not be  independent (as would be PEs obtained from different runs)

....would it be okay to use those PEs anyways?


A second related thing is that I have not been using the PEs  exactly but the Contrast of PEs (i.e. COPES in FSL)
associated with each EV- I have 16 EVs (8 per class) and hence obtained COPES such that

1000
0100
0010
0001
etc 


I dont see why it would make any difference to work wit COPEs rather than PEs, except that only with the later I could boost my dataset by using the temporal derivatives or basis functions....


cheers
ds



On Fri, Jul 4, 2014 at 2:33 PM, Michael Hanke <mih at debian.org> wrote:

Hi,



On Tue, Jul 01, 2014 at 12:25:40AM +0100, David Soto wrote:

> Hi Michael, indeed ..well done for germany today! :).

> Thanks for the reply and the suggestion on KNN

> I should have been  more clear that for each subject I have the

> following *block

> *sequences

> ababbaabbaabbaba in TASK 1

> ababbaabbaabbaba in TASK 2

>

> this explains that I have  8 a-betas and 8 b-betas for each task

> AND for each subject..so if i concatenate & normalize all the beta data

> across subjects I will have 8 x 19 (subjects)= 152 beta images for class a

> and the same for class b



Ah, I guess you model each task with two regressors (hrf + derivative?).

You can also use a basis function set and get even more betas...

>

> then could I use SVM searchlight trained to discriminate a from b in  task1

> betas and tested in the task2 betas?



yes, no problem.



Cheers,



Michael



PS: Off to enjoy the quarter finals ... ;-)





--

Michael Hanke

http://mih.voxindeserto.de



_______________________________________________

Pkg-ExpPsy-PyMVPA mailing list

Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org

http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa



-- 
http://www1.imperial.ac.uk/medicine/people/d.soto/




_______________________________________________
Pkg-ExpPsy-PyMVPA mailing list
Pkg-ExpPsy-PyMVPA at lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/pkg-exppsy-pymvpa 		 	   		  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-exppsy-pymvpa/attachments/20140808/81fe3541/attachment.html>


More information about the Pkg-ExpPsy-PyMVPA mailing list