[pymvpa] MemoryError

Yaroslav Halchenko debian at onerussian.com
Wed Jun 30 02:52:23 UTC 2010


ok -- quick update:

 As far as I can see issue boiled down to lack of support for such large
 data "files" in zlib (compression library used by libnifti).  I am not
 sure how quickly issue would get resolved since it requires breaking an
 ABI of the library (and may be more).  I've emailed the authors,
 waiting for reply.

 On a good side:  there is a pure Python reimplementation of pynifti by
 Matthew Brett and Michael Hanke (yes yes -- the same PyMVPA Michael) --
 it is coming to replace current pynifti, and is already supported in
 our development (0.5) branch of PyMVPA.  On my quick trials nibabel
 (and new mri_dataset) seems to support large files without any
 problem ;)

 Both nibabel and development snapshot of PyMVPA are available from
 our NeuroDebian repository if you are using Debian-derived system, so
 you could give them a try if you desperately need to work with
 compressed large nifti files. Be warned though, that development
 version of PyMVPA, although quite usable and somewhat stable, is still
 in a fluid state and API might change (not to say that it is already
 different from 0.4.x versions PyMVPA).  But you can use dev.pymvpa.org
 as the ultimate source of documentation and inspiration.

With best regards,
Yarik

On Sun, 27 Jun 2010, McKell Carter wrote:

> Wow, very quick work tracking it! When I load the uncompressed file
> I'm able to access the full array. I'm running a couple tests with
> pymvpa now to make sure, but that seems a good way to get things
> working for now. Please do let me know of any updates.

> Cheers,
> McKell
-- 
                                  .-.
=------------------------------   /v\  ----------------------------=
Keep in touch                    // \\     (yoh@|www.)onerussian.com
Yaroslav Halchenko              /(   )\               ICQ#: 60653192
                   Linux User    ^^-^^    [175555]





More information about the Pkg-ExpPsy-PyMVPA mailing list