Bug#944192: python3-h5py: 'import h5py' produces Open MPI message to stderr
Thibaut Paumard
thibaut at debian.org
Thu Nov 7 11:59:03 GMT 2019
Dear Jamie,
Le 06/11/2019 à 19:15, Jameson Graef Rollins a écrit :
> On Wed, Nov 06 2019, Thibaut Paumard <thibaut at debian.org> wrote:
>> I believe:
>> - the issue is not very serious, as it will not prevent your code from
>> running fine and efficiently (it's only an informative message).
>
> It's true that it does not make it completely unusable, but it's
> certainly not usable in any command line application without hacking the
> import to suppress this message, which is actually quite problematic.
> So I don't think we can simply dismiss this issue...
The warning contains a hint how to suppress it:
"NOTE: You can disable this warning by setting the MCA parameter
btl_base_warn_component_unused to 0."
There are various way to do it, such as setting the corresponding
environment variable, e.g. in bash:
OMPI_MCA_btl_base_warn_component_unused=0
That could be done from within Python as well.
More options to set an MCA parameter here:
https://www.open-mpi.org/faq/?category=tuning#setting-mca-params
>> - it is not in python3-h5py but somewhere in the MPI stack (I have no
>> time right now for investigating, but I have seen this recently in
>> another MPI code: gyoto).
>
> It sounds like this issue should then be forwarded to whichever
> underlying library is producing the message. It's not ok for libraries
> to print warnings to stderr/out, unless they can be easily suppressed by
> the caller. But this message can only be suppressed from python with
> some very undstandard redirection of stderr at import time.
As shown above, the message can be suppressed easily.
>> - downgrading only solves your issue because version 2.8.0 was not
>> linked with MPI (and therefore may be less efficient, depending on
>> hardware).
>
> Well, in fact, importing the MPI version (2.10.0) is about x7 slower
> than the non-MPI version (2.8.0), taking almost a full second for just
> the import, so not really a clear performance improvement at all:
I will let the h5py maintainer comment on that, but I assume there's a
judgment call between enabling MPI parallelization for high performance
computing and optimizing the serial user case.
In any case, I'm just giving some information I know, I'm not an active
developer of h5py, libhdf5 or openmpi.
Regards, Thibaut.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
URL: <http://alioth-lists.debian.net/pipermail/debian-science-maintainers/attachments/20191107/0c152640/attachment.sig>
More information about the debian-science-maintainers
mailing list