Bug#966999: python-hdf5storage: FTBFS: File "/usr/lib/python3/dist-packages/numpydoc/docscrape.py", line 324, in _parse_see_also ; raise ParseError("%s is not a item name" % line)

Lucas Nussbaum lucas at debian.org
Mon Aug 3 10:05:48 BST 2020


Source: python-hdf5storage
Version: 0.1.15-2
Severity: serious
Justification: FTBFS on amd64
Tags: bullseye sid ftbfs
Usertags: ftbfs-20200802 ftbfs-bullseye

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.

Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_auto_build
> I: pybuild base:217: /usr/bin/python3 setup.py build 
> running build
> running build_py
> creating /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_hdf5storage/build/hdf5storage
> copying hdf5storage/utilities.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_hdf5storage/build/hdf5storage
> copying hdf5storage/__init__.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_hdf5storage/build/hdf5storage
> copying hdf5storage/Marshallers.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_hdf5storage/build/hdf5storage
> copying hdf5storage/lowlevel.py -> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.8_hdf5storage/build/hdf5storage
> PYTHONPATH=. http_proxy='127.0.0.1:9' \
> 	python3 -m sphinx -N -b html doc/source build/html
> Running Sphinx v2.4.3
> making output directory... done
> WARNING: html_static_path entry '_static' does not exist
> loading intersphinx inventory from /usr/share/doc/python3/html/objects.inv...
> loading intersphinx inventory from /usr/share/doc/python-numpy-doc/html/objects.inv...
> loading intersphinx inventory from /usr/share/doc/python-scipy-doc/html/objects.inv...
> loading intersphinx inventory from /usr/share/doc/python-h5py-doc/html/objects.inv...
> [autosummary] generating autosummary for: api.rst, compression.rst, development.rst, hdf5storage.Marshallers.rst, hdf5storage.lowlevel.rst, hdf5storage.rst, hdf5storage.utilities.rst, index.rst, information.rst, introduction.rst, storage_format.rst
> building [mo]: targets for 0 po files that are out of date
> building [html]: targets for 11 source files that are out of date
> updating environment: [new config] 11 added, 0 changed, 0 removed
> reading sources... [  9%] api
> reading sources... [ 18%] compression
> reading sources... [ 27%] development
> reading sources... [ 36%] hdf5storage
> WARNING: [numpydoc] While processing docstring for 'hdf5storage.Options.compression_algorithm'
> 
> Exception occurred:
>   File "/usr/lib/python3/dist-packages/numpydoc/docscrape.py", line 324, in _parse_see_also
>     raise ParseError("%s is not a item name" % line)
> numpydoc.docscrape.ParseError: http://www.hdfgroup.org/doc_resource/SZIP/Commercial_szip.html is not a item name in "Algorithm to use for compression.\n\n{'gzip', 'lzf', 'szip'}\n\nCompression algorithm to use When the ``compress`` option is set\nand a python object is larger than ``compress_size_threshold``.\n``'gzip'`` is the only MATLAB compatible option.\n\n``'gzip'`` is also known as the Deflate algorithm, which is the\ndefault compression algorithm of ZIP files and is a common\ncompression algorithm used on tarballs. It is the most\ncompatible option. It has good compression and is reasonably\nfast. Its compression level is set with the\n``gzip_compression_level`` option, which is an integer between 0\nand 9 inclusive.\n\n``'lzf'`` is a very fast but low to moderate compression\nalgorithm. It is less commonly used than gzip/Deflate, but\ndoesn't have any patent or license issues.\n\n``'szip'`` is a compression algorithm that has some patents and\nlicense restrictions. It is not always available.\n\nSee Also\n--------\ncompress\ncompress_size_threshold\nh5py.Group.create_dataset\nhttp://www.hdfgroup.org/doc_resource/SZIP/Commercial_szip.html\n\n"
> The full traceback has been saved in /tmp/sphinx-err-j401k2uv.log, if you want to report the issue to the developers.
> Please also report this if it was a user error, so that a better error message can be provided next time.
> A bug report can be filed in the tracker at <https://github.com/sphinx-doc/sphinx/issues>. Thanks!
> make[1]: *** [debian/rules:13: override_dh_auto_build] Error 2

The full build log is available from:
   http://qa-logs.debian.net/2020/08/02/python-hdf5storage_0.1.15-2_unstable.log

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

About the archive rebuild: The rebuild was done on EC2 VM instances from
Amazon Web Services, using a clean, minimal and up-to-date chroot. Every
failed build was retried once to eliminate random failures.



More information about the debian-science-maintainers mailing list