[med-svn] [Git][med-team/nibabel][upstream] New upstream version 5.2.1
Étienne Mollier (@emollier)
gitlab at salsa.debian.org
Wed Feb 28 13:15:49 GMT 2024
Étienne Mollier pushed to branch upstream at Debian Med / nibabel
Commits:
4fe9f580 by Étienne Mollier at 2024-02-28T13:46:06+01:00
New upstream version 5.2.1
- - - - -
22 changed files:
- .git_archival.txt
- .github/workflows/test.yml
- .gitmodules
- Changelog
- doc/source/conf.py
- nibabel/gifti/gifti.py
- nibabel/gifti/parse_gifti_fast.py
- + nibabel/gifti/tests/data/ascii_flat_data.gii
- nibabel/gifti/tests/test_parse_gifti_fast.py
- nibabel/nicom/dicomwrappers.py
- nibabel/nicom/tests/test_dicomwrappers.py
- nibabel/pkg_info.py
- nibabel/testing/__init__.py
- nibabel/tests/test_image_api.py
- nibabel/tests/test_image_load_save.py
- nibabel/tests/test_loadsave.py
- nibabel/tests/test_onetime.py
- nibabel/tests/test_orientations.py
- nibabel/tests/test_spatialimages.py
- pyproject.toml
- + tools/markdown_release_notes.py
- tox.ini
Changes:
=====================================
.git_archival.txt
=====================================
@@ -1,4 +1,4 @@
-node: 70795b063c48c2a04edbfcb2e97d5429b4bc31c3
-node-date: 2023-12-11T14:48:26-05:00
-describe-name: 5.2.0
-ref-names: tag: 5.2.0, refs/pull/1278/head, maint/5.2.x
+node: 1df3b610e6e501d6aa000a8076ec23a21701dafe
+node-date: 2024-02-26T22:49:46-05:00
+describe-name: 5.2.1
+ref-names: tag: 5.2.1, maint/5.2.x
=====================================
.github/workflows/test.yml
=====================================
@@ -44,7 +44,7 @@ jobs:
- uses: actions/checkout at v4
with:
fetch-depth: 0
- - uses: actions/setup-python at v4
+ - uses: actions/setup-python at v5
with:
python-version: 3
- run: pip install --upgrade build twine
@@ -54,12 +54,12 @@ jobs:
- name: Build git archive
run: mkdir archive && git archive -v -o archive/nibabel-archive.tgz HEAD
- name: Upload sdist and wheel artifacts
- uses: actions/upload-artifact at v3
+ uses: actions/upload-artifact at v4
with:
name: dist
path: dist/
- name: Upload git archive artifact
- uses: actions/upload-artifact at v3
+ uses: actions/upload-artifact at v4
with:
name: archive
path: archive/
@@ -73,17 +73,17 @@ jobs:
steps:
- name: Download sdist and wheel artifacts
if: matrix.package != 'archive'
- uses: actions/download-artifact at v3
+ uses: actions/download-artifact at v4
with:
name: dist
path: dist/
- name: Download git archive artifact
if: matrix.package == 'archive'
- uses: actions/download-artifact at v3
+ uses: actions/download-artifact at v4
with:
name: archive
path: archive/
- - uses: actions/setup-python at v4
+ - uses: actions/setup-python at v5
with:
python-version: 3
- name: Display Python version
@@ -147,7 +147,7 @@ jobs:
submodules: recursive
fetch-depth: 0
- name: Set up Python ${{ matrix.python-version }}
- uses: actions/setup-python at v4
+ uses: actions/setup-python at v5
with:
python-version: ${{ matrix.python-version }}
architecture: ${{ matrix.architecture }}
@@ -162,14 +162,15 @@ jobs:
run: tox c
- name: Run tox
run: tox -v --exit-and-dump-after 1200
- - uses: codecov/codecov-action at v3
+ - uses: codecov/codecov-action at v4
if: ${{ always() }}
with:
files: cov.xml
+ token: ${{ secrets.CODECOV_TOKEN }}
- name: Upload pytest test results
- uses: actions/upload-artifact at v3
+ uses: actions/upload-artifact at v4
with:
- name: pytest-results-${{ matrix.os }}-${{ matrix.python-version }}
+ name: pytest-results-${{ matrix.os }}-${{ matrix.python-version }}-${{ matrix.dependencies }}-${{ matrix.architecture }}
path: test-results.xml
if: ${{ always() }}
@@ -183,7 +184,7 @@ jobs:
steps:
- uses: actions/checkout at v4
- name: Set up Python ${{ matrix.python-version }}
- uses: actions/setup-python at v4
+ uses: actions/setup-python at v5
with:
python-version: 3
- name: Display Python version
@@ -204,7 +205,7 @@ jobs:
id-token: write
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags/')
steps:
- - uses: actions/download-artifact at v3
+ - uses: actions/download-artifact at v4
with:
name: dist
path: dist/
=====================================
.gitmodules
=====================================
@@ -19,3 +19,6 @@
[submodule "nibabel-data/nitest-dicom"]
path = nibabel-data/nitest-dicom
url = https://github.com/effigies/nitest-dicom
+[submodule "nibabel-data/dcm_qa_xa30"]
+ path = nibabel-data/dcm_qa_xa30
+ url = https://github.com/neurolabusc/dcm_qa_xa30.git
=====================================
Changelog
=====================================
@@ -25,6 +25,28 @@ Eric Larson (EL), Demian Wassermann, Stephan Gerhard and Ross Markello (RM).
References like "pr/298" refer to github pull request numbers.
+5.2.1 (Monday 26 February 2024)
+===============================
+
+Bug-fix release in the 5.2.x series.
+
+Enhancements
+------------
+* Support "flat" ASCII-encoded GIFTI DataArrays (pr/1298) (PM, reviewed by CM)
+
+Bug fixes
+---------
+* Tolerate missing ``git`` when reporting version info (pr/1286) (CM, reviewed by
+ Yuri Victorovich)
+* Handle Siemens XA30 derived DWI DICOMs (pr/1296) (CM, reviewed by YOH and
+ Mathias Goncalves)
+
+Maintenance
+-----------
+* Add tool for generating GitHub-friendly release notes (pr/1284) (CM)
+* Accommodate pytest 8 changes (pr/1297) (CM)
+
+
5.2.0 (Monday 11 December 2023)
===============================
@@ -36,15 +58,15 @@ tested up to Python 3.12 and NumPy 1.26.
New features
------------
* Add generic :class:`~nibabel.pointset.Pointset` and regularly spaced
- :class:`~nibabel.pointset.NDGrid` data structures in preparation for coordinate
+ :class:`~nibabel.pointset.Grid` data structures in preparation for coordinate
transformation and resampling (pr/1251) (CM, reviewed by Oscar Esteban)
Enhancements
------------
* Add :meth:`~nibabel.arrayproxy.ArrayProxy.copy` method to
:class:`~nibabel.arrayproxy.ArrayProxy` (pr/1255) (CM, reviewed by Paul McCarthy)
-* Permit :meth:`~nibabel.xmlutils.XmlSerializable.to_xml` to pass keyword
- arguments to :meth:`~xml.etree.ElementTree.ElementTree.tostring` (pr/1258)
+* Permit :meth:`~nibabel.xmlutils.XmlSerializable.to_xml` methods to pass keyword
+ arguments to :func:`xml.etree.ElementTree.tostring` (pr/1258)
(CM)
* Allow user expansion (e.g., ``~/...``) in strings passed to functions that
accept paths (pr/1260) (Reinder Vos de Wael, reviewed by CM)
@@ -54,7 +76,7 @@ Enhancements
``affine=None`` argument (pr/1253) (Blake Dewey, reviewed by CM)
* Warn on invalid MINC2 spacing declarations, treat as missing (pr/1237)
(Peter Suter, reviewed by CM)
-* Refactor :func:`~nibabel.nicom.utils.find_private_element` for improved
+* Refactor :func:`~nibabel.nicom.utils.find_private_section` for improved
readability and maintainability (pr/1228) (MB, reviewed by CM)
Bug fixes
=====================================
doc/source/conf.py
=====================================
@@ -280,7 +280,12 @@ latex_documents = [('index', 'nibabel.tex', 'NiBabel Documentation', 'NiBabel Au
# Example configuration for intersphinx: refer to the Python standard library.
-intersphinx_mapping = {'https://docs.python.org/3/': None}
+intersphinx_mapping = {
+ 'python': ('https://docs.python.org/3', None),
+ 'numpy': ('https://numpy.org/doc/stable', None),
+ 'scipy': ('https://docs.scipy.org/doc/scipy', None),
+ 'matplotlib': ('https://matplotlib.org/stable', None),
+}
# Config of plot_directive
plot_include_source = True
=====================================
nibabel/gifti/gifti.py
=====================================
@@ -745,7 +745,7 @@ class GiftiImage(xml.XmlSerializable, SerializableImage):
>>> triangles_2 = surf_img.agg_data('triangle')
>>> triangles_3 = surf_img.agg_data(1009) # Numeric code for pointset
>>> print(np.array2string(triangles))
- [0 1 2]
+ [[0 1 2]]
>>> np.array_equal(triangles, triangles_2)
True
>>> np.array_equal(triangles, triangles_3)
=====================================
nibabel/gifti/parse_gifti_fast.py
=====================================
@@ -68,17 +68,21 @@ def read_data_block(darray, fname, data, mmap):
if mmap is True:
mmap = 'c'
enclabel = gifti_encoding_codes.label[darray.encoding]
- dtype = data_type_codes.type[darray.datatype]
+ if enclabel not in ('ASCII', 'B64BIN', 'B64GZ', 'External'):
+ raise GiftiParseError(f'Unknown encoding {darray.encoding}')
+
+ # Encode the endianness in the dtype
+ byteorder = gifti_endian_codes.byteorder[darray.endian]
+ dtype = data_type_codes.dtype[darray.datatype].newbyteorder(byteorder)
+
+ shape = tuple(darray.dims)
+ order = array_index_order_codes.npcode[darray.ind_ord]
+
+ # GIFTI_ENCODING_ASCII
if enclabel == 'ASCII':
- # GIFTI_ENCODING_ASCII
- c = StringIO(data)
- da = np.loadtxt(c, dtype=dtype)
- return da # independent of the endianness
- elif enclabel not in ('B64BIN', 'B64GZ', 'External'):
- return 0
-
- # GIFTI_ENCODING_EXTBIN
+ return np.loadtxt(StringIO(data), dtype=dtype, ndmin=1).reshape(shape, order=order)
+
# We assume that the external data file is raw uncompressed binary, with
# the data type/endianness/ordering specified by the other DataArray
# attributes
@@ -94,12 +98,13 @@ def read_data_block(darray, fname, data, mmap):
newarr = None
if mmap:
try:
- newarr = np.memmap(
+ return np.memmap(
ext_fname,
dtype=dtype,
mode=mmap,
offset=darray.ext_offset,
- shape=tuple(darray.dims),
+ shape=shape,
+ order=order,
)
# If the memmap fails, we ignore the error and load the data into
# memory below
@@ -107,13 +112,12 @@ def read_data_block(darray, fname, data, mmap):
pass
# mmap=False or np.memmap failed
if newarr is None:
- # We can replace this with a call to np.fromfile in numpy>=1.17,
- # as an "offset" parameter was added in that version.
- with open(ext_fname, 'rb') as f:
- f.seek(darray.ext_offset)
- nbytes = np.prod(darray.dims) * dtype().itemsize
- buff = f.read(nbytes)
- newarr = np.frombuffer(buff, dtype=dtype)
+ return np.fromfile(
+ ext_fname,
+ dtype=dtype,
+ count=np.prod(darray.dims),
+ offset=darray.ext_offset,
+ ).reshape(shape, order=order)
# Numpy arrays created from bytes objects are read-only.
# Neither b64decode nor decompress will return bytearrays, and there
@@ -121,26 +125,14 @@ def read_data_block(darray, fname, data, mmap):
# there is not a simple way to avoid making copies.
# If this becomes a problem, we should write a decoding interface with
# a tunable chunk size.
+ dec = base64.b64decode(data.encode('ascii'))
+ if enclabel == 'B64BIN':
+ buff = bytearray(dec)
else:
- dec = base64.b64decode(data.encode('ascii'))
- if enclabel == 'B64BIN':
- # GIFTI_ENCODING_B64BIN
- buff = bytearray(dec)
- else:
- # GIFTI_ENCODING_B64GZ
- buff = bytearray(zlib.decompress(dec))
- del dec
- newarr = np.frombuffer(buff, dtype=dtype)
-
- sh = tuple(darray.dims)
- if len(newarr.shape) != len(sh):
- newarr = newarr.reshape(sh, order=array_index_order_codes.npcode[darray.ind_ord])
-
- # check if we need to byteswap
- required_byteorder = gifti_endian_codes.byteorder[darray.endian]
- if required_byteorder in ('big', 'little') and required_byteorder != sys.byteorder:
- newarr = newarr.byteswap()
- return newarr
+ # GIFTI_ENCODING_B64GZ
+ buff = bytearray(zlib.decompress(dec))
+ del dec
+ return np.frombuffer(buff, dtype=dtype).reshape(shape, order=order)
def _str2int(in_str):
=====================================
nibabel/gifti/tests/data/ascii_flat_data.gii
=====================================
@@ -0,0 +1,76 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!DOCTYPE GIFTI SYSTEM "http://www.nitrc.org/frs/download.php/115/gifti.dtd">
+<GIFTI Version="1.0" NumberOfDataArrays="2">
+ <MetaData>
+ <MD>
+ <Name><![CDATA[Caret-Version]]></Name>
+ <Value><![CDATA[5.512]]></Value>
+ </MD>
+ <MD>
+ <Name><![CDATA[date]]></Name>
+ <Value><![CDATA[Thu Dec 27 14:27:43 2007]]></Value>
+ </MD>
+ <MD>
+ <Name><![CDATA[encoding]]></Name>
+ <Value><![CDATA[XML]]></Value>
+ </MD>
+ </MetaData>
+ <LabelTable/>
+ <DataArray Intent="NIFTI_INTENT_POINTSET"
+ DataType="NIFTI_TYPE_FLOAT32"
+ ArrayIndexingOrder="RowMajorOrder"
+ Dimensionality="2"
+ Dim0="10"
+ Dim1="3"
+ Encoding="ASCII"
+ Endian="LittleEndian"
+ ExternalFileName=""
+ ExternalFileOffset="">
+ <MetaData>
+ <MD>
+ <Name><![CDATA[AnatomicalStructurePrimary]]></Name>
+ <Value><![CDATA[CortexLeft]]></Value>
+ </MD>
+ <MD>
+ <Name><![CDATA[AnatomicalStructureSecondary]]></Name>
+ <Value><![CDATA[Pial]]></Value>
+ </MD>
+ <MD>
+ <Name><![CDATA[GeometricType]]></Name>
+ <Value><![CDATA[Anatomical]]></Value>
+ </MD>
+ <MD>
+ <Name><![CDATA[UniqueID]]></Name>
+ <Value><![CDATA[{70e032e9-4123-47ee-965d-5b29107cbd83}]]></Value>
+ </MD>
+ </MetaData>
+ <CoordinateSystemTransformMatrix>
+ <DataSpace><![CDATA[NIFTI_XFORM_TALAIRACH]]></DataSpace>
+ <TransformedSpace><![CDATA[NIFTI_XFORM_TALAIRACH]]></TransformedSpace>
+ <MatrixData>1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000</MatrixData>
+ </CoordinateSystemTransformMatrix>
+ <Data>155.17539978 135.58103943 98.30715179 140.33973694 190.0491333 73.24776459 157.3598938 196.97969055 83.65809631 171.46174622 137.43661499 78.4709549 148.54592896 97.06752777 65.96373749 123.45701599 111.46841431 66.3571167 135.30892944 202.28720093 36.38148499 178.28155518 162.59469604 37.75128937 178.11087036 115.28820038 57.17986679 142.81582642 82.82115173 31.02205276</Data>
+ </DataArray>
+ <DataArray Intent="NIFTI_INTENT_TRIANGLE"
+ DataType="NIFTI_TYPE_INT32"
+ ArrayIndexingOrder="RowMajorOrder"
+ Dimensionality="2"
+ Dim0="10"
+ Dim1="3"
+ Encoding="ASCII"
+ Endian="LittleEndian"
+ ExternalFileName=""
+ ExternalFileOffset="">
+ <MetaData>
+ <MD>
+ <Name><![CDATA[TopologicalType]]></Name>
+ <Value><![CDATA[CLOSED]]></Value>
+ </MD>
+ <MD>
+ <Name><![CDATA[UniqueID]]></Name>
+ <Value><![CDATA[{747d8015-455b-43ad-82ac-dcfb7606004a}]]></Value>
+ </MD>
+ </MetaData>
+ <Data>6402 17923 25602 14085 25602 17923 25602 14085 4483 17923 1602 14085 4483 25603 25602 25604 25602 25603 25602 25604 6402 25603 3525 25604 1123 17922 12168 25604 12168 17922 </Data>
+ </DataArray>
+</GIFTI>
=====================================
nibabel/gifti/tests/test_parse_gifti_fast.py
=====================================
@@ -39,9 +39,19 @@ DATA_FILE4 = pjoin(IO_DATA_PATH, 'rh.shape.curv.gii')
DATA_FILE5 = pjoin(IO_DATA_PATH, 'base64bin.gii')
DATA_FILE6 = pjoin(IO_DATA_PATH, 'rh.aparc.annot.gii')
DATA_FILE7 = pjoin(IO_DATA_PATH, 'external.gii')
-
-datafiles = [DATA_FILE1, DATA_FILE2, DATA_FILE3, DATA_FILE4, DATA_FILE5, DATA_FILE6, DATA_FILE7]
-numDA = [2, 1, 1, 1, 2, 1, 2]
+DATA_FILE8 = pjoin(IO_DATA_PATH, 'ascii_flat_data.gii')
+
+datafiles = [
+ DATA_FILE1,
+ DATA_FILE2,
+ DATA_FILE3,
+ DATA_FILE4,
+ DATA_FILE5,
+ DATA_FILE6,
+ DATA_FILE7,
+ DATA_FILE8,
+]
+numDA = [2, 1, 1, 1, 2, 1, 2, 2]
DATA_FILE1_darr1 = np.array(
[
@@ -50,7 +60,7 @@ DATA_FILE1_darr1 = np.array(
[-17.614349, -65.401642, 21.071466],
]
)
-DATA_FILE1_darr2 = np.array([0, 1, 2])
+DATA_FILE1_darr2 = np.array([[0, 1, 2]])
DATA_FILE2_darr1 = np.array(
[
@@ -152,6 +162,10 @@ DATA_FILE7_darr2 = np.array(
dtype=np.int32,
)
+DATA_FILE8_darr1 = np.copy(DATA_FILE5_darr1)
+
+DATA_FILE8_darr2 = np.copy(DATA_FILE5_darr2)
+
def assert_default_types(loaded):
default = loaded.__class__()
@@ -448,3 +462,9 @@ def test_load_compressed():
img7 = load(fn)
assert_array_almost_equal(img7.darrays[0].data, DATA_FILE7_darr1)
assert_array_almost_equal(img7.darrays[1].data, DATA_FILE7_darr2)
+
+
+def test_load_flat_ascii_data():
+ img = load(DATA_FILE8)
+ assert_array_almost_equal(img.darrays[0].data, DATA_FILE8_darr1)
+ assert_array_almost_equal(img.darrays[1].data, DATA_FILE8_darr2)
=====================================
nibabel/nicom/dicomwrappers.py
=====================================
@@ -509,11 +509,14 @@ class MultiframeWrapper(Wrapper):
if hasattr(first_frame, 'get') and first_frame.get([0x18, 0x9117]):
# DWI image may include derived isotropic, ADC or trace volume
try:
- self.frames = pydicom.Sequence(
+ anisotropic = pydicom.Sequence(
frame
for frame in self.frames
if frame.MRDiffusionSequence[0].DiffusionDirectionality != 'ISOTROPIC'
)
+ # Image contains DWI volumes followed by derived images; remove derived images
+ if len(anisotropic) != 0:
+ self.frames = anisotropic
except IndexError:
# Sequence tag is found but missing items!
raise WrapperError('Diffusion file missing information')
=====================================
nibabel/nicom/tests/test_dicomwrappers.py
=====================================
@@ -35,6 +35,11 @@ DATA_FILE_4D = pjoin(IO_DATA_PATH, '4d_multiframe_test.dcm')
DATA_FILE_EMPTY_ST = pjoin(IO_DATA_PATH, 'slicethickness_empty_string.dcm')
DATA_FILE_4D_DERIVED = pjoin(get_nibabel_data(), 'nitest-dicom', '4d_multiframe_with_derived.dcm')
DATA_FILE_CT = pjoin(get_nibabel_data(), 'nitest-dicom', 'siemens_ct_header_csa.dcm')
+DATA_FILE_SIEMENS_TRACE = pjoin(
+ get_nibabel_data(),
+ 'dcm_qa_xa30',
+ 'In/20_DWI_dir80_AP/0001_1.3.12.2.1107.5.2.43.67093.2022071112140611403312307.dcm',
+)
# This affine from our converted image was shown to match our image spatially
# with an image from SPM DICOM conversion. We checked the matching with SPM
@@ -656,6 +661,13 @@ class TestMultiFrameWrapper(TestCase):
with pytest.warns(UserWarning, match='Derived images found and removed'):
assert dw.image_shape == (96, 96, 60, 33)
+ @dicom_test
+ @needs_nibabel_data('dcm_qa_xa30')
+ def test_data_trace(self):
+ # Test that a standalone trace volume is found and not dropped
+ dw = didw.wrapper_from_file(DATA_FILE_SIEMENS_TRACE)
+ assert dw.image_shape == (72, 72, 39, 1)
+
@dicom_test
@needs_nibabel_data('nitest-dicom')
def test_data_unreadable_private_headers(self):
=====================================
nibabel/pkg_info.py
=====================================
@@ -1,6 +1,7 @@
from __future__ import annotations
import sys
+from contextlib import suppress
from subprocess import run
from packaging.version import Version
@@ -11,7 +12,7 @@ except ImportError:
__version__ = '0+unknown'
-COMMIT_HASH = '70795b063c'
+COMMIT_HASH = '1df3b610e6'
def _cmp(a: Version, b: Version) -> int:
@@ -102,14 +103,16 @@ def pkg_commit_hash(pkg_path: str | None = None) -> tuple[str, str]:
ver = Version(__version__)
if ver.local is not None and ver.local.startswith('g'):
return 'installation', ver.local[1:8]
- # maybe we are in a repository
- proc = run(
- ('git', 'rev-parse', '--short', 'HEAD'),
- capture_output=True,
- cwd=pkg_path,
- )
- if proc.stdout:
- return 'repository', proc.stdout.decode().strip()
+ # maybe we are in a repository, but consider that we may not have git
+ with suppress(FileNotFoundError):
+ proc = run(
+ ('git', 'rev-parse', '--short', 'HEAD'),
+ capture_output=True,
+ cwd=pkg_path,
+ )
+ if proc.stdout:
+ return 'repository', proc.stdout.decode().strip()
+
return '(none found)', '<not found>'
=====================================
nibabel/testing/__init__.py
=====================================
@@ -233,3 +233,15 @@ def expires(version):
return lambda x: x
return pytest.mark.xfail(raises=ExpiredDeprecationError)
+
+
+def deprecated_to(version):
+ """Context manager to expect DeprecationWarnings until a given version"""
+ from packaging.version import Version
+
+ from nibabel import __version__ as nbver
+
+ if Version(nbver) < Version(version):
+ return pytest.deprecated_call()
+
+ return nullcontext()
=====================================
nibabel/tests/test_image_api.py
=====================================
@@ -48,6 +48,7 @@ from nibabel.testing import (
bytesio_filemap,
bytesio_round_trip,
clear_and_catch_warnings,
+ deprecated_to,
expires,
nullcontext,
)
@@ -80,10 +81,6 @@ from .test_minc2 import EXAMPLE_IMAGES as MINC2_EXAMPLE_IMAGES
from .test_parrec import EXAMPLE_IMAGES as PARREC_EXAMPLE_IMAGES
-def maybe_deprecated(meth_name):
- return pytest.deprecated_call() if meth_name == 'get_data' else nullcontext()
-
-
class GenericImageAPI(ValidateAPI):
"""General image validation API"""
@@ -194,7 +191,7 @@ class GenericImageAPI(ValidateAPI):
@expires('5.0.0')
def validate_get_data_deprecated(self, imaker, params):
img = imaker()
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
data = img.get_data()
assert_array_equal(np.asanyarray(img.dataobj), data)
@@ -246,14 +243,12 @@ class DataInterfaceMixin(GetSetDtypeMixin):
self._check_array_interface(imaker, meth_name)
method = getattr(img, meth_name)
# Data shape is same as image shape
- with maybe_deprecated(meth_name):
- assert img.shape == method().shape
+ assert img.shape == method().shape
# Data ndim is same as image ndim
- with maybe_deprecated(meth_name):
- assert img.ndim == method().ndim
+ assert img.ndim == method().ndim
# Values to get_data caching parameter must be 'fill' or
# 'unchanged'
- with maybe_deprecated(meth_name), pytest.raises(ValueError):
+ with pytest.raises(ValueError):
method(caching='something')
# dataobj is read only
fake_data = np.zeros(img.shape, dtype=img.get_data_dtype())
@@ -277,13 +272,11 @@ class DataInterfaceMixin(GetSetDtypeMixin):
assert not img.in_memory
# Load with caching='unchanged'
method = getattr(img, meth_name)
- with maybe_deprecated(meth_name):
- data = method(caching='unchanged')
+ data = method(caching='unchanged')
# Still not cached
assert not img.in_memory
# Default load, does caching
- with maybe_deprecated(meth_name):
- data = method()
+ data = method()
# Data now cached. in_memory is True if either of the get_data
# or get_fdata caches are not-None
assert img.in_memory
@@ -295,36 +288,30 @@ class DataInterfaceMixin(GetSetDtypeMixin):
# integers, but lets assume that's not true here.
assert_array_equal(proxy_data, data)
# Now caching='unchanged' does nothing, returns cached version
- with maybe_deprecated(meth_name):
- data_again = method(caching='unchanged')
+ data_again = method(caching='unchanged')
assert data is data_again
# caching='fill' does nothing because the cache is already full
- with maybe_deprecated(meth_name):
- data_yet_again = method(caching='fill')
+ data_yet_again = method(caching='fill')
assert data is data_yet_again
# changing array data does not change proxy data, or reloaded
# data
data[:] = 42
assert_array_equal(proxy_data, proxy_copy)
assert_array_equal(np.asarray(img.dataobj), proxy_copy)
- # It does change the result of get_data
- with maybe_deprecated(meth_name):
- assert_array_equal(method(), 42)
+ # It does change the result of get_fdata
+ assert_array_equal(method(), 42)
# until we uncache
img.uncache()
# Which unsets in_memory
assert not img.in_memory
- with maybe_deprecated(meth_name):
- assert_array_equal(method(), proxy_copy)
+ assert_array_equal(method(), proxy_copy)
# Check caching='fill' does cache data
img = imaker()
method = getattr(img, meth_name)
assert not img.in_memory
- with maybe_deprecated(meth_name):
- data = method(caching='fill')
+ data = method(caching='fill')
assert img.in_memory
- with maybe_deprecated(meth_name):
- data_again = method()
+ data_again = method()
assert data is data_again
# Check that caching refreshes for new floating point type.
img.uncache()
@@ -368,8 +355,7 @@ class DataInterfaceMixin(GetSetDtypeMixin):
get_data_func = method if caching is None else partial(method, caching=caching)
assert isinstance(img.dataobj, np.ndarray)
assert img.in_memory
- with maybe_deprecated(meth_name):
- data = get_data_func()
+ data = get_data_func()
# Returned data same object as underlying dataobj if using
# old ``get_data`` method, or using newer ``get_fdata``
# method, where original array was float64.
@@ -377,8 +363,7 @@ class DataInterfaceMixin(GetSetDtypeMixin):
dataobj_is_data = arr_dtype == np.float64 or method == img.get_data
# Set something to the output array.
data[:] = 42
- with maybe_deprecated(meth_name):
- get_result_changed = np.all(get_data_func() == 42)
+ get_result_changed = np.all(get_data_func() == 42)
assert get_result_changed == (dataobj_is_data or caching != 'unchanged')
if dataobj_is_data:
assert data is img.dataobj
@@ -387,15 +372,13 @@ class DataInterfaceMixin(GetSetDtypeMixin):
assert_array_equal(np.asarray(img.dataobj), 42)
# Uncache has no effect
img.uncache()
- with maybe_deprecated(meth_name):
- assert_array_equal(get_data_func(), 42)
+ assert_array_equal(get_data_func(), 42)
else:
assert not data is img.dataobj
assert not np.all(np.asarray(img.dataobj) == 42)
# Uncache does have an effect
img.uncache()
- with maybe_deprecated(meth_name):
- assert not np.all(get_data_func() == 42)
+ assert not np.all(get_data_func() == 42)
# in_memory is always true for array images, regardless of
# cache state.
img.uncache()
@@ -408,8 +391,7 @@ class DataInterfaceMixin(GetSetDtypeMixin):
if arr_dtype not in float_types:
return
for float_type in float_types:
- with maybe_deprecated(meth_name):
- data = get_data_func(dtype=float_type)
+ data = get_data_func(dtype=float_type)
assert (data is img.dataobj) == (arr_dtype == float_type)
def validate_shape(self, imaker, params):
=====================================
nibabel/tests/test_image_load_save.py
=====================================
@@ -40,7 +40,7 @@ from .. import spm2analyze as spm2
from .. import spm99analyze as spm99
from ..optpkg import optional_package
from ..spatialimages import SpatialImage
-from ..testing import expires
+from ..testing import deprecated_to, expires
from ..tmpdirs import InTemporaryDirectory
from ..volumeutils import native_code, swapped_code
@@ -285,7 +285,7 @@ def test_filename_save():
@expires('5.0.0')
def test_guessed_image_type():
# Test whether we can guess the image type from example files
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert nils.guessed_image_type(pjoin(DATA_PATH, 'example4d.nii.gz')) == Nifti1Image
assert nils.guessed_image_type(pjoin(DATA_PATH, 'nifti1.hdr')) == Nifti1Pair
assert nils.guessed_image_type(pjoin(DATA_PATH, 'example_nifti2.nii.gz')) == Nifti2Image
=====================================
nibabel/tests/test_loadsave.py
=====================================
@@ -21,7 +21,7 @@ from ..filebasedimages import ImageFileError
from ..loadsave import _signature_matches_extension, load, read_img_data
from ..openers import Opener
from ..optpkg import optional_package
-from ..testing import expires
+from ..testing import deprecated_to, expires
from ..tmpdirs import InTemporaryDirectory
_, have_scipy, _ = optional_package('scipy')
@@ -50,14 +50,14 @@ def test_read_img_data():
fpath = pathlib.Path(fpath)
img = load(fpath)
data = img.get_fdata()
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
data2 = read_img_data(img)
assert_array_equal(data, data2)
# These examples have null scaling - assert prefer=unscaled is the same
dao = img.dataobj
if hasattr(dao, 'slope') and hasattr(img.header, 'raw_data_from_fileobj'):
assert (dao.slope, dao.inter) == (1, 0)
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert_array_equal(read_img_data(img, prefer='unscaled'), data)
# Assert all caps filename works as well
with TemporaryDirectory() as tmpdir:
@@ -140,21 +140,21 @@ def test_read_img_data_nifti():
img = img_class(data, np.eye(4))
img.set_data_dtype(out_dtype)
# No filemap => error
- with pytest.deprecated_call(), pytest.raises(ImageFileError):
+ with deprecated_to('5.0.0'), pytest.raises(ImageFileError):
read_img_data(img)
# Make a filemap
froot = f'an_image_{i}'
img.file_map = img.filespec_to_file_map(froot)
# Trying to read from this filemap will generate an error because
# we are going to read from files that do not exist
- with pytest.deprecated_call(), pytest.raises(OSError):
+ with deprecated_to('5.0.0'), pytest.raises(OSError):
read_img_data(img)
img.to_file_map()
# Load - now the scaling and offset correctly applied
img_fname = img.file_map['image'].filename
img_back = load(img_fname)
data_back = img_back.get_fdata()
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert_array_equal(data_back, read_img_data(img_back))
# This is the same as if we loaded the image and header separately
hdr_fname = img.file_map['header'].filename if 'header' in img.file_map else img_fname
@@ -166,16 +166,16 @@ def test_read_img_data_nifti():
# Unscaled is the same as returned from raw_data_from_fileobj
with open(img_fname, 'rb') as fobj:
unscaled_back = hdr_back.raw_data_from_fileobj(fobj)
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert_array_equal(unscaled_back, read_img_data(img_back, prefer='unscaled'))
# If we futz with the scaling in the header, the result changes
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert_array_equal(data_back, read_img_data(img_back))
has_inter = hdr_back.has_data_intercept
old_slope = hdr_back['scl_slope']
old_inter = hdr_back['scl_inter'] if has_inter else 0
est_unscaled = (data_back - old_inter) / old_slope
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
actual_unscaled = read_img_data(img_back, prefer='unscaled')
assert_almost_equal(est_unscaled, actual_unscaled)
img_back.header['scl_slope'] = 2.1
@@ -185,10 +185,10 @@ def test_read_img_data_nifti():
else:
new_inter = 0
# scaled scaling comes from new parameters in header
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert np.allclose(actual_unscaled * 2.1 + new_inter, read_img_data(img_back))
# Unscaled array didn't change
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert_array_equal(actual_unscaled, read_img_data(img_back, prefer='unscaled'))
# Check the offset too
img.header.set_data_offset(1024)
@@ -200,14 +200,14 @@ def test_read_img_data_nifti():
fobj.write(b'\x00\x00')
img_back = load(img_fname)
data_back = img_back.get_fdata()
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert_array_equal(data_back, read_img_data(img_back))
img_back.header.set_data_offset(1026)
# Check we pick up new offset
exp_offset = np.zeros((data.size,), data.dtype) + old_inter
exp_offset[:-1] = np.ravel(data_back, order='F')[1:]
exp_offset = np.reshape(exp_offset, shape, order='F')
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert_array_equal(exp_offset, read_img_data(img_back))
# Delete stuff that might hold onto file references
del img, img_back, data_back
=====================================
nibabel/tests/test_onetime.py
=====================================
@@ -1,12 +1,12 @@
import pytest
from nibabel.onetime import auto_attr, setattr_on_read
-from nibabel.testing import expires
+from nibabel.testing import deprecated_to, expires
@expires('5.0.0')
def test_setattr_on_read():
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
class MagicProp:
@setattr_on_read
=====================================
nibabel/tests/test_orientations.py
=====================================
@@ -26,7 +26,7 @@ from ..orientations import (
ornt2axcodes,
ornt_transform,
)
-from ..testing import expires
+from ..testing import deprecated_to, expires
IN_ARRS = [
np.eye(4),
@@ -407,6 +407,6 @@ def test_inv_ornt_aff():
def test_flip_axis_deprecation():
a = np.arange(24).reshape((2, 3, 4))
axis = 1
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
a_flipped = flip_axis(a, axis)
assert_array_equal(a_flipped, np.flip(a, axis))
=====================================
nibabel/tests/test_spatialimages.py
=====================================
@@ -18,7 +18,7 @@ from numpy.testing import assert_array_almost_equal
from .. import load as top_load
from ..imageclasses import spatial_axes_first
from ..spatialimages import HeaderDataError, SpatialHeader, SpatialImage
-from ..testing import bytesio_round_trip, expires, memmap_after_ufunc
+from ..testing import bytesio_round_trip, deprecated_to, expires, memmap_after_ufunc
from ..tmpdirs import InTemporaryDirectory
@@ -368,7 +368,7 @@ class TestSpatialImage:
in_data = in_data_template.copy()
img = img_klass(in_data, None)
assert in_data is img.dataobj
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
out_data = img.get_data()
assert in_data is out_data
# and that uncache has no effect
@@ -381,18 +381,18 @@ class TestSpatialImage:
rt_img = bytesio_round_trip(img)
assert in_data is not rt_img.dataobj
assert (rt_img.dataobj == in_data).all()
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
out_data = rt_img.get_data()
assert (out_data == in_data).all()
assert rt_img.dataobj is not out_data
# cache
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert rt_img.get_data() is out_data
out_data[:] = 42
rt_img.uncache()
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert rt_img.get_data() is not out_data
- with pytest.deprecated_call():
+ with deprecated_to('5.0.0'):
assert (rt_img.get_data() == in_data).all()
def test_slicer(self):
=====================================
pyproject.toml
=====================================
@@ -26,6 +26,7 @@ classifiers = [
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
+ "Programming Language :: Python :: 3.12",
"Topic :: Scientific/Engineering",
]
# Version from setuptools_scm
=====================================
tools/markdown_release_notes.py
=====================================
@@ -0,0 +1,94 @@
+#!/usr/bin/env python
+import re
+import sys
+from pathlib import Path
+
+CHANGELOG = Path(__file__).parent.parent / 'Changelog'
+
+# Match release lines like "5.2.0 (Monday 11 December 2023)"
+RELEASE_REGEX = re.compile(r"""((?:\d+)\.(?:\d+)\.(?:\d+)) \(\w+ \d{1,2} \w+ \d{4}\)$""")
+
+
+def main():
+ version = sys.argv[1]
+ output = sys.argv[2]
+ if output == '-':
+ output = sys.stdout
+ else:
+ output = open(output, 'w')
+
+ release_notes = []
+ in_release_notes = False
+
+ with open(CHANGELOG) as f:
+ for line in f:
+ match = RELEASE_REGEX.match(line)
+ if match:
+ if in_release_notes:
+ break
+ in_release_notes = match.group(1) == version
+ next(f) # Skip the underline
+ continue
+
+ if in_release_notes:
+ release_notes.append(line)
+
+ # Drop empty lines at start and end
+ while release_notes and not release_notes[0].strip():
+ release_notes.pop(0)
+ while release_notes and not release_notes[-1].strip():
+ release_notes.pop()
+
+ # Join lines
+ release_notes = ''.join(release_notes)
+
+ # Remove line breaks when they are followed by a space
+ release_notes = re.sub(r'\n +', ' ', release_notes)
+
+ # Replace pr/<number> with #<number> for GitHub
+ release_notes = re.sub(r'\(pr/(\d+)\)', r'(#\1)', release_notes)
+
+ # Replace :mod:`package.X` with [package.X](...)
+ release_notes = re.sub(
+ r':mod:`nibabel\.(.*)`',
+ r'[nibabel.\1](https://nipy.org/nibabel/reference/nibabel.\1.html)',
+ release_notes,
+ )
+ # Replace :class/func/attr:`package.module.X` with [package.module.X](...)
+ release_notes = re.sub(
+ r':(?:class|func|attr):`(nibabel\.\w*)(\.[\w.]*)?\.(\w+)`',
+ r'[\1\2.\3](https://nipy.org/nibabel/reference/\1.html#\1\2.\3)',
+ release_notes,
+ )
+ release_notes = re.sub(
+ r':(?:class|func|attr):`~(nibabel\.\w*)(\.[\w.]*)?\.(\w+)`',
+ r'[\3](https://nipy.org/nibabel/reference/\1.html#\1\2.\3)',
+ release_notes,
+ )
+ # Replace :meth:`package.module.class.X` with [package.module.class.X](...)
+ release_notes = re.sub(
+ r':meth:`(nibabel\.[\w.]*)\.(\w+)\.(\w+)`',
+ r'[\1.\2.\3](https://nipy.org/nibabel/reference/\1.html#\1.\2.\3)',
+ release_notes,
+ )
+ release_notes = re.sub(
+ r':meth:`~(nibabel\.[\w.]*)\.(\w+)\.(\w+)`',
+ r'[\3](https://nipy.org/nibabel/reference/\1.html#\1.\2.\3)',
+ release_notes,
+ )
+
+ def python_doc(match):
+ module = match.group(1)
+ name = match.group(2)
+ return f'[{name}](https://docs.python.org/3/library/{module.lower()}.html#{module}.{name})'
+
+ release_notes = re.sub(r':meth:`~([\w.]+)\.(\w+)`', python_doc, release_notes)
+
+ output.write('## Release notes\n\n')
+ output.write(release_notes)
+
+ output.close()
+
+
+if __name__ == '__main__':
+ main()
=====================================
tox.ini
=====================================
@@ -141,7 +141,8 @@ labels = check
deps =
flake8
blue
- isort[colors]
+ # Broken extras, remove when fix is released
+ isort[colors]!=5.13.1
skip_install = true
commands =
blue --check --diff --color nibabel
@@ -153,7 +154,7 @@ description = Auto-apply style guide to the extent possible
labels = pre-release
deps =
blue
- isort[colors]
+ isort
skip_install = true
commands =
blue nibabel
View it on GitLab: https://salsa.debian.org/med-team/nibabel/-/commit/4fe9f580edf1a2a055b183d5ab120c69beb034e9
--
View it on GitLab: https://salsa.debian.org/med-team/nibabel/-/commit/4fe9f580edf1a2a055b183d5ab120c69beb034e9
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20240228/c0b67b24/attachment-0001.htm>
More information about the debian-med-commit
mailing list