[Git][debian-gis-team/netcdf4-python][upstream] New upstream version 1.6.5

Bas Couwenberg (@sebastic) gitlab at salsa.debian.org
Wed Oct 25 04:32:44 BST 2023



Bas Couwenberg pushed to branch upstream at Debian GIS Project / netcdf4-python


Commits:
b5422974 by Bas Couwenberg at 2023-10-25T05:25:19+02:00
New upstream version 1.6.5
- - - - -


11 changed files:

- .github/workflows/miniconda.yml
- Changelog
- README.md
- create_docs.sh
- docs/index.html
- pyproject.toml
- src/netCDF4/__init__.py
- src/netCDF4/_netCDF4.pyx
- test/tst_masked.py
- test/tst_multifile.py
- test/tst_multifile2.py


Changes:

=====================================
.github/workflows/miniconda.yml
=====================================
@@ -12,7 +12,7 @@ jobs:
     #  NO_NET: 1
     strategy:
       matrix:
-        python-version: [ "3.7", "3.8", "3.9", "3.10", "3.11" ]
+        python-version: [ "3.8", "3.9", "3.10", "3.11", "3.12" ]
         os: [windows-latest, ubuntu-latest, macos-latest]
         platform: [x64, x32]
         exclude:
@@ -79,8 +79,8 @@ jobs:
         export PATH="${CONDA_PREFIX}/bin:${CONDA_PREFIX}/Library/bin:$PATH" 
         which mpirun
         mpirun --version
-        mpirun -np 4 --oversubscribe python mpi_example.py # for openmpi
-        #mpirun -np 4 python mpi_example.py
+        #mpirun -np 4 --oversubscribe python mpi_example.py # for openmpi
+        mpirun -np 4 python mpi_example.py
         if [ $? -ne 0 ] ; then
           echo "hdf5 mpi test failed!"
           exit 1


=====================================
Changelog
=====================================
@@ -1,3 +1,8 @@
+ version 1.6.5 (not yet released)
+===============================
+ * fix for issue #1271 (mask ignored if bool MA assinged to uint8 var)
+ * include information on specific object when reporting errors from netcdf-c
+
  version 1.6.4 (tag v1.6.4rel)
 ===============================
  * set path to SSL certificates internally, so https DAP URLs work with wheels


=====================================
README.md
=====================================
@@ -10,9 +10,13 @@
 ## News
 For details on the latest updates, see the [Changelog](https://github.com/Unidata/netcdf4-python/blob/master/Changelog).
 
+10/20/2023: Version [1.6.5](https://pypi.python.org/pypi/netCDF4/1.6.5) released. 
+Fix for issue #1271 (mask ignored if bool MA assinged to uint8 var), support for python 3.12, more
+informative error messages.
+
 6/4/2023:  Version [1.6.4](https://pypi.python.org/pypi/netCDF4/1.6.4) released.  Now requires 
 [certifi](https://github.com/certifi/python-certifi) to locate SSL certificates - this allows 
-OpenDAP https URLs to work with linux wheels (issue [#1246](https://github.com/Unidata/netcdf4-python/issues/1246).
+OpenDAP https URLs to work with linux wheels (issue [#1246](https://github.com/Unidata/netcdf4-python/issues/1246)).
 
 3/3/2023:  Version [1.6.3](https://pypi.python.org/pypi/netCDF4/1.6.3) released.
 


=====================================
create_docs.sh
=====================================
@@ -1,10 +1,3 @@
-# Uses pdoc (https://github.com/mitmproxy/pdoc)
-# to create html docs from docstrings in Cython source.
-pdoc -o 'docs' netCDF4 
-# use resulting docs/netCDF4/_netCDF4.html
-cp docs/netCDF4.html docs/index.html
-sed -i -e 's!href="../netCDF4.html!href="./index.html!g' docs/index.html
-sed -i -e 's!/../netCDF4.html!/index.html!g' docs/index.html
-sed -i -e 's!._netCDF4 API! API!g' docs/index.html
-sed -i -e 's!netCDF4</a>._netCDF4</h1>!netCDF4</a></h1>!g' docs/index.html
-
+# use pdoc (https://pdoc3.github.io/pdoc/) to generate API docs
+pdoc3 --html --config show_source_code=False --force -o 'docs' netCDF4
+/bin/cp -f docs/netCDF4/index.html docs/index.html


=====================================
docs/index.html
=====================================
The diff for this file was not included because it is too large.

=====================================
pyproject.toml
=====================================
@@ -9,14 +9,6 @@ build-backend = "setuptools.build_meta"
 [project]
 name = "netCDF4"
 description = "Provides an object-oriented python interface to the netCDF version 4 library"
-readme = {text = """\
-netCDF version 4 has many features not found in earlier versions of the library,
-such as hierarchical groups, zlib compression, multiple unlimited dimensions,
-and new data types.  It is implemented on top of HDF5.  This module implements
-most of the new features, and can read and write netCDF files compatible with
-older versions of the library.  The API is modelled after Scientific.IO.NetCDF,
-and should be familiar to users of that module.
-""", content-type = "text/x-rst"}
 authors = [
   {name = "Jeff Whitaker", email = "jeffrey.s.whitaker at noaa.gov"},
 ]
@@ -47,6 +39,25 @@ dependencies = [
 ]
 dynamic = ["version"]
 
+[project.optional-dependencies]
+tests = [
+  "Cython",
+  "packaging",
+  "pytest",
+]
+
+
+[project.readme]
+text = """\
+netCDF version 4 has many features not found in earlier versions of the library,
+such as hierarchical groups, zlib compression, multiple unlimited dimensions,
+and new data types.  It is implemented on top of HDF5.  This module implements
+most of the new features, and can read and write netCDF files compatible with
+older versions of the library.  The API is modelled after Scientific.IO.NetCDF,
+and should be familiar to users of that module.
+"""
+content-type = "text/x-rst"
+
 [project.scripts]
 nc3tonc4 = "netCDF4.utils:nc3tonc4"
 nc4tonc3 = "netCDF4.utils:nc4tonc3"


=====================================
src/netCDF4/__init__.py
=====================================
@@ -14,6 +14,9 @@ from ._netCDF4 import (__version__, __netcdf4libversion__, __hdf5libversion__,
 import os
 __all__ =\
 ['Dataset','Variable','Dimension','Group','MFDataset','MFTime','CompoundType','VLType','date2num','num2date','date2index','stringtochar','chartostring','stringtoarr','getlibversion','EnumType','get_chunk_cache','set_chunk_cache','set_alignment','get_alignment']
+__pdoc__ = {
+    'utils': False,
+}
 # if HDF5_PLUGIN_PATH not set, point to package path if plugins live there
 pluginpath = os.path.join(__path__[0],'plugins')
 if 'HDF5_PLUGIN_PATH' not in os.environ and\


=====================================
src/netCDF4/_netCDF4.pyx
=====================================
@@ -1,5 +1,5 @@
 """
-Version 1.6.4
+Version 1.6.5
 -------------
 
 # Introduction
@@ -64,8 +64,7 @@ types) are not supported.
    so the extra compression algorithms available in netcdf-c >= 4.9.0 will automatically
    be available.  Otherwise, the user will have to set `HDF5_PLUGIN_PATH` explicitly
    to have access to the extra compression plugins.
- - run `python setup.py build`, then `python setup.py install` (as root if
-   necessary).
+ - run `pip install -v .` (as root if necessary)
  - run the tests in the 'test' directory by running `python run_all.py`.
 
 # Tutorial
@@ -1228,7 +1227,7 @@ from .utils import (_StartCountStride, _quantize, _find_dim, _walk_grps,
 import sys
 import functools
 
-__version__ = "1.6.4"
+__version__ = "1.6.5"
 
 # Initialize numpy
 import posixpath
@@ -2017,16 +2016,22 @@ cdef _get_vars(group):
         free(varids) # free pointer holding variable ids.
     return variables
 
-cdef _ensure_nc_success(ierr, err_cls=RuntimeError, filename=None):
+def _ensure_nc_success(ierr, err_cls=RuntimeError, filename=None, extra_msg=None):
     # print netcdf error message, raise error.
-    if ierr != NC_NOERR:
-        err_str = (<char *>nc_strerror(ierr)).decode('ascii')
-        if issubclass(err_cls, OSError):
-            if isinstance(filename, bytes):
-                filename = filename.decode()
-            raise err_cls(ierr, err_str, filename)
-        else:
-            raise err_cls(err_str)
+    if ierr == NC_NOERR:
+        return
+
+    err_str = (<char *>nc_strerror(ierr)).decode('ascii')
+    if issubclass(err_cls, OSError):
+        if isinstance(filename, bytes):
+            filename = filename.decode()
+        raise err_cls(ierr, err_str, filename)
+
+    if extra_msg:
+        if isinstance(extra_msg, bytes):
+            extra_msg = extra_msg.decode()
+        err_str = f"{err_str}: {extra_msg}"
+    raise err_cls(err_str)
 
 # these are class attributes that
 # only exist at the python level (not in the netCDF file).
@@ -4017,6 +4022,10 @@ behavior is similar to Fortran or Matlab, but different than numpy.
         cdef size_t sizep, nelemsp
         cdef size_t *chunksizesp
         cdef float preemptionp
+
+        # Extra information for more helpful error messages
+        error_info = f"(variable '{name}', group '{grp.name}')"
+
         # flag to indicate that orthogonal indexing is supported
         self.__orthogonal_indexing__ = True
         # For backwards compatibility, deprecated zlib kwarg takes
@@ -4060,7 +4069,7 @@ behavior is similar to Fortran or Matlab, but different than numpy.
             compression = None # if compression evaluates to False, set to None.
             pass
         else:
-            raise ValueError("Unsupported value for compression kwarg")
+            raise ValueError(f"Unsupported value for compression kwarg {error_info}")
         self._grpid = grp._grpid
         # make a weakref to group to avoid circular ref (issue 218)
         # keep strong reference the default behaviour (issue 251)
@@ -4118,14 +4127,15 @@ behavior is similar to Fortran or Matlab, but different than numpy.
                         'Variable length strings are only supported for the '
                         'NETCDF4 format. For other formats, consider using '
                         'netCDF4.stringtochar to convert string arrays into '
-                        'character arrays with an additional dimension.')
+                        'character arrays with an additional dimension.'
+                        f' {error_info}')
                 datatype = VLType(self._grp, str, None)
                 self._vltype = datatype
             xtype = datatype._nc_type
             # make sure this a valid user defined datatype defined in this Group
             with nogil:
                 ierr = nc_inq_type(self._grpid, xtype, namstring, NULL)
-            _ensure_nc_success(ierr)
+            _ensure_nc_success(ierr, extra_msg=error_info)
             # dtype variable attribute is a numpy datatype object.
             self.dtype = datatype.dtype
         elif datatype.str[1:] in _supportedtypes:
@@ -4136,7 +4146,7 @@ behavior is similar to Fortran or Matlab, but different than numpy.
             # dtype variable attribute is a numpy datatype object.
             self.dtype = datatype
         else:
-            raise TypeError('illegal primitive data type, must be one of %s, got %s' % (_supportedtypes,datatype))
+            raise TypeError(f'Illegal primitive data type, must be one of {_supportedtypes}, got {datatype} {error_info}')
         if 'id' in kwargs:
             self._varid = kwargs['id']
         else:
@@ -4162,6 +4172,12 @@ behavior is similar to Fortran or Matlab, but different than numpy.
                 with nogil:
                     ierr = nc_def_var(self._grpid, varname, xtype, ndims,
                                       NULL, &self._varid)
+
+            if ierr != NC_NOERR:
+                if grp.data_model != 'NETCDF4':
+                    grp._enddef()
+                _ensure_nc_success(ierr, extra_msg=error_info)
+
             # set chunk cache size if desired
             # default is 1mb per var, can cause problems when many (1000's)
             # of vars are created.  This change only lasts as long as file is
@@ -4170,16 +4186,14 @@ behavior is similar to Fortran or Matlab, but different than numpy.
                 with nogil:
                     ierr = nc_get_var_chunk_cache(self._grpid, self._varid, &sizep,
                            &nelemsp, &preemptionp)
-                _ensure_nc_success(ierr)
+                _ensure_nc_success(ierr, extra_msg=error_info)
                 # reset chunk cache size, leave other parameters unchanged.
                 sizep = chunk_cache
                 with nogil:
                     ierr = nc_set_var_chunk_cache(self._grpid, self._varid, sizep,
                            nelemsp, preemptionp)
-                _ensure_nc_success(ierr)
-            if ierr != NC_NOERR:
-                if grp.data_model != 'NETCDF4': grp._enddef()
-                _ensure_nc_success(ierr)
+                _ensure_nc_success(ierr, extra_msg=error_info)
+
             # set compression, shuffle, chunking, fletcher32 and endian
             # variable settings.
             # don't bother for NETCDF3* formats.
@@ -4199,7 +4213,7 @@ behavior is similar to Fortran or Matlab, but different than numpy.
                                 ierr = nc_def_var_deflate(self._grpid, self._varid, 0, 1, icomplevel)
                         if ierr != NC_NOERR:
                             if grp.data_model != 'NETCDF4': grp._enddef()
-                            _ensure_nc_success(ierr)
+                            _ensure_nc_success(ierr, extra_msg=error_info)
                     if szip:
                         IF HAS_SZIP_SUPPORT:
                             try:
@@ -4212,7 +4226,7 @@ behavior is similar to Fortran or Matlab, but different than numpy.
                                 ierr = nc_def_var_szip(self._grpid, self._varid, iszip_coding, iszip_pixels_per_block)
                             if ierr != NC_NOERR:
                                 if grp.data_model != 'NETCDF4': grp._enddef()
-                                _ensure_nc_success(ierr)
+                                _ensure_nc_success(ierr, extra_msg=error_info)
                         ELSE:
                             msg = """
 compression='szip' only works if linked version of hdf5 has szip functionality enabled"""
@@ -4224,7 +4238,7 @@ compression='szip' only works if linked version of hdf5 has szip functionality e
                                 ierr = nc_def_var_zstandard(self._grpid, self._varid, icomplevel)
                             if ierr != NC_NOERR:
                                 if grp.data_model != 'NETCDF4': grp._enddef()
-                                _ensure_nc_success(ierr)
+                                _ensure_nc_success(ierr, extra_msg=error_info)
                         ELSE:
                             msg = """
 compression='zstd' only works with netcdf-c >= 4.9.0.  To enable, install Cython, make sure you have
@@ -4237,7 +4251,7 @@ version 4.9.0 or higher netcdf-c with zstandard support, and rebuild netcdf4-pyt
                                 ierr = nc_def_var_bzip2(self._grpid, self._varid, icomplevel)
                             if ierr != NC_NOERR:
                                 if grp.data_model != 'NETCDF4': grp._enddef()
-                                _ensure_nc_success(ierr)
+                                _ensure_nc_success(ierr, extra_msg=error_info)
                         ELSE:
                             msg = """
 compression='bzip2' only works with netcdf-c >= 4.9.0.  To enable, install Cython, make sure you have
@@ -4256,7 +4270,7 @@ version 4.9.0 or higher netcdf-c with bzip2 support, and rebuild netcdf4-python.
                                     iblosc_shuffle)
                             if ierr != NC_NOERR:
                                 if grp.data_model != 'NETCDF4': grp._enddef()
-                                _ensure_nc_success(ierr)
+                                _ensure_nc_success(ierr, extra_msg=error_info)
                         ELSE:
                             msg = """
 compression='blosc_*' only works with netcdf-c >= 4.9.0.  To enable, install Cython, make sure you have
@@ -4268,7 +4282,7 @@ version 4.9.0 or higher netcdf-c with blosc support, and rebuild netcdf4-python.
                         ierr = nc_def_var_fletcher32(self._grpid, self._varid, 1)
                     if ierr != NC_NOERR:
                         if grp.data_model != 'NETCDF4': grp._enddef()
-                        _ensure_nc_success(ierr)
+                        _ensure_nc_success(ierr, extra_msg=error_info)
                 # set chunking stuff.
                 if ndims: # don't bother for scalar variable.
                     if contiguous:
@@ -4296,7 +4310,7 @@ version 4.9.0 or higher netcdf-c with blosc support, and rebuild netcdf4-python.
                         free(chunksizesp)
                         if ierr != NC_NOERR:
                             if grp.data_model != 'NETCDF4': grp._enddef()
-                            _ensure_nc_success(ierr)
+                            _ensure_nc_success(ierr, extra_msg=error_info)
                 # set endian-ness of variable
                 if endian == 'little':
                     with nogil:
@@ -4328,17 +4342,17 @@ version 4.9.0 or higher netcdf-c with blosc support, and rebuild netcdf4-python.
 
                 ELSE:
                     if significant_digits is not None:
-                        msg = """
+                        msg = f"""
 significant_digits kwarg only works with netcdf-c >= 4.9.0.  To enable, install Cython, make sure you have
 version 4.9.0 or higher netcdf-c, and rebuild netcdf4-python. Otherwise, use least_significant_digit
-kwarg for quantization."""
+kwarg for quantization. {error_info}"""
                         raise ValueError(msg)
                 if ierr != NC_NOERR:
                     if grp.data_model != 'NETCDF4': grp._enddef()
-                    _ensure_nc_success(ierr)
+                    _ensure_nc_success(ierr, extra_msg=error_info)
             else:
                 if endian != 'native':
-                    msg="only endian='native' allowed for NETCDF3 files"
+                    msg=f"only endian='native' allowed for NETCDF3 files {error_info}"
                     raise RuntimeError(msg)
             # set a fill value for this variable if fill_value keyword
             # given.  This avoids the HDF5 overhead of deleting and
@@ -4355,7 +4369,7 @@ kwarg for quantization."""
                             ierr = nc_def_var_fill(self._grpid, self._varid, 1, NULL)
                     if ierr != NC_NOERR:
                         if grp.data_model != 'NETCDF4': grp._enddef()
-                        _ensure_nc_success(ierr)
+                        _ensure_nc_success(ierr, extra_msg=error_info)
                 else:
                     if self._isprimitive or self._isenum or \
                        (self._isvlen and self.dtype == str):
@@ -4380,7 +4394,7 @@ kwarg for quantization."""
         # set ndim attribute (number of dimensions).
         with nogil:
             ierr = nc_inq_varndims(self._grpid, self._varid, &numdims)
-        _ensure_nc_success(ierr)
+        _ensure_nc_success(ierr, extra_msg=error_info)
         self.ndim = numdims
         self._name = name
         # default for automatically applying scale_factor and
@@ -5234,6 +5248,8 @@ rename a `Variable` attribute named `oldname` to `newname`."""
         elif hasattr(self, 'add_offset'):
             data = data - self.add_offset
             if self.dtype.kind in 'iu': data = numpy.around(data)
+        if self.dtype != data.dtype:
+            data = data.astype(self.dtype) # cast data to var type, if necessary.
         if ma.isMA(data):
             # if underlying data in masked regions of masked array
             # corresponds to missing values, don't fill masked array -
@@ -5264,8 +5280,6 @@ rename a `Variable` attribute named `oldname` to `newname`."""
                     data = numpy.array([fillval],self.dtype)
                 else:
                     data = data.filled(fill_value=fillval)
-        if self.dtype != data.dtype:
-            data = data.astype(self.dtype) # cast data to var type, if necessary.
         return data
 
     def _assign_vlen(self, elem, data):


=====================================
test/tst_masked.py
=====================================
@@ -15,6 +15,7 @@ from numpy.ma import masked_all
 # create an n1dim by n2dim random ranarr.
 FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
 FILE_NAME2 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
+FILE_NAME3 = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
 ndim = 10
 ranarr = 100.*uniform(size=(ndim))
 ranarr2 = 100.*uniform(size=(ndim))
@@ -41,6 +42,7 @@ class PrimitiveTypesTestCase(unittest.TestCase):
     def setUp(self):
         self.file = FILE_NAME
         self.file2 = FILE_NAME2
+        self.file3 = FILE_NAME3
         file = netCDF4.Dataset(self.file,'w')
         file.createDimension('n', ndim)
         foo = file.createVariable('maskeddata', 'f8', ('n',))
@@ -93,6 +95,19 @@ class PrimitiveTypesTestCase(unittest.TestCase):
         data = dataset['v'][:]
         dataset.close()
 
+        # issue #1271 (mask is ignored when assigning bool array to uint8 var)
+        ds = netCDF4.Dataset(self.file3, "w")
+        dim = ds.createDimension('time', 48)
+        var = ds.createVariable('blaat', np.uint8, ('time',),
+              zlib=True, complevel=4, shuffle=True, fletcher32=True,
+              fill_value=240)
+        mask = np.full((48,), False)
+        for x in range(30):
+            mask[x] = True
+        mama = ma.array(np.full((48,), True), mask=mask)
+        var[:] = mama
+        ds.close()
+
     def tearDown(self):
         # Remove the temporary files
         os.remove(self.file)
@@ -148,6 +163,11 @@ class PrimitiveTypesTestCase(unittest.TestCase):
         assert var1[:].mask.all()
         assert var2[:].mask.any() == False
         dataset.close()
+        # issue #1271
+        ds = netCDF4.Dataset(self.file3,"r")
+        var = ds['blaat']
+        assert np.count_nonzero(var[:].mask) == 30
+        ds.close()
 
 if __name__ == '__main__':
     unittest.main()


=====================================
test/tst_multifile.py
=====================================
@@ -5,7 +5,7 @@ from numpy.testing import assert_array_equal, assert_equal
 from numpy import ma
 import tempfile, unittest, os, datetime
 import cftime
-from pkg_resources import parse_version
+from packaging.version import Version
 
 nx=100; ydim=5; zdim=10
 nfiles = 10
@@ -138,7 +138,7 @@ class NonuniformTimeTestCase(unittest.TestCase):
         assert_equal(T.typecode(), t.typecode())
         # skip this until cftime pull request #55 is in a released
         # version (1.0.1?). Otherwise, fix for issue #808 breaks this
-        if parse_version(cftime.__version__) >= parse_version('1.0.1'):
+        if Version(cftime.__version__) >= Version('1.0.1'):
             assert_array_equal(cftime.num2date(T[:], T.units, T.calendar), dates)
         assert_equal(cftime.date2index(datetime.datetime(1980, 1, 2), T), 366)
         f.close()


=====================================
test/tst_multifile2.py
=====================================
@@ -5,7 +5,7 @@ from numpy.testing import assert_array_equal, assert_equal
 from numpy import ma
 import tempfile, unittest, os, datetime
 import cftime
-from pkg_resources import parse_version
+from packaging.version import Version
 
 nx=100; ydim=5; zdim=10
 nfiles = 10
@@ -106,7 +106,7 @@ class NonuniformTimeTestCase(unittest.TestCase):
         # Get the real dates
         # skip this until cftime pull request #55 is in a released
         # version (1.0.1?). Otherwise, fix for issue #808 breaks this
-        if parse_version(cftime.__version__) >= parse_version('1.0.1'):
+        if Version(cftime.__version__) >= Version('1.0.1'):
             dates = []
             for file in self.files:
                 f = Dataset(file)
@@ -126,7 +126,7 @@ class NonuniformTimeTestCase(unittest.TestCase):
         assert_equal(T.typecode(), t.typecode())
         # skip this until cftime pull request #55 is in a released
         # version (1.0.1?). Otherwise, fix for issue #808 breaks this
-        if parse_version(cftime.__version__) >= parse_version('1.0.1'):
+        if Version(cftime.__version__) >= Version('1.0.1'):
             assert_array_equal(cftime.num2date(T[:], T.units, T.calendar), dates)
         assert_equal(cftime.date2index(datetime.datetime(1980, 1, 2), T), 366)
         f.close()



View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/-/commit/b5422974561958d6f4ab8a30ece5767b040d0d8a

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/-/commit/b5422974561958d6f4ab8a30ece5767b040d0d8a
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20231025/74840ee3/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list