[Git][debian-gis-team/netcdf4-python][upstream] New upstream version 1.6.3

Bas Couwenberg (@sebastic) gitlab at salsa.debian.org
Sat Mar 4 07:56:51 GMT 2023



Bas Couwenberg pushed to branch upstream at Debian GIS Project / netcdf4-python


Commits:
8c905f7f by Bas Couwenberg at 2023-03-04T08:11:30+01:00
New upstream version 1.6.3
- - - - -


22 changed files:

- .github/workflows/build_latest.yml
- .github/workflows/build_master.yml
- .github/workflows/build_old.yml
- .github/workflows/miniconda.yml
- Changelog
- README.md
- docs/index.html
- examples/bench_compress3.py
- examples/bench_compress4.py
- examples/threaded_read.py
- pyproject.toml
- setup.py
- src/netCDF4/_netCDF4.pyx
- src/netCDF4/utils.py
- test/tst_Unsigned.py
- test/tst_atts.py
- test/tst_cdl.py
- test/tst_dap.py
- test/tst_filepath.py
- test/tst_open_mem.py
- test/tst_utils.py
- test/ubyte.nc


Changes:

=====================================
.github/workflows/build_latest.yml
=====================================
@@ -6,14 +6,14 @@ jobs:
     runs-on: ubuntu-latest
     env:
       PNETCDF_VERSION: 1.12.1
-      NETCDF_VERSION: 4.9.0
+      NETCDF_VERSION: 4.9.1
       NETCDF_DIR: ${{ github.workspace }}/..
       NETCDF_EXTRA_CONFIG: --enable-pnetcdf
       CC: mpicc.mpich
       #NO_NET: 1
     strategy:
       matrix:
-        python-version: ["3.10"]
+        python-version: ["3.11"]
     steps:
 
     - uses: actions/checkout at v2


=====================================
.github/workflows/build_master.yml
=====================================
@@ -10,7 +10,7 @@ jobs:
       #NO_NET: 1
     strategy:
       matrix:
-        python-version: ["3.10"]
+        python-version: ["3.11"]
     steps:
 
     - uses: actions/checkout at v2


=====================================
.github/workflows/build_old.yml
=====================================
@@ -13,7 +13,7 @@ jobs:
       #NO_NET: 1
     strategy:
       matrix:
-        python-version: ["3.10"]
+        python-version: ["3.11"]
     steps:
 
     - uses: actions/checkout at v2


=====================================
.github/workflows/miniconda.yml
=====================================
@@ -53,7 +53,7 @@ jobs:
     runs-on: ${{ matrix.os }}
     strategy:
       matrix:
-        python-version: [ "3.10" ]
+        python-version: [ "3.11" ]
         os: [ubuntu-latest]
         platform: [x64]
     steps:


=====================================
Changelog
=====================================
@@ -1,3 +1,10 @@
+ version 1.6.3 (tag v1.6.3rel)
+==============================
+ * Use ``nc_put_vars`` for strided writes for netcdf-c >= 4.6.2 (issue #1222).
+ * _Unsigned="false" should be same as not having _Unsigned set (issue #1232). 
+   _Unsigned now must be set to "true" or "True" for variable to be interpreted
+   as unsigned, instead of just having _Unsigned be set (to anything).
+
  version 1.6.2 (tag v1.6.2rel)
 ==============================
  * Added ``netCDF4.__has_set_alignment__`` property to help identify if the


=====================================
README.md
=====================================
@@ -10,6 +10,8 @@
 ## News
 For details on the latest updates, see the [Changelog](https://github.com/Unidata/netcdf4-python/blob/master/Changelog).
 
+3/3/2023:  Version [1.6.3](https://pypi.python.org/pypi/netCDF4/1.6.3) released.
+
 11/15/2022:  Version [1.6.2](https://pypi.python.org/pypi/netCDF4/1.6.2) released. Fix for
 compilation with netcdf-c < 4.9.0 (issue [#1209](https://github.com/Unidata/netcdf4-python/issues/1209)).  
 Slicing multi-dimensional variables with an all False boolean index array
@@ -247,7 +249,7 @@ conda install -c conda-forge netCDF4
 * Clone GitHub repository (`git clone https://github.com/Unidata/netcdf4-python.git`)
 
 * Make sure [numpy](http://www.numpy.org/) and [Cython](http://cython.org/) are
-  installed and you have [Python](https://www.python.org) 3.6 or newer.
+  installed and you have [Python](https://www.python.org) 3.7 or newer.
 
 * Make sure [HDF5](http://www.h5py.org/) and netcdf-4 are installed, 
   and the `nc-config` utility is in your Unix PATH.


=====================================
docs/index.html
=====================================
The diff for this file was not included because it is too large.

=====================================
examples/bench_compress3.py
=====================================
@@ -1,4 +1,3 @@
-from __future__ import print_function
 # benchmark reads and writes, with and without compression.
 # tests all four supported file formats.
 from numpy.random.mtrand import uniform


=====================================
examples/bench_compress4.py
=====================================
@@ -1,4 +1,3 @@
-from __future__ import print_function
 # benchmark reads and writes, with and without compression.
 # tests all four supported file formats.
 from numpy.random.mtrand import uniform


=====================================
examples/threaded_read.py
=====================================
@@ -1,4 +1,3 @@
-from __future__ import print_function
 from netCDF4 import Dataset
 from numpy.testing import assert_array_equal, assert_array_almost_equal
 import numpy as np


=====================================
pyproject.toml
=====================================
@@ -1,3 +1,62 @@
 [build-system]
-requires = ["setuptools>=41.2", "cython>=0.19", "oldest-supported-numpy"]
+requires = [
+    "Cython>=0.29",
+    "oldest-supported-numpy",
+    "setuptools>=61",
+]
 build-backend = "setuptools.build_meta"
+
+[project]
+name = "netCDF4"
+description = "Provides an object-oriented python interface to the netCDF version 4 library"
+readme = {text = """\
+netCDF version 4 has many features not found in earlier versions of the library,
+such as hierarchical groups, zlib compression, multiple unlimited dimensions,
+and new data types.  It is implemented on top of HDF5.  This module implements
+most of the new features, and can read and write netCDF files compatible with
+older versions of the library.  The API is modelled after Scientific.IO.NetCDF,
+and should be familiar to users of that module.
+""", content-type = "text/x-rst"}
+authors = [
+  {name = "Jeff Whitaker", email = "jeffrey.s.whitaker at noaa.gov"},
+]
+requires-python = ">=3.7"
+keywords = [
+    "numpy", "netcdf", "data", "science", "network", "oceanography",
+    "meteorology", "climate",
+]
+license = {text = "MIT"}
+classifiers = [
+    "Development Status :: 3 - Alpha",
+    "Programming Language :: Python :: 3",
+    "Programming Language :: Python :: 3.7",
+    "Programming Language :: Python :: 3.8",
+    "Programming Language :: Python :: 3.9",
+    "Programming Language :: Python :: 3.10",
+    "Programming Language :: Python :: 3.11",
+    "Intended Audience :: Science/Research",
+    "License :: OSI Approved :: MIT License",
+    "Topic :: Software Development :: Libraries :: Python Modules",
+    "Topic :: System :: Archiving :: Compression",
+    "Operating System :: OS Independent",
+]
+dependencies = [
+    "cftime",
+    "numpy",
+]
+dynamic = ["version"]
+
+[project.scripts]
+nc3tonc4 = "netCDF4.utils:nc3tonc4"
+nc4tonc3 = "netCDF4.utils:nc4tonc3"
+ncinfo = "netCDF4.utils:ncinfo"
+
+[project.urls]
+Documentation = "https://unidata.github.io/netcdf4-python/"
+Repository = "https://github.com/Unidata/netcdf4-python"
+
+[tool.setuptools.packages.find]
+where = ["src"]
+
+[tool.setuptools.package-data]
+"netCDF4.plugins" = ["lib__nc*"]


=====================================
setup.py
=====================================
@@ -2,27 +2,16 @@ import os, sys, subprocess, glob
 import os.path as osp
 import shutil
 import configparser
-from setuptools import setup, Extension, find_namespace_packages
+from setuptools import setup, Extension
 from setuptools.dist import Distribution
 
-setuptools_extra_kwargs = {
-    "install_requires": ["numpy>=1.9","cftime"],
-    "entry_points": {
-        'console_scripts': [
-            'ncinfo = netCDF4.utils:ncinfo',
-            'nc4tonc3 = netCDF4.utils:nc4tonc3',
-            'nc3tonc4 = netCDF4.utils:nc3tonc4',
-        ]
-    },
-}
-
 open_kwargs = {'encoding': 'utf-8'}
 
 
 def check_hdf5version(hdf5_includedir):
     try:
         f = open(os.path.join(hdf5_includedir, 'H5public.h'), **open_kwargs)
-    except IOError:
+    except OSError:
         return None
     hdf5_version = None
     for line in f:
@@ -46,7 +35,7 @@ def get_hdf5_version(direc):
 def check_ifnetcdf4(netcdf4_includedir):
     try:
         f = open(os.path.join(netcdf4_includedir, 'netcdf.h'), **open_kwargs)
-    except IOError:
+    except OSError:
         return False
     isnetcdf4 = False
     for line in f:
@@ -54,7 +43,6 @@ def check_ifnetcdf4(netcdf4_includedir):
             isnetcdf4 = True
     return isnetcdf4
 
-
 def check_api(inc_dirs,netcdf_lib_version):
     has_rename_grp = False
     has_nc_inq_path = False
@@ -76,7 +64,7 @@ def check_api(inc_dirs,netcdf_lib_version):
     for d in inc_dirs:
         try:
             f = open(os.path.join(d, 'netcdf.h'), **open_kwargs)
-        except IOError:
+        except OSError:
             continue
 
         has_nc_open_mem = os.path.exists(os.path.join(d, 'netcdf_mem.h'))
@@ -99,7 +87,7 @@ def check_api(inc_dirs,netcdf_lib_version):
         if has_nc_open_mem:
             try:
                 f = open(os.path.join(d, 'netcdf_mem.h'), **open_kwargs)
-            except IOError:
+            except OSError:
                 continue
             for line in f:
                 if line.startswith('EXTERNL int nc_create_mem'):
@@ -108,7 +96,7 @@ def check_api(inc_dirs,netcdf_lib_version):
         if has_nc_filter:
             try:
                 f = open(os.path.join(d, 'netcdf_filter.h'), **open_kwargs)
-            except IOError:
+            except OSError:
                 continue
             for line in f:
                 if line.startswith('EXTERNL int nc_def_var_zstandard'):
@@ -124,15 +112,31 @@ def check_api(inc_dirs,netcdf_lib_version):
         if os.path.exists(ncmetapath):
             for line in open(ncmetapath):
                 if line.startswith('#define NC_HAS_CDF5'):
-                    has_cdf5_format = bool(int(line.split()[2]))
+                    try:
+                        has_cdf5_format = bool(int(line.split()[2]))
+                    except ValueError:
+                        pass  # keep default False if value cannot be parsed
                 if line.startswith('#define NC_HAS_PARALLEL'):
-                    has_parallel_support = bool(int(line.split()[2]))
+                    try:
+                        has_parallel_support = bool(int(line.split()[2]))
+                    except ValueError:
+                        pass
                 if line.startswith('#define NC_HAS_PARALLEL4'):
-                    has_parallel4_support = bool(int(line.split()[2]))
+                    try:
+                        has_parallel4_support = bool(int(line.split()[2]))
+                    except ValueError:
+                        pass
                 if line.startswith('#define NC_HAS_PNETCDF'):
-                    has_pnetcdf_support = bool(int(line.split()[2]))
+                    try:
+                        has_pnetcdf_support = bool(int(line.split()[2]))
+                    except ValueError:
+                        pass
                 if line.startswith('#define NC_HAS_SZIP_WRITE'):
-                    has_szip_support = bool(int(line.split()[2]))
+                    try:
+                        has_szip_support = bool(int(line.split()[2]))
+                    except ValueError:
+                        pass
+
         # NC_HAS_PARALLEL4 missing in 4.6.1 (issue #964)
         if not has_parallel4_support and has_parallel_support and not has_pnetcdf_support:
             has_parallel4_support = True
@@ -726,35 +730,12 @@ else:
     sys.stdout.write('NETCDF_PLUGIN_DIR not set, no netcdf compression plugins installed\n')
     data_files = []
 
-setup(name="netCDF4",
-      cmdclass=cmdclass,
-      version=extract_version(netcdf4_src_pyx),
-      long_description="netCDF version 4 has many features not found in earlier versions of the library, such as hierarchical groups, zlib compression, multiple unlimited dimensions, and new data types.  It is implemented on top of HDF5.  This module implements most of the new features, and can read and write netCDF files compatible with older versions of the library.  The API is modelled after Scientific.IO.NetCDF, and should be familiar to users of that module.\n\nThis project is hosted on a `GitHub repository <https://github.com/Unidata/netcdf4-python>`_ where you may access the most up-to-date source.",
-      author="Jeff Whitaker",
-      author_email="jeffrey.s.whitaker at noaa.gov",
-      url="http://github.com/Unidata/netcdf4-python",
-      download_url="http://python.org/pypi/netCDF4",
-      platforms=["any"],
-      license='License :: OSI Approved :: MIT License',
-      description="Provides an object-oriented python interface to the netCDF version 4 library.",
-      keywords=['numpy', 'netcdf', 'data', 'science', 'network', 'oceanography',
-                'meteorology', 'climate'],
-      classifiers=["Development Status :: 3 - Alpha",
-                   "Programming Language :: Python :: 3",
-                   "Programming Language :: Python :: 3.6",
-                   "Programming Language :: Python :: 3.7",
-                   "Programming Language :: Python :: 3.8",
-                   "Intended Audience :: Science/Research",
-                   "License :: OSI Approved :: MIT License",
-                   "Topic :: Software Development :: Libraries :: Python Modules",
-                   "Topic :: System :: Archiving :: Compression",
-                   "Operating System :: OS Independent"],
-      packages=find_namespace_packages(where="src"),
-      package_dir={'':'src'},
-      package_data={"netCDF4.plugins": ["lib__nc*"]},
-      ext_modules=ext_modules,
-      python_requires=">=3.6",
-      **setuptools_extra_kwargs)
+# See pyproject.toml for project metadata
+setup(
+    name="netCDF4",  # need by GitHub dependency graph
+    version=extract_version(netcdf4_src_pyx),
+    ext_modules=ext_modules,
+)
 
 # remove plugin files copied from outside source tree
 if copied_plugins:


=====================================
src/netCDF4/_netCDF4.pyx
=====================================
@@ -1,5 +1,5 @@
 """
-Version 1.6.2
+Version 1.6.3
 -------------
 
 # Introduction
@@ -32,7 +32,7 @@ types) are not supported.
 
  - Clone the
    [github repository](http://github.com/Unidata/netcdf4-python).
- - Make sure the dependencies are satisfied (Python 3.6 or later,
+ - Make sure the dependencies are satisfied (Python 3.7 or later,
    [numpy](http://numpy.scipy.org), 
    [Cython](http://cython.org),
    [cftime](https://github.com/Unidata/cftime),
@@ -1226,16 +1226,14 @@ from cpython.bytes cimport PyBytes_FromStringAndSize
 from .utils import (_StartCountStride, _quantize, _find_dim, _walk_grps,
                     _out_array_shape, _sortbylist, _tostr, _safecast, _is_int)
 import sys
-if sys.version_info[0:2] < (3, 7):
-    # Python 3.7+ guarantees order; older versions need OrderedDict
-    from collections import OrderedDict
 
-__version__ = "1.6.2"
+__version__ = "1.6.3"
 
 # Initialize numpy
 import posixpath
 from cftime import date2num, num2date, date2index
 import numpy
+cimport numpy
 import weakref
 import warnings
 import subprocess
@@ -1245,7 +1243,7 @@ from glob import glob
 from numpy import ma
 from libc.string cimport memcpy, memset
 from libc.stdlib cimport malloc, free
-import_array()
+numpy.import_array()
 include "constants.pyx"
 include "membuf.pyx"
 include "netCDF4.pxi"
@@ -1471,8 +1469,8 @@ default_fillvals = {#'S1':NC_FILL_CHAR,
                      'f8':NC_FILL_DOUBLE}
 
 # logical for native endian type.
-is_native_little = numpy.dtype('<f4').byteorder == '='
-is_native_big = numpy.dtype('>f4').byteorder == '='
+is_native_little = numpy.dtype('<f4').byteorder == c'='
+is_native_big = numpy.dtype('>f4').byteorder == c'='
 
 # hard code these here, instead of importing from netcdf.h
 # so it will compile with versions <= 4.2.
@@ -1715,7 +1713,7 @@ be raised in the next release."""
             # don't allow string array attributes in NETCDF3 files.
             if is_netcdf3 and N > 1:
                 msg='array string attributes can only be written with NETCDF4'
-                raise IOError(msg)
+                raise OSError(msg)
             if not value_arr.shape:
                 dats = _strencode(value_arr.item())
             else:
@@ -1775,14 +1773,9 @@ cdef _get_types(group):
             ierr = nc_inq_typeids(_grpid, &ntypes, typeids)
         _ensure_nc_success(ierr)
     # create empty dictionary for CompoundType instances.
-    if sys.version_info[0:2] < (3, 7):
-        cmptypes = OrderedDict()
-        vltypes = OrderedDict()
-        enumtypes = OrderedDict()
-    else:
-        cmptypes = dict()
-        vltypes = dict()
-        enumtypes = dict()
+    cmptypes = dict()
+    vltypes = dict()
+    enumtypes = dict()
 
     if ntypes > 0:
         for n from 0 <= n < ntypes:
@@ -1839,10 +1832,7 @@ cdef _get_dims(group):
         ierr = nc_inq_ndims(_grpid, &numdims)
     _ensure_nc_success(ierr)
     # create empty dictionary for dimensions.
-    if sys.version_info[0:2] < (3, 7):
-        dimensions = OrderedDict()
-    else:
-        dimensions = dict()
+    dimensions = dict()
     if numdims > 0:
         dimids = <int *>malloc(sizeof(int) * numdims)
         if group.data_model == 'NETCDF4':
@@ -1873,10 +1863,7 @@ cdef _get_grps(group):
         ierr = nc_inq_grps(_grpid, &numgrps, NULL)
     _ensure_nc_success(ierr)
     # create dictionary containing `Group` instances for groups in this group
-    if sys.version_info[0:2] < (3, 7):
-        groups = OrderedDict()
-    else:
-        groups = dict()
+    groups = dict()
     if numgrps > 0:
         grpids = <int *>malloc(sizeof(int) * numgrps)
         with nogil:
@@ -1906,10 +1893,7 @@ cdef _get_vars(group):
         ierr = nc_inq_nvars(_grpid, &numvars)
     _ensure_nc_success(ierr, err_cls=AttributeError)
     # create empty dictionary for variables.
-    if sys.version_info[0:2] < (3, 7):
-        variables = OrderedDict()
-    else:
-        variables = dict()
+    variables = dict()
     if numvars > 0:
         # get variable ids.
         varids = <int *>malloc(sizeof(int) * numvars)
@@ -2022,7 +2006,9 @@ cdef _ensure_nc_success(ierr, err_cls=RuntimeError, filename=None):
     # print netcdf error message, raise error.
     if ierr != NC_NOERR:
         err_str = (<char *>nc_strerror(ierr)).decode('ascii')
-        if issubclass(err_cls, EnvironmentError):
+        if issubclass(err_cls, OSError):
+            if isinstance(filename, bytes):
+                filename = filename.decode()
             raise err_cls(ierr, err_str, filename)
         else:
             raise err_cls(err_str)
@@ -2460,7 +2446,7 @@ strings.
         else:
             raise ValueError("mode must be 'w', 'x', 'r', 'a' or 'r+', got '%s'" % mode)
 
-        _ensure_nc_success(ierr, err_cls=IOError, filename=path)
+        _ensure_nc_success(ierr, err_cls=OSError, filename=path)
 
         # data model and file format attributes
         self.data_model = _get_format(grpid)
@@ -2488,10 +2474,7 @@ strings.
         if self.data_model == 'NETCDF4':
             self.groups = _get_grps(self)
         else:
-            if sys.version_info[0:2] < (3, 7):
-                self.groups = OrderedDict()
-            else:
-                self.groups = dict()
+            self.groups = dict()
 
     # these allow Dataset objects to be used via a "with" statement.
     def __enter__(self):
@@ -3053,7 +3036,7 @@ Use if you need to ensure that a netCDF attribute is created with type
         xtype=-99
         if self.data_model != 'NETCDF4':
             msg='file format does not support NC_STRING attributes'
-            raise IOError(msg)
+            raise OSError(msg)
         _set_att(self, NC_GLOBAL, name, value, xtype=xtype, force_ncstring=True)
 
     def setncatts(self,attdict):
@@ -3128,11 +3111,7 @@ attributes."""
                 values = []
                 for name in names:
                     values.append(_get_att(self, NC_GLOBAL, name))
-                gen = zip(names, values)
-                if sys.version_info[0:2] < (3, 7):
-                    return OrderedDict(gen)
-                else:
-                    return dict(gen)
+                return dict(zip(names, values))
             else:
                 raise AttributeError
         elif name in _private_atts:
@@ -3621,20 +3600,12 @@ Additional read-only class variables:
             with nogil:
                 ierr = nc_def_grp(grpid, groupname, &self._grpid)
             _ensure_nc_success(ierr)
-            if sys.version_info[0:2] < (3, 7):
-                self.cmptypes = OrderedDict()
-                self.vltypes = OrderedDict()
-                self.enumtypes = OrderedDict()
-                self.dimensions = OrderedDict()
-                self.variables = OrderedDict()
-                self.groups = OrderedDict()
-            else:
-                self.cmptypes = dict()
-                self.vltypes = dict()
-                self.enumtypes = dict()
-                self.dimensions = dict()
-                self.variables = dict()
-                self.groups = dict()
+            self.cmptypes = dict()
+            self.vltypes = dict()
+            self.enumtypes = dict()
+            self.dimensions = dict()
+            self.variables = dict()
+            self.groups = dict()
 
 
     def close(self):
@@ -3642,8 +3613,8 @@ Additional read-only class variables:
 **`close(self)`**
 
 overrides `Dataset` close method which does not apply to `Group`
-instances, raises IOError."""
-        raise IOError('cannot close a `Group` (only applies to Dataset)')
+instances, raises OSError."""
+        raise OSError('cannot close a `Group` (only applies to Dataset)')
 
 
 cdef class Dimension:
@@ -3826,7 +3797,7 @@ variable's data type.
 
 **`scale`**: If True, `scale_factor` and `add_offset` are
 applied, and signed integer data is automatically converted to
-unsigned integer data if the `_Unsigned` attribute is set.
+unsigned integer data if the `_Unsigned` attribute is set to "true" or "True".
 Default is `True`, can be reset using `Variable.set_auto_scale` and
 `Variable.set_auto_maskandscale` methods.
 
@@ -4613,7 +4584,7 @@ Use if you need to set an attribute to an array of variable-length strings."""
         xtype=-99
         if self._grp.data_model != 'NETCDF4':
             msg='file format does not support NC_STRING attributes'
-            raise IOError(msg)
+            raise OSError(msg)
         _set_att(self._grp, self._varid, name, value, xtype=xtype, force_ncstring=True)
 
     def setncatts(self,attdict):
@@ -4910,11 +4881,7 @@ details."""
                 values = []
                 for name in names:
                     values.append(_get_att(self._grp, self._varid, name))
-                gen = zip(names, values)
-                if sys.version_info[0:2] < (3, 7):
-                    return OrderedDict(gen)
-                else:
-                    return dict(gen)
+                return dict(zip(names, values))
 
             else:
                 raise AttributeError
@@ -5011,10 +4978,10 @@ rename a `Variable` attribute named `oldname` to `newname`."""
         if self.mask and (self._isprimitive or self._isenum):\
             data = self._toma(data)
         else:
-            # if attribute _Unsigned is True, and variable has signed integer
+            # if attribute _Unsigned is "true", and variable has signed integer
             # dtype, return view with corresponding unsigned dtype (issue #656)
             if self.scale:  # only do this if autoscale option is on.
-                is_unsigned = getattr(self, '_Unsigned', False)
+                is_unsigned = getattr(self, '_Unsigned', False) in ["true","True"]
                 if is_unsigned and data.dtype.kind == 'i':
                     data=data.view('%su%s'%(data.dtype.byteorder,data.dtype.itemsize))
 
@@ -5066,10 +5033,11 @@ rename a `Variable` attribute named `oldname` to `newname`."""
 
     def _toma(self,data):
         cdef int ierr, no_fill
-        # if attribute _Unsigned is True, and variable has signed integer
+        # if attribute _Unsigned is "true", and variable has signed integer
         # dtype, return view with corresponding unsigned dtype (issues #656,
         # #794)
-        is_unsigned = getattr(self, '_Unsigned', False)
+        # _Unsigned attribute must be "true" or "True" (string). Issue #1232.
+        is_unsigned = getattr(self, '_Unsigned', False) in ["True","true"]
         is_unsigned_int = is_unsigned and data.dtype.kind == 'i'
         if self.scale and is_unsigned_int:  # only do this if autoscale option is on.
             dtype_unsigned_int='%su%s' % (data.dtype.byteorder,data.dtype.itemsize)
@@ -5474,7 +5442,7 @@ cannot be safely cast to variable data type""" % attname
                 raise ValueError(msg)
 
         start, count, stride, put_ind =\
-        _StartCountStride(elem,self.shape,self.dimensions,self._grp,datashape=data.shape,put=True)
+        _StartCountStride(elem,self.shape,self.dimensions,self._grp,datashape=data.shape,put=True,use_get_vars=self._use_get_vars)
         datashape = _out_array_shape(count)
 
         # if a numpy scalar, create an array of the right size
@@ -5589,7 +5557,7 @@ turn on or off automatic conversion of variable data to and
 from masked arrays, automatic packing/unpacking of variable
 data using `scale_factor` and `add_offset` attributes and
 automatic conversion of signed integer data to unsigned integer
-data if the `_Unsigned` attribute exists.
+data if the `_Unsigned` attribute exists and is set to "true" (or "True").
 
 If `maskandscale` is set to `True`, when data is read from a variable
 it is converted to a masked array if any of the values are exactly
@@ -5624,7 +5592,7 @@ used to provide simple compression, see the
 [PSL metadata conventions](http://www.esrl.noaa.gov/psl/data/gridded/conventions/cdc_netcdf_standard.shtml).
 
 In addition, if `maskandscale` is set to `True`, and if the variable has an
-attribute `_Unsigned` set, and the variable has a signed integer data type,
+attribute `_Unsigned` set to "true", and the variable has a signed integer data type,
 a view to the data is returned with the corresponding unsigned integer data type.
 This convention is used by the netcdf-java library to save unsigned integer
 data in `NETCDF3` or `NETCDF4_CLASSIC` files (since the `NETCDF3`
@@ -5643,7 +5611,7 @@ turn on or off automatic packing/unpacking of variable
 data using `scale_factor` and `add_offset` attributes.
 Also turns on and off automatic conversion of signed integer data
 to unsigned integer data if the variable has an `_Unsigned`
-attribute.
+attribute set to "true" or "True".
 
 If `scale` is set to `True`, and the variable has a
 `scale_factor` or an `add_offset` attribute, then data read
@@ -5663,7 +5631,7 @@ used to provide simple compression, see the
 [PSL metadata conventions](http://www.esrl.noaa.gov/psl/data/gridded/conventions/cdc_netcdf_standard.shtml).
 
 In addition, if `scale` is set to `True`, and if the variable has an
-attribute `_Unsigned` set, and the variable has a signed integer data type,
+attribute `_Unsigned` set to "true", and the variable has a signed integer data type,
 a view to the data is returned with the corresponding unsigned integer datatype.
 This convention is used by the netcdf-java library to save unsigned integer
 data in `NETCDF3` or `NETCDF4_CLASSIC` files (since the `NETCDF3`
@@ -6778,7 +6746,7 @@ Example usage (See `MFDataset.__init__` for more details):
 
         if not files:
            msg='no files specified (file list is empty)'
-           raise IOError(msg)
+           raise OSError(msg)
 
         if master_file is not None:
             if master_file not in files:
@@ -6808,7 +6776,7 @@ Example usage (See `MFDataset.__init__` for more details):
                     aggDimId = dim
                     aggDimName = dimname
         if aggDimId is None:
-            raise IOError("master dataset %s does not have a aggregation dimension" % master)
+            raise OSError("master dataset %s does not have a aggregation dimension" % master)
 
         # Get info on all aggregation variables defined in the master.
         # Make sure the master defines at least one aggregation variable.
@@ -6824,7 +6792,7 @@ Example usage (See `MFDataset.__init__` for more details):
             if (len(dims) > 0 and aggDimName == dims[0]):
                 masterRecVar[vName] = (dims, shape, dtype)
         if len(masterRecVar) == 0:
-            raise IOError("master dataset %s does not have any variables to aggregate" % master)
+            raise OSError("master dataset %s does not have any variables to aggregate" % master)
 
         # Create the following:
         #   cdf       list of Dataset instances
@@ -6854,7 +6822,7 @@ Example usage (See `MFDataset.__init__` for more details):
                 if check:
                     # Make sure master rec var is also defined here.
                     if v not in varInfo.keys():
-                        raise IOError("aggregation variable %s not defined in %s" % (v, f))
+                        raise OSError("aggregation variable %s not defined in %s" % (v, f))
 
                     #if not vInst.dimensions[0] != aggDimName:
 
@@ -6864,7 +6832,7 @@ Example usage (See `MFDataset.__init__` for more details):
                     extType = varInfo[v].dtype
                     # Check that dimension names are identical.
                     if masterDims != extDims:
-                        raise IOError("variable %s : dimensions mismatch between "
+                        raise OSError("variable %s : dimensions mismatch between "
                                        "master %s (%s) and extension %s (%s)" %
                                        (v, master, masterDims, f, extDims))
 
@@ -6872,17 +6840,17 @@ Example usage (See `MFDataset.__init__` for more details):
                     # identical (except for that of the unlimited dimension, which of
                     # course may vary.
                     if len(masterShape) != len(extShape):
-                        raise IOError("variable %s : rank mismatch between "
+                        raise OSError("variable %s : rank mismatch between "
                                        "master %s (%s) and extension %s (%s)" %
                                        (v, master, len(masterShape), f, len(extShape)))
                     if masterShape[1:] != extShape[1:]:
-                        raise IOError("variable %s : shape mismatch between "
+                        raise OSError("variable %s : shape mismatch between "
                                        "master %s (%s) and extension %s (%s)" %
                                        (v, master, masterShape, f, extShape))
 
                     # Check that the data types are identical.
                     if masterType != extType:
-                        raise IOError("variable %s : data type mismatch between "
+                        raise OSError("variable %s : data type mismatch between "
                                        "master %s (%s) and extension %s (%s)" %
                                        (v, master, masterType, f, extType))
 


=====================================
src/netCDF4/utils.py
=====================================
@@ -1,5 +1,3 @@
-from __future__ import print_function
-
 import sys
 import numpy as np
 from numpy import ma
@@ -81,7 +79,7 @@ least_significant_digit=1, bits will be 4.
         return datout
 
 def _StartCountStride(elem, shape, dimensions=None, grp=None, datashape=None,\
-        put=False, use_get_vars = False):
+        put=False, use_get_vars = True):
     """Return start, count, stride and indices needed to store/extract data
     into/from a netCDF variable.
 
@@ -578,7 +576,7 @@ def _nc4tonc3(filename4,filename3,clobber=False,nchunk=10,quiet=False,format='NE
 
     ncfile4 = Dataset(filename4,'r')
     if ncfile4.file_format != 'NETCDF4_CLASSIC':
-        raise IOError('input file must be in NETCDF4_CLASSIC format')
+        raise OSError('input file must be in NETCDF4_CLASSIC format')
     ncfile3 = Dataset(filename3,'w',clobber=clobber,format=format)
     # create dimensions. Check for unlimited dim.
     unlimdimname = False


=====================================
test/tst_Unsigned.py
=====================================
@@ -21,6 +21,11 @@ class Test_Unsigned(unittest.TestCase):
         data2 = f['ub'][:]
         assert data2.dtype.str[1:] == 'i1'
         assert_array_equal(data2,np.array([0,-1],np.int8))
+        data = f['sb'][:]
+        assert data.dtype.str[1:] == 'i1'
+        # issue #1232 _Unsigned='false' is same as not having _Unsigned set.
+        data = f['sb2'][:]
+        assert data.dtype.str[1:] == 'i1'
         f.close()
         # issue 671
         f = netCDF4.Dataset('issue671.nc')


=====================================
test/tst_atts.py
=====================================
@@ -115,7 +115,7 @@ class VariablesTestCase(unittest.TestCase):
             f.charatt = 'foo' # will be written as NC_CHAR
             f.setncattr_string('stringatt','bar') # NC_STRING
             f.cafe = 'caf\xe9' # NC_STRING
-            f.batt = 'caf\xe9'.encode('utf-8') #NC_CHAR
+            f.batt = 'caf\xe9'.encode() #NC_CHAR
             v.setncattr_string('stringatt','bar') # NC_STRING
             # issue #882 - provide an option to always string attribute
             # as NC_STRINGs. Testing various approaches to setting text attributes...
@@ -123,7 +123,7 @@ class VariablesTestCase(unittest.TestCase):
             f.stringatt_ncstr = 'foo' # will now be written as NC_STRING
             f.setncattr_string('stringatt_ncstr','bar') # NC_STRING anyway
             f.caf_ncstr = 'caf\xe9' # NC_STRING anyway
-            f.bat_ncstr = 'caf\xe9'.encode('utf-8') # now NC_STRING
+            f.bat_ncstr = 'caf\xe9'.encode() # now NC_STRING
             g.stratt_ncstr = STRATT # now NC_STRING
             #g.renameAttribute('stratt_tmp','stratt_ncstr')
             v.setncattr_string('stringatt_ncstr','bar') # NC_STRING anyway


=====================================
test/tst_cdl.py
=====================================
@@ -9,6 +9,8 @@ variables:
 	byte ub(d) ;
 		ub:_Unsigned = "true" ;
 	byte sb(d) ;
+	byte sb2(d) ;
+		sb2:_Unsigned = "false" ;
 
 // global attributes:
 		:_Format = "classic" ;
@@ -21,6 +23,8 @@ variables:
 	byte ub(d) ;
 		ub:_Unsigned = "true" ;
 	byte sb(d) ;
+	byte sb2(d) ;
+		sb2:_Unsigned = "false" ;
 
 // global attributes:
 		:_Format = "classic" ;
@@ -29,6 +33,8 @@ data:
  ub = 0, -1 ;
 
  sb = -128, 127 ;
+
+ sb2 = -127, -127 ;
 }
 """
 


=====================================
test/tst_dap.py
=====================================
@@ -1,15 +1,17 @@
 import unittest
 import netCDF4
+import numpy as np
+from datetime import datetime, timedelta
 from numpy.testing import assert_array_almost_equal
 
 # test accessing data over http with opendap.
 
-URL = "http://remotetest.unidata.ucar.edu/thredds/dodsC/testdods/testData.nc"
+yesterday = datetime.utcnow() - timedelta(days=1)
+URL = "http://nomads.ncep.noaa.gov/dods/gfs_1p00/gfs%s/gfs_1p00_00z" % yesterday.strftime('%Y%m%d')
 URL_https = 'https://podaac-opendap.jpl.nasa.gov/opendap/allData/modis/L3/aqua/11um/v2019.0/4km/daily/2017/365/AQUA_MODIS.20171231.L3m.DAY.NSST.sst.4km.nc'
-varname = 'Z_sfc'
-varmin = 0
-varmax = 3292
-varshape = (1,95,135)
+varname = 'hgtsfc'
+data_min = -40; data_max = 5900
+varshape = (181, 360)
 
 class DapTestCase(unittest.TestCase):
 
@@ -24,10 +26,10 @@ class DapTestCase(unittest.TestCase):
         ncfile = netCDF4.Dataset(URL)
         assert varname in ncfile.variables.keys()
         var = ncfile.variables[varname]
-        assert var.shape == varshape
-        data = var[:]
-        assert_array_almost_equal(data.min(),varmin)
-        assert_array_almost_equal(data.max(),varmax)
+        data = var[0,...]
+        assert data.shape == varshape
+        assert(np.abs(data.min()-data_min) < 10)
+        assert(np.abs(data.max()-data_max) < 100)
         ncfile.close()
         # test https support (linked curl lib must built with openssl support)
         ncfile = netCDF4.Dataset(URL_https)


=====================================
test/tst_filepath.py
=====================================
@@ -26,7 +26,7 @@ class test_filepath(unittest.TestCase):
 
     def test_no_such_file_raises(self):
         fname = 'not_a_nc_file.nc'
-        with self.assertRaisesRegex(IOError, fname):
+        with self.assertRaisesRegex(OSError, fname):
             netCDF4.Dataset(fname, 'r')
 
 


=====================================
test/tst_open_mem.py
=====================================
@@ -19,7 +19,7 @@ class TestOpenMem(unittest.TestCase):
 
             # Needs: https://github.com/Unidata/netcdf-c/pull/400
             if netCDF4.__netcdf4libversion__ < '4.4.1.2':
-                with self.assertRaises(IOError):
+                with self.assertRaises(OSError):
                     netCDF4.Dataset('foo_bar', memory=nc_bytes)
                 return
 


=====================================
test/tst_utils.py
=====================================
@@ -62,9 +62,9 @@ class TestgetStartCountStride(unittest.TestCase):
         elem = [slice(None), [1,3,5], 8]
         start, count, stride, put_ind = _StartCountStride(elem, (50, 6, 10))
         # pull request #683 now does not convert integer sequences to strided
-        # slices.
-        #assert_equal(put_ind[...,1].squeeze(), slice(None,None,None))
-        assert_equal(put_ind[...,1].squeeze(), [0,1,2])
+        # slices. PR #1224 reverts this behavior.
+        assert_equal(put_ind[...,1].squeeze(), slice(None,None,None))
+        #assert_equal(put_ind[...,1].squeeze(), [0,1,2])
 
 
     def test_multiple_sequences(self):


=====================================
test/ubyte.nc
=====================================
Binary files a/test/ubyte.nc and b/test/ubyte.nc differ



View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/-/commit/8c905f7f1190aebfeb987f20d7beffe5e3b7b509

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/-/commit/8c905f7f1190aebfeb987f20d7beffe5e3b7b509
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20230304/9d7d505f/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list