[Git][debian-gis-team/netcdf4-python][upstream] New upstream version 1.6.2

Bas Couwenberg (@sebastic) gitlab at salsa.debian.org
Thu Nov 17 05:50:46 GMT 2022



Bas Couwenberg pushed to branch upstream at Debian GIS Project / netcdf4-python


Commits:
fc0ecaaa by Bas Couwenberg at 2022-11-17T05:50:57+01:00
New upstream version 1.6.2
- - - - -


15 changed files:

- .github/workflows/build.yml → .github/workflows/build_latest.yml
- .github/workflows/build_master.yml
- + .github/workflows/build_old.yml
- .github/workflows/miniconda.yml
- Changelog
- README.md
- include/netCDF4.pxi
- + pyproject.toml
- setup.py
- src/netCDF4/__init__.py
- src/netCDF4/_netCDF4.pyx
- src/netCDF4/utils.py
- test/tst_alignment.py
- test/tst_atts.py
- test/tst_fancyslicing.py


Changes:

=====================================
.github/workflows/build.yml → .github/workflows/build_latest.yml
=====================================
@@ -1,4 +1,4 @@
-name: Build and Test Linux
+name: Build and Test Linux with latest netcdf-c
 on: [push, pull_request]
 jobs:
   build-linux:
@@ -13,7 +13,7 @@ jobs:
       #NO_NET: 1
     strategy:
       matrix:
-        python-version: ["3.9"]
+        python-version: ["3.10"]
     steps:
 
     - uses: actions/checkout at v2
@@ -36,7 +36,7 @@ jobs:
         make install
         popd
         echo "Download and build netCDF version ${NETCDF_VERSION}"
-        wget https://downloads.unidata.ucar.edu/netcdf-c/4.9.0/netcdf-c-${NETCDF_VERSION}.tar.gz
+        wget https://downloads.unidata.ucar.edu/netcdf-c/${NETCDF_VERSION}/netcdf-c-${NETCDF_VERSION}.tar.gz
         tar -xzf netcdf-c-${NETCDF_VERSION}.tar.gz
         pushd netcdf-c-${NETCDF_VERSION}
         export CPPFLAGS="-I/usr/include/hdf5/mpich -I${NETCDF_DIR}/include"
@@ -94,11 +94,11 @@ jobs:
           echo "pnetcdf mpi test passed!"
         fi
 
-    - name: Tarball
-      run: |
-        export PATH=${NETCDF_DIR}/bin:${PATH} 
-        python setup.py --version  
-        check-manifest --version
-        check-manifest --verbose 
-        pip wheel . -w dist --no-deps 
-        twine check dist/* 
+#   - name: Tarball
+#     run: |
+#       export PATH=${NETCDF_DIR}/bin:${PATH} 
+#       python setup.py --version  
+#       check-manifest --version
+#       check-manifest --verbose 
+#       pip wheel . -w dist --no-deps 
+#       twine check dist/* 


=====================================
.github/workflows/build_master.yml
=====================================
@@ -10,7 +10,7 @@ jobs:
       #NO_NET: 1
     strategy:
       matrix:
-        python-version: ["3.9"]
+        python-version: ["3.10"]
     steps:
 
     - uses: actions/checkout at v2


=====================================
.github/workflows/build_old.yml
=====================================
@@ -0,0 +1,104 @@
+name: Build and Test Linux with older netcdf-c
+on: [push, pull_request]
+jobs:
+  build-linux:
+    name: Python (${{ matrix.python-version }})
+    runs-on: ubuntu-latest
+    env:
+      PNETCDF_VERSION: 1.12.1
+      NETCDF_VERSION: 4.8.1
+      NETCDF_DIR: ${{ github.workspace }}/..
+      NETCDF_EXTRA_CONFIG: --enable-pnetcdf
+      CC: mpicc.mpich
+      #NO_NET: 1
+    strategy:
+      matrix:
+        python-version: ["3.10"]
+    steps:
+
+    - uses: actions/checkout at v2
+
+    - name: Set up Python ${{ matrix.python-version }}
+      uses: actions/setup-python at v2
+      with:
+        python-version: ${{ matrix.python-version }}
+
+    - name: Install Ubuntu Dependencies
+      run: |
+        sudo apt-get update
+        sudo apt-get install mpich libmpich-dev libhdf5-mpich-dev libcurl4-openssl-dev bzip2 libsnappy-dev libblosc-dev libzstd-dev
+        echo "Download and build PnetCDF version ${PNETCDF_VERSION}"
+        wget https://parallel-netcdf.github.io/Release/pnetcdf-${PNETCDF_VERSION}.tar.gz
+        tar -xzf pnetcdf-${PNETCDF_VERSION}.tar.gz
+        pushd pnetcdf-${PNETCDF_VERSION}
+        ./configure --prefix $NETCDF_DIR --enable-shared --disable-fortran --disable-cxx
+        make -j 2
+        make install
+        popd
+        echo "Download and build netCDF version ${NETCDF_VERSION}"
+        wget https://downloads.unidata.ucar.edu/netcdf-c/${NETCDF_VERSION}/netcdf-c-${NETCDF_VERSION}.tar.gz
+        tar -xzf netcdf-c-${NETCDF_VERSION}.tar.gz
+        pushd netcdf-c-${NETCDF_VERSION}
+        export CPPFLAGS="-I/usr/include/hdf5/mpich -I${NETCDF_DIR}/include"
+        export LDFLAGS="-L${NETCDF_DIR}/lib"
+        export LIBS="-lhdf5_mpich_hl -lhdf5_mpich -lm -lz"
+        ./configure --prefix $NETCDF_DIR --enable-netcdf-4 --enable-shared --enable-dap --enable-parallel4 $NETCDF_EXTRA_CONFIG
+        make -j 2
+        make install
+        popd
+
+#   - name: The job has failed
+#     if: ${{ failure() }}
+#     run: |
+#       cd netcdf-c-${NETCDF_VERSION}
+#       cat config.log 
+
+    - name: Install python dependencies via pip
+      run: |
+        python -m pip install --upgrade pip
+        pip install numpy cython cftime pytest twine wheel check-manifest mpi4py
+
+    - name: Install netcdf4-python
+      run: |
+        export PATH=${NETCDF_DIR}/bin:${PATH} 
+        export NETCDF_PLUGIN_DIR=${{ github.workspace }}/netcdf-c-${NETCDF_VERSION}/plugins/plugindir
+        python setup.py install
+    - name: Test
+      run: |
+        export PATH=${NETCDF_DIR}/bin:${PATH} 
+        python checkversion.py
+        # serial
+        cd test
+        python run_all.py
+        # parallel (hdf5 for netcdf4, pnetcdf for netcdf3)
+        cd ../examples
+        mpirun.mpich -np 4 python mpi_example.py
+        if [ $? -ne 0 ] ; then
+          echo "hdf5 mpi test failed!"
+          exit 1
+        else
+          echo "hdf5 mpi test passed!"
+        fi
+        mpirun.mpich -np 4 python mpi_example_compressed.py
+        if [ $? -ne 0 ] ; then
+          echo "hdf5 compressed mpi test failed!"
+          exit 1
+        else
+          echo "hdf5 compressed mpi test passed!"
+        fi
+        mpirun.mpich -np 4 python mpi_example.py NETCDF3_64BIT_DATA
+        if [ $? -ne 0 ] ; then
+          echo "pnetcdf mpi test failed!"
+          exit 1
+        else
+          echo "pnetcdf mpi test passed!"
+        fi
+
+#   - name: Tarball
+#     run: |
+#       export PATH=${NETCDF_DIR}/bin:${PATH} 
+#       python setup.py --version  
+#       check-manifest --version
+#       check-manifest --verbose 
+#       pip wheel . -w dist --no-deps 
+#       twine check dist/* 


=====================================
.github/workflows/miniconda.yml
=====================================
@@ -12,7 +12,7 @@ jobs:
     #  NO_NET: 1
     strategy:
       matrix:
-        python-version: ["3.6", "3.7", "3.8", "3.9", "3.10" ]
+        python-version: [ "3.7", "3.8", "3.9", "3.10", "3.11" ]
         os: [windows-latest, ubuntu-latest, macos-latest]
         platform: [x64, x32]
         exclude:
@@ -34,7 +34,7 @@ jobs:
         micromamba create --name TEST python=${{ matrix.python-version }} numpy cython pip pytest hdf5 libnetcdf cftime zlib --channel conda-forge
         micromamba activate TEST
         export PATH="${CONDA_PREFIX}/bin:${CONDA_PREFIX}/Library/bin:$PATH" # so setup.py finds nc-config
-        pip install -e . --no-deps --force-reinstall
+        pip install -v -e . --no-deps --force-reinstall
 
     - name: Debug conda
       shell: bash -l {0}
@@ -53,7 +53,7 @@ jobs:
     runs-on: ${{ matrix.os }}
     strategy:
       matrix:
-        python-version: [ "3.9" ]
+        python-version: [ "3.10" ]
         os: [ubuntu-latest]
         platform: [x64]
     steps:
@@ -70,7 +70,8 @@ jobs:
         micromamba create --name TEST python=${{ matrix.python-version }} numpy cython pip pytest mpi4py hdf5=*=mpi* libnetcdf=*=mpi* cftime zlib --channel conda-forge
         micromamba activate TEST
         export PATH="${CONDA_PREFIX}/bin:${CONDA_PREFIX}/Library/bin:$PATH" # so setup.py finds nc-config
-        pip install -e . --no-deps --force-reinstall
+        nc-config --all
+        pip install -v -e . --no-build-isolation --no-deps --force-reinstall
 
     - name: Debug conda
       shell: bash -l {0}
@@ -88,8 +89,8 @@ jobs:
         export PATH="${CONDA_PREFIX}/bin:${CONDA_PREFIX}/Library/bin:$PATH" 
         which mpirun
         mpirun --version
-        #mpirun -np 4 --oversubscribe python mpi_example.py # for openmpi
-        mpirun -np 4 python mpi_example.py
+        mpirun -np 4 --oversubscribe python mpi_example.py # for openmpi
+        #mpirun -np 4 python mpi_example.py
         if [ $? -ne 0 ] ; then
           echo "hdf5 mpi test failed!"
           exit 1


=====================================
Changelog
=====================================
@@ -1,3 +1,12 @@
+ version 1.6.2 (tag v1.6.2rel)
+==============================
+ * Added ``netCDF4.__has_set_alignment__`` property to help identify if the
+   underlying netcdf4 supports setting the HDF5 alignment.
+ * Slicing multi-dimensional variables with an all False boolean index array
+   now returns an empty numpy array (instead of raising an exception - issue #1197).
+   Behavior now consistent with numpy slicing.
+ * fix problem with compiling using netcdf-c < 4.9.0 (issue #1209)
+
  version 1.6.1 (tag v1.6.1rel)
 ==============================
  * add Dataset methods has_<name>_filter (where <name>=zstd,blosc,bzip2,szip)


=====================================
README.md
=====================================
@@ -10,6 +10,11 @@
 ## News
 For details on the latest updates, see the [Changelog](https://github.com/Unidata/netcdf4-python/blob/master/Changelog).
 
+11/15/2022:  Version [1.6.2](https://pypi.python.org/pypi/netCDF4/1.6.2) released. Fix for
+compilation with netcdf-c < 4.9.0 (issue [#1209](https://github.com/Unidata/netcdf4-python/issues/1209)).  
+Slicing multi-dimensional variables with an all False boolean index array
+now returns an empty numpy array (instead of raising an exception - issue [#1197](https://github.com/Unidata/netcdf4-python/issues/1197)).
+
 09/18/2022:  Version [1.6.1](https://pypi.python.org/pypi/netCDF4/1.6.1) released.  GIL now
 released for all C lib calls, `set_alignment` and `get_alignment` module functions
 added to modify/retrieve HDF5 data alignment properties. Added `Dataset` methods to 


=====================================
include/netCDF4.pxi
=====================================
@@ -367,7 +367,6 @@ cdef extern from "netcdf.h":
 
     int nc_inq_enum_ident(int ncid, nc_type xtype, long long value, char *identifier) nogil
 
-
 IF HAS_QUANTIZATION_SUPPORT:
     cdef extern from "netcdf.h":
         cdef enum:
@@ -377,6 +376,8 @@ IF HAS_QUANTIZATION_SUPPORT:
             NC_QUANTIZE_BITROUND
         int nc_def_var_quantize(int ncid, int varid, int quantize_mode, int nsd) nogil
         int nc_inq_var_quantize(int ncid, int varid, int *quantize_modep, int *nsdp) nogil
+
+IF HAS_NCFILTER:
     cdef extern from "netcdf_filter.h":
         int nc_inq_filter_avail(int ncid, unsigned filterid) nogil
 
@@ -395,7 +396,6 @@ IF HAS_ZSTANDARD_SUPPORT:
             H5Z_FILTER_ZSTD
         int nc_def_var_zstandard(int ncid, int varid, int level) nogil
         int nc_inq_var_zstandard(int ncid, int varid, int* hasfilterp, int *levelp) nogil
-        int nc_inq_filter_avail(int ncid, unsigned id) nogil
 
 IF HAS_BZIP2_SUPPORT:
     cdef extern from "netcdf_filter.h":


=====================================
pyproject.toml
=====================================
@@ -0,0 +1,3 @@
+[build-system]
+requires = ["setuptools>=41.2", "cython>=0.19", "oldest-supported-numpy"]
+build-backend = "setuptools.build_meta"


=====================================
setup.py
=====================================
@@ -3,11 +3,10 @@ import os.path as osp
 import shutil
 import configparser
 from setuptools import setup, Extension, find_namespace_packages
-from distutils.dist import Distribution
+from setuptools.dist import Distribution
 
 setuptools_extra_kwargs = {
     "install_requires": ["numpy>=1.9","cftime"],
-    "setup_requires": ['setuptools>=18.0', "cython>=0.19"],
     "entry_points": {
         'console_scripts': [
             'ncinfo = netCDF4.utils:ncinfo',
@@ -71,6 +70,7 @@ def check_api(inc_dirs,netcdf_lib_version):
     has_zstandard = False
     has_bzip2 = False
     has_blosc = False
+    has_ncfilter = False
     has_set_alignment = False
 
     for d in inc_dirs:
@@ -117,6 +117,8 @@ def check_api(inc_dirs,netcdf_lib_version):
                     has_bzip2 = True
                 if line.startswith('EXTERNL int nc_def_var_blosc'):
                     has_blosc = True
+                if line.startswith('EXTERNL int nc_inq_filter_avail'):
+                    has_ncfilter = True
 
         ncmetapath = os.path.join(d,'netcdf_meta.h')
         if os.path.exists(ncmetapath):
@@ -144,7 +146,7 @@ def check_api(inc_dirs,netcdf_lib_version):
     return has_rename_grp, has_nc_inq_path, has_nc_inq_format_extended, \
            has_cdf5_format, has_nc_open_mem, has_nc_create_mem, \
            has_parallel4_support, has_pnetcdf_support, has_szip_support, has_quantize, \
-           has_zstandard, has_bzip2, has_blosc, has_set_alignment
+           has_zstandard, has_bzip2, has_blosc, has_set_alignment, has_ncfilter
 
 
 def getnetcdfvers(libdirs):
@@ -558,7 +560,7 @@ if 'sdist' not in sys.argv[1:] and 'clean' not in sys.argv[1:] and '--version' n
     has_rename_grp, has_nc_inq_path, has_nc_inq_format_extended, \
     has_cdf5_format, has_nc_open_mem, has_nc_create_mem, \
     has_parallel4_support, has_pnetcdf_support, has_szip_support, has_quantize, \
-    has_zstandard, has_bzip2, has_blosc, has_set_alignment = \
+    has_zstandard, has_bzip2, has_blosc, has_set_alignment, has_ncfilter = \
     check_api(inc_dirs,netcdf_lib_version)
     # for netcdf 4.4.x CDF5 format is always enabled.
     if netcdf_lib_version is not None and\
@@ -566,11 +568,12 @@ if 'sdist' not in sys.argv[1:] and 'clean' not in sys.argv[1:] and '--version' n
         has_cdf5_format = True
 
     # disable parallel support if mpi4py not available.
-    try:
-        import mpi4py
-    except ImportError:
-        has_parallel4_support = False
-        has_pnetcdf_support = False
+    #try:
+    #    import mpi4py
+    #except ImportError:
+    #    f.write('disabling mpi parallel support because mpi4py not found\n')
+    #    has_parallel4_support = False
+    #    has_pnetcdf_support = False
 
     f = open(osp.join('include', 'constants.pyx'), 'w')
     if has_rename_grp:
@@ -672,9 +675,17 @@ if 'sdist' not in sys.argv[1:] and 'clean' not in sys.argv[1:] and '--version' n
         sys.stdout.write('netcdf lib does not have nc_set_alignment function\n')
         f.write('DEF HAS_SET_ALIGNMENT = 0\n')
 
+    if has_ncfilter:
+        sys.stdout.write('netcdf lib has nc_inq_filter_avail function\n')
+        f.write('DEF HAS_NCFILTER = 1\n')
+    else:
+        sys.stdout.write('netcdf lib does not have nc_inq_filter_avail function\n')
+        f.write('DEF HAS_NCFILTER = 0\n')
+
     f.close()
 
     if has_parallel4_support or has_pnetcdf_support:
+        import mpi4py
         inc_dirs.append(mpi4py.get_include())
         # mpi_incdir should not be needed if using nc-config
         # (should be included in nc-config --cflags)
@@ -742,6 +753,7 @@ setup(name="netCDF4",
       package_dir={'':'src'},
       package_data={"netCDF4.plugins": ["lib__nc*"]},
       ext_modules=ext_modules,
+      python_requires=">=3.6",
       **setuptools_extra_kwargs)
 
 # remove plugin files copied from outside source tree


=====================================
src/netCDF4/__init__.py
=====================================
@@ -9,7 +9,8 @@ from ._netCDF4 import (__version__, __netcdf4libversion__, __hdf5libversion__,
                        __has_nc_create_mem__, __has_cdf5_format__,
                        __has_parallel4_support__, __has_pnetcdf_support__,
                        __has_quantization_support__, __has_zstandard_support__,
-                       __has_bzip2_support__, __has_blosc_support__, __has_szip_support__)
+                       __has_bzip2_support__, __has_blosc_support__, __has_szip_support__,
+                       __has_set_alignment__)
 import os
 __all__ =\
 ['Dataset','Variable','Dimension','Group','MFDataset','MFTime','CompoundType','VLType','date2num','num2date','date2index','stringtochar','chartostring','stringtoarr','getlibversion','EnumType','get_chunk_cache','set_chunk_cache','set_alignment','get_alignment']


=====================================
src/netCDF4/_netCDF4.pyx
=====================================
@@ -1,5 +1,5 @@
 """
-Version 1.6.1
+Version 1.6.2
 -------------
 
 # Introduction
@@ -1230,7 +1230,7 @@ if sys.version_info[0:2] < (3, 7):
     # Python 3.7+ guarantees order; older versions need OrderedDict
     from collections import OrderedDict
 
-__version__ = "1.6.1"
+__version__ = "1.6.2"
 
 # Initialize numpy
 import posixpath
@@ -3543,15 +3543,21 @@ returns True if bzip2 compression filter is available"""
 **`has_szip_filter(self)`**
 returns True if szip compression filter is available"""
         cdef int ierr
-        IF HAS_SZIP_SUPPORT:
-            with nogil:
-                ierr = nc_inq_filter_avail(self._grpid, H5Z_FILTER_SZIP)
-            if ierr:
+        IF HAS_NCFILTER:
+            IF HAS_SZIP_SUPPORT:
+                with nogil:
+                    ierr = nc_inq_filter_avail(self._grpid, H5Z_FILTER_SZIP)
+                if ierr:
+                    return False
+                else:
+                    return True
+            ELSE:
                 return False
-            else:
-                return True
         ELSE:
-            return False
+             IF HAS_SZIP_SUPPORT:
+                 return True
+             ELSE:
+                 return False
 
 cdef class Group(Dataset):
     """
@@ -4953,7 +4959,7 @@ rename a `Variable` attribute named `oldname` to `newname`."""
         # put_ind for this dimension is set to -1 by _StartCountStride.
         squeeze = data.ndim * [slice(None),]
         for i,n in enumerate(put_ind.shape[:-1]):
-            if n == 1 and put_ind[...,i].ravel()[0] == -1:
+            if n == 1 and put_ind.size > 0 and put_ind[...,i].ravel()[0] == -1:
                 squeeze[i] = 0
 
         # Reshape the arrays so we can iterate over them.


=====================================
src/netCDF4/utils.py
=====================================
@@ -457,7 +457,7 @@ def _out_array_shape(count):
     out = []
 
     for i, n in enumerate(s):
-        if n == 1:
+        if n == 1 and count.size > 0:
             c = count[..., i].ravel()[0] # All elements should be identical.
             out.append(c)
         else:


=====================================
test/tst_alignment.py
=====================================
@@ -1,5 +1,6 @@
 import numpy as np
 from netCDF4 import set_alignment, get_alignment, Dataset
+from netCDF4 import __has_set_alignment__
 import netCDF4
 import os
 import subprocess
@@ -23,6 +24,7 @@ file_name = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name
 
 class AlignmentTestCase(unittest.TestCase):
     def setUp(self):
+
         self.file = file_name
 
         # This is a global variable in netcdf4, it must be set before File
@@ -57,6 +59,10 @@ class AlignmentTestCase(unittest.TestCase):
             with self.assertRaises(RuntimeError):
                 get_alignment()
 
+    def test_reports_alignment_capabilities(self):
+        # Assert that the library reports that it supports alignment correctly
+        assert has_alignment == __has_set_alignment__
+
     # if we have no support for alignment, we have no guarantees on
     # how the data can be aligned
     @unittest.skipIf(


=====================================
test/tst_atts.py
=====================================
@@ -40,96 +40,95 @@ class VariablesTestCase(unittest.TestCase):
 
     def setUp(self):
         self.file = FILE_NAME
-        f = netCDF4.Dataset(self.file,'w')
-        # try to set a dataset attribute with one of the reserved names.
-        f.setncattr('file_format','netcdf4_format')
-        # test attribute renaming
-        f.stratt_tmp = STRATT
-        f.renameAttribute('stratt_tmp','stratt')
-        f.emptystratt = EMPTYSTRATT
-        f.intatt = INTATT
-        f.floatatt = FLOATATT
-        f.seqatt = SEQATT
-        # sequences of strings converted to a single string.
-        f.stringseqatt = STRINGSEQATT
-        f.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
-        g = f.createGroup(GROUP_NAME)
-        f.createDimension(DIM1_NAME, DIM1_LEN)
-        f.createDimension(DIM2_NAME, DIM2_LEN)
-        f.createDimension(DIM3_NAME, DIM3_LEN)
-        g.createDimension(DIM1_NAME, DIM1_LEN)
-        g.createDimension(DIM2_NAME, DIM2_LEN)
-        g.createDimension(DIM3_NAME, DIM3_LEN)
-        g.stratt_tmp = STRATT
-        g.renameAttribute('stratt_tmp','stratt')
-        g.emptystratt = EMPTYSTRATT
-        g.intatt = INTATT
-        g.floatatt = FLOATATT
-        g.seqatt = SEQATT
-        g.stringseqatt = STRINGSEQATT
-        if netCDF4.__version__ > "1.4.2":
-            with self.assertRaises(ValueError):
-                g.arrayatt = [[1, 2], [3, 4]] # issue #841
-        g.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
-        v = f.createVariable(VAR_NAME, 'f8',(DIM1_NAME,DIM2_NAME,DIM3_NAME))
-        # try to set a variable attribute with one of the reserved names..
-        v.setncattr('ndim','three')
-        v.setncatts({'foo': 1})
-        v.setncatts(OrderedDict(bar=2))
-        v.stratt_tmp = STRATT
-        v.renameAttribute('stratt_tmp','stratt')
-        v.emptystratt = EMPTYSTRATT
-        v.intatt = INTATT
-        v.floatatt = FLOATATT
-        v.seqatt = SEQATT
-        v.stringseqatt = STRINGSEQATT
-        v.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
-        v1 = g.createVariable(VAR_NAME, 'f8',(DIM1_NAME,DIM2_NAME,DIM3_NAME))
-        v1.stratt = STRATT
-        v1.emptystratt = EMPTYSTRATT
-        v1.intatt = INTATT
-        v1.floatatt = FLOATATT
-        v1.seqatt = SEQATT
-        v1.stringseqatt = STRINGSEQATT
-        v1.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
-        # issue #959: should not be able to set _FillValue after var creation
-        try:
-            v1._FillValue(-999.)
-        except AttributeError:
-            pass
-        else:
-            raise ValueError('This test should have failed.')
-        try:
-            v1.setncattr('_FillValue',-999.)
-        except AttributeError:
-            pass
-        else:
-            raise ValueError('This test should have failed.')
-        # issue #485 (triggers segfault in C lib
-        # with version 1.2.1 without pull request #486)
-        f.foo = np.array('bar','S')
-        f.foo = np.array('bar','U')
-        # issue #529 write string attribute as NC_CHAR unless
-        # it can't be decoded to ascii.  Add setncattr_string
-        # method to force NC_STRING.
-        f.charatt = 'foo' # will be written as NC_CHAR
-        f.setncattr_string('stringatt','bar') # NC_STRING
-        f.cafe = 'caf\xe9' # NC_STRING
-        f.batt = 'caf\xe9'.encode('utf-8') #NC_CHAR
-        v.setncattr_string('stringatt','bar') # NC_STRING
-        # issue #882 - provide an option to always string attribute
-        # as NC_STRINGs. Testing various approaches to setting text attributes...
-        f.set_ncstring_attrs(True)
-        f.stringatt_ncstr = 'foo' # will now be written as NC_STRING
-        f.setncattr_string('stringatt_ncstr','bar') # NC_STRING anyway
-        f.caf_ncstr = 'caf\xe9' # NC_STRING anyway
-        f.bat_ncstr = 'caf\xe9'.encode('utf-8') # now NC_STRING
-        g.stratt_ncstr = STRATT # now NC_STRING
-        #g.renameAttribute('stratt_tmp','stratt_ncstr')
-        v.setncattr_string('stringatt_ncstr','bar') # NC_STRING anyway
-        v.stratt_ncstr = STRATT
-        v1.emptystratt_ncstr = EMPTYSTRATT
-        f.close()
+        with netCDF4.Dataset(self.file,'w') as f:
+            # try to set a dataset attribute with one of the reserved names.
+            f.setncattr('file_format','netcdf4_format')
+            # test attribute renaming
+            f.stratt_tmp = STRATT
+            f.renameAttribute('stratt_tmp','stratt')
+            f.emptystratt = EMPTYSTRATT
+            f.intatt = INTATT
+            f.floatatt = FLOATATT
+            f.seqatt = SEQATT
+            # sequences of strings converted to a single string.
+            f.stringseqatt = STRINGSEQATT
+            f.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
+            g = f.createGroup(GROUP_NAME)
+            f.createDimension(DIM1_NAME, DIM1_LEN)
+            f.createDimension(DIM2_NAME, DIM2_LEN)
+            f.createDimension(DIM3_NAME, DIM3_LEN)
+            g.createDimension(DIM1_NAME, DIM1_LEN)
+            g.createDimension(DIM2_NAME, DIM2_LEN)
+            g.createDimension(DIM3_NAME, DIM3_LEN)
+            g.stratt_tmp = STRATT
+            g.renameAttribute('stratt_tmp','stratt')
+            g.emptystratt = EMPTYSTRATT
+            g.intatt = INTATT
+            g.floatatt = FLOATATT
+            g.seqatt = SEQATT
+            g.stringseqatt = STRINGSEQATT
+            if netCDF4.__version__ > "1.4.2":
+                with self.assertRaises(ValueError):
+                    g.arrayatt = [[1, 2], [3, 4]] # issue #841
+            g.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
+            v = f.createVariable(VAR_NAME, 'f8',(DIM1_NAME,DIM2_NAME,DIM3_NAME))
+            # try to set a variable attribute with one of the reserved names.
+            v.setncattr('ndim','three')
+            v.setncatts({'foo': 1})
+            v.setncatts(OrderedDict(bar=2))
+            v.stratt_tmp = STRATT
+            v.renameAttribute('stratt_tmp','stratt')
+            v.emptystratt = EMPTYSTRATT
+            v.intatt = INTATT
+            v.floatatt = FLOATATT
+            v.seqatt = SEQATT
+            v.stringseqatt = STRINGSEQATT
+            v.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
+            v1 = g.createVariable(VAR_NAME, 'f8',(DIM1_NAME,DIM2_NAME,DIM3_NAME))
+            v1.stratt = STRATT
+            v1.emptystratt = EMPTYSTRATT
+            v1.intatt = INTATT
+            v1.floatatt = FLOATATT
+            v1.seqatt = SEQATT
+            v1.stringseqatt = STRINGSEQATT
+            v1.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING
+            # issue #959: should not be able to set _FillValue after var creation
+            try:
+                v1._FillValue(-999.)
+            except AttributeError:
+                pass
+            else:
+                raise ValueError('This test should have failed.')
+            try:
+                v1.setncattr('_FillValue',-999.)
+            except AttributeError:
+                pass
+            else:
+                raise ValueError('This test should have failed.')
+            # issue #485 (triggers segfault in C lib
+            # with version 1.2.1 without pull request #486)
+            f.foo = np.array('bar','S')
+            f.foo = np.array('bar','U')
+            # issue #529 write string attribute as NC_CHAR unless
+            # it can't be decoded to ascii.  Add setncattr_string
+            # method to force NC_STRING.
+            f.charatt = 'foo' # will be written as NC_CHAR
+            f.setncattr_string('stringatt','bar') # NC_STRING
+            f.cafe = 'caf\xe9' # NC_STRING
+            f.batt = 'caf\xe9'.encode('utf-8') #NC_CHAR
+            v.setncattr_string('stringatt','bar') # NC_STRING
+            # issue #882 - provide an option to always string attribute
+            # as NC_STRINGs. Testing various approaches to setting text attributes...
+            f.set_ncstring_attrs(True)
+            f.stringatt_ncstr = 'foo' # will now be written as NC_STRING
+            f.setncattr_string('stringatt_ncstr','bar') # NC_STRING anyway
+            f.caf_ncstr = 'caf\xe9' # NC_STRING anyway
+            f.bat_ncstr = 'caf\xe9'.encode('utf-8') # now NC_STRING
+            g.stratt_ncstr = STRATT # now NC_STRING
+            #g.renameAttribute('stratt_tmp','stratt_ncstr')
+            v.setncattr_string('stringatt_ncstr','bar') # NC_STRING anyway
+            v.stratt_ncstr = STRATT
+            v1.emptystratt_ncstr = EMPTYSTRATT
 
     def tearDown(self):
         # Remove the temporary files
@@ -138,87 +137,87 @@ class VariablesTestCase(unittest.TestCase):
 
     def runTest(self):
         """testing attributes"""
-        f  = netCDF4.Dataset(self.file, 'r')
-        v = f.variables[VAR_NAME]
-        g = f.groups[GROUP_NAME]
-        v1 = g.variables[VAR_NAME]
-        # check attributes in root group.
-        # global attributes.
-        # check __dict__ method for accessing all netCDF attributes.
-        for key,val in ATTDICT.items():
-            if type(val) == np.ndarray:
-                assert f.__dict__[key].tolist() == val.tolist()
-            else:
-                assert f.__dict__[key] == val
-        # check accessing individual attributes.
-        assert f.intatt == INTATT
-        assert f.floatatt == FLOATATT
-        assert f.stratt == STRATT
-        assert f.emptystratt == EMPTYSTRATT
-        assert f.seqatt.tolist() == SEQATT.tolist()
-        #assert f.stringseqatt == ''.join(STRINGSEQATT) # issue 770
-        assert f.stringseqatt == STRINGSEQATT
-        assert f.stringseqatt_array == STRINGSEQATT
-        assert f.getncattr('file_format') == 'netcdf4_format'
-        # variable attributes.
-        # check __dict__ method for accessing all netCDF attributes.
-        for key,val in ATTDICT.items():
-            if type(val) == np.ndarray:
-                assert v.__dict__[key].tolist() == val.tolist()
-            else:
-                assert v.__dict__[key] == val
-        # check accessing individual attributes.
-        assert v.intatt == INTATT
-        assert v.floatatt == FLOATATT
-        assert v.stratt == STRATT
-        assert v.seqatt.tolist() == SEQATT.tolist()
-        #assert v.stringseqatt == ''.join(STRINGSEQATT) # issue 770
-        assert v.stringseqatt == STRINGSEQATT
-        assert v.stringseqatt_array == STRINGSEQATT
-        assert v.getncattr('ndim') == 'three'
-        assert v.getncattr('foo') == 1
-        assert v.getncattr('bar') == 2
-        # check type of attributes using ncdump (issue #529)
-        if not os.getenv('NO_CDL'):
-            ncdump_output = f.tocdl()
-            for line in ncdump_output:
-                line = line.strip('\t\n\r')
-                line = line.strip()# Must be done another time for group variables
-                if "stringatt" in line: assert line.startswith('string')
-                if "charatt" in line: assert line.startswith(':')
-                if "cafe" in line: assert line.startswith('string')
-                if "batt" in line: assert line.startswith(':')
-                if "_ncstr" in line: assert line.startswith('string')
-        # check attributes in subgroup.
-        # global attributes.
-        for key,val in ATTDICT.items():
-            if type(val) == np.ndarray:
-                assert g.__dict__[key].tolist() == val.tolist()
-            else:
-                assert g.__dict__[key] == val
-        assert g.intatt == INTATT
-        assert g.floatatt == FLOATATT
-        assert g.stratt == STRATT
-        assert g.emptystratt == EMPTYSTRATT
-        assert g.seqatt.tolist() == SEQATT.tolist()
-        #assert g.stringseqatt == ''.join(STRINGSEQATT) # issue 770
-        assert g.stringseqatt == STRINGSEQATT
-        assert g.stringseqatt_array == STRINGSEQATT
-        for key,val in ATTDICT.items():
-            if type(val) == np.ndarray:
-                assert v1.__dict__[key].tolist() == val.tolist()
-            else:
-                assert v1.__dict__[key] == val
-        assert v1.intatt == INTATT
-        assert v1.floatatt == FLOATATT
-        assert v1.stratt == STRATT
-        assert v1.emptystratt == EMPTYSTRATT
-        assert v1.seqatt.tolist() == SEQATT.tolist()
-        #assert v1.stringseqatt == ''.join(STRINGSEQATT) # issue 770
-        assert v1.stringseqatt == STRINGSEQATT
-        assert v1.stringseqatt_array == STRINGSEQATT
-        assert getattr(v1,'nonexistantatt',None) == None
-        f.close()
+        with netCDF4.Dataset(self.file, 'r') as f:
+            v = f.variables[VAR_NAME]
+            g = f.groups[GROUP_NAME]
+            v1 = g.variables[VAR_NAME]
+            # check attributes in root group.
+            # global attributes.
+            # check __dict__ method for accessing all netCDF attributes.
+            for key,val in ATTDICT.items():
+                if type(val) == np.ndarray:
+                    assert f.__dict__[key].tolist() == val.tolist()
+                else:
+                    assert f.__dict__[key] == val
+            # check accessing individual attributes.
+            assert f.intatt == INTATT
+            assert f.floatatt == FLOATATT
+            assert f.stratt == STRATT
+            assert f.emptystratt == EMPTYSTRATT
+            assert f.seqatt.tolist() == SEQATT.tolist()
+            #assert f.stringseqatt == ''.join(STRINGSEQATT) # issue 770
+            assert f.stringseqatt == STRINGSEQATT
+            assert f.stringseqatt_array == STRINGSEQATT
+            assert f.getncattr('file_format') == 'netcdf4_format'
+            # variable attributes.
+            # check __dict__ method for accessing all netCDF attributes.
+            for key,val in ATTDICT.items():
+                if type(val) == np.ndarray:
+                    assert v.__dict__[key].tolist() == val.tolist()
+                else:
+                    assert v.__dict__[key] == val
+            # check accessing individual attributes.
+            assert v.intatt == INTATT
+            assert v.floatatt == FLOATATT
+            assert v.stratt == STRATT
+            assert v.seqatt.tolist() == SEQATT.tolist()
+            #assert v.stringseqatt == ''.join(STRINGSEQATT) # issue 770
+            assert v.stringseqatt == STRINGSEQATT
+            assert v.stringseqatt_array == STRINGSEQATT
+            assert v.getncattr('ndim') == 'three'
+            assert v.getncattr('foo') == 1
+            assert v.getncattr('bar') == 2
+            # check type of attributes using ncdump (issue #529)
+            if not os.getenv('NO_CDL'):
+                ncdump_output = f.tocdl()
+                for line in ncdump_output:
+                    line = line.strip('\t\n\r')
+                    line = line.strip()# Must be done another time for group variables
+                    if "stringatt" in line: assert line.startswith('string')
+                    if "charatt" in line: assert line.startswith(':')
+                    if "cafe" in line: assert line.startswith('string')
+                    if "batt" in line: assert line.startswith(':')
+                    if "_ncstr" in line: assert line.startswith('string')
+            # check attributes in subgroup.
+            # global attributes.
+            for key,val in ATTDICT.items():
+                if type(val) == np.ndarray:
+                    assert g.__dict__[key].tolist() == val.tolist()
+                else:
+                    assert g.__dict__[key] == val
+            assert g.intatt == INTATT
+            assert g.floatatt == FLOATATT
+            assert g.stratt == STRATT
+            assert g.emptystratt == EMPTYSTRATT
+            assert g.seqatt.tolist() == SEQATT.tolist()
+            #assert g.stringseqatt == ''.join(STRINGSEQATT) # issue 770
+            assert g.stringseqatt == STRINGSEQATT
+            assert g.stringseqatt_array == STRINGSEQATT
+            for key,val in ATTDICT.items():
+                if type(val) == np.ndarray:
+                    assert v1.__dict__[key].tolist() == val.tolist()
+                else:
+                    assert v1.__dict__[key] == val
+            assert v1.intatt == INTATT
+            assert v1.floatatt == FLOATATT
+            assert v1.stratt == STRATT
+            assert v1.emptystratt == EMPTYSTRATT
+            assert v1.seqatt.tolist() == SEQATT.tolist()
+            #assert v1.stringseqatt == ''.join(STRINGSEQATT) # issue 770
+            assert v1.stringseqatt == STRINGSEQATT
+            assert v1.stringseqatt_array == STRINGSEQATT
+            assert getattr(v1,'nonexistantatt',None) == None
+
         # issue 915 empty string attribute (ncdump reports 'NIL')
         f = netCDF4.Dataset('test_gold.nc')
         assert f['RADIANCE'].VAR_NOTES == ""


=====================================
test/tst_fancyslicing.py
=====================================
@@ -142,6 +142,11 @@ class VariablesTestCase(unittest.TestCase):
 
         assert_array_equal(v[0], self.data[0])
 
+        # slicing with all False booleans (PR #1197)
+        iby[:] = False
+        data = v[ibx,iby,ibz]
+        assert(data.size == 0)
+
         f.close()
 
     def test_set(self):



View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/-/commit/fc0ecaaad21b8b0d0b9047b2574114b09135aa34

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/-/commit/fc0ecaaad21b8b0d0b9047b2574114b09135aa34
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20221117/3a364963/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list