[Git][debian-gis-team/fiona][upstream] New upstream version 1.10.1

Bas Couwenberg (@sebastic) gitlab at salsa.debian.org
Tue Sep 17 04:04:29 BST 2024



Bas Couwenberg pushed to branch upstream at Debian GIS Project / fiona


Commits:
2fbc5f55 by Bas Couwenberg at 2024-09-17T04:57:04+02:00
New upstream version 1.10.1
- - - - -


22 changed files:

- .github/workflows/scorecard.yml
- CHANGES.txt
- Makefile
- appveyor.yml
- docs/cli.rst
- docs/conf.py
- fiona/__init__.py
- fiona/_path.py
- fiona/_vsiopener.pyx
- fiona/crs.pyx
- fiona/drvsupport.py
- fiona/env.py
- fiona/fio/collect.py
- fiona/fio/dump.py
- fiona/inspector.py
- fiona/logutils.py
- fiona/meta.py
- fiona/session.py
- fiona/vfs.py
- tests/test_bigint.py
- tests/test_crs.py
- tests/test_datetime.py


Changes:

=====================================
.github/workflows/scorecard.yml
=====================================
@@ -59,7 +59,7 @@ jobs:
       # Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF
       # format to the repository Actions tab.
       - name: "Upload artifact"
-        uses: actions/upload-artifact at 834a144ee995460fba8ed112a2fc961b36a5ec5a # v4.3.6
+        uses: actions/upload-artifact at 50769540e7f4bd5e21e526ee35c689e35e0d6874 # v4.4.0
         with:
           name: SARIF file
           path: results.sarif
@@ -67,6 +67,6 @@ jobs:
 
       # Upload the results to GitHub's code scanning dashboard.
       - name: "Upload to code-scanning"
-        uses: github/codeql-action/upload-sarif at eb055d739abdc2e8de2e5f4ba1a8b246daa779aa # v3.26.0
+        uses: github/codeql-action/upload-sarif at 4dd16135b69a43b6c8efb853346f8437d92d3c93 # v3.26.6
         with:
           sarif_file: results.sarif


=====================================
CHANGES.txt
=====================================
@@ -3,6 +3,14 @@ Changes
 
 All issue numbers are relative to https://github.com/Toblerity/Fiona/issues.
 
+1.10.1 (2024-09-16)
+-------------------
+
+Bug fixes:
+
+- Logging in the CRS class no longer tries to print representations of objects
+  that may be NULL when searching for authority matches (#1445).
+
 1.10.0 (2024-09-03)
 -------------------
 
@@ -282,7 +290,7 @@ Deprecations:
 Changes:
 
 - Fiona's FionaDeprecationWarning now sub-classes DeprecationWarning.
-- Some test modules have beeen re-formatted using black.
+- Some test modules have been re-formatted using black.
 
 New features:
 
@@ -1009,7 +1017,7 @@ can't unhear Love Coffin.
 - New BytesCollection class (#215).
 - Add GDAL's OpenFileGDB driver to registered drivers (#221).
 - Implement CLI commands as plugins (#228).
-- Raise click.abort instead of calling sys.exit, preventing suprising exits
+- Raise click.abort instead of calling sys.exit, preventing surprising exits
   (#236).
 
 1.5.1 (2015-03-19)


=====================================
Makefile
=====================================
@@ -1,5 +1,5 @@
-PYTHON_VERSION ?= 3.10
-GDAL ?= ubuntu-small-3.9.0
+PYTHON_VERSION ?= 3.12
+GDAL ?= ubuntu-small-3.9.2
 all: deps clean install test
 
 .PHONY: docs


=====================================
appveyor.yml
=====================================
@@ -7,7 +7,7 @@ environment:
 
     global:
         # SDK v7.0 MSVC Express 2008's SetEnv.cmd script will fail if the
-        # /E:ON and /V:ON options are not enabled in the batch script intepreter
+        # /E:ON and /V:ON options are not enabled in the batch script interpreter
         # See: http://stackoverflow.com/a/13751649/163740
         CMD_IN_ENV: "cmd /E:ON /V:ON /C .\\appveyor\\run_with_env.cmd"
         GDAL_HOME: "C:\\gdal"


=====================================
docs/cli.rst
=====================================
@@ -232,7 +232,7 @@ dataset using another format.
     > | fio load /tmp/test.shp --driver Shapefile
 
 This command also supports GeoJSON text sequences. RS-separated sequences will
-be detected. If you want to load LF-separated sequences, you must specfiy
+be detected. If you want to load LF-separated sequences, you must specify
 ``--x-json-seq``.
 
 .. code-block:: console


=====================================
docs/conf.py
=====================================
@@ -285,7 +285,7 @@ epub_copyright = '2011, Sean Gillies'
 # The format is a list of tuples containing the path and title.
 #epub_pre_files = []
 
-# HTML files shat should be inserted after the pages created by sphinx.
+# HTML files that should be inserted after the pages created by sphinx.
 # The format is a list of tuples containing the path and title.
 #epub_post_files = []
 


=====================================
fiona/__init__.py
=====================================
@@ -78,7 +78,7 @@ __all__ = [
     "remove",
 ]
 
-__version__ = "1.10.0"
+__version__ = "1.10.1"
 __gdal_version__ = get_gdal_release_name()
 
 gdal_version = get_gdal_version_tuple()
@@ -155,7 +155,7 @@ def open(
           'example.shp', enabled_drivers=['GeoJSON', 'ESRI Shapefile'])
 
     Some format drivers permit low-level filtering of fields. Specific
-    fields can be ommitted by using the ``ignore_fields`` parameter.
+    fields can be omitted by using the ``ignore_fields`` parameter.
     Specific fields can be selected, excluding all others, by using the
     ``include_fields`` parameter.
 


=====================================
fiona/_path.py
=====================================
@@ -16,7 +16,7 @@ import attr
 from fiona.errors import PathError
 
 # Supported URI schemes and their mapping to GDAL's VSI suffix.
-# TODO: extend for other cloud plaforms.
+# TODO: extend for other cloud platforms.
 SCHEMES = {
     'ftp': 'curl',
     'gzip': 'gzip',
@@ -34,7 +34,7 @@ SCHEMES = {
 ARCHIVESCHEMES = set
 CURLSCHEMES = set([k for k, v in SCHEMES.items() if v == 'curl'])
 
-# TODO: extend for other cloud plaforms.
+# TODO: extend for other cloud platforms.
 REMOTESCHEMES = set([k for k, v in SCHEMES.items() if v in ('curl', 's3', 'oss', 'gs', 'az',)])
 
 


=====================================
fiona/_vsiopener.pyx
=====================================
@@ -28,7 +28,7 @@ cdef str VSI_NS_ROOT = "vsifiopener"
 # the plugin to determine what "files" exist on "disk".
 # Currently the only way to "create" a file in the filesystem is to add
 # an entry to this dictionary. GDAL will then Open the path later.
-_OPENER_REGISTRY = ContextVar("opener_registery")
+_OPENER_REGISTRY = ContextVar("opener_registry")
 _OPENER_REGISTRY.set({})
 _OPEN_FILE_EXIT_STACKS = ContextVar("open_file_exit_stacks")
 _OPEN_FILE_EXIT_STACKS.set({})
@@ -188,7 +188,7 @@ cdef void* pyopener_open(
 
     This function is mandatory in the GDAL Filesystem Plugin API.
     GDAL may call this function multiple times per filename and each
-    result must be seperately seekable.
+    result must be separately seekable.
     """
     cdef FSData *fsdata = <FSData *>pUserData
     path = fsdata.path.decode("utf-8")


=====================================
fiona/crs.pyx
=====================================
@@ -487,12 +487,13 @@ cdef class CRS:
         cdef int *confidences = NULL
         cdef int num_matches = 0
         cdef int i = 0
+        cdef char *c_code = NULL
+        cdef char *c_name = NULL
 
         results = defaultdict(list)
 
         try:
             osr = exc_wrap_pointer(OSRClone(self._osr))
-
             matches = OSRFindMatches(osr, NULL, &num_matches, &confidences)
 
             for i in range(num_matches):
@@ -500,18 +501,11 @@ cdef class CRS:
                 c_code = OSRGetAuthorityCode(matches[i], NULL)
                 c_name = OSRGetAuthorityName(matches[i], NULL)
 
-                if c_code == NULL:
-                    log.debug("returned authority code was null")
-                if c_name == NULL:
-                    log.debug("returned authority name was null")
-
                 if c_code != NULL and c_name != NULL and confidence >= confidence_threshold:
-                    log.debug(
-                        "Matched. confidence=%r, c_code=%r, c_name=%r",
-                        confidence, c_code, c_name)
                     code = c_code.decode('utf-8')
                     name = c_name.decode('utf-8')
                     results[name].append(code)
+
             return results
 
         finally:
@@ -966,12 +960,13 @@ cdef class CRS:
         cdef int *confidences = NULL
         cdef int num_matches = 0
         cdef int i = 0
+        cdef char *c_code = NULL
+        cdef char *c_name = NULL
 
         results = defaultdict(list)
 
         try:
             osr = exc_wrap_pointer(OSRClone(self._osr))
-
             matches = OSRFindMatches(osr, NULL, &num_matches, &confidences)
 
             for i in range(num_matches):
@@ -979,10 +974,6 @@ cdef class CRS:
                 c_code = OSRGetAuthorityCode(matches[i], NULL)
                 c_name = OSRGetAuthorityName(matches[i], NULL)
 
-                log.debug(
-                    "Matched. confidence=%r, c_code=%r, c_name=%r",
-                    confidence, c_code, c_name)
-
                 if c_code != NULL and c_name != NULL and confidence >= confidence_threshold:
                     code = c_code.decode('utf-8')
                     name = c_name.decode('utf-8')


=====================================
fiona/drvsupport.py
=====================================
@@ -221,7 +221,7 @@ def vector_driver_extensions():
 
     extension_to_driver = {}
     for drv, modes in supported_drivers.items():
-        # update extensions based on driver suppport
+        # update extensions based on driver support
         for extension in extensions(drv) or ():
             if "w" in modes:
                 extension_to_driver[extension] = extension_to_driver.get(extension, drv)


=====================================
fiona/env.py
=====================================
@@ -540,7 +540,7 @@ def require_gdal_version(
         def some_func():
 
     calling `some_func` with a runtime version of GDAL that is < 2.2 raises a
-    GDALVersionErorr.
+    GDALVersionError.
 
     \b
         @require_gdal_version('2.2', param='foo')


=====================================
fiona/fio/collect.py
=====================================
@@ -186,7 +186,7 @@ def collect(
                 # Log error and close up the GeoJSON, leaving it
                 # more or less valid no matter what happens above.
                 logger.critical(
-                    "failed to serialize file record %d (%s), " "quiting", i, exc
+                    "failed to serialize file record %d (%s), " "quitting", i, exc
                 )
                 sink.write("]")
                 sink.write(tail)
@@ -214,7 +214,7 @@ def collect(
                     )
                 else:
                     logger.critical(
-                        "failed to serialize file record %d (%s), " "quiting",
+                        "failed to serialize file record %d (%s), " "quitting",
                         i,
                         exc,
                     )


=====================================
fiona/fio/dump.py
=====================================
@@ -132,7 +132,7 @@ def dump(
                     # Log error and close up the GeoJSON, leaving it
                     # more or less valid no matter what happens above.
                     logger.critical(
-                        "failed to serialize file record %d (%s), " "quiting", i, exc
+                        "failed to serialize file record %d (%s), " "quitting", i, exc
                     )
                     sink.write("]")
                     sink.write(tail)
@@ -165,7 +165,7 @@ def dump(
                     else:
                         logger.critical(
                             "failed to serialize file record %d (%s), "
-                            "quiting",
+                            "quitting",
                             i, exc)
                         sink.write("]")
                         sink.write(tail)


=====================================
fiona/inspector.py
=====================================
@@ -10,7 +10,7 @@ logger = logging.getLogger('fiona.inspector')
 
 
 def main(srcfile):
-    """Open a dataset in an iteractive session."""
+    """Open a dataset in an interactive session."""
     with fiona.drivers():
         with fiona.open(srcfile) as src:
             code.interact(


=====================================
fiona/logutils.py
=====================================
@@ -4,7 +4,7 @@ import logging
 
 
 class FieldSkipLogFilter(logging.Filter):
-    """Filter field skip log messges.
+    """Filter field skip log messages.
 
     At most, one message per field skipped per loop will be passed.
     """


=====================================
fiona/meta.py
=====================================
@@ -223,8 +223,8 @@ def supports_vsi(driver):
     bool
 
     """
-    virutal_io = _get_metadata_item(driver, MetadataItem.VIRTUAL_IO)
-    return virutal_io is not None and virutal_io.upper() == "YES"
+    virtual_io = _get_metadata_item(driver, MetadataItem.VIRTUAL_IO)
+    return virtual_io is not None and virtual_io.upper() == "YES"
 
 
 @require_gdal_version('2.0')


=====================================
fiona/session.py
=====================================
@@ -275,7 +275,7 @@ class AWSSession(Session):
         profile_name : str, optional
             A shared credentials profile name, as per boto3.
         endpoint_url: str, optional
-            An endpoint_url, as per GDAL's AWS_S3_ENPOINT
+            An endpoint_url, as per GDAL's AWS_S3_ENDPOINT
         requester_pays : bool, optional
             True if the requester agrees to pay transfer costs (default:
             False)
@@ -359,7 +359,7 @@ class GSSession(Session):
     """Configures access to secured resources stored in Google Cloud Storage
     """
     def __init__(self, google_application_credentials=None):
-        """Create new Google Cloude Storage session
+        """Create new Google Cloud Storage session
 
         Parameters
         ----------


=====================================
fiona/vfs.py
=====================================
@@ -6,7 +6,7 @@ from urllib.parse import urlparse
 
 
 # Supported URI schemes and their mapping to GDAL's VSI suffix.
-# TODO: extend for other cloud plaforms.
+# TODO: extend for other cloud platforms.
 SCHEMES = {
     'ftp': 'curl',
     'gzip': 'gzip',
@@ -20,7 +20,7 @@ SCHEMES = {
 
 CURLSCHEMES = {k for k, v in SCHEMES.items() if v == 'curl'}
 
-# TODO: extend for other cloud plaforms.
+# TODO: extend for other cloud platforms.
 REMOTESCHEMES = {k for k, v in SCHEMES.items() if v in ('curl', 's3', 'gs')}
 
 


=====================================
tests/test_bigint.py
=====================================
@@ -5,7 +5,7 @@ characters, so to be unambiguously read as OFTInteger (and if specifying
 integer that require 10 or 11 characters. the field is dynamically extended
 like managed since a few versions). OFTInteger64 fields are created by default
 with a width of 18 digits, so to be unambiguously read as OFTInteger64, and
-extented to 19 or 20 if needed. Integer fields of width between 10 and 18
+extended to 19 or 20 if needed. Integer fields of width between 10 and 18
 will be read as OFTInteger64. Above they will be treated as OFTReal. In
 previous GDAL versions, Integer fields were created with a default with of 10,
 and thus will be now read as OFTInteger64. An open option, DETECT_TYPE=YES, can


=====================================
tests/test_crs.py
=====================================
@@ -147,3 +147,41 @@ def test_to_wkt__invalid_version():
 def test_from_func_deprecations(func, arg):
     with pytest.warns(FionaDeprecationWarning):
         _ = func(arg)
+
+
+def test_xx():
+    """Create a CRS from WKT with a vertical datum."""
+    wkt = """
+COMPD_CS["NAD83(CSRS) / UTM zone 10N + CGVD28 height",
+PROJCS["NAD83(CSRS) / UTM zone 10N",
+GEOGCS["NAD83(CSRS)",
+DATUM["NAD83_Canadian_Spatial_Reference_System",
+SPHEROID["GRS 1980",6378137,298.257222101,
+AUTHORITY["EPSG","7019"]],
+AUTHORITY["EPSG","6140"]],
+PRIMEM["Greenwich",0,
+AUTHORITY["EPSG","8901"]],
+UNIT["degree",0.0174532925199433,
+AUTHORITY["EPSG","9122"]],
+AUTHORITY["EPSG","4617"]],
+PROJECTION["Transverse_Mercator"],
+PARAMETER["latitude_of_origin",0],
+PARAMETER["central_meridian",-123],
+PARAMETER["scale_factor",0.9996],
+PARAMETER["false_easting",500000],
+PARAMETER["false_northing",0],
+UNIT["metre",1,
+AUTHORITY["EPSG","9001"]],
+AXIS["Easting",EAST],
+AXIS["Northing",NORTH],
+AUTHORITY["EPSG","3157"]],
+VERT_CS["CGVD28 height",
+VERT_DATUM["Canadian Geodetic Vertical Datum of 1928",2005,
+AUTHORITY["EPSG","5114"]],
+UNIT["metre",1,
+AUTHORITY["EPSG","9001"]],
+AXIS["Gravity-related height",UP],
+AUTHORITY["EPSG","5713"]]]
+"""
+    val = crs.CRS.from_wkt(wkt)
+    assert val.wkt.startswith("COMPD_CS")


=====================================
tests/test_datetime.py
=====================================
@@ -740,7 +740,7 @@ def test_datetime_field_type_marked_not_supported_is_not_supported(
 ):
     """Test if a date/datetime/time field type marked as not not supported is really not supported
 
-    Warning: Success of this test does not necessary mean that a field is not supported. E.g. errors can occour due to
+    Warning: Success of this test does not necessary mean that a field is not supported. E.g. errors can occur due to
     special schema requirements of drivers. This test only covers the standard case.
 
     """



View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/-/commit/2fbc5f55ead156b2160f4aaa8081f3f95dbb20d7

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/-/commit/2fbc5f55ead156b2160f4aaa8081f3f95dbb20d7
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20240917/f762c7f5/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list