[Git][debian-gis-team/fiona][experimental] 6 commits: New upstream version 1.10~b2

Bas Couwenberg (@sebastic) gitlab at salsa.debian.org
Thu Jul 11 04:52:32 BST 2024



Bas Couwenberg pushed to branch experimental at Debian GIS Project / fiona


Commits:
acb0a1cd by Bas Couwenberg at 2024-07-11T05:17:45+02:00
New upstream version 1.10~b2
- - - - -
184900c2 by Bas Couwenberg at 2024-07-11T05:17:48+02:00
Update upstream source from tag 'upstream/1.10_b2'

Update to upstream version '1.10~b2'
with Debian dir 43d98a1affa2b61af7b5520497188fcaed7dd13a
- - - - -
b9395632 by Bas Couwenberg at 2024-07-11T05:18:05+02:00
New upstream release.

- - - - -
65d5c072 by Bas Couwenberg at 2024-07-11T05:19:12+02:00
Drop pr1404-gcc13.patch, included upstream.

- - - - -
57cb6420 by Bas Couwenberg at 2024-07-11T05:45:33+02:00
Add patch to fix test_opener_fsspec_file_fs_listdir failure.

- - - - -
0f051d18 by Bas Couwenberg at 2024-07-11T05:46:01+02:00
Set distribution to experimental.

- - - - -


30 changed files:

- .github/workflows/rstcheck.yml
- .github/workflows/scorecard.yml
- .github/workflows/test_gdal_latest.yml
- .github/workflows/tests.yml
- CHANGES.txt
- Makefile
- ci/rstcheck/requirements.txt
- debian/changelog
- − debian/patches/pr1404-gcc13.patch
- debian/patches/series
- + debian/patches/test_opener_fsspec_file_fs_listdir.patch
- fiona/__init__.py
- fiona/_env.pxd
- fiona/_env.pyx
- fiona/_err.pxd
- fiona/_err.pyx
- fiona/_geometry.pyx
- fiona/_vsiopener.pxd
- fiona/_vsiopener.pyx
- fiona/drvsupport.py
- fiona/gdal.pxi
- fiona/model.py
- fiona/ogrext.pyx
- setup.py
- tests/test_bounds.py
- tests/test_memoryfile.py
- tests/test_model.py
- tests/test_pyopener.py
- tests/test_slice.py
- tests/test_topojson.py


Changes:

=====================================
.github/workflows/rstcheck.yml
=====================================
@@ -25,7 +25,7 @@ jobs:
 
     steps:
       - name: Checkout code
-        uses: actions/checkout at v4
+        uses: actions/checkout at v4.1.3
 
       - name: Set up Python
         uses: actions/setup-python at v5


=====================================
.github/workflows/scorecard.yml
=====================================
@@ -32,12 +32,12 @@ jobs:
 
     steps:
       - name: "Checkout code"
-        uses: actions/checkout at 93ea575cb5d8a053eaa0ac8fa3b40d7e05a33cc8 # v3.1.0
+        uses: actions/checkout at 1d96c772d19495a3b5c517cd2bc0cb401ea0529f # v4.1.3
         with:
           persist-credentials: false
 
       - name: "Run analysis"
-        uses: ossf/scorecard-action at 0864cf19026789058feabb7e87baa5f140aac736 # v2.3.1
+        uses: ossf/scorecard-action at dc50aa9510b46c811795eb24b2f1ba02a914e534 # v2.3.3
         with:
           results_file: results.sarif
           results_format: sarif
@@ -59,7 +59,7 @@ jobs:
       # Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF
       # format to the repository Actions tab.
       - name: "Upload artifact"
-        uses: actions/upload-artifact at 5d5d22a31266ced268874388b861e4b58bb5c2f3 # v4.3.1
+        uses: actions/upload-artifact at 0b2256b8c012f0828dc542b3febcab082c67f72b # v4.3.4
         with:
           name: SARIF file
           path: results.sarif
@@ -67,6 +67,6 @@ jobs:
 
       # Upload the results to GitHub's code scanning dashboard.
       - name: "Upload to code-scanning"
-        uses: github/codeql-action/upload-sarif at 1b1aada464948af03b950897e5eb522f92603cc2 # v3.24.9
+        uses: github/codeql-action/upload-sarif at b611370bb5703a7efb587f9d136a52ea24c5c38c # v3.25.11
         with:
           sarif_file: results.sarif


=====================================
.github/workflows/test_gdal_latest.yml
=====================================
@@ -27,7 +27,7 @@ jobs:
       GDAL_DATA: ${{ github.workspace }}/gdal_install/share/gdal
       LD_LIBRARY_PATH: "${{ github.workspace }}/gdal_install/lib/:${LD_LIBRARY_PATH}"
     steps:
-      - uses: actions/checkout at v4
+      - uses: actions/checkout at v4.1.3
       - name: Update
         run: |
           apt-get update


=====================================
.github/workflows/tests.yml
=====================================
@@ -4,7 +4,7 @@ on:
   push:
     branches: [ main, 'maint-*' ]
     paths:
-      - '.github/workflows/tests.yaml'
+      - '.github/workflows/tests.yml'
       - 'requirements*.txt'
       - 'setup.py'
       - 'setup.cfg'
@@ -15,7 +15,7 @@ on:
   pull_request:
     branches: [ main, 'maint-*' ]
     paths:
-      - '.github/workflows/tests.yaml'
+      - '.github/workflows/tests.yml'
       - 'requirements*.txt'
       - 'setup.py'
       - 'setup.cfg'
@@ -52,7 +52,7 @@ jobs:
             gdal-version: '3.8.3'
 
     steps:
-      - uses: actions/checkout at v4
+      - uses: actions/checkout at v4.1.3
 
       - name: Update
         run: |
@@ -92,7 +92,9 @@ jobs:
       fail-fast: true
       matrix:
         include:
-          - os: macos-latest
+          - os: macos-13
+            python-version: '3.11'
+          - os: macos-14
             python-version: '3.11'
           - os: windows-latest
             python-version: '3.11'
@@ -100,53 +102,55 @@ jobs:
       - uses: actions/checkout at v4
 
       - name: Conda Setup
-        uses: s-weigand/setup-conda at v1
+        uses: conda-incubator/setup-miniconda at v3
         with:
-          conda-channels: conda-forge
+          miniforge-variant: Mambaforge
+          miniforge-version: latest
+          use-mamba: true
+          auto-update-conda: true
+          use-only-tar-bz2: false
 
       - name: Install Env (OSX)
-        if: matrix.os == 'macos-latest'
-        shell: bash
+        if: matrix.os == 'macos-13' || matrix.os == 'macos-14'
+        shell: bash -l {0}
         run: |
           conda config --prepend channels conda-forge
           conda config --set channel_priority strict
-          conda create -n test python=${{ matrix.python-version }} libgdal geos=3.10.3 cython=3
-          source activate test
+          conda create -n test python=${{ matrix.python-version }} libgdal geos=3.11 cython=3
+          conda activate test
           python -m pip install -e . || python -m pip install -e .
           python -m pip install -r requirements-dev.txt
 
       - name: Install Env (Windows)
         if: matrix.os == 'windows-latest'
-        shell: bash
+        shell: bash -l {0}
         run: |
           conda config --prepend channels conda-forge
           conda config --set channel_priority strict
-          conda create -n test python=${{ matrix.python-version }} libgdal geos=3.10.3 cython=3
-          source activate test
-          GDAL_VERSION="3.5" python setup.py build_ext -I"C:\\Miniconda\\envs\\test\\Library\\include" -lgdal_i -L"C:\\Miniconda\\envs\\test\\Library\\lib" install
+          conda create -n test python=${{ matrix.python-version }} libgdal geos=3.11 cython=3
+          conda activate test
+          GDAL_VERSION="3.7" python setup.py build_ext -I"/c/Users/runneradmin/miniconda3/envs/test/Library/include" -lgdal -L"/c/Users/runneradmin/miniconda3/envs/test/Library/lib" install
           python -m pip install -r requirements-dev.txt
 
       - name: Check and Log Environment
-        shell: bash
+        shell: bash -l {0}
         run: |
-          source activate test
+          conda activate test
           python -V
           conda info
-          conda list
 
       - name: Test with Coverage (Windows)
         if: matrix.os == 'windows-latest'
-        shell: bash
+        shell: bash -l {0}
         run: |
-          source activate test
+          conda activate test
           pytest -v -m "not wheel" -rxXs --cov fiona --cov-report term-missing
 
       - name: Test with Coverage (OSX)
-        if: matrix.os == 'macos-latest'
-        shell: bash
+        if: matrix.os == 'macos-13'
+        shell: bash -l {0}
         run: |
-          source activate test
+          conda activate test
           python -m pytest -v -m "not wheel" -rxXs  --cov fiona --cov-report term-missing
 
-
       - uses: codecov/codecov-action at v3


=====================================
CHANGES.txt
=====================================
@@ -3,6 +3,24 @@ Changes
 
 All issue numbers are relative to https://github.com/Toblerity/Fiona/issues.
 
+1.10b2 (2024-07-10)
+-------------------
+
+Bug fixes:
+
+- The Pyopener registry and VSI plugin have been rewritten to avoid filename
+  conflicts and to be compatible with multithreading. Now, a new plugin handler
+  is registered for each instance of using an opener (#1408). Before GDAL 3.9.0
+  plugin handlers cannot not be removed and so it may be observed that the size
+  of the Pyopener registry grows during the execution of a program.
+- A CSLConstList ctypedef has been added and is used where appropriate (#1404).
+- Fiona model objects have a informative, printable representation again
+  (#1380).
+
+Packaging:
+
+- PyPI wheels include GDAL 3.9.1 and curl 8.8.0.
+
 1.10b1 (2024-04-16)
 -------------------
 


=====================================
Makefile
=====================================
@@ -33,7 +33,7 @@ dockertestimage:
 	docker build --target gdal --build-arg GDAL=$(GDAL) --build-arg PYTHON_VERSION=$(PYTHON_VERSION) -t fiona:$(GDAL)-py$(PYTHON_VERSION) .
 
 dockertest: dockertestimage
-	docker run -it -v $(shell pwd):/app -v /tmp:/tmp --env AWS_ACCESS_KEY_ID --env AWS_SECRET_ACCESS_KEY --entrypoint=/bin/bash fiona:$(GDAL)-py$(PYTHON_VERSION) -c '/venv/bin/python -m pip install --editable .[all] --no-build-isolation && /venv/bin/python -B -m pytest -m "not wheel" --cov fiona --cov-report term-missing $(OPTS)'
+	docker run -it -v $(shell pwd):/app -v /tmp:/tmp --env AWS_ACCESS_KEY_ID --env AWS_SECRET_ACCESS_KEY --entrypoint=/bin/bash fiona:$(GDAL)-py$(PYTHON_VERSION) -c '/venv/bin/python -m pip install -vvv --editable .[all] --no-build-isolation && /venv/bin/python -B -m pytest -m "not wheel" --cov fiona --cov-report term-missing $(OPTS)'
 
 dockershell: dockertestimage
 	docker run -it -v $(shell pwd):/app --env AWS_ACCESS_KEY_ID --env AWS_SECRET_ACCESS_KEY --entrypoint=/bin/bash fiona:$(GDAL)-py$(PYTHON_VERSION) -c '/venv/bin/python -m pip install --editable . --no-build-isolation && /bin/bash'


=====================================
ci/rstcheck/requirements.txt
=====================================
@@ -8,7 +8,7 @@ alabaster==0.7.13
     # via sphinx
 babel==2.12.1
     # via sphinx
-certifi==2023.7.22
+certifi==2024.7.4
     # via requests
 charset-normalizer==3.2.0
     # via requests
@@ -22,30 +22,28 @@ docutils==0.19
     # via
     #   rstcheck-core
     #   sphinx
-idna==3.4
+idna==3.7
     # via requests
 imagesize==1.4.1
     # via sphinx
-jinja2==3.1.3
+jinja2==3.1.4
     # via sphinx
 markupsafe==2.1.3
     # via jinja2
 packaging==23.1
     # via sphinx
-pydantic==1.10.12
+pydantic==1.10.13
     # via rstcheck-core
 pygments==2.16.1
     # via
     #   rich
     #   sphinx
-requests==2.31.0
+requests==2.32.0
     # via sphinx
 rich==12.6.0
     # via typer
 rstcheck==6.1.2
-    # via
-    #   -r requirements.in
-    #   rstcheck
+    # via -r requirements.in
 rstcheck-core==1.0.3
     # via rstcheck
 shellingham==1.5.3
@@ -73,12 +71,10 @@ sphinxcontrib-qthelp==1.0.6
 sphinxcontrib-serializinghtml==1.1.9
     # via sphinx
 typer==0.7.0
-    # via
-    #   rstcheck
-    #   typer
+    # via rstcheck
 types-docutils==0.19.1.9
     # via rstcheck-core
 typing-extensions==4.7.1
     # via pydantic
-urllib3==2.0.7
+urllib3==2.2.2
     # via requests


=====================================
debian/changelog
=====================================
@@ -1,11 +1,12 @@
-fiona (1.10~b1-1~exp2) UNRELEASED; urgency=medium
+fiona (1.10~b2-1~exp1) experimental; urgency=medium
 
   * Team upload.
-  * Ignore test failures on mips64el.
-  * Add upstream patch to fix FTBFS with GCC 14.
     (closes: #1074959)
+  * New upstream release.
+  * Ignore test failures on mips64el.
+  * Add patch to fix test_opener_fsspec_file_fs_listdir failure.
 
- -- Bas Couwenberg <sebastic at debian.org>  Wed, 17 Apr 2024 12:22:05 +0200
+ -- Bas Couwenberg <sebastic at debian.org>  Thu, 11 Jul 2024 05:45:46 +0200
 
 fiona (1.10~b1-1~exp1) experimental; urgency=medium
 


=====================================
debian/patches/pr1404-gcc13.patch deleted
=====================================
@@ -1,80 +0,0 @@
-Description: Use CSLConstList ctypedef where appropriate
-Author: Sean Gillies <sean.gillies at gmail.com>
-Origin: https://github.com/Toblerity/Fiona/pull/1404
-Bug: https://github.com/Toblerity/Fiona/issues/1365
-
---- a/fiona/ogrext.pyx
-+++ b/fiona/ogrext.pyx
-@@ -376,7 +376,7 @@ cdef class StringListField(AbstractField
-         for item in value:
-             item_b = item.encode(encoding)
-             string_list = CSLAddString(string_list, <const char *>item_b)
--        OGR_F_SetFieldStringList(feature, i, <const char **>string_list)
-+        OGR_F_SetFieldStringList(feature, i, <CSLConstList>string_list)
- 
- 
- cdef class JSONField(AbstractField):
-@@ -1264,7 +1264,7 @@ cdef class Session:
- 
-         cdef char **metadata = NULL
-         metadata = GDALGetMetadata(obj, domain)
--        num_items = CSLCount(metadata)
-+        num_items = CSLCount(<CSLConstList>metadata)
- 
-         return dict(metadata[i].decode('utf-8').split('=', 1) for i in range(num_items))
- 
-@@ -2175,7 +2175,7 @@ def _listdir(path):
-         raise FionaValueError(f"Path '{path}' is not a directory.")
- 
-     papszFiles = VSIReadDir(path_c)
--    n = CSLCount(papszFiles)
-+    n = CSLCount(<CSLConstList>papszFiles)
-     files = []
-     for i in range(n):
-         files.append(papszFiles[i].decode("utf-8"))
---- a/fiona/gdal.pxi
-+++ b/fiona/gdal.pxi
-@@ -16,18 +16,22 @@ cdef extern from "cpl_conv.h":
-     const char *CPLFindFile(const char *pszClass, const char *pszBasename)
- 
- 
-+cdef extern from "cpl_port.h":
-+    ctypedef char **CSLConstList
-+
-+
- cdef extern from "cpl_string.h":
--    char ** CSLAddNameValue (char **list, const char *name, const char *value)
--    char ** CSLSetNameValue (char **list, const char *name, const char *value)
--    void CSLDestroy (char **list)
-+    char ** CSLAddNameValue(char **list, const char *name, const char *value)
-+    char ** CSLSetNameValue(char **list, const char *name, const char *value)
-+    void CSLDestroy(char **list)
-     char ** CSLAddString(char **list, const char *string)
--    int CSLCount(char **papszStrList)
--    char **CSLDuplicate(char **papszStrList)
--    int CSLFindName(char **papszStrList, const char *pszName)
--    int CSLFindString(char **papszStrList, const char *pszString)
--    int CSLFetchBoolean(char **papszStrList, const char *pszName, int default)
--    const char *CSLFetchNameValue(char **papszStrList, const char *pszName)
--    char **CSLMerge(char **first, char **second)
-+    int CSLCount(CSLConstList papszStrList)
-+    char **CSLDuplicate(CSLConstList papszStrList)
-+    int CSLFindName(CSLConstList papszStrList, const char *pszName)
-+    int CSLFindString(CSLConstList papszStrList, const char *pszString)
-+    int CSLFetchBoolean(CSLConstList papszStrList, const char *pszName, int default)
-+    const char *CSLFetchNameValue(CSLConstList papszStrList, const char *pszName)
-+    char **CSLMerge(char **first, CSLConstList second)
- 
- 
- cdef extern from "cpl_error.h" nogil:
---- a/fiona/_vsiopener.pyx
-+++ b/fiona/_vsiopener.pyx
-@@ -36,7 +36,7 @@ _OPEN_FILE_EXIT_STACKS.set({})
- cdef int install_pyopener_plugin(VSIFilesystemPluginCallbacksStruct *callbacks_struct):
-     """Install handlers for python file openers if it isn't already installed."""
-     cdef char **registered_prefixes = VSIGetFileSystemsPrefixes()
--    cdef int prefix_index = CSLFindString(registered_prefixes, PREFIX_BYTES)
-+    cdef int prefix_index = CSLFindString(<CSLConstList>registered_prefixes, PREFIX_BYTES)
-     CSLDestroy(registered_prefixes)
- 
-     if prefix_index < 0:


=====================================
debian/patches/series
=====================================
@@ -1,3 +1,3 @@
 0001-Rename-fio-command-to-fiona-to-avoid-name-clash.patch
 test_drvsupport.patch
-pr1404-gcc13.patch
+test_opener_fsspec_file_fs_listdir.patch


=====================================
debian/patches/test_opener_fsspec_file_fs_listdir.patch
=====================================
@@ -0,0 +1,15 @@
+Description: Fix test failure.
+Author: Bas Couwenberg <sebastic at debian.org>
+Bug: https://github.com/Toblerity/Fiona/issues/1332
+
+--- a/tests/test_pyopener.py
++++ b/tests/test_pyopener.py
+@@ -143,7 +143,7 @@ def test_opener_fsspec_file_fs_listdir()
+     """Use fsspec file filesystem as opener for listdir()."""
+     fs = fsspec.filesystem("file")
+     listing = fiona.listdir("tests/data", opener=fs)
+-    assert len(listing) >= 35
++    assert len(listing) >= 33
+     assert set(
+         ["coutwildrnp.shp", "coutwildrnp.dbf", "coutwildrnp.shx", "coutwildrnp.prj"]
+     ) & set(listing)


=====================================
fiona/__init__.py
=====================================
@@ -78,7 +78,7 @@ __all__ = [
     "remove",
 ]
 
-__version__ = "1.10b1"
+__version__ = "1.10b2"
 __gdal_version__ = get_gdal_release_name()
 
 gdal_version = get_gdal_version_tuple()
@@ -99,6 +99,10 @@ def open(
     vfs=None,
     enabled_drivers=None,
     crs_wkt=None,
+    ignore_fields=None,
+    ignore_geometry=False,
+    include_fields=None,
+    wkt_version=None,
     allow_unsupported_drivers=False,
     opener=None,
     **kwargs
@@ -150,6 +154,11 @@ def open(
       fiona.open(
           'example.shp', enabled_drivers=['GeoJSON', 'ESRI Shapefile'])
 
+    Some format drivers permit low-level filtering of fields. Specific
+    fields can be ommitted by using the ``ignore_fields`` parameter.
+    Specific fields can be selected, excluding all others, by using the
+    ``include_fields`` parameter.
+
     Parameters
     ----------
     fp : URI (str or pathlib.Path), or file-like object
@@ -177,12 +186,12 @@ def open(
     crs_wkt : str
         An optional WKT representation of a coordinate reference
         system.
-    ignore_fields : list
+    ignore_fields : list[str], optional
         List of field names to ignore on load.
+    include_fields : list[str], optional
+        List of a subset of field names to include on load.
     ignore_geometry : bool
         Ignore the geometry on load.
-    include_fields : list
-        List of a subset of field names to include on load.
     wkt_version : fiona.enums.WktVersion or str, optional
         Version to use to for the CRS WKT.
         Defaults to GDAL's default (WKT1_GDAL for GDAL 3).
@@ -209,6 +218,12 @@ def open(
     -------
     Collection
 
+    Raises
+    ------
+    DriverError
+        When the selected format driver cannot provide requested
+        capabilities such as ignoring fields.
+
     """
     if mode == "r" and hasattr(fp, "read"):
         memfile = MemoryFile(fp.read())
@@ -218,6 +233,10 @@ def open(
             schema=schema,
             layer=layer,
             encoding=encoding,
+            ignore_fields=ignore_fields,
+            include_fields=include_fields,
+            ignore_geometry=ignore_geometry,
+            wkt_version=wkt_version,
             enabled_drivers=enabled_drivers,
             allow_unsupported_drivers=allow_unsupported_drivers,
             **kwargs
@@ -233,6 +252,10 @@ def open(
             schema=schema,
             layer=layer,
             encoding=encoding,
+            ignore_fields=ignore_fields,
+            include_fields=include_fields,
+            ignore_geometry=ignore_geometry,
+            wkt_version=wkt_version,
             enabled_drivers=enabled_drivers,
             allow_unsupported_drivers=allow_unsupported_drivers,
             crs_wkt=crs_wkt,
@@ -273,6 +296,10 @@ def open(
                 schema=schema,
                 layer=layer,
                 encoding=encoding,
+                ignore_fields=ignore_fields,
+                include_fields=include_fields,
+                ignore_geometry=ignore_geometry,
+                wkt_version=wkt_version,
                 enabled_drivers=enabled_drivers,
                 allow_unsupported_drivers=allow_unsupported_drivers,
                 crs_wkt=crs_wkt,
@@ -297,7 +324,7 @@ def open(
                 log.debug("Registering opener: raw_dataset_path=%r, opener=%r", raw_dataset_path, opener)
                 vsi_path_ctx = _opener_registration(raw_dataset_path, opener)
                 registered_vsi_path = stack.enter_context(vsi_path_ctx)
-                log.debug("Registered vsi path: registered_vsi_path%r", registered_vsi_path)
+                log.debug("Registered vsi path: registered_vsi_path=%r", registered_vsi_path)
                 path = _UnparsedPath(registered_vsi_path)
             else:
                 if vfs:
@@ -318,6 +345,10 @@ def open(
                     driver=driver,
                     encoding=encoding,
                     layer=layer,
+                    ignore_fields=ignore_fields,
+                    include_fields=include_fields,
+                    ignore_geometry=ignore_geometry,
+                    wkt_version=wkt_version,
                     enabled_drivers=enabled_drivers,
                     allow_unsupported_drivers=allow_unsupported_drivers,
                     **kwargs
@@ -331,6 +362,10 @@ def open(
                     schema=schema,
                     encoding=encoding,
                     layer=layer,
+                    ignore_fields=ignore_fields,
+                    include_fields=include_fields,
+                    ignore_geometry=ignore_geometry,
+                    wkt_version=wkt_version,
                     enabled_drivers=enabled_drivers,
                     crs_wkt=crs_wkt,
                     allow_unsupported_drivers=allow_unsupported_drivers,
@@ -351,7 +386,7 @@ collection = open
 
 
 @ensure_env_with_credentials
-def remove(path_or_collection, driver=None, layer=None):
+def remove(path_or_collection, driver=None, layer=None, opener=None):
     """Delete an OGR data source or one of its layers.
 
     If no layer is specified, the entire dataset and all of its layers
@@ -361,6 +396,19 @@ def remove(path_or_collection, driver=None, layer=None):
     ----------
     path_or_collection : str, pathlib.Path, or Collection
         The target Collection or its path.
+    opener : callable or obj, optional
+        A custom dataset opener which can serve GDAL's virtual
+        filesystem machinery via Python file-like objects. The
+        underlying file-like object is obtained by calling *opener* with
+        (*fp*, *mode*) or (*fp*, *mode* + "b") depending on the format
+        driver's native mode. *opener* must return a Python file-like
+        object that provides read, seek, tell, and close methods. Note:
+        only one opener at a time per fp, mode pair is allowed.
+
+        Alternatively, opener may be a filesystem object from a package
+        like fsspec that provides the following methods: isdir(),
+        isfile(), ls(), mtime(), open(), and size(). The exact interface
+        is defined in the fiona._vsiopener._AbstractOpener class.
     driver : str, optional
         The name of a driver to be used for deletion, optional. Can
         usually be detected.
@@ -379,21 +427,37 @@ def remove(path_or_collection, driver=None, layer=None):
     """
     if isinstance(path_or_collection, Collection):
         collection = path_or_collection
-        path = collection.path
+        raw_dataset_path = collection.path
         driver = collection.driver
         collection.close()
-    elif isinstance(path_or_collection, Path):
-        path = str(path_or_collection)
+
     else:
-        path = path_or_collection
-    if layer is None:
-        _remove(path, driver)
+        fp = path_or_collection
+        if hasattr(fp, "path") and hasattr(fp, "fs"):
+            log.debug("Detected fp is an OpenFile: fp=%r", fp)
+            raw_dataset_path = fp.path
+            opener = fp.fs.open
+        else:
+            raw_dataset_path = os.fspath(fp)
+
+    if opener:
+        log.debug("Registering opener: raw_dataset_path=%r, opener=%r", raw_dataset_path, opener)
+        with _opener_registration(raw_dataset_path, opener) as registered_vsi_path:
+            log.debug("Registered vsi path: registered_vsi_path=%r", registered_vsi_path)
+            if layer is None:
+                _remove(registered_vsi_path, driver)
+            else:
+                _remove_layer(registered_vsi_path, layer, driver)
     else:
-        _remove_layer(path, layer, driver)
+        pobj = _parse_path(raw_dataset_path)
+        if layer is None:
+            _remove(_vsi_path(pobj), driver)
+        else:
+            _remove_layer(_vsi_path(pobj), layer, driver)
 
 
 @ensure_env_with_credentials
-def listdir(fp):
+def listdir(fp, opener=None):
     """Lists the datasets in a directory or archive file.
 
     Archive files must be prefixed like "zip://" or "tar://".
@@ -402,6 +466,19 @@ def listdir(fp):
     ----------
     fp : str or pathlib.Path
         Directory or archive path.
+    opener : callable or obj, optional
+        A custom dataset opener which can serve GDAL's virtual
+        filesystem machinery via Python file-like objects. The
+        underlying file-like object is obtained by calling *opener* with
+        (*fp*, *mode*) or (*fp*, *mode* + "b") depending on the format
+        driver's native mode. *opener* must return a Python file-like
+        object that provides read, seek, tell, and close methods. Note:
+        only one opener at a time per fp, mode pair is allowed.
+
+        Alternatively, opener may be a filesystem object from a package
+        like fsspec that provides the following methods: isdir(),
+        isfile(), ls(), mtime(), open(), and size(). The exact interface
+        is defined in the fiona._vsiopener._AbstractOpener class.
 
     Returns
     -------
@@ -414,18 +491,25 @@ def listdir(fp):
         If the input is not a str or Path.
 
     """
-    if isinstance(fp, Path):
-        fp = str(fp)
-
-    if not isinstance(fp, str):
-        raise TypeError("invalid path: %r" % fp)
+    if hasattr(fp, "path") and hasattr(fp, "fs"):
+        log.debug("Detected fp is an OpenFile: fp=%r", fp)
+        raw_dataset_path = fp.path
+        opener = fp.fs.open
+    else:
+        raw_dataset_path = os.fspath(fp)
 
-    pobj = _parse_path(fp)
-    return _listdir(_vsi_path(pobj))
+    if opener:
+        log.debug("Registering opener: raw_dataset_path=%r, opener=%r", raw_dataset_path, opener)
+        with _opener_registration(raw_dataset_path, opener) as registered_vsi_path:
+            log.debug("Registered vsi path: registered_vsi_path=%r", registered_vsi_path)
+            return _listdir(registered_vsi_path)
+    else:
+        pobj = _parse_path(raw_dataset_path)
+        return _listdir(_vsi_path(pobj))
 
 
 @ensure_env_with_credentials
-def listlayers(fp, vfs=None, **kwargs):
+def listlayers(fp, opener=None, vfs=None, **kwargs):
     """Lists the layers (collections) in a dataset.
 
     Archive files must be prefixed like "zip://" or "tar://".
@@ -434,6 +518,19 @@ def listlayers(fp, vfs=None, **kwargs):
     ----------
     fp : str, pathlib.Path, or file-like object
         A dataset identifier or file object containing a dataset.
+    opener : callable or obj, optional
+        A custom dataset opener which can serve GDAL's virtual
+        filesystem machinery via Python file-like objects. The
+        underlying file-like object is obtained by calling *opener* with
+        (*fp*, *mode*) or (*fp*, *mode* + "b") depending on the format
+        driver's native mode. *opener* must return a Python file-like
+        object that provides read, seek, tell, and close methods. Note:
+        only one opener at a time per fp, mode pair is allowed.
+
+        Alternatively, opener may be a filesystem object from a package
+        like fsspec that provides the following methods: isdir(),
+        isfile(), ls(), mtime(), open(), and size(). The exact interface
+        is defined in the fiona._vsiopener._AbstractOpener class.
     vfs : str
         This is a deprecated parameter. A URI scheme such as "zip://"
         should be used instead.
@@ -451,18 +548,26 @@ def listlayers(fp, vfs=None, **kwargs):
         If the input is not a str, Path, or file object.
 
     """
+    if vfs and not isinstance(vfs, str):
+        raise TypeError(f"invalid vfs: {vfs!r}")
+
     if hasattr(fp, 'read'):
         with MemoryFile(fp.read()) as memfile:
             return _listlayers(memfile.name, **kwargs)
-    else:
-        if isinstance(fp, Path):
-            fp = str(fp)
 
-        if not isinstance(fp, str):
-            raise TypeError(f"invalid path: {fp!r}")
-        if vfs and not isinstance(vfs, str):
-            raise TypeError(f"invalid vfs: {vfs!r}")
+    if hasattr(fp, "path") and hasattr(fp, "fs"):
+        log.debug("Detected fp is an OpenFile: fp=%r", fp)
+        raw_dataset_path = fp.path
+        opener = fp.fs.open
+    else:
+        raw_dataset_path = os.fspath(fp)
 
+    if opener:
+        log.debug("Registering opener: raw_dataset_path=%r, opener=%r", raw_dataset_path, opener)
+        with _opener_registration(raw_dataset_path, opener) as registered_vsi_path:
+            log.debug("Registered vsi path: registered_vsi_path=%r", registered_vsi_path)
+            return _listlayers(registered_vsi_path, **kwargs)
+    else:
         if vfs:
             warnings.warn(
                 "The vfs keyword argument is deprecated and will be removed in 2.0. "
@@ -471,10 +576,10 @@ def listlayers(fp, vfs=None, **kwargs):
                 stacklevel=2,
             )
             pobj_vfs = _parse_path(vfs)
-            pobj_path = _parse_path(fp)
+            pobj_path = _parse_path(raw_dataset_path)
             pobj = _ParsedPath(pobj_path.path, pobj_vfs.path, pobj_vfs.scheme)
         else:
-            pobj = _parse_path(fp)
+            pobj = _parse_path(raw_dataset_path)
 
         return _listlayers(_vsi_path(pobj), **kwargs)
 


=====================================
fiona/_env.pxd
=====================================
@@ -1,11 +1,6 @@
 include "gdal.pxi"
 
 
-cdef extern from "ogr_srs_api.h":
-    void OSRSetPROJSearchPaths(const char *const *papszPaths)
-    void OSRGetPROJVersion	(int *pnMajor, int *pnMinor, int *pnPatch)
-
-
 cdef class ConfigEnv(object):
     cdef public object options
 


=====================================
fiona/_env.pyx
=====================================
@@ -17,7 +17,6 @@ import threading
 
 from fiona._err cimport exc_wrap_int, exc_wrap_ogrerr
 from fiona._err import CPLE_BaseError
-from fiona._vsiopener cimport install_pyopener_plugin
 from fiona.errors import EnvError
 
 level_map = {
@@ -406,10 +405,8 @@ cdef class GDALEnv(ConfigEnv):
         if not self._have_registered_drivers:
             with threading.Lock():
                 if not self._have_registered_drivers:
-
                     GDALAllRegister()
                     OGRRegisterAll()
-                    install_pyopener_plugin(pyopener_plugin)
 
                     if 'GDAL_DATA' in os.environ:
                         log.debug("GDAL_DATA found in environment.")


=====================================
fiona/_err.pxd
=====================================
@@ -1,15 +1,14 @@
-from libc.stdio cimport *
-
-cdef extern from "cpl_vsi.h":
-
-    ctypedef FILE VSILFILE
+include "gdal.pxi"
 
-cdef extern from "ogr_core.h":
-
-    ctypedef int OGRErr
+from libc.stdio cimport *
 
 cdef get_last_error_msg()
 cdef int exc_wrap_int(int retval) except -1
 cdef OGRErr exc_wrap_ogrerr(OGRErr retval) except -1
 cdef void *exc_wrap_pointer(void *ptr) except NULL
 cdef VSILFILE *exc_wrap_vsilfile(VSILFILE *f) except NULL
+
+cdef class StackChecker:
+    cdef object error_stack
+    cdef int exc_wrap_int(self, int retval) except -1
+    cdef void *exc_wrap_pointer(self, void *ptr) except NULL


=====================================
fiona/_err.pyx
=====================================
@@ -29,23 +29,17 @@ manager raises a more useful and informative error:
     ValueError: The PNG driver does not support update access to existing datasets.
 """
 
-# CPL function declarations.
-cdef extern from "cpl_error.h":
-
-    ctypedef enum CPLErr:
-        CE_None
-        CE_Debug
-        CE_Warning
-        CE_Failure
-        CE_Fatal
+import contextlib
+from contextvars import ContextVar
+from enum import IntEnum
+from itertools import zip_longest
+import logging
 
-    int CPLGetLastErrorNo()
-    const char* CPLGetLastErrorMsg()
-    int CPLGetLastErrorType()
-    void CPLErrorReset()
+log = logging.getLogger(__name__)
 
+_ERROR_STACK = ContextVar("error_stack")
+_ERROR_STACK.set([])
 
-from enum import IntEnum
 
 # Python exceptions expressing the CPL error numbers.
 
@@ -132,6 +126,10 @@ class CPLE_AWSSignatureDoesNotMatchError(CPLE_BaseError):
     pass
 
 
+class CPLE_AWSError(CPLE_BaseError):
+    pass
+
+
 class FionaNullPointerError(CPLE_BaseError):
     """
     Returned from exc_wrap_pointer when a NULL pointer is passed, but no GDAL
@@ -148,6 +146,14 @@ class FionaCPLError(CPLE_BaseError):
     pass
 
 
+cdef dict _LEVEL_MAP = {
+    0: 0,
+    1: logging.DEBUG,
+    2: logging.WARNING,
+    3: logging.ERROR,
+    4: logging.CRITICAL
+}
+
 # Map of GDAL error numbers to the Python exceptions.
 exception_map = {
     1: CPLE_AppDefinedError,
@@ -168,8 +174,30 @@ exception_map = {
     13: CPLE_AWSObjectNotFoundError,
     14: CPLE_AWSAccessDeniedError,
     15: CPLE_AWSInvalidCredentialsError,
-    16: CPLE_AWSSignatureDoesNotMatchError}
-
+    16: CPLE_AWSSignatureDoesNotMatchError,
+    17: CPLE_AWSError
+}
+
+cdef dict _CODE_MAP = {
+    0: 'CPLE_None',
+    1: 'CPLE_AppDefined',
+    2: 'CPLE_OutOfMemory',
+    3: 'CPLE_FileIO',
+    4: 'CPLE_OpenFailed',
+    5: 'CPLE_IllegalArg',
+    6: 'CPLE_NotSupported',
+    7: 'CPLE_AssertionFailed',
+    8: 'CPLE_NoWriteAccess',
+    9: 'CPLE_UserInterrupt',
+    10: 'ObjectNull',
+    11: 'CPLE_HttpResponse',
+    12: 'CPLE_AWSBucketNotFound',
+    13: 'CPLE_AWSObjectNotFound',
+    14: 'CPLE_AWSAccessDenied',
+    15: 'CPLE_AWSInvalidCredentials',
+    16: 'CPLE_AWSSignatureDoesNotMatch',
+    17: 'CPLE_AWSError'
+}
 
 # CPL Error types as an enum.
 class GDALError(IntEnum):
@@ -305,3 +333,127 @@ cdef VSILFILE *exc_wrap_vsilfile(VSILFILE *f) except NULL:
     return f
 
 cpl_errs = GDALErrCtxManager()
+
+
+cdef class StackChecker:
+
+    def __init__(self, error_stack=None):
+        self.error_stack = error_stack or {}
+
+    cdef int exc_wrap_int(self, int err) except -1:
+        """Wrap a GDAL/OGR function that returns CPLErr (int).
+
+        Raises a Rasterio exception if a non-fatal error has be set.
+        """
+        if err:
+            stack = self.error_stack.get()
+            for error, cause in zip_longest(stack[::-1], stack[::-1][1:]):
+                if error is not None and cause is not None:
+                    error.__cause__ = cause
+
+            if stack:
+                last = stack.pop()
+                if last is not None:
+                    raise last
+
+        return err
+
+    cdef void *exc_wrap_pointer(self, void *ptr) except NULL:
+        """Wrap a GDAL/OGR function that returns a pointer.
+
+        Raises a Rasterio exception if a non-fatal error has be set.
+        """
+        if ptr == NULL:
+            stack = self.error_stack.get()
+            for error, cause in zip_longest(stack[::-1], stack[::-1][1:]):
+                if error is not None and cause is not None:
+                    error.__cause__ = cause
+
+            if stack:
+                last = stack.pop()
+                if last is not None:
+                    raise last
+
+        return ptr
+
+
+cdef void log_error(
+    CPLErr err_class,
+    int err_no,
+    const char* msg,
+) noexcept with gil:
+    """Send CPL errors to Python's logger.
+
+    Because this function is called by GDAL with no Python context, we
+    can't propagate exceptions that we might raise here. They'll be
+    ignored.
+
+    """
+    if err_no in _CODE_MAP:
+        # We've observed that some GDAL functions may emit multiple
+        # ERROR level messages and yet succeed. We want to see those
+        # messages in our log file, but not at the ERROR level. We
+        # turn the level down to INFO.
+        if err_class == 3:
+            log.info(
+                "GDAL signalled an error: err_no=%r, msg=%r",
+                err_no,
+                msg.decode("utf-8")
+            )
+        elif err_no == 0:
+            log.log(_LEVEL_MAP[err_class], "%s", msg.decode("utf-8"))
+        else:
+            log.log(_LEVEL_MAP[err_class], "%s:%s", _CODE_MAP[err_no], msg.decode("utf-8"))
+    else:
+        log.info("Unknown error number %r", err_no)
+
+
+IF UNAME_SYSNAME == "Windows":
+    cdef void __stdcall chaining_error_handler(
+        CPLErr err_class,
+        int err_no,
+        const char* msg
+    ) noexcept with gil:
+        global _ERROR_STACK
+        log_error(err_class, err_no, msg)
+        if err_class == 3:
+            stack = _ERROR_STACK.get()
+            stack.append(
+                exception_map.get(err_no, CPLE_BaseError)(err_class, err_no, msg.decode("utf-8")),
+            )
+            _ERROR_STACK.set(stack)
+ELSE:
+    cdef void chaining_error_handler(
+        CPLErr err_class,
+        int err_no,
+        const char* msg
+    ) noexcept with gil:
+        global _ERROR_STACK
+        log_error(err_class, err_no, msg)
+        if err_class == 3:
+            stack = _ERROR_STACK.get()
+            stack.append(
+                exception_map.get(err_no, CPLE_BaseError)(err_class, err_no, msg.decode("utf-8")),
+            )
+            _ERROR_STACK.set(stack)
+
+
+ at contextlib.contextmanager
+def stack_errors():
+    # TODO: better name?
+    # Note: this manager produces one chain of errors and thus assumes
+    # that no more than one GDAL function is called.
+    CPLErrorReset()
+    global _ERROR_STACK
+    _ERROR_STACK.set([])
+
+    # chaining_error_handler (better name a TODO) records GDAL errors
+    # in the order they occur and converts to exceptions.
+    CPLPushErrorHandlerEx(<CPLErrorHandler>chaining_error_handler, NULL)
+
+    # Run code in the `with` block.
+    yield StackChecker(_ERROR_STACK)
+
+    CPLPopErrorHandler()
+    _ERROR_STACK.set([])
+    CPLErrorReset()


=====================================
fiona/_geometry.pyx
=====================================
@@ -20,6 +20,23 @@ log.addHandler(NullHandler())
 # mapping of GeoJSON type names to OGR integer geometry types
 GEOJSON2OGR_GEOMETRY_TYPES = dict((v, k) for k, v in GEOMETRY_TYPES.items())
 
+cdef set LINEAR_GEOM_TYPES = {
+    OGRGeometryType.CircularString.value,
+    OGRGeometryType.CompoundCurve.value,
+    OGRGeometryType.CurvePolygon.value,
+    OGRGeometryType.MultiCurve.value,
+    OGRGeometryType.MultiSurface.value,
+    # OGRGeometryType.Curve.value,  # Abstract type
+    # OGRGeometryType.Surface.value,  # Abstract type
+}
+
+cdef set PS_TIN_Tri_TYPES = {
+    OGRGeometryType.PolyhedralSurface.value,
+    OGRGeometryType.TIN.value,
+    OGRGeometryType.Triangle.value
+}
+
+
 cdef int ogr_get_geometry_type(void *geometry):
     # OGR_G_GetGeometryType with NULL geometry support
     if geometry == NULL:
@@ -137,14 +154,11 @@ cdef class GeomBuilder:
         parts = []
         j = 0
         count = OGR_G_GetGeometryCount(geom)
+
         while j < count:
             part = OGR_G_GetGeometryRef(geom, j)
-            code = base_geometry_type_code(ogr_get_geometry_type(part))
-            if code in (
-                OGRGeometryType.PolyhedralSurface.value,
-                OGRGeometryType.TIN.value,
-                OGRGeometryType.Triangle.value,
-            ):
+            code = base_geometry_type_code(OGR_G_GetGeometryType(part))
+            if code in PS_TIN_Tri_TYPES:
                 OGR_G_RemoveGeometry(geom, j, False)
                 # Removing a geometry will cause the geometry count to drop by one,
                 # and all “higher” geometries will shuffle down one in index.
@@ -186,11 +200,7 @@ cdef class GeomBuilder:
 
         # We need to take ownership of the geometry before we can call 
         # OGR_G_ForceToPolygon or OGR_G_ForceToMultiPolygon
-        if code in (
-            OGRGeometryType.PolyhedralSurface.value,
-            OGRGeometryType.TIN.value,
-            OGRGeometryType.Triangle.value,
-        ):
+        if code in PS_TIN_Tri_TYPES:
             cogr_geometry = OGR_F_StealGeometry(feature)
         return self.build(cogr_geometry)
 
@@ -206,28 +216,16 @@ cdef class GeomBuilder:
 
         # We convert special geometries (Curves, TIN, Triangle, ...)
         # to GeoJSON compatible geometries (LineStrings, Polygons, MultiPolygon, ...)
-        if code in (
-            OGRGeometryType.CircularString.value,
-            OGRGeometryType.CompoundCurve.value,
-            OGRGeometryType.CurvePolygon.value,
-            OGRGeometryType.MultiCurve.value,
-            OGRGeometryType.MultiSurface.value,
-            # OGRGeometryType.Curve.value,  # Abstract type
-            # OGRGeometryType.Surface.value,  # Abstract type
-        ):
+        if code in LINEAR_GEOM_TYPES:
             geometry_to_dealloc = OGR_G_GetLinearGeometry(geom, 0.0, NULL)
             code = base_geometry_type_code(ogr_get_geometry_type(geometry_to_dealloc))
             geom = geometry_to_dealloc
-        elif code in (
-            OGRGeometryType.PolyhedralSurface.value,
-            OGRGeometryType.TIN.value,
-            OGRGeometryType.Triangle.value,
-        ):
-            if code in (OGRGeometryType.PolyhedralSurface.value, OGRGeometryType.TIN.value):
-                geometry_to_dealloc = OGR_G_ForceToMultiPolygon(geom)
-            elif code == OGRGeometryType.Triangle.value:
+        elif code in PS_TIN_Tri_TYPES:
+            if code == OGRGeometryType.Triangle.value:
                 geometry_to_dealloc = OGR_G_ForceToPolygon(geom)
-            code = base_geometry_type_code(ogr_get_geometry_type(geometry_to_dealloc))
+            else:
+                geometry_to_dealloc = OGR_G_ForceToMultiPolygon(geom)
+            code = base_geometry_type_code(OGR_G_GetGeometryType(geometry_to_dealloc))
             geom = geometry_to_dealloc
         self.ndims = OGR_G_GetCoordinateDimension(geom)
 


=====================================
fiona/_vsiopener.pxd
=====================================
@@ -1,4 +1 @@
 include "gdal.pxi"
-
-cdef int install_pyopener_plugin(VSIFilesystemPluginCallbacksStruct *callbacks_struct)
-cdef void uninstall_pyopener_plugin(VSIFilesystemPluginCallbacksStruct *callbacks_struct)


=====================================
fiona/_vsiopener.pyx
=====================================
@@ -8,19 +8,17 @@ from contextvars import ContextVar
 import logging
 import os
 from pathlib import Path
-
 import stat
+from uuid import uuid4
 
 from libc.string cimport memcpy
 
+from fiona._env import get_gdal_version_tuple
 from fiona.errors import OpenerRegistrationError
 
 log = logging.getLogger(__name__)
 
-# Prefix for all in-memory paths used by GDAL's VSI system
-# Except for errors and log messages this shouldn't really be seen by the user
-cdef str PREFIX = "/vsifiopener/"
-cdef bytes PREFIX_BYTES = PREFIX.encode("utf-8")
+cdef str VSI_NS_ROOT = "vsifiopener"
 
 # This is global state for the Python filesystem plugin. It currently only
 # contains path -> PyOpenerBase (or subclass) instances. This is used by
@@ -33,38 +31,12 @@ _OPEN_FILE_EXIT_STACKS = ContextVar("open_file_exit_stacks")
 _OPEN_FILE_EXIT_STACKS.set({})
 
 
-cdef int install_pyopener_plugin(VSIFilesystemPluginCallbacksStruct *callbacks_struct):
-    """Install handlers for python file openers if it isn't already installed."""
-    cdef char **registered_prefixes = VSIGetFileSystemsPrefixes()
-    cdef int prefix_index = CSLFindString(registered_prefixes, PREFIX_BYTES)
-    CSLDestroy(registered_prefixes)
-
-    if prefix_index < 0:
-        log.debug("Installing Python opener handler plugin...")
-        callbacks_struct = VSIAllocFilesystemPluginCallbacksStruct()
-        callbacks_struct.open = <VSIFilesystemPluginOpenCallback>pyopener_open
-        callbacks_struct.eof = <VSIFilesystemPluginEofCallback>pyopener_eof
-        callbacks_struct.tell = <VSIFilesystemPluginTellCallback>pyopener_tell
-        callbacks_struct.seek = <VSIFilesystemPluginSeekCallback>pyopener_seek
-        callbacks_struct.read = <VSIFilesystemPluginReadCallback>pyopener_read
-        callbacks_struct.write = <VSIFilesystemPluginWriteCallback>pyopener_write
-        callbacks_struct.flush = <VSIFilesystemPluginFlushCallback>pyopener_flush
-        callbacks_struct.close = <VSIFilesystemPluginCloseCallback>pyopener_close
-        callbacks_struct.read_dir = <VSIFilesystemPluginReadDirCallback>pyopener_read_dir
-        callbacks_struct.stat = <VSIFilesystemPluginStatCallback>pyopener_stat
-        callbacks_struct.pUserData = <void*>_OPENER_REGISTRY
-        retval = VSIInstallPluginHandler(PREFIX_BYTES, callbacks_struct)
-        VSIFreeFilesystemPluginCallbacksStruct(callbacks_struct)
-        return retval
-    else:
-        return 0
-
-
-cdef void uninstall_pyopener_plugin(VSIFilesystemPluginCallbacksStruct *callbacks_struct):
-    if callbacks_struct is not NULL:
-        callbacks_struct.pUserData = NULL
-        VSIFreeFilesystemPluginCallbacksStruct(callbacks_struct)
-    callbacks_struct = NULL
+# When an opener is registered for a path, this structure captures the
+# path and unique registration instance. VSI stat, read_dir, and open
+# calls have access to the struct instance.
+cdef struct FSData:
+    char *path
+    char *uuid
 
 
 cdef int pyopener_stat(
@@ -74,14 +46,20 @@ cdef int pyopener_stat(
     int nFlags
 ) with gil:
     """Provides POSIX stat data to GDAL from a Python filesystem."""
-    # Convert the given filename to a registry key.
-    # Reminder: openers are registered by URI scheme, authority, and 
-    # *directory* path.
+    cdef FSData *fsdata = <FSData *>pUserData
+    path = fsdata.path.decode("utf-8")
+    uuid = fsdata.uuid.decode("utf-8")
+    key = (Path(path), uuid)
     urlpath = pszFilename.decode("utf-8")
-    key = Path(urlpath).parent
 
     registry = _OPENER_REGISTRY.get()
-    log.debug("Looking up opener in pyopener_stat: registry=%r, key=%r", registry, key)
+    log.debug(
+        "Looking up opener in pyopener_stat: urlpath=%r, registry=%r, key=%r",
+        urlpath,
+        registry,
+        key
+    )
+
     try:
         file_opener = registry[key]
     except KeyError as err:
@@ -91,15 +69,15 @@ cdef int pyopener_stat(
 
     try:
         if file_opener.isfile(urlpath):
-            fmode = 0o170000 | stat.S_IFREG
+            fmode = stat.S_IFREG
         elif file_opener.isdir(urlpath):
-            fmode = 0o170000 | stat.S_IFDIR
+            fmode = stat.S_IFDIR
         else:
             # No such file or directory.
             return -1
         size = file_opener.size(urlpath)
         mtime = file_opener.mtime(urlpath)
-    except (FileNotFoundError, KeyError):
+    except (FileNotFoundError, KeyError) as err:
         # No such file or directory.
         return -1
     except Exception as err:
@@ -113,17 +91,64 @@ cdef int pyopener_stat(
     return 0
 
 
+cdef int pyopener_unlink(
+    void *pUserData,
+    const char *pszFilename,
+) with gil:
+    """Unlink a file from a Python filesystem."""
+    cdef FSData *fsdata = <FSData *>pUserData
+    path = fsdata.path.decode("utf-8")
+    uuid = fsdata.uuid.decode("utf-8")
+    key = (Path(path), uuid)
+    urlpath = pszFilename.decode("utf-8")
+
+    registry = _OPENER_REGISTRY.get()
+    log.debug(
+        "Looking up opener in pyopener_unlink: urlpath=%r, registry=%r, key=%r",
+        urlpath,
+        registry,
+        key
+    )
+
+    try:
+        file_opener = registry[key]
+    except KeyError as err:
+        errmsg = f"Opener not found: {repr(err)}".encode("utf-8")
+        CPLError(CE_Failure, <CPLErrorNum>4, <const char *>"%s", <const char *>errmsg)
+        return -1
+
+    try:
+        file_opener.rm(urlpath)
+        return 0
+    except (FileNotFoundError, KeyError) as err:
+        # No such file or directory.
+        return -1
+    except Exception as err:
+        errmsg = f"Opener failed to determine file info: {repr(err)}".encode("utf-8")
+        CPLError(CE_Failure, <CPLErrorNum>4, <const char *>"%s", <const char *>errmsg)
+        return -1
+
+
 cdef char ** pyopener_read_dir(
     void *pUserData,
     const char *pszDirname,
     int nMaxFiles
 ) with gil:
     """Provides a directory listing to GDAL from a Python filesystem."""
+    cdef FSData *fsdata = <FSData *>pUserData
+    path = fsdata.path.decode("utf-8")
+    uuid = fsdata.uuid.decode("utf-8")
+    key = (Path(path), uuid)
     urlpath = pszDirname.decode("utf-8")
-    key = Path(urlpath)
 
     registry = _OPENER_REGISTRY.get()
-    log.debug("Looking up opener in pyopener_read_dir: registry=%r, key=%r", registry, key)
+    log.debug(
+        "Looking up opener in pyopener_read_dir: urlpath=%r, registry=%r, key=%r",
+        urlpath,
+        registry,
+        key
+    )
+
     try:
         file_opener = registry[key]
     except KeyError as err:
@@ -134,8 +159,7 @@ cdef char ** pyopener_read_dir(
     try:
         # GDAL wants relative file names.
         contents = [Path(item).name for item in file_opener.ls(urlpath)]
-        log.debug("Looking for dir contents: urlpath=%r, contents=%r", urlpath, contents)
-    except (FileNotFoundError, KeyError):
+    except (FileNotFoundError, KeyError) as err:
         # No such file or directory.
         return NULL
     except Exception as err:
@@ -163,12 +187,24 @@ cdef void* pyopener_open(
     GDAL may call this function multiple times per filename and each
     result must be seperately seekable.
     """
+    cdef FSData *fsdata = <FSData *>pUserData
+    path = fsdata.path.decode("utf-8")
+    uuid = fsdata.uuid.decode("utf-8")
+    key = (Path(path), uuid)
     urlpath = pszFilename.decode("utf-8")
+
     mode = pszAccess.decode("utf-8")
-    key = Path(urlpath).parent
+    if not "b" in mode:
+        mode += "b"
 
     registry = _OPENER_REGISTRY.get()
-    log.debug("Looking up opener in pyopener_open: registry=%r, key=%r", registry, key)
+    log.debug(
+        "Looking up opener in pyopener_open: urlpath=%r, registry=%r, key=%r",
+        urlpath,
+        registry,
+        key
+    )
+
     try:
         file_opener = registry[key]
     except KeyError as err:
@@ -199,7 +235,6 @@ cdef void* pyopener_open(
     try:
         file_obj = stack.enter_context(file_obj)
     except (AttributeError, TypeError) as err:
-        log.error("File object is not a context manager: file_obj=%r", file_obj)
         errmsg = f"Opener failed to open file with arguments ({repr(urlpath)}, {repr(mode)}): {repr(err)}".encode("utf-8")
         CPLError(CE_Failure, <CPLErrorNum>4, <const char *>"%s", <const char *>errmsg)
         return NULL
@@ -207,10 +242,9 @@ cdef void* pyopener_open(
         errmsg = "OpenFile didn't resolve".encode("utf-8")
         return NULL
     else:
-        exit_stacks = _OPEN_FILE_EXIT_STACKS.get()
+        exit_stacks = _OPEN_FILE_EXIT_STACKS.get({})
         exit_stacks[file_obj] = stack
         _OPEN_FILE_EXIT_STACKS.set(exit_stacks)
-        log.debug("Returning: file_obj=%r", file_obj)
         return <void *>file_obj
 
 
@@ -222,6 +256,7 @@ cdef int pyopener_eof(void *pFile) with gil:
     else:
         return 0
 
+
 cdef vsi_l_offset pyopener_tell(void *pFile) with gil:
     cdef object file_obj = <object>pFile
     return <vsi_l_offset>file_obj.tell()
@@ -249,7 +284,11 @@ cdef size_t pyopener_write(void *pFile, void *pBuffer, size_t nSize, size_t nCou
     cdef object file_obj = <object>pFile
     buffer_len = nSize * nCount
     cdef unsigned char [:] buff_view = <unsigned char[:buffer_len]>pBuffer
-    log.debug("Writing data: file_obj=%r, buff_view=%r, buffer_len=%r", file_obj, buff_view, buffer_len)
+    log.debug(
+        "Writing data: file_obj=%r, buff_view=%r, buffer_len=%r",
+        file_obj,
+        buff_view,
+        buffer_len)
     try:
         num = file_obj.write(buff_view)
     except TypeError:
@@ -279,32 +318,86 @@ cdef int pyopener_close(void *pFile) with gil:
 
 @contextlib.contextmanager
 def _opener_registration(urlpath, obj):
-    key = Path(urlpath).parent
+    cdef char **registered_prefixes = NULL
+    cdef int prefix_index = 0
+    cdef VSIFilesystemPluginCallbacksStruct *callbacks_struct = NULL
+    cdef FSData fsdata
+    cdef char *path_c = NULL
+    cdef char *uuid_c = NULL
+
+    # To resolve issue 1406 we add the opener or filesystem id to the
+    # registry key.
+    kpath = Path(urlpath).parent
+    kid = uuid4().hex
+    key = (kpath, kid)
+
+    path_b = kpath.as_posix().encode("utf-8")
+    path_c = path_b
+    uuid_b = kid.encode("utf-8")
+    uuid_c = uuid_b
+
+    fsdata = FSData(path_c, uuid_c)
+
+    namespace = f"{VSI_NS_ROOT}_{kid}"
+    cdef bytes prefix_bytes = f"/{namespace}/".encode("utf-8")
 
     # Might raise.
     opener = _create_opener(obj)
 
-    registry = _OPENER_REGISTRY.get()
+    registry = _OPENER_REGISTRY.get({})
+
     if key in registry:
         if registry[key] != opener:
             raise OpenerRegistrationError(f"Opener already registered for urlpath.")
         else:
             try:
-                yield f"{PREFIX}{urlpath}"
+                yield f"/{namespace}/{urlpath}"
             finally:
                 registry = _OPENER_REGISTRY.get()
                 _ = registry.pop(key, None)
                 _OPENER_REGISTRY.set(registry)
+
     else:
+        # Install handler.
+        registered_prefixes = VSIGetFileSystemsPrefixes()
+        prefix_index = CSLFindString(<CSLConstList>registered_prefixes, prefix_bytes)
+        CSLDestroy(registered_prefixes)
+
+        if prefix_index < 0:
+            log.debug("Installing Python opener handler plugin: prefix_bytes=%r", prefix_bytes)
+            callbacks_struct = VSIAllocFilesystemPluginCallbacksStruct()
+            callbacks_struct.open = <VSIFilesystemPluginOpenCallback>pyopener_open
+            callbacks_struct.eof = <VSIFilesystemPluginEofCallback>pyopener_eof
+            callbacks_struct.tell = <VSIFilesystemPluginTellCallback>pyopener_tell
+            callbacks_struct.seek = <VSIFilesystemPluginSeekCallback>pyopener_seek
+            callbacks_struct.read = <VSIFilesystemPluginReadCallback>pyopener_read
+            callbacks_struct.write = <VSIFilesystemPluginWriteCallback>pyopener_write
+            callbacks_struct.flush = <VSIFilesystemPluginFlushCallback>pyopener_flush
+            callbacks_struct.close = <VSIFilesystemPluginCloseCallback>pyopener_close
+            callbacks_struct.read_dir = <VSIFilesystemPluginReadDirCallback>pyopener_read_dir
+            callbacks_struct.stat = <VSIFilesystemPluginStatCallback>pyopener_stat
+            callbacks_struct.unlink = <VSIFilesystemPluginUnlinkCallback>pyopener_unlink
+            callbacks_struct.pUserData = &fsdata
+            retval = VSIInstallPluginHandler(prefix_bytes, callbacks_struct)
+            VSIFreeFilesystemPluginCallbacksStruct(callbacks_struct)
+
+        registered_prefixes = VSIGetFileSystemsPrefixes()
+        prefix_index = CSLFindString(<CSLConstList>registered_prefixes, prefix_bytes)
+        CSLDestroy(registered_prefixes)
+
         registry[key] = opener
         _OPENER_REGISTRY.set(registry)
+
         try:
-            yield f"{PREFIX}{urlpath}"
+            yield f"/{namespace}/{urlpath}"
         finally:
             registry = _OPENER_REGISTRY.get()
             _ = registry.pop(key, None)
             _OPENER_REGISTRY.set(registry)
 
+            IF (CTE_GDAL_MAJOR_VERSION, CTE_GDAL_MINOR_VERSION) >= (3, 9):
+                retval = VSIRemovePluginHandler(prefix_bytes)
+
 
 class _AbstractOpener:
     """Adapts a Python object to the opener interface."""
@@ -381,6 +474,19 @@ class _AbstractOpener:
             Modification timestamp in seconds.
         """
         raise NotImplementedError
+    def rm(self, path):
+        """Remove a resource.
+
+        Parameters
+        ----------
+        path : str
+            The identifier/locator for a resource within a filesystem.
+
+        Returns
+        -------
+        None
+        """
+        raise NotImplementedError
     def size(self, path):
         """Get the size, in bytes, of a resource..
 
@@ -427,14 +533,16 @@ class _FilesystemOpener(_AbstractOpener):
     def isdir(self, path):
         return self._obj.isdir(path)
     def ls(self, path):
-        return self._obj.ls(path)
+        # return value of ls() varies between file and zip fsspec filesystems.
+        return [item if isinstance(item, str) else item["filename"] for item in self._obj.ls(path)]
     def mtime(self, path):
         try:
             mtime = int(self._obj.modified(path).timestamp())
         except NotImplementedError:
             mtime = 0
-        log.debug("Modification time: mtime=%r", mtime)
         return mtime
+    def rm(self, path):
+        return self._obj.rm(path)
     def size(self, path):
         return self._obj.size(path)
 
@@ -447,6 +555,8 @@ class _AltFilesystemOpener(_FilesystemOpener):
         return self._obj.is_dir(path)
     def mtime(self, path):
         return 0
+    def rm(self, path):
+        self._obj.remove_file(path)
     def size(self, path):
         return self._obj.file_size(path)
 


=====================================
fiona/drvsupport.py
=====================================
@@ -103,7 +103,7 @@ supported_drivers = dict(
         # multi-layer
         #   ("OpenAir", "r"),
         # (Geo)Parquet
-        ("Parquet", "raw"),
+        ("Parquet", "rw"),
         # PCI Geomatics Database File 	PCIDSK 	No 	No 	Yes, using internal PCIDSK SDK (from GDAL 1.7.0)
         ("PCIDSK", "raw"),
         # PDS 	PDS 	No 	Yes 	Yes


=====================================
fiona/gdal.pxi
=====================================
@@ -16,18 +16,22 @@ cdef extern from "cpl_conv.h":
     const char *CPLFindFile(const char *pszClass, const char *pszBasename)
 
 
+cdef extern from "cpl_port.h":
+    ctypedef char **CSLConstList
+
+
 cdef extern from "cpl_string.h":
-    char ** CSLAddNameValue (char **list, const char *name, const char *value)
-    char ** CSLSetNameValue (char **list, const char *name, const char *value)
-    void CSLDestroy (char **list)
+    char ** CSLAddNameValue(char **list, const char *name, const char *value)
+    char ** CSLSetNameValue(char **list, const char *name, const char *value)
+    void CSLDestroy(char **list)
     char ** CSLAddString(char **list, const char *string)
-    int CSLCount(char **papszStrList)
-    char **CSLDuplicate(char **papszStrList)
-    int CSLFindName(char **papszStrList, const char *pszName)
-    int CSLFindString(char **papszStrList, const char *pszString)
-    int CSLFetchBoolean(char **papszStrList, const char *pszName, int default)
-    const char *CSLFetchNameValue(char **papszStrList, const char *pszName)
-    char **CSLMerge(char **first, char **second)
+    int CSLCount(CSLConstList papszStrList)
+    char **CSLDuplicate(CSLConstList papszStrList)
+    int CSLFindName(CSLConstList papszStrList, const char *pszName)
+    int CSLFindString(CSLConstList papszStrList, const char *pszString)
+    int CSLFetchBoolean(CSLConstList papszStrList, const char *pszName, int default)
+    const char *CSLFetchNameValue(CSLConstList papszStrList, const char *pszName)
+    char **CSLMerge(char **first, CSLConstList second)
 
 
 cdef extern from "cpl_error.h" nogil:
@@ -47,7 +51,9 @@ cdef extern from "cpl_error.h" nogil:
     const char* CPLGetLastErrorMsg()
     CPLErr CPLGetLastErrorType()
     void CPLPushErrorHandler(CPLErrorHandler handler)
+    void CPLPushErrorHandlerEx(CPLErrorHandler handler, void *userdata)
     void CPLPopErrorHandler()
+    void CPLQuietErrorHandler(CPLErr eErrClass, CPLErrorNum nError, const char *pszErrorMsg)
 
 
 cdef extern from "cpl_vsi.h" nogil:
@@ -137,6 +143,11 @@ cdef extern from "cpl_vsi.h" nogil:
     int VSI_ISDIR(int mode)
 
 
+IF (CTE_GDAL_MAJOR_VERSION, CTE_GDAL_MINOR_VERSION) >= (3, 9):
+    cdef extern from "cpl_vsi.h" nogil:
+        int VSIRemovePluginHandler(const char*)
+
+
 cdef extern from "ogr_core.h" nogil:
     ctypedef int OGRErr
     char *OGRGeometryTypeToName(int type)
@@ -297,7 +308,7 @@ cdef extern from "ogr_srs_api.h" nogil:
     OGRErr OSRExportToPROJJSON(OGRSpatialReferenceH hSRS,
                                 char ** ppszReturn,
                                 const char* const* papszOptions)
-
+    void OSRGetPROJVersion	(int *pnMajor, int *pnMinor, int *pnPatch)
 
 cdef extern from "gdal.h" nogil:
 


=====================================
fiona/model.py
=====================================
@@ -5,10 +5,15 @@ from collections.abc import MutableMapping
 from enum import Enum
 import itertools
 from json import JSONEncoder
+import reprlib
 from warnings import warn
 
 from fiona.errors import FionaDeprecationWarning
 
+_model_repr = reprlib.Repr()
+_model_repr.maxlist = 1
+_model_repr.maxdict = 5
+
 
 class OGRGeometryType(Enum):
     Unknown = 0
@@ -134,7 +139,10 @@ class Object(MutableMapping):
         }
 
     def __getitem__(self, item):
-        props = self._props()
+        props = {
+            k: (dict(v) if isinstance(v, Object) else v)
+            for k, v in self._props().items()
+        }
         props.update(**self._data)
         return props[item]
 
@@ -146,6 +154,13 @@ class Object(MutableMapping):
         props = self._props()
         return len(props) + len(self._data)
 
+    def __repr__(self):
+        kvs = [
+            f"{k}={v!r}"
+            for k, v in itertools.chain(self._props().items(), self._data.items())
+        ]
+        return "fiona.{}({})".format(self.__class__.__name__, ", ".join(kvs))
+
     def __setitem__(self, key, value):
         warn(
             "instances of this class -- CRS, geometry, and feature objects -- will become immutable in fiona version 2.0",
@@ -197,6 +212,10 @@ class Geometry(Object):
         )
         super().__init__(**data)
 
+    def __repr__(self):
+        kvs = [f"{k}={_model_repr.repr(v)}" for k, v in self.items() if v is not None]
+        return "fiona.Geometry({})".format(", ".join(kvs))
+
     @classmethod
     def from_dict(cls, ob=None, **kwargs):
         if ob is not None:
@@ -384,7 +403,10 @@ class ObjectEncoder(JSONEncoder):
 
     def default(self, o):
         if isinstance(o, Object):
-            o_dict = {k: self.default(v) for k, v in o.items()}
+            o_dict = {
+                k: self.default(v)
+                for k, v in itertools.chain(o._props().items(), o._data.items())
+            }
             if isinstance(o, Geometry):
                 if o.type == "GeometryCollection":
                     _ = o_dict.pop("coordinates", None)


=====================================
fiona/ogrext.pyx
=====================================
@@ -18,11 +18,12 @@ from fiona._geometry cimport (
     GeomBuilder, OGRGeomBuilder, geometry_type_code,
     normalize_geometry_type_code, base_geometry_type_code)
 from fiona._err cimport exc_wrap_int, exc_wrap_pointer, exc_wrap_vsilfile, get_last_error_msg
+from fiona._err cimport StackChecker
 
 import fiona
 from fiona._env import get_gdal_version_num, calc_gdal_version_num, get_gdal_version_tuple
 from fiona._err import (
-    cpl_errs, FionaNullPointerError, CPLE_BaseError, CPLE_AppDefinedError,
+    cpl_errs, stack_errors, FionaNullPointerError, CPLE_BaseError, CPLE_AppDefinedError,
     CPLE_OpenFailedError)
 from fiona._geometry import GEOMETRY_TYPES
 from fiona import compat
@@ -92,6 +93,10 @@ cdef void* gdal_open_vector(const char* path_c, int mode, drivers, options) exce
     cdef char **drvs = NULL
     cdef void* drv = NULL
     cdef char **open_opts = NULL
+    cdef char **registered_prefixes = NULL
+    cdef int prefix_index = 0
+    cdef VSIFilesystemPluginCallbacksStruct *callbacks_struct = NULL
+    cdef StackChecker checker
 
     flags = GDAL_OF_VECTOR | GDAL_OF_VERBOSE_ERROR
     if mode == 1:
@@ -122,15 +127,13 @@ cdef void* gdal_open_vector(const char* path_c, int mode, drivers, options) exce
     open_opts = CSLAddNameValue(open_opts, "VALIDATE_OPEN_OPTIONS", "NO")
 
     try:
-        cogr_ds = exc_wrap_pointer(
-            GDALOpenEx(path_c, flags, <const char *const *>drvs, <const char *const *>open_opts, NULL)
-        )
-        return cogr_ds
-    except FionaNullPointerError:
-        raise DriverError(
-            f"Failed to open dataset (mode={mode}): {path_c.decode('utf-8')}")
+        with stack_errors() as checker:
+            cogr_ds = GDALOpenEx(
+                path_c, flags, <const char *const *>drvs, <const char *const *>open_opts, NULL
+            )
+            return checker.exc_wrap_pointer(cogr_ds)
     except CPLE_BaseError as exc:
-        raise DriverError(str(exc))
+        raise DriverError(f"Failed to open dataset (flags={flags}): {path_c.decode('utf-8')}") from exc
     finally:
         CSLDestroy(drvs)
         CSLDestroy(open_opts)
@@ -149,9 +152,7 @@ cdef void* gdal_create(void* cogr_driver, const char *path_c, options) except NU
     creation_option_keys = option_keys & set(meta.dataset_creation_options(db.decode("utf-8")))
 
     for k, v in options.items():
-
         if k.upper() in creation_option_keys:
-
             kb = k.upper().encode('utf-8')
 
             if isinstance(v, bool):
@@ -171,7 +172,6 @@ cdef void* gdal_create(void* cogr_driver, const char *path_c, options) except NU
         CSLDestroy(creation_opts)
 
 
-
 def _explode(coords):
     """Explode a GeoJSON geometry's coordinates object and yield
     coordinate tuples. As long as the input is conforming, the type of
@@ -193,6 +193,7 @@ def _bounds(geometry):
     except (KeyError, TypeError):
         return None
 
+
 cdef int GDAL_VERSION_NUM = get_gdal_version_num()
 
 
@@ -376,7 +377,7 @@ cdef class StringListField(AbstractField):
         for item in value:
             item_b = item.encode(encoding)
             string_list = CSLAddString(string_list, <const char *>item_b)
-        OGR_F_SetFieldStringList(feature, i, <const char **>string_list)
+        OGR_F_SetFieldStringList(feature, i, <CSLConstList>string_list)
 
 
 cdef class JSONField(AbstractField):
@@ -1264,7 +1265,7 @@ cdef class Session:
 
         cdef char **metadata = NULL
         metadata = GDALGetMetadata(obj, domain)
-        num_items = CSLCount(metadata)
+        num_items = CSLCount(<CSLConstList>metadata)
 
         return dict(metadata[i].decode('utf-8').split('=', 1) for i in range(num_items))
 
@@ -2126,10 +2127,8 @@ def _remove_layer(path, layer, driver=None):
 
 
 def _listlayers(path, **kwargs):
-
     """Provides a list of the layers in an OGR data source.
     """
-
     cdef void *cogr_ds = NULL
     cdef void *cogr_layer = NULL
     cdef const char *path_c
@@ -2175,7 +2174,7 @@ def _listdir(path):
         raise FionaValueError(f"Path '{path}' is not a directory.")
 
     papszFiles = VSIReadDir(path_c)
-    n = CSLCount(papszFiles)
+    n = CSLCount(<CSLConstList>papszFiles)
     files = []
     for i in range(n):
         files.append(papszFiles[i].decode("utf-8"))


=====================================
setup.py
=====================================
@@ -83,7 +83,7 @@ if 'clean' not in sys.argv:
                          " setup.py to locate needed GDAL files.\nMore"
                          " information is available in the README.")
         else:
-            logging.warn("Failed to get options via gdal-config: %s", str(e))
+            logging.warning("Failed to get options via gdal-config: %s", str(e))
 
     # Get GDAL API version from environment variable.
     if 'GDAL_VERSION' in os.environ:


=====================================
tests/test_bounds.py
=====================================
@@ -69,7 +69,7 @@ def test_bounds(tmpdir, driver, testdata_generator):
             ys.append(r.geometry["coordinates"][1])
         return min(xs), max(xs), min(ys), max(ys)
 
-    with fiona.open(path, "w", driver=driver, schema=schema) as c:
+    with fiona.open(path, "w", crs="OGC:CRS84", driver=driver, schema=schema) as c:
         c.writerecords(records1)
 
         try:


=====================================
tests/test_memoryfile.py
=====================================
@@ -217,7 +217,7 @@ def test_mapinfo_raises():
         for driver in supported_drivers
         if _driver_supports_mode(driver, "w")
         and supports_vsi(driver)
-        and driver not in {"MapInfo File"}
+        and driver not in {"MapInfo File", "TileDB"}
     ],
 )
 def test_write_memoryfile_drivers(driver, testdata_generator):
@@ -226,7 +226,7 @@ def test_write_memoryfile_drivers(driver, testdata_generator):
     schema, crs, records1, _, _ = testdata_generator(driver, range1, [])
 
     with MemoryFile() as memfile:
-        with memfile.open(driver=driver, schema=schema) as c:
+        with memfile.open(driver=driver, crs="OGC:CRS84", schema=schema) as c:
             c.writerecords(records1)
 
         with memfile.open(driver=driver) as c:
@@ -267,7 +267,7 @@ def test_multiple_layer_memoryfile(testdata_generator):
         for driver in supported_drivers
         if _driver_supports_mode(driver, "a")
         and supports_vsi(driver)
-        and driver not in {"MapInfo File"}
+        and driver not in {"MapInfo File", "TileDB"}
     ],
 )
 def test_append_memoryfile_drivers(driver, testdata_generator):
@@ -277,16 +277,23 @@ def test_append_memoryfile_drivers(driver, testdata_generator):
     schema, crs, records1, records2, _ = testdata_generator(driver, range1, range2)
 
     with MemoryFile() as memfile:
-        with memfile.open(driver=driver, schema=schema) as c:
+        with memfile.open(driver=driver, crs="OGC:CRS84", schema=schema) as c:
             c.writerecords(records1)
 
-        with memfile.open(mode='a', driver=driver, schema=schema) as c:
-            c.writerecords(records2)
-
-        with memfile.open(driver=driver) as c:
-            assert driver == c.driver
-            items = list(c)
-            assert len(items) == len(range1 + range2)
+        # The parquet dataset does not seem to support append mode
+        if driver == "Parquet":
+            with memfile.open(driver=driver) as c:
+                assert driver == c.driver
+                items = list(c)
+                assert len(items) == len(range1)
+        else:
+            with memfile.open(mode='a', driver=driver, schema=schema) as c:
+                c.writerecords(records2)
+
+            with memfile.open(driver=driver) as c:
+                assert driver == c.driver
+                items = list(c)
+                assert len(items) == len(range1 + range2)
 
 
 def test_memoryfile_driver_does_not_support_vsi():


=====================================
tests/test_model.py
=====================================
@@ -333,3 +333,12 @@ def test_geometry_collection_encoding():
     assert "coordinates" not in ObjectEncoder().default(
         Geometry(type="GeometryCollection", geometries=[])
     )
+
+
+def test_feature_repr():
+    feat = Feature(
+        id="1",
+        geometry=Geometry(type="LineString", coordinates=[(0, 0)] * 100),
+        properties=Properties(a=1, foo="bar"),
+    )
+    assert repr(feat) == "fiona.Feature(geometry=fiona.Geometry(coordinates=[(0, 0), ...], type='LineString'), id='1', properties=fiona.Properties(a=1, foo='bar'))"


=====================================
tests/test_pyopener.py
=====================================
@@ -1,6 +1,7 @@
 """Tests of the Python opener VSI plugin."""
 
 import io
+import os
 
 import fsspec
 import pytest
@@ -78,3 +79,85 @@ def test_opener_fsspec_fs_write(tmp_path):
         collection.write(feature)
         assert len(collection) == 1
         assert collection.crs == "OGC:CRS84"
+
+
+def test_threads_context():
+    import io
+    from threading import Thread
+
+
+    def target():
+        with fiona.open("tests/data/coutwildrnp.shp", opener=io.open) as colxn:
+            print(colxn.profile)
+            assert len(colxn) == 67
+
+
+    thread = Thread(target=target)
+    thread.start()
+    thread.join()
+
+
+def test_overwrite(data):
+    """Opener can overwrite data."""
+    schema = {"geometry": "Point", "properties": {"zero": "int"}}
+    feature = Feature.from_dict(
+        **{
+            "geometry": {"type": "Point", "coordinates": (0, 0)},
+            "properties": {"zero": "0"},
+        }
+    )
+    fs = fsspec.filesystem("file")
+    outputfile = os.path.join(str(data), "coutwildrnp.shp")
+
+    with fiona.open(
+        str(outputfile),
+        "w",
+        driver="ESRI Shapefile",
+        schema=schema,
+        crs="OGC:CRS84",
+        opener=fs,
+    ) as collection:
+        collection.write(feature)
+        assert len(collection) == 1
+        assert collection.crs == "OGC:CRS84"
+
+
+def test_opener_fsspec_zip_fs_listlayers():
+    """Use fsspec zip filesystem as opener for listlayers()."""
+    fs = fsspec.filesystem("zip", fo="tests/data/coutwildrnp.zip")
+    assert fiona.listlayers("coutwildrnp.shp", opener=fs) == ["coutwildrnp"]
+
+
+def test_opener_fsspec_zip_fs_listdir():
+    """Use fsspec zip filesystem as opener for listdir()."""
+    fs = fsspec.filesystem("zip", fo="tests/data/coutwildrnp.zip")
+    listing = fiona.listdir("/", opener=fs)
+    assert len(listing) == 4
+    assert set(
+        ["coutwildrnp.shp", "coutwildrnp.dbf", "coutwildrnp.shx", "coutwildrnp.prj"]
+    ) & set(listing)
+
+
+
+def test_opener_fsspec_file_fs_listdir():
+    """Use fsspec file filesystem as opener for listdir()."""
+    fs = fsspec.filesystem("file")
+    listing = fiona.listdir("tests/data", opener=fs)
+    assert len(listing) >= 35
+    assert set(
+        ["coutwildrnp.shp", "coutwildrnp.dbf", "coutwildrnp.shx", "coutwildrnp.prj"]
+    ) & set(listing)
+
+
+def test_opener_fsspec_file_remove(data):
+    """Opener can remove data."""
+    fs = fsspec.filesystem("file")
+    listing = fiona.listdir(str(data), opener=fs)
+    assert len(listing) == 4
+    outputfile = os.path.join(str(data), "coutwildrnp.shp")
+    fiona.remove(outputfile)
+    listing = fiona.listdir(str(data), opener=fs)
+    assert len(listing) == 0
+    assert not set(
+        ["coutwildrnp.shp", "coutwildrnp.dbf", "coutwildrnp.shx", "coutwildrnp.prj"]
+    ) & set(listing)


=====================================
tests/test_slice.py
=====================================
@@ -113,7 +113,9 @@ def slice_dataset_path(request):
     tmpdir = tempfile.mkdtemp()
     path = os.path.join(tmpdir, get_temp_filename(driver))
 
-    with fiona.open(path, "w", driver=driver, schema=schema, **create_kwargs) as c:
+    with fiona.open(
+        path, "w", driver=driver, crs="OGC:CRS84", schema=schema, **create_kwargs
+    ) as c:
         c.writerecords(records)
     yield path
     shutil.rmtree(tmpdir)


=====================================
tests/test_topojson.py
=====================================
@@ -32,6 +32,6 @@ def test_read_topojson(data_dir):
 
     assert len(features) == 3, "unexpected number of features"
     for feature in features:
-        assert isinstance(feature["properties"], Properties)
-        assert len(feature["properties"]) > 0
-        assert feature["geometry"]["type"] in {"Point", "LineString", "Polygon"}
+        assert isinstance(feature.properties, Properties)
+        assert len(feature.properties) > 0
+        assert feature.geometry.type in {"Point", "LineString", "Polygon"}



View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/-/compare/1f6ab8484ef57fa437db4d07ef7683109de27a10...0f051d18d36058380e4a6671b57f19cbad15e705

-- 
This project does not include diff previews in email notifications.
View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/-/compare/1f6ab8484ef57fa437db4d07ef7683109de27a10...0f051d18d36058380e4a6671b57f19cbad15e705
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20240711/e055a19e/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list