[Git][debian-gis-team/pooch][master] 7 commits: Use <!nocheck> marker

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Sat Oct 28 10:32:25 BST 2023



Antonio Valentino pushed to branch master at Debian GIS Project / pooch


Commits:
ddef3e9f by Antonio Valentino at 2023-10-28T09:16:42+00:00
Use <!nocheck> marker

- - - - -
fae60771 by Antonio Valentino at 2023-10-28T09:19:33+00:00
Switch to autopkgtest-pkg-pybuild

- - - - -
6a487c4a by Antonio Valentino at 2023-10-28T09:21:23+00:00
Update debian/python3-pooch.lintian-overrides

- - - - -
dbfc27cd by Antonio Valentino at 2023-10-28T09:21:50+00:00
New upstream version 1.8.0
- - - - -
0aa030dd by Antonio Valentino at 2023-10-28T09:21:50+00:00
Update upstream source from tag 'upstream/1.8.0'

Update to upstream version '1.8.0'
with Debian dir 75609a0831ca70f5df47267c6e03cf329f01e147
- - - - -
648bce14 by Antonio Valentino at 2023-10-28T09:30:26+00:00
New upstream release

- - - - -
37cdbaa5 by Antonio Valentino at 2023-10-28T09:30:36+00:00
Add build dependency on python3-pytest-httpserver

- - - - -


16 changed files:

- .github/workflows/docs.yml
- .github/workflows/pypi.yml
- .github/workflows/style.yml
- .github/workflows/test.yml
- README.md
- debian/changelog
- debian/control
- debian/python3-pooch.lintian-overrides
- doc/changes.rst
- doc/versions.rst
- env/requirements-test.txt
- environment.yml
- pooch/core.py
- pooch/downloaders.py
- pooch/tests/test_downloaders.py
- setup.cfg


Changes:

=====================================
.github/workflows/docs.yml
=====================================
@@ -42,7 +42,7 @@ jobs:
 
       # Checks-out your repository under $GITHUB_WORKSPACE
       - name: Checkout
-        uses: actions/checkout at v2
+        uses: actions/checkout at v3
         with:
           # Need to fetch more than the last commit so that setuptools-scm can
           # create the correct version string. If the number of commits since
@@ -58,9 +58,9 @@ jobs:
         run: git fetch origin 'refs/tags/*:refs/tags/*'
 
       - name: Setup Python
-        uses: actions/setup-python at v2
+        uses: actions/setup-python at v4
         with:
-          python-version: "3.10"
+          python-version: "3.x"
 
       - name: Collect requirements
         run: |
@@ -86,7 +86,7 @@ jobs:
           echo "::set-output name=dir::$(pip cache dir)"
 
       - name: Setup caching for pip packages
-        uses: actions/cache at v2
+        uses: actions/cache at v3
         with:
           path: ${{ steps.pip-cache.outputs.dir }}
           key: ${{ runner.os }}-pip-${{ hashFiles('requirements-full.txt') }}
@@ -113,7 +113,7 @@ jobs:
 
       # Store the docs as a build artifact so we can deploy it later
       - name: Upload HTML documentation as an artifact
-        uses: actions/upload-artifact at v2
+        uses: actions/upload-artifact at v3
         with:
           name: docs-${{ github.sha }}
           path: doc/_build/html
@@ -127,11 +127,11 @@ jobs:
 
     steps:
       - name: Checkout
-        uses: actions/checkout at v2
+        uses: actions/checkout at v3
 
       # Fetch the built docs from the "build" job
       - name: Download HTML documentation artifact
-        uses: actions/download-artifact at v2
+        uses: actions/download-artifact at v3
         with:
           name: docs-${{ github.sha }}
           path: doc/_build/html


=====================================
.github/workflows/pypi.yml
=====================================
@@ -29,7 +29,7 @@ jobs:
     steps:
       # Checks-out your repository under $GITHUB_WORKSPACE
       - name: Checkout
-        uses: actions/checkout at v2
+        uses: actions/checkout at v3
         with:
           # Need to fetch more than the last commit so that setuptools_scm can
           # create the correct version string. If the number of commits since
@@ -45,9 +45,9 @@ jobs:
         run: git fetch origin 'refs/tags/*:refs/tags/*'
 
       - name: Setup Python
-        uses: actions/setup-python at v2
+        uses: actions/setup-python at v4
         with:
-          python-version: "3.10"
+          python-version: "3.x"
 
       - name: Install requirements
         run: |
@@ -79,7 +79,7 @@ jobs:
       - name: Upload archives as artifacts
         # Only if not a pull request
         if: success() && github.event_name != 'pull_request'
-        uses: actions/upload-artifact at v2
+        uses: actions/upload-artifact at v3
         with:
           name: pypi-${{ github.sha }}
           path: dist
@@ -94,7 +94,7 @@ jobs:
 
     steps:
       - name: Checkout
-        uses: actions/checkout at v2
+        uses: actions/checkout at v3
         with:
           # The GitHub token is preserved by default but this job doesn't need
           # to be able to push to GitHub.
@@ -102,7 +102,7 @@ jobs:
 
       # Fetch the built archives from the "build" job
       - name: Download built archives artifact
-        uses: actions/download-artifact at v2
+        uses: actions/download-artifact at v3
         with:
           name: pypi-${{ github.sha }}
           path: dist


=====================================
.github/workflows/style.yml
=====================================
@@ -20,12 +20,12 @@ jobs:
     runs-on: ubuntu-latest
     steps:
       - name: Checkout
-        uses: actions/checkout at v2
+        uses: actions/checkout at v3
         with:
           persist-credentials: false
 
       - name: Setup Python
-        uses: actions/setup-python at v2
+        uses: actions/setup-python at v4
         with:
           python-version: "3.10"
 
@@ -42,12 +42,12 @@ jobs:
     runs-on: ubuntu-latest
     steps:
       - name: Checkout
-        uses: actions/checkout at v2
+        uses: actions/checkout at v3
         with:
           persist-credentials: false
 
       - name: Setup Python
-        uses: actions/setup-python at v2
+        uses: actions/setup-python at v4
         with:
           python-version: "3.10"
 


=====================================
.github/workflows/test.yml
=====================================
@@ -32,6 +32,7 @@ jobs:
   # Run tests and upload to codecov
   test:
     name: ${{ matrix.os }} python=${{ matrix.python }} dependencies=${{ matrix.dependencies }}
+    if: ${{ github.repository_owner == 'fatiando' || github.event_name != 'schedule' }}
     runs-on: ${{ matrix.os }}-latest
     strategy:
       # Otherwise, the workflow would stop if a single job fails. We want to
@@ -44,7 +45,7 @@ jobs:
           - windows
         python:
           - "3.7"
-          - "3.10"
+          - "3.11"
         dependencies:
           - latest
           - optional
@@ -67,7 +68,7 @@ jobs:
 
       # Checks-out your repository under $GITHUB_WORKSPACE
       - name: Checkout
-        uses: actions/checkout at v2
+        uses: actions/checkout at v3
         with:
           # Need to fetch more than the last commit so that setuptools-scm can
           # create the correct version string. If the number of commits since
@@ -83,7 +84,7 @@ jobs:
         run: git fetch origin 'refs/tags/*:refs/tags/*'
 
       - name: Setup Python
-        uses: actions/setup-python at v2
+        uses: actions/setup-python at v4
         with:
           python-version: ${{ matrix.python }}
 
@@ -118,7 +119,7 @@ jobs:
           echo "::set-output name=dir::$(pip cache dir)"
 
       - name: Setup caching for pip packages
-        uses: actions/cache at v2
+        uses: actions/cache at v3
         with:
           path: ${{ steps.pip-cache.outputs.dir }}
           key: ${{ runner.os }}-pip-${{ hashFiles('requirements-full.txt') }}


=====================================
README.md
=====================================
@@ -149,6 +149,7 @@ def fetch_gravity_data():
 * [climlab](https://github.com/climlab/climlab)
 * [napari](https://github.com/napari/napari)
 * [mne-python](https://github.com/mne-tools/mne-python)
+* [GemGIS](https://github.com/cgre-aachen/gemgis)
 
 *If you're using Pooch, send us a pull request adding your project to the list.*
 


=====================================
debian/changelog
=====================================
@@ -1,10 +1,19 @@
-pooch (1.7.0-2) UNRELEASED; urgency=medium
+pooch (1.8.0-1) UNRELEASED; urgency=medium
 
+  [ Bas Couwenberg ]
   * Team upload.
   * Remove generated files in clean target.
   * Enable Salsa CI.
   * Switch to dh-sequence-*.
 
+  [ Antonio Valentino ]
+  * New upstream release.
+  * debian/control:
+    - Use <!nocheck> marker.
+    - Switch to autopkgtest-pkg-pybuild.
+    - Add build-dependency on python3-pytest-httpserver.
+  * Update debian/python3-pooch.lintian-overrides.
+
  -- Bas Couwenberg <sebastic at debian.org>  Sun, 13 Aug 2023 09:15:02 +0200
 
 pooch (1.7.0-1) unstable; urgency=medium


=====================================
debian/control
=====================================
@@ -3,7 +3,7 @@ Section: python
 Priority: optional
 Maintainer: Debian GIS Project <pkg-grass-devel at lists.alioth.debian.org>
 Uploaders: Antonio Valentino <antonio.valentino at tiscali.it>
-Testsuite: autopkgtest-pkg-python
+Testsuite: autopkgtest-pkg-pybuild
 Build-Depends: debhelper-compat (= 13),
                dh-python,
                dh-sequence-python3,
@@ -12,7 +12,8 @@ Build-Depends: debhelper-compat (= 13),
                python3-packaging,
                python3-paramiko,
                python3-platformdirs,
-               python3-pytest,
+               python3-pytest <!nocheck>,
+               python3-pytest-httpserver <!nocheck>,
                python3-requests,
                python3-setuptools,
                python3-setuptools-scm,


=====================================
debian/python3-pooch.lintian-overrides
=====================================
@@ -3,3 +3,6 @@ compressed-duplicate [usr/lib/python3/dist-packages/pooch/tests/data/*]
 # .txt files are erroneously assumed to be documentation files,
 # they are data files part of the test suite instead
 package-contains-documentation-outside-usr-share-doc [usr/lib/python3/dist-packages/pooch/tests/data/*]
+
+# false positive
+package-contains-documentation-outside-usr-share-doc [usr/lib/python3/dist-packages/pooch-*.*-info/*]


=====================================
doc/changes.rst
=====================================
@@ -3,6 +3,45 @@
 Changelog
 =========
 
+Version 1.8.0
+-------------
+
+*Released on: 2023/10/24*
+
+doi:`10.5281/zenodo.10037888 <https://doi.org/10.5281/zenodo.10037888>`__
+
+Bug fixes:
+
+* Fix bug: add support for old and new Zenodo APIs (`#375 <https://github.com/fatiando/pooch/pull/375>`__)
+
+New features:
+
+* Only create local data directories if necessary (`#370 <https://github.com/fatiando/pooch/pull/370>`__)
+* Speed up import time by lazy loading requests (`#328 <https://github.com/fatiando/pooch/pull/328>`__)
+
+Maintenance:
+
+* Add support for Python 3.11 (`#348 <https://github.com/fatiando/pooch/pull/348>`__)
+* Only run CI cron job for the upstream repository (`#361 <https://github.com/fatiando/pooch/pull/361>`__)
+
+Documentation:
+
+* Add GemGIS to list of projects using Pooch (`#349 <https://github.com/fatiando/pooch/pull/349>`__)
+* Fix spelling of Dataverse (`#353 <https://github.com/fatiando/pooch/pull/353>`__)
+* Fix grammar on retrieve documentation (`#359 <https://github.com/fatiando/pooch/pull/359>`__)
+
+
+This release contains contributions from:
+
+* Hugo van Kemenade
+* AlexanderJuestel
+* Mark Harfouche
+* Philip Durbin
+* Rob Luke
+* Santiago Soler
+* Stephan Hoyer
+
+
 Version 1.7.0
 -------------
 


=====================================
doc/versions.rst
=====================================
@@ -7,6 +7,7 @@ Use the links below to access documentation for specific versions
 * `Latest release <https://www.fatiando.org/pooch/latest>`__
 * `Development <https://www.fatiando.org/pooch/dev>`__
   (reflects the current development branch on GitHub)
+* `v1.8.0 <https://www.fatiando.org/pooch/v1.8.0>`__
 * `v1.7.0 <https://www.fatiando.org/pooch/v1.7.0>`__
 * `v1.6.0 <https://www.fatiando.org/pooch/v1.6.0>`__
 * `v1.5.2 <https://www.fatiando.org/pooch/v1.5.2>`__


=====================================
env/requirements-test.txt
=====================================
@@ -2,4 +2,5 @@
 pytest
 pytest-cov
 pytest-localftpserver
+pytest-httpserver
 coverage


=====================================
environment.yml
=====================================
@@ -3,7 +3,7 @@ channels:
     - conda-forge
     - defaults
 dependencies:
-    - python==3.10
+    - python==3.11
     - pip
     # Run
     - requests
@@ -15,6 +15,7 @@ dependencies:
     - pytest
     - pytest-cov
     - pytest-localftpserver
+    - pytest-httpserver
     - coverage
     # Documentation
     - sphinx==4.4.*


=====================================
pooch/core.py
=====================================
@@ -14,8 +14,6 @@ from pathlib import Path
 import shlex
 import shutil
 
-import requests
-import requests.exceptions
 
 from .hashes import hash_matches, file_hash
 from .utils import (
@@ -86,7 +84,7 @@ def retrieve(
         existing file needs to be updated.
     fname : str or None
         The name that will be used to save the file. Should NOT include the
-        full the path, just the file name (it will be appended to *path*). If
+        full path, just the file name (it will be appended to *path*). If
         None, will create a unique file name using a combination of the last
         part of the URL (assuming it's the file name) and the MD5 hash of the
         URL. For example, ``81whdo2d2e928yd1wi22-data-file.csv``. This ensures
@@ -217,15 +215,17 @@ def retrieve(
         path = os_cache("pooch")
     if fname is None:
         fname = unique_file_name(url)
-    # Create the local data directory if it doesn't already exist and make the
-    # path absolute.
+    # Make the path absolute.
     path = cache_location(path, env=None, version=None)
-    make_local_storage(path)
 
     full_path = path.resolve() / fname
     action, verb = download_action(full_path, known_hash)
 
     if action in ("download", "update"):
+        # We need to write data, so create the local data directory if it
+        # doesn't already exist.
+        make_local_storage(path)
+
         get_logger().info(
             "%s data from '%s' to file '%s'.",
             verb,
@@ -560,9 +560,6 @@ class Pooch:
         """
         self._assert_file_in_registry(fname)
 
-        # Create the local data directory if it doesn't already exist
-        make_local_storage(str(self.abspath))
-
         url = self.get_url(fname)
         full_path = self.abspath / fname
         known_hash = self.registry[fname]
@@ -574,6 +571,10 @@ class Pooch:
             )
 
         if action in ("download", "update"):
+            # We need to write data, so create the local data directory if it
+            # doesn't already exist.
+            make_local_storage(str(self.abspath))
+
             get_logger().info(
                 "%s file '%s' from '%s' to '%s'.",
                 verb,
@@ -789,6 +790,9 @@ def stream_download(url, fname, known_hash, downloader, pooch=None, retry_if_fai
     will retry the download the specified number of times in case the failure
     was due to a network error.
     """
+    # Lazy import requests to speed up import time
+    import requests.exceptions  # pylint: disable=C0415
+
     # Ensure the parent directory exists in case the file is in a subdirectory.
     # Otherwise, move will cause an error.
     if not fname.parent.exists():


=====================================
pooch/downloaders.py
=====================================
@@ -12,7 +12,6 @@ import sys
 import ftplib
 
 import warnings
-import requests
 
 from .utils import parse_url
 
@@ -192,6 +191,9 @@ class HTTPDownloader:  # pylint: disable=too-few-public-methods
             is available on the server. Otherwise, returns ``None``.
 
         """
+        # Lazy import requests to speed up import time
+        import requests  # pylint: disable=C0415
+
         if check_only:
             response = requests.head(url, allow_redirects=True)
             available = bool(response.status_code == 200)
@@ -514,7 +516,7 @@ class DOIDownloader:  # pylint: disable=too-few-public-methods
 
     * `figshare <https://www.figshare.com>`__
     * `Zenodo <https://www.zenodo.org>`__
-    * `DataVerse <https://dataverse.org/>`__ instances
+    * `Dataverse <https://dataverse.org/>`__ instances
 
     .. attention::
 
@@ -626,6 +628,9 @@ def doi_to_url(doi):
         The URL of the archive in the data repository.
 
     """
+    # Lazy import requests to speed up import time
+    import requests  # pylint: disable=C0415
+
     # Use doi.org to resolve the DOI to the repository website.
     response = requests.get(f"https://doi.org/{doi}")
     url = response.url
@@ -743,10 +748,13 @@ class DataRepository:  # pylint: disable=too-few-public-methods, missing-class-d
 
 
 class ZenodoRepository(DataRepository):  # pylint: disable=missing-class-docstring
+    base_api_url = "https://zenodo.org/api/records"
+
     def __init__(self, doi, archive_url):
         self.archive_url = archive_url
         self.doi = doi
         self._api_response = None
+        self._api_version = None
 
     @classmethod
     def initialize(cls, doi, archive_url):
@@ -777,15 +785,49 @@ class ZenodoRepository(DataRepository):  # pylint: disable=missing-class-docstri
     @property
     def api_response(self):
         """Cached API response from Zenodo"""
-
         if self._api_response is None:
+            # Lazy import requests to speed up import time
+            import requests  # pylint: disable=C0415
+
             article_id = self.archive_url.split("/")[-1]
             self._api_response = requests.get(
-                f"https://zenodo.org/api/records/{article_id}"
+                f"{self.base_api_url}/{article_id}"
             ).json()
 
         return self._api_response
 
+    @property
+    def api_version(self):
+        """
+        Version of the Zenodo API we are interacting with
+
+        The versions can either be :
+
+        - ``"legacy"``: corresponds to the Zenodo API that was supported until
+          2023-10-12 (before the migration to InvenioRDM).
+        - ``"new"``: corresponds to the new API that went online on 2023-10-13
+          after the migration to InvenioRDM.
+
+        The ``"new"`` API breaks backward compatibility with the ``"legacy"``
+        one and could probably be replaced by an updated version that restores
+        the behaviour of the ``"legacy"`` one.
+
+        Returns
+        -------
+        str
+        """
+        if self._api_version is None:
+            if all(["key" in file for file in self.api_response["files"]]):
+                self._api_version = "legacy"
+            elif all(["filename" in file for file in self.api_response["files"]]):
+                self._api_version = "new"
+            else:
+                raise ValueError(
+                    "Couldn't determine the version of the Zenodo API for "
+                    f"{self.archive_url} (doi:{self.doi})."
+                )
+        return self._api_version
+
     def download_url(self, file_name):
         """
         Use the repository API to get the download URL for a file given
@@ -800,14 +842,35 @@ class ZenodoRepository(DataRepository):  # pylint: disable=missing-class-docstri
         -------
         download_url : str
             The HTTP URL that can be used to download the file.
-        """
 
-        files = {item["key"]: item for item in self.api_response["files"]}
+        Notes
+        -----
+        After Zenodo migrated to InvenioRDM on Oct 2023, their API changed. The
+        link to the desired files that appears in the API response leads to 404
+        errors (by 2023-10-17). The files are available in the following url:
+        ``https://zenodo.org/records/{article_id}/files/{file_name}?download=1``.
+
+        This method supports both the legacy and the new API.
+        """
+        # Create list of files in the repository
+        if self.api_version == "legacy":
+            files = {item["key"]: item for item in self.api_response["files"]}
+        else:
+            files = [item["filename"] for item in self.api_response["files"]]
+        # Check if file exists in the repository
         if file_name not in files:
             raise ValueError(
-                f"File '{file_name}' not found in data archive {self.archive_url} (doi:{self.doi})."
+                f"File '{file_name}' not found in data archive "
+                f"{self.archive_url} (doi:{self.doi})."
+            )
+        # Build download url
+        if self.api_version == "legacy":
+            download_url = files[file_name]["links"]["self"]
+        else:
+            article_id = self.api_response["id"]
+            download_url = (
+                f"https://zenodo.org/records/{article_id}/files/{file_name}?download=1"
             )
-        download_url = files[file_name]["links"]["self"]
         return download_url
 
     def populate_registry(self, pooch):
@@ -818,10 +881,22 @@ class ZenodoRepository(DataRepository):  # pylint: disable=missing-class-docstri
         ----------
         pooch : Pooch
             The pooch instance that the registry will be added to.
-        """
 
+        Notes
+        -----
+        After Zenodo migrated to InvenioRDM on Oct 2023, their API changed. The
+        checksums for each file listed in the API reference is now an md5 sum.
+
+        This method supports both the legacy and the new API.
+        """
         for filedata in self.api_response["files"]:
-            pooch.registry[filedata["key"]] = filedata["checksum"]
+            checksum = filedata["checksum"]
+            if self.api_version == "legacy":
+                key = "key"
+            else:
+                key = "filename"
+                checksum = f"md5:{checksum}"
+            pooch.registry[filedata[key]] = checksum
 
 
 class FigshareRepository(DataRepository):  # pylint: disable=missing-class-docstring
@@ -875,8 +950,10 @@ class FigshareRepository(DataRepository):  # pylint: disable=missing-class-docst
     @property
     def api_response(self):
         """Cached API response from Figshare"""
-
         if self._api_response is None:
+            # Lazy import requests to speed up import time
+            import requests  # pylint: disable=C0415
+
             # Use the figshare API to find the article ID from the DOI
             article = requests.get(
                 f"https://api.figshare.com/v2/articles?doi={self.doi}"
@@ -927,7 +1004,6 @@ class FigshareRepository(DataRepository):  # pylint: disable=missing-class-docst
         download_url : str
             The HTTP URL that can be used to download the file.
         """
-
         files = {item["name"]: item for item in self.api_response}
         if file_name not in files:
             raise ValueError(
@@ -974,7 +1050,6 @@ class DataverseRepository(DataRepository):  # pylint: disable=missing-class-docs
         archive_url : str
             The resolved URL for the DOI
         """
-
         # Access the DOI as if this was a DataVerse instance
         response = cls._get_api_response(doi, archive_url)
 
@@ -995,6 +1070,9 @@ class DataverseRepository(DataRepository):  # pylint: disable=missing-class-docs
         This has been separated into a separate ``classmethod``, as it can be
         used prior and after the initialization.
         """
+        # Lazy import requests to speed up import time
+        import requests  # pylint: disable=C0415
+
         parsed = parse_url(archive_url)
         response = requests.get(
             f"{parsed['protocol']}://{parsed['netloc']}/api/datasets/"
@@ -1034,7 +1112,6 @@ class DataverseRepository(DataRepository):  # pylint: disable=missing-class-docs
         download_url : str
             The HTTP URL that can be used to download the file.
         """
-
         parsed = parse_url(self.archive_url)
 
         # Iterate over the given files until we find one of the requested name


=====================================
pooch/tests/test_downloaders.py
=====================================
@@ -23,6 +23,7 @@ try:
 except ImportError:
     paramiko = None
 
+from .. import Pooch
 from ..downloaders import (
     HTTPDownloader,
     FTPDownloader,
@@ -384,3 +385,157 @@ def test_downloader_arbitrary_progressbar(capsys):
 
         # Check that the downloaded file has the right content
         check_large_data(outfile)
+
+
+class TestZenodoAPISupport:
+    """
+    Test support for different Zenodo APIs
+    """
+
+    article_id = 123456
+    doi = f"10.0001/zenodo.{article_id}"
+    doi_url = f"https://doi.org/{doi}"
+    file_name = "my-file.zip"
+    file_url = (
+        "https://zenodo.org/api/files/513d7033-93a2-4eeb-821c-2fb0bbab0012/my-file.zip"
+    )
+    file_checksum = "2942bfabb3d05332b66eb128e0842cff"
+
+    legacy_api_response = dict(
+        created="2021-20-19T08:00:00.000000+00:00",
+        modified="2021-20-19T08:00:00.000000+00:00",
+        id=article_id,
+        doi=doi,
+        doi_url=doi_url,
+        files=[
+            {
+                "id": "513d7033-93a2-4eeb-821c-2fb0bbab0012",
+                "key": file_name,
+                "checksum": f"md5:{file_checksum}",
+                "links": {
+                    "self": file_url,
+                },
+            }
+        ],
+    )
+
+    new_api_response = dict(
+        created="2021-20-19T08:00:00.000000+00:00",
+        modified="2021-20-19T08:00:00.000000+00:00",
+        id=article_id,
+        doi=doi,
+        doi_url=doi_url,
+        files=[
+            {
+                "id": "513d7033-93a2-4eeb-821c-2fb0bbab0012",
+                "filename": file_name,
+                "checksum": file_checksum,
+                "links": {
+                    "self": file_url,
+                },
+            }
+        ],
+    )
+
+    invalid_api_response = dict(
+        created="2021-20-19T08:00:00.000000+00:00",
+        modified="2021-20-19T08:00:00.000000+00:00",
+        id=article_id,
+        doi=doi,
+        doi_url=doi_url,
+        files=[
+            {
+                "id": "513d7033-93a2-4eeb-821c-2fb0bbab0012",
+                "filename": file_name,
+                "checksum": file_checksum,
+                "links": {
+                    "self": file_url,
+                },
+            },
+            {
+                "id": "513d7033-93a2-4eeb-821c-2fb0bbab0012",
+                "key": file_name,
+                "checksum": f"md5:{file_checksum}",
+                "links": {
+                    "self": file_url,
+                },
+            },
+        ],
+    )
+
+    @pytest.mark.parametrize(
+        "api_version, api_response",
+        [
+            ("legacy", legacy_api_response),
+            ("new", new_api_response),
+            ("invalid", invalid_api_response),
+        ],
+    )
+    def test_api_version(self, httpserver, api_version, api_response):
+        """
+        Test if the API version is correctly detected.
+        """
+        # Create a local http server
+        httpserver.expect_request(f"/zenodo.{self.article_id}").respond_with_json(
+            api_response
+        )
+        # Create Zenodo downloader
+        downloader = ZenodoRepository(doi=self.doi, archive_url=self.doi_url)
+        # Override base url for the API of the downloader
+        downloader.base_api_url = httpserver.url_for("")
+        # Check if the API version is correctly identified
+        if api_version != "invalid":
+            assert downloader.api_version == api_version
+        else:
+            msg = "Couldn't determine the version of the Zenodo API"
+            with pytest.raises(ValueError, match=msg):
+                api_version = downloader.api_version
+
+    @pytest.mark.parametrize(
+        "api_version, api_response",
+        [("legacy", legacy_api_response), ("new", new_api_response)],
+    )
+    def test_download_url(self, httpserver, api_version, api_response):
+        """
+        Test if the download url is correct for each API version.
+        """
+        # Create a local http server
+        httpserver.expect_request(f"/zenodo.{self.article_id}").respond_with_json(
+            api_response
+        )
+        # Create Zenodo downloader
+        downloader = ZenodoRepository(doi=self.doi, archive_url=self.doi_url)
+        # Override base url for the API of the downloader
+        downloader.base_api_url = httpserver.url_for("")
+        # Check if the download url is correct
+        download_url = downloader.download_url(file_name=self.file_name)
+        if api_version == "legacy":
+            assert download_url == self.file_url
+        else:
+            expected_url = (
+                "https://zenodo.org/records/"
+                f"{self.article_id}/files/{self.file_name}?download=1"
+            )
+            assert download_url == expected_url
+
+    @pytest.mark.parametrize(
+        "api_response",
+        [legacy_api_response, new_api_response],
+    )
+    def test_populate_registry(self, httpserver, tmp_path, api_response):
+        """
+        Test if population of registry is correctly done for each API version.
+        """
+        # Create a local http server
+        httpserver.expect_request(f"/zenodo.{self.article_id}").respond_with_json(
+            api_response
+        )
+        # Create sample pooch object
+        puppy = Pooch(base_url="", path=tmp_path)
+        # Create Zenodo downloader
+        downloader = ZenodoRepository(doi=self.doi, archive_url=self.doi_url)
+        # Override base url for the API of the downloader
+        downloader.base_api_url = httpserver.url_for("")
+        # Populate registry
+        downloader.populate_registry(puppy)
+        assert puppy.registry == {self.file_name: f"md5:{self.file_checksum}"}


=====================================
setup.cfg
=====================================
@@ -27,6 +27,7 @@ classifiers =
     Programming Language :: Python :: 3.8
     Programming Language :: Python :: 3.9
     Programming Language :: Python :: 3.10
+    Programming Language :: Python :: 3.11
 url = https://github.com/fatiando/pooch
 project_urls =
     Documentation = https://www.fatiando.org/pooch



View it on GitLab: https://salsa.debian.org/debian-gis-team/pooch/-/compare/dfaa29141ed1a4b430c84c4b47750ee9fd76297f...37cdbaa5cb0f1f15168c21e203aaf9491a0cb3ef

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/pooch/-/compare/dfaa29141ed1a4b430c84c4b47750ee9fd76297f...37cdbaa5cb0f1f15168c21e203aaf9491a0cb3ef
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20231028/34ef528d/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list