[Git][debian-gis-team/python-geotiepoints][upstream] New upstream version 1.4.1

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Sat Jun 11 12:01:31 BST 2022



Antonio Valentino pushed to branch upstream at Debian GIS Project / python-geotiepoints


Commits:
180474b1 by Antonio Valentino at 2022-06-11T09:04:17+00:00
New upstream version 1.4.1
- - - - -


19 changed files:

- .github/workflows/ci.yaml
- .github/workflows/deploy.yaml
- CHANGELOG.md
- README.md
- RELEASING.md
- continuous_integration/build-manylinux-wheels.sh
- continuous_integration/environment.yaml
- geotiepoints/__init__.py
- geotiepoints/modisinterpolator.py
- geotiepoints/simple_modis_interpolator.py
- geotiepoints/tests/test_modisinterpolator.py
- geotiepoints/tests/test_simple_modis_interpolator.py
- geotiepoints/version.py
- pyproject.toml
- setup.cfg
- setup.py
- + testdata/create_modis_test_data.py
- testdata/modis_test_data.h5
- versioneer.py


Changes:

=====================================
.github/workflows/ci.yaml
=====================================
@@ -80,7 +80,9 @@ jobs:
           env_vars: OS,PYTHON_VERSION,UNSTABLE
 
       - name: Coveralls Parallel
-        uses: AndreMiras/coveralls-python-action at develop
+        # See https://github.com/AndreMiras/coveralls-python-action/pull/16
+        uses: miurahr/coveralls-python-action at patch-pyprject-toml
+#        uses: AndreMiras/coveralls-python-action at develop
         with:
           flag-name: run-${{ matrix.test_number }}
           parallel: true
@@ -91,7 +93,9 @@ jobs:
     runs-on: ubuntu-latest
     steps:
       - name: Coveralls Finished
-        uses: AndreMiras/coveralls-python-action at develop
+        # See https://github.com/AndreMiras/coveralls-python-action/pull/16
+        uses: miurahr/coveralls-python-action at patch-pyprject-toml
+#        uses: AndreMiras/coveralls-python-action at develop
         with:
           parallel-finished: true
 


=====================================
.github/workflows/deploy.yaml
=====================================
@@ -41,11 +41,7 @@ jobs:
           - name: manylinux 64-bit
             os: ubuntu-latest
             python-version: 3.8
-            docker-image: manylinux1_x86_64
-          - name: manylinux 32-bit
-            os: ubuntu-latest
-            python-version: 3.8
-            docker-image: manylinux1_i686
+            docker-image: manylinux2014_x86_64
 
     steps:
       - uses: actions/checkout at v2
@@ -78,6 +74,14 @@ jobs:
             quay.io/pypa/${{ matrix.docker-image }} \
             /io/continuous_integration/build-manylinux-wheels.sh
 
+      - name: Check version number from inside wheel
+        if:  matrix.docker-image != 'manylinux2014_i686'
+        run: |
+          mv geotiepoints unused_src_to_prevent_local_import
+          python -m pip install --find-links=./dist/ python-geotiepoints
+          python -c "import geotiepoints; print(geotiepoints.__file__, geotiepoints.__version__)"
+          python -c "import geotiepoints; assert 'unknown' not in geotiepoints.__version__, 'incorrect version found'"
+
       - name: Upload wheel(s) as build artifacts
         uses: actions/upload-artifact at v2
         with:


=====================================
CHANGELOG.md
=====================================
@@ -1,5 +1,25 @@
-## Version 1.4.0 (2022/02/21)
+## Version 1.4.1 (2022/06/08)
+
+### Issues Closed
+
+* [Issue 39](https://github.com/pytroll/python-geotiepoints/issues/39) - MODIS Interpolation Comparisons ([PR 41](https://github.com/pytroll/python-geotiepoints/pull/41) by [@djhoese](https://github.com/djhoese))
+
+In this release 1 issue was closed.
+
+### Pull Requests Merged
+
+#### Bugs fixed
+
+* [PR 41](https://github.com/pytroll/python-geotiepoints/pull/41) - Fix MODIS cviirs-based interpolation ([39](https://github.com/pytroll/python-geotiepoints/issues/39))
+
+#### Features added
+
+* [PR 35](https://github.com/pytroll/python-geotiepoints/pull/35) - Optimize angle-based modis interpolation for dask
 
+In this release 2 pull requests were closed.
+
+
+## Version 1.4.0 (2022/02/21)
 
 ### Pull Requests Merged
 


=====================================
README.md
=====================================
@@ -1,9 +1,8 @@
 python-geotiepoints
 ===================
 
-[![Build Status](https://travis-ci.org/pytroll/python-geotiepoints.svg?branch=main)](https://travis-ci.org/pytroll/python-geotiepoints)
+[![Build Status](https://github.com/pytroll/python-geotiepoints/workflows/CI/badge.svg?branch=main)](https://github.com/pytroll/python-geotiepoints/actions?query=workflow%3A%22CI%22)
 [![Coverage Status](https://coveralls.io/repos/github/pytroll/python-geotiepoints/badge.svg?branch=main)](https://coveralls.io/github/pytroll/python-geotiepoints?branch=main)
-[![Code Health](https://landscape.io/github/pytroll/python-geotiepoints/main/landscape.svg?style=flat)](https://landscape.io/github/pytroll/python-geotiepoints/main)
 
 
 Python-geotiepoints is a python module that interpolates (and extrapolates if


=====================================
RELEASING.md
=====================================
@@ -8,7 +8,7 @@ prerequisites: `pip install loghub setuptools twine`
 4. run `loghub` and update the `CHANGELOG.md` file:
 
 ```
-loghub pytroll/python-geotiepoints --token $LOGHUB_GITHUB_TOKEN -st v0.8.0 -plg bug "Bugs fixed" -plg enhancement "Features added" -plg documentation "Documentation changes" -plg backwards-incompatibility "Backward incompatible changes" -plg refactor "Refactoring"
+loghub pytroll/python-geotiepoints --token $LOGHUB_GITHUB_TOKEN -st $(git tag --sort=-version:refname --list 'v*' | head -n 1) -plg bug "Bugs fixed" -plg enhancement "Features added" -plg documentation "Documentation changes" -plg backwards-incompatibility "Backward incompatible changes" -plg refactor "Refactoring"
 ```
 
 This uses a `LOGHUB_GITHUB_TOKEN` environment variable. This must be created


=====================================
continuous_integration/build-manylinux-wheels.sh
=====================================
@@ -3,14 +3,14 @@ set -e -x
 
 # This is to be run by Docker inside a Docker image.
 # You can test it locally on a Linux machine by installing docker and running from this repo's root:
-# $ docker run -e PLAT=manylinux1_x86_64 -v `pwd`:/io quay.io/pypa/manylinux1_x86_64 /io/scripts/build-manylinux-wheels.sh
+# $ docker run -e PLAT=manylinux2014_x86_64 -v `pwd`:/io quay.io/pypa/manylinux1_x86_64 /io/scripts/build-manylinux-wheels.sh
 
 # * The -e just defines an environment variable PLAT=[docker name] inside the
 #    docker - auditwheel can't detect the docker name automatically.
 # * The -v gives a directory alias for passing files in and out of the docker
 #    (/io is arbitrary). E.g the `setup.py` script would be accessed in the
 #    docker via `/io/setup.py`.
-# * quay.io/pypa/manylinux1_x86_64 is the full docker image name. Docker
+# * quay.io/pypa/manylinux2014_x86_64 is the full docker image name. Docker
 #    downloads it automatically.
 # * The last argument is a shell command that the Docker will execute.
 #    Filenames must be from the Docker's perspective.
@@ -24,9 +24,13 @@ mkdir -p /io/temp-wheels
 # Clean out any old existing wheels.
 find /io/temp-wheels/ -type f -delete
 
+# /io might be owned by someone else since we are in docker
+# this may stop versioneer from using git the way it needs
+git config --global --add safe.directory /io
+
 # Iterate through available pythons.
-for PYBIN in /opt/python/cp*/bin; do
-    "${PYBIN}/pip" install -q -U setuptools wheel nose --cache-dir /io/pip-cache
+for PYBIN in /opt/python/cp3{7,8,9,10}*/bin; do
+    "${PYBIN}/pip" install -q -U setuptools wheel build --cache-dir /io/pip-cache
     # Run the following in root of this repo.
     (cd /io/ && "${PYBIN}/pip" install -q .)
     (cd /io/ && "${PYBIN}/python" -m build -w -o /io/temp-wheels)


=====================================
continuous_integration/environment.yaml
=====================================
@@ -17,3 +17,4 @@ dependencies:
   - h5py
   - pytest
   - pytest-cov
+  - pyproj


=====================================
geotiepoints/__init__.py
=====================================
@@ -176,6 +176,7 @@ def modis1kmto250m(lons1km, lats1km, cores=1):
 
     return lons250m, lats250m
 
-from .version import get_versions
-__version__ = get_versions()['version']
-del get_versions
+
+from . import version
+__version__ = version.get_versions()['version']
+


=====================================
geotiepoints/modisinterpolator.py
=====================================
@@ -20,49 +20,56 @@
 # You should have received a copy of the GNU General Public License
 # along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
-"""Interpolation of geographical tiepoints using the second order interpolation
+"""Interpolation of MODIS data using satellite zenith angle.
+
+Interpolation of geographical tiepoints using the second order interpolation
 scheme implemented in the CVIIRS software, as described here:
 Compact VIIRS SDR Product Format User Guide (V1J)
-http://www.eumetsat.int/website/wcm/idc/idcplg?IdcService=GET_FILE&dDocName=PDF_DMT_708025&RevisionSelectionMethod=LatestReleased&Rendition=Web
+https://www.eumetsat.int/media/45988
+and
+Anders Meier Soerensen, Stephan Zinke,
+A tie-point zone group compaction schema for the geolocation data of S-NPP and NOAA-20 VIIRS SDRs to reduce file sizes
+in memory-sensitive environments,
+Applied Computing and Geosciences, Volume 6, 2020, 100025, ISSN 2590-1974,
+https://doi.org/10.1016/j.acags.2020.100025.
+(https://www.sciencedirect.com/science/article/pii/S2590197420300070)
 """
 
-import xarray as xr
-import dask.array as da
 import numpy as np
 import warnings
 
 from .geointerpolator import lonlat2xyz, xyz2lonlat
+from .simple_modis_interpolator import scanline_mapblocks
 
 R = 6371.
-# Aqua scan width and altitude in km
-scan_width = 10.00017
-H = 705.
+# Aqua altitude in km
+H = 709.
 
 
-def compute_phi(zeta):
+def _compute_phi(zeta):
     return np.arcsin(R * np.sin(zeta) / (R + H))
 
 
-def compute_theta(zeta, phi):
+def _compute_theta(zeta, phi):
     return zeta - phi
 
 
-def compute_zeta(phi):
+def _compute_zeta(phi):
     return np.arcsin((R + H) * np.sin(phi) / R)
 
 
-def compute_expansion_alignment(satz_a, satz_b, satz_c, satz_d):
+def _compute_expansion_alignment(satz_a, satz_b, scan_width):
     """All angles in radians."""
     zeta_a = satz_a
     zeta_b = satz_b
 
-    phi_a = compute_phi(zeta_a)
-    phi_b = compute_phi(zeta_b)
-    theta_a = compute_theta(zeta_a, phi_a)
-    theta_b = compute_theta(zeta_b, phi_b)
+    phi_a = _compute_phi(zeta_a)
+    phi_b = _compute_phi(zeta_b)
+    theta_a = _compute_theta(zeta_a, phi_a)
+    theta_b = _compute_theta(zeta_b, phi_b)
     phi = (phi_a + phi_b) / 2
-    zeta = compute_zeta(phi)
-    theta = compute_theta(zeta, phi)
+    zeta = _compute_zeta(phi)
+    theta = _compute_theta(zeta, phi)
     # Workaround for tiepoints symetrical about the subsatellite-track
     denominator = np.where(theta_a == theta_b, theta_a * 2, theta_a - theta_b)
 
@@ -78,7 +85,7 @@ def compute_expansion_alignment(satz_a, satz_b, satz_c, satz_d):
     return c_expansion, c_alignment
 
 
-def get_corners(arr):
+def _get_corners(arr):
     arr_a = arr[:, :-1, :-1]
     arr_b = arr[:, :-1, 1:]
     arr_c = arr[:, 1:, 1:]
@@ -86,169 +93,206 @@ def get_corners(arr):
     return arr_a, arr_b, arr_c, arr_d
 
 
-class ModisInterpolator(object):
-
-    def __init__(self, cres, fres, cscan_full_width=None):
-        if cres == 1000:
-            self.cscan_len = 10
-            self.cscan_width = 1
-            self.cscan_full_width = 1354
-            self.get_coords = self._get_coords_1km
-            self.expand_tiepoint_array = self._expand_tiepoint_array_1km
-        elif cres == 5000:
-            self.cscan_len = 2
-            self.cscan_width = 5
-            if cscan_full_width is None:
-                self.cscan_full_width = 271
-            else:
-                self.cscan_full_width = cscan_full_width
-            self.expand_tiepoint_array = self._expand_tiepoint_array_5km
-            self.get_coords = self._get_coords_5km
-
-        if fres == 250:
-            self.fscan_width = 4 * self.cscan_width
-            self.fscan_full_width = 1354 * 4
-            self.fscan_len = 4 * 10 // self.cscan_len
-        elif fres == 500:
-            self.fscan_width = 2 * self.cscan_width
-            self.fscan_full_width = 1354 * 2
-            self.fscan_len = 2 * 10 // self.cscan_len
-        elif fres == 1000:
-            self.fscan_width = 1 * self.cscan_width
-            self.fscan_full_width = 1354
-            self.fscan_len = 1 * 10 // self.cscan_len
-
-    def _expand_tiepoint_array_1km(self, arr, lines, cols):
-        arr = da.repeat(arr, lines, axis=1)
-        arr = da.concatenate((arr[:, :lines//2, :], arr, arr[:, -(lines//2):, :]), axis=1)
-        arr = da.repeat(arr.reshape((-1, self.cscan_full_width - 1)), cols, axis=1)
-        return da.hstack((arr, arr[:, -cols:]))
+ at scanline_mapblocks
+def _interpolate(
+        lon1,
+        lat1,
+        satz1,
+        coarse_resolution=None,
+        fine_resolution=None,
+        coarse_scan_width=None,
+):
+    """Helper function to interpolate scan-aligned arrays.
 
-    def _get_coords_1km(self, scans):
-        y = (np.arange((self.cscan_len + 1) * self.fscan_len) % self.fscan_len) + .5
-        y = y[self.fscan_len // 2:-(self.fscan_len // 2)]
-        y[:self.fscan_len//2] = np.arange(-self.fscan_len/2 + .5, 0)
-        y[-(self.fscan_len//2):] = np.arange(self.fscan_len + .5, self.fscan_len * 3 / 2)
-        y = np.tile(y, scans)
+    This function's decorator runs this function for each dask block/chunk of
+    scans. The arrays are scan-aligned meaning they are an even number of scans
+    (N rows per scan) and contain the entire scan width.
 
-        x = np.arange(self.fscan_full_width) % self.fscan_width
-        x[-self.fscan_width:] = np.arange(self.fscan_width, self.fscan_width * 2)
-        return x, y
+    """
+    interp = _MODISInterpolator(coarse_resolution, fine_resolution, coarse_scan_width=coarse_scan_width)
+    return interp.interpolate(lon1, lat1, satz1)
 
-    def _expand_tiepoint_array_5km(self, arr, lines, cols):
-        arr = da.repeat(arr, lines * 2, axis=1)
-        arr = da.repeat(arr.reshape((-1, self.cscan_full_width - 1)), cols, axis=1)
-        factor = self.fscan_width // self.cscan_width
-        if self.cscan_full_width == 271:
-            return da.hstack((arr[:, :2 * factor], arr, arr[:, -2 * factor:]))
-        else:
-            return da.hstack((arr[:, :2 * factor], arr, arr[:, -self.fscan_width:], arr[:, -2 * factor:]))
 
-    def _get_coords_5km(self, scans):
-        y = np.arange(self.fscan_len * self.cscan_len) - 2
-        y = np.tile(y, scans)
-
-        x = (np.arange(self.fscan_full_width) - 2) % self.fscan_width
-        x[0] = -2
-        x[1] = -1
-        if self.cscan_full_width == 271:
-            x[-2] = 5
-            x[-1] = 6
-        elif self.cscan_full_width == 270:
-            x[-7] = 5
-            x[-6] = 6
-            x[-5] = 7
-            x[-4] = 8
-            x[-3] = 9
-            x[-2] = 10
-            x[-1] = 11
-        else:
-            raise NotImplementedError("Can't interpolate if 5km tiepoints have less than 270 columns.")
-        return x, y
+class _MODISInterpolator:
+    """Helper class for MODIS interpolation.
+
+    Not intended for public use. Use ``modis_X_to_Y`` functions instead.
+
+    """
+    def __init__(self, coarse_resolution, fine_resolution, coarse_scan_width=None):
+        if coarse_resolution == 1000:
+            coarse_scan_length = 10
+            coarse_scan_width = 1354
+            self._get_coords = self._get_coords_1km
+            self._expand_tiepoint_array = self._expand_tiepoint_array_1km
+        elif coarse_resolution == 5000:
+            coarse_scan_length = 2
+            self._get_coords = self._get_coords_5km
+            self._expand_tiepoint_array = self._expand_tiepoint_array_5km
+            if coarse_scan_width is None:
+                coarse_scan_width = 271
+            else:
+                coarse_scan_width = coarse_scan_width
+        self._coarse_scan_length = coarse_scan_length
+        self._coarse_scan_width = coarse_scan_width
+        self._coarse_pixels_per_1km = coarse_resolution // 1000
+
+        fine_pixels_per_1km = {
+            250: 4,
+            500: 2,
+            1000: 1,
+        }[fine_resolution]
+        self._fine_pixels_per_coarse_pixel = fine_pixels_per_1km * self._coarse_pixels_per_1km
+        self._fine_scan_width = 1354 * fine_pixels_per_1km
+        self._fine_scan_length = fine_pixels_per_1km * 10 // coarse_scan_length
+        self._coarse_resolution = coarse_resolution
+        self._fine_resolution = fine_resolution
 
     def interpolate(self, lon1, lat1, satz1):
-        cscan_len = self.cscan_len
-        cscan_full_width = self.cscan_full_width
-
-        fscan_width = self.fscan_width
-        fscan_len = self.fscan_len
-
-        scans = satz1.shape[0] // cscan_len
-        satz1 = satz1.data
-
-        satz1 = satz1.reshape((-1, cscan_len, cscan_full_width))
+        """Interpolate MODIS geolocation from 'coarse_resolution' to 'fine_resolution'."""
+        scans = satz1.shape[0] // self._coarse_scan_length
+        # reshape to (num scans, rows per scan, columns per scan)
+        satz1 = satz1.reshape((-1, self._coarse_scan_length, self._coarse_scan_width))
 
-        satz_a, satz_b, satz_c, satz_d = get_corners(da.deg2rad(satz1))
+        satz_a, satz_b = _get_corners(np.deg2rad(satz1))[:2]
+        c_exp, c_ali = _compute_expansion_alignment(satz_a, satz_b, self._coarse_pixels_per_1km)
 
-        c_exp, c_ali = compute_expansion_alignment(satz_a, satz_b, satz_c, satz_d)
-
-        x, y = self.get_coords(scans)
-        i_rs, i_rt = da.meshgrid(x, y)
+        x, y = self._get_coords(scans)
+        i_rs, i_rt = np.meshgrid(x, y)
 
         p_os = 0
         p_ot = 0
+        s_s = (p_os + i_rs) * 1.0 / self._fine_pixels_per_coarse_pixel
+        s_t = (p_ot + i_rt) * 1.0 / self._fine_scan_length
 
-        s_s = (p_os + i_rs) * 1. / fscan_width
-        s_t = (p_ot + i_rt) * 1. / fscan_len
-
-        cols = fscan_width
-        lines = fscan_len
-
-        c_exp_full = self.expand_tiepoint_array(c_exp, lines, cols)
-        c_ali_full = self.expand_tiepoint_array(c_ali, lines, cols)
+        c_exp_full = self._expand_tiepoint_array(c_exp)
+        c_ali_full = self._expand_tiepoint_array(c_ali)
 
         a_track = s_t
-        a_scan = (s_s + s_s * (1 - s_s) * c_exp_full + s_t * (1 - s_t) * c_ali_full)
+        a_scan = s_s + s_s * (1 - s_s) * c_exp_full + s_t * (1 - s_t) * c_ali_full
 
         res = []
         datasets = lonlat2xyz(lon1, lat1)
         for data in datasets:
-            data = data.data
-            data = data.reshape((-1, cscan_len, cscan_full_width))
-            data_a, data_b, data_c, data_d = get_corners(data)
-            data_a = self.expand_tiepoint_array(data_a, lines, cols)
-            data_b = self.expand_tiepoint_array(data_b, lines, cols)
-            data_c = self.expand_tiepoint_array(data_c, lines, cols)
-            data_d = self.expand_tiepoint_array(data_d, lines, cols)
+            data = data.reshape((-1, self._coarse_scan_length, self._coarse_scan_width))
+            data_a, data_b, data_c, data_d = _get_corners(data)
+            data_a = self._expand_tiepoint_array(data_a)
+            data_b = self._expand_tiepoint_array(data_b)
+            data_c = self._expand_tiepoint_array(data_c)
+            data_d = self._expand_tiepoint_array(data_d)
 
             data_1 = (1 - a_scan) * data_a + a_scan * data_b
             data_2 = (1 - a_scan) * data_d + a_scan * data_c
             data = (1 - a_track) * data_1 + a_track * data_2
 
             res.append(data)
-        lon, lat = xyz2lonlat(*res)
-        return xr.DataArray(lon, dims=lon1.dims), xr.DataArray(lat, dims=lat1.dims)
+        new_lons, new_lats = xyz2lonlat(*res)
+        return new_lons.astype(lon1.dtype), new_lats.astype(lat1.dtype)
+
+    def _get_coords_1km(self, scans):
+        y = (
+            np.arange((self._coarse_scan_length + 1) * self._fine_scan_length) % self._fine_scan_length
+        ) + 0.5
+        half_scan_length = self._fine_scan_length // 2
+        y = y[half_scan_length:-half_scan_length]
+        y[:half_scan_length] = np.arange(-self._fine_scan_length / 2 + 0.5, 0)
+        y[-half_scan_length:] = np.arange(self._fine_scan_length + 0.5, self._fine_scan_length * 3 / 2)
+        y = np.tile(y, scans)
+
+        x = np.arange(self._fine_scan_width) % self._fine_pixels_per_coarse_pixel
+        x[-self._fine_pixels_per_coarse_pixel:] = np.arange(
+            self._fine_pixels_per_coarse_pixel,
+            self._fine_pixels_per_coarse_pixel * 2)
+        return x, y
+
+    def _get_coords_5km(self, scans):
+        y = np.arange(self._fine_scan_length * self._coarse_scan_length) - 2
+        y = np.tile(y, scans)
+
+        x = (np.arange(self._fine_scan_width) - 2) % self._fine_pixels_per_coarse_pixel
+        x[0] = -2
+        x[1] = -1
+        if self._coarse_scan_width == 271:
+            x[-2] = 5
+            x[-1] = 6
+        elif self._coarse_scan_width == 270:
+            x[-7] = 5
+            x[-6] = 6
+            x[-5] = 7
+            x[-4] = 8
+            x[-3] = 9
+            x[-2] = 10
+            x[-1] = 11
+        else:
+            raise NotImplementedError(
+                "Can't interpolate if 5km tiepoints have less than 270 columns."
+            )
+        return x, y
+
+    def _expand_tiepoint_array_1km(self, arr):
+        arr = np.repeat(arr, self._fine_scan_length, axis=1)
+        arr = np.concatenate(
+            (arr[:, :self._fine_scan_length // 2, :], arr, arr[:, -(self._fine_scan_length // 2):, :]), axis=1
+        )
+        arr = np.repeat(arr.reshape((-1, self._coarse_scan_width - 1)), self._fine_pixels_per_coarse_pixel, axis=1)
+        return np.hstack((arr, arr[:, -self._fine_pixels_per_coarse_pixel:]))
+
+    def _expand_tiepoint_array_5km(self, arr):
+        arr = np.repeat(arr, self._fine_scan_length * 2, axis=1)
+        arr = np.repeat(arr.reshape((-1, self._coarse_scan_width - 1)), self._fine_pixels_per_coarse_pixel, axis=1)
+        factor = self._fine_pixels_per_coarse_pixel // self._coarse_pixels_per_1km
+        if self._coarse_scan_width == 271:
+            return np.hstack((arr[:, :2 * factor], arr, arr[:, -2 * factor:]))
+        else:
+            return np.hstack(
+                (
+                    arr[:, :2 * factor],
+                    arr,
+                    arr[:, -self._fine_pixels_per_coarse_pixel:],
+                    arr[:, -2 * factor:],
+                )
+            )
 
 
 def modis_1km_to_250m(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 1km to 250m resolution."""
-    interp = ModisInterpolator(1000, 250)
-    return interp.interpolate(lon1, lat1, satz1)
+    return _interpolate(lon1, lat1, satz1,
+                        coarse_resolution=1000,
+                        fine_resolution=250)
 
 
 def modis_1km_to_500m(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 1km to 500m resolution."""
-    interp = ModisInterpolator(1000, 500)
-    return interp.interpolate(lon1, lat1, satz1)
+    return _interpolate(lon1, lat1, satz1,
+                        coarse_resolution=1000,
+                        fine_resolution=500)
 
 
 def modis_5km_to_1km(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 5km to 1km resolution."""
-    interp = ModisInterpolator(5000, 1000, lon1.shape[1])
-    return interp.interpolate(lon1, lat1, satz1)
+    return _interpolate(lon1, lat1, satz1,
+                        coarse_resolution=5000,
+                        fine_resolution=1000,
+                        coarse_scan_width=lon1.shape[1])
 
 
 def modis_5km_to_500m(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 5km to 500m resolution."""
-    warnings.warn("Interpolating 5km geolocation to 500m resolution "
-                  "may result in poor quality")
-    interp = ModisInterpolator(5000, 500, lon1.shape[1])
-    return interp.interpolate(lon1, lat1, satz1)
+    warnings.warn(
+        "Interpolating 5km geolocation to 500m resolution " "may result in poor quality"
+    )
+    return _interpolate(lon1, lat1, satz1,
+                        coarse_resolution=5000,
+                        fine_resolution=500,
+                        coarse_scan_width=lon1.shape[1])
 
 
 def modis_5km_to_250m(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 5km to 250m resolution."""
-    warnings.warn("Interpolating 5km geolocation to 250m resolution "
-                  "may result in poor quality")
-    interp = ModisInterpolator(5000, 250, lon1.shape[1])
-    return interp.interpolate(lon1, lat1, satz1)
+    warnings.warn(
+        "Interpolating 5km geolocation to 250m resolution " "may result in poor quality"
+    )
+    return _interpolate(lon1, lat1, satz1,
+                        coarse_resolution=5000,
+                        fine_resolution=250,
+                        coarse_scan_width=lon1.shape[1])


=====================================
geotiepoints/simple_modis_interpolator.py
=====================================
@@ -49,83 +49,130 @@ try:
 except ImportError:
     xr = None
 
-# MODIS has 10 rows of data in the array for every scan line
-ROWS_PER_SCAN = 10
+
+def _rows_per_scan_for_resolution(res):
+    return {
+        5000: 2,
+        1000: 10,
+        500: 20,
+        250: 40,
+    }[res]
 
 
 def scanline_mapblocks(func):
     """Convert dask array inputs to appropriate map_blocks calls.
 
     This function, applied as a decorator, will call the wrapped function
-    using dask's ``map_blocks``. It will rechunk inputs when necessary to make
-    sure that the input chunks are entire scanlines to avoid incorrect
-    interpolation.
+    using dask's ``map_blocks``. It will rechunk dask array inputs when
+    necessary to make sure that the input chunks are entire scanlines to
+    avoid incorrect interpolation.
 
     """
     @wraps(func)
-    def _wrapper(lon_data, lat_data, res_factor=4):
-        if lon_data.ndim != 2 or lat_data.ndim != 2:
-            raise ValueError("Expected 2D lon/lat arrays.")
-        if hasattr(lon_data, "compute"):
+    def _wrapper(*args, coarse_resolution=None, fine_resolution=None, **kwargs):
+        if coarse_resolution is None or fine_resolution is None:
+            raise ValueError("'coarse_resolution' and 'fine_resolution' are required keyword arguments.")
+        first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
+        if first_arr.ndim != 2 or first_arr.ndim != 2:
+            raise ValueError("Expected 2D input arrays.")
+        if hasattr(first_arr, "compute"):
             # assume it is dask or xarray with dask, ensure proper chunk size
             # if DataArray get just the dask array
-            lon_dask = lon_data.data if hasattr(lon_data, "dims") else lon_data
-            lat_dask = lat_data.data if hasattr(lat_data, "dims") else lat_data
-            lon_dask, lat_dask = _rechunk_lonlat_if_needed(lon_dask, lat_dask)
-            new_lons, new_lats = _call_map_blocks_interp(func, lon_dask, lat_dask, res_factor)
-            if hasattr(lon_data, "dims"):
+            dask_args = _extract_dask_arrays_from_args(args)
+            rows_per_scan = _rows_per_scan_for_resolution(coarse_resolution)
+            rechunked_args = _rechunk_dask_arrays_if_needed(dask_args, rows_per_scan)
+            results = _call_map_blocks_interp(
+                func,
+                coarse_resolution,
+                fine_resolution,
+                *rechunked_args,
+                **kwargs
+            )
+            if hasattr(first_arr, "dims"):
                 # recreate DataArrays
-                new_lons = xr.DataArray(new_lons, dims=lon_data.dims)
-                new_lats = xr.DataArray(new_lats, dims=lon_data.dims)
-            return new_lons, new_lats
-
-        return func(lon_data, lat_data, res_factor=res_factor)
+                results = _results_to_data_arrays(first_arr.dims, *results)
+            return results
+        return func(
+            *args,
+            coarse_resolution=coarse_resolution,
+            fine_resolution=fine_resolution,
+            **kwargs
+        )
 
     return _wrapper
 
 
-def _call_map_blocks_interp(func, lon_dask, lat_dask, res_factor):
-    new_row_chunks = tuple(x * res_factor for x in lon_dask.chunks[0])
-    new_col_chunks = tuple(x * res_factor for x in lon_dask.chunks[1])
+def _extract_dask_arrays_from_args(args):
+    return [arr_obj.data if hasattr(arr_obj, "dims") else arr_obj for arr_obj in args]
+
+
+def _call_map_blocks_interp(func, coarse_resolution, fine_resolution, *args, **kwargs):
+    first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
+    res_factor = coarse_resolution // fine_resolution
+    new_row_chunks = tuple(x * res_factor for x in first_arr.chunks[0])
+    fine_pixels_per_1km = {250: 4, 500: 2, 1000: 1}[fine_resolution]
+    fine_scan_width = 1354 * fine_pixels_per_1km
+    new_col_chunks = (fine_scan_width,)
     wrapped_func = _map_blocks_handler(func)
-    res = da.map_blocks(wrapped_func, lon_dask, lat_dask, res_factor,
+    res = da.map_blocks(wrapped_func, *args,
+                        coarse_resolution=coarse_resolution,
+                        fine_resolution=fine_resolution,
+                        **kwargs,
                         new_axis=[0],
                         chunks=(2, new_row_chunks, new_col_chunks),
-                        dtype=lon_dask.dtype,
-                        meta=np.empty((2, 2, 2), dtype=lon_dask.dtype))
-    return res[0], res[1]
+                        dtype=first_arr.dtype,
+                        meta=np.empty((2, 2, 2), dtype=first_arr.dtype))
+    return tuple(res[idx] for idx in range(res.shape[0]))
+
 
+def _results_to_data_arrays(dims, *results):
+    new_results = []
+    for result in results:
+        if not isinstance(result, da.Array):
+            continue
+        data_arr = xr.DataArray(result, dims=dims)
+        new_results.append(data_arr)
+    return new_results
 
-def _rechunk_lonlat_if_needed(lon_data, lat_data):
+
+def _rechunk_dask_arrays_if_needed(args, rows_per_scan: int):
     # take current chunk size and get a relatively similar chunk size
-    row_chunks = lon_data.chunks[0]
-    col_chunks = lon_data.chunks[1]
-    num_rows = lon_data.shape[0]
-    num_cols = lon_data.shape[-1]
-    good_row_chunks = all(x % ROWS_PER_SCAN == 0 for x in row_chunks)
+    first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
+    row_chunks = first_arr.chunks[0]
+    col_chunks = first_arr.chunks[1]
+    num_rows = first_arr.shape[0]
+    num_cols = first_arr.shape[-1]
+    good_row_chunks = all(x % rows_per_scan == 0 for x in row_chunks)
     good_col_chunks = len(col_chunks) == 1 and col_chunks[0] != num_cols
-    lonlat_same_chunks = lon_data.chunks == lat_data.chunks
-    if num_rows % ROWS_PER_SCAN != 0:
+    all_orig_chunks = [arr.chunks for arr in args if hasattr(arr, "chunks")]
+
+    if num_rows % rows_per_scan != 0:
         raise ValueError("Input longitude/latitude data does not consist of "
                          "whole scans (10 rows per scan).")
-    if good_row_chunks and good_col_chunks and lonlat_same_chunks:
-        return lon_data, lat_data
+    all_same_chunks = all(
+        all_orig_chunks[0] == some_chunks
+        for some_chunks in all_orig_chunks[1:]
+    )
+    if good_row_chunks and good_col_chunks and all_same_chunks:
+        return args
 
-    new_row_chunks = (row_chunks[0] // ROWS_PER_SCAN) * ROWS_PER_SCAN
-    lon_data = lon_data.rechunk((new_row_chunks, -1))
-    lat_data = lat_data.rechunk((new_row_chunks, -1))
-    return lon_data, lat_data
+    new_row_chunks = (row_chunks[0] // rows_per_scan) * rows_per_scan
+    new_args = [arr.rechunk((new_row_chunks, -1)) if hasattr(arr, "chunks") else arr for arr in args]
+    return new_args
 
 
 def _map_blocks_handler(func):
-    def _map_blocks_wrapper(lon_array, lat_array, res_factor):
-        lons, lats = func(lon_array, lat_array, res_factor=res_factor)
-        return np.concatenate((lons[np.newaxis], lats[np.newaxis]), axis=0)
+    @wraps(func)
+    def _map_blocks_wrapper(*args, **kwargs):
+        results = func(*args, **kwargs)
+        return np.concatenate(
+            tuple(result[np.newaxis] for result in results),
+            axis=0)
     return _map_blocks_wrapper
 
 
 @scanline_mapblocks
-def interpolate_geolocation_cartesian(lon_array, lat_array, res_factor=4):
+def interpolate_geolocation_cartesian(lon_array, lat_array, coarse_resolution, fine_resolution):
     """Interpolate MODIS navigation from 1000m resolution to 250m.
 
     Python rewrite of the IDL function ``MODIS_GEO_INTERP_250`` but converts to cartesian (X, Y, Z) coordinates
@@ -143,14 +190,16 @@ def interpolate_geolocation_cartesian(lon_array, lat_array, res_factor=4):
         A two-element tuple (lon, lat).
 
     """
+    rows_per_scan = _rows_per_scan_for_resolution(coarse_resolution)
+    res_factor = coarse_resolution // fine_resolution
     num_rows, num_cols = lon_array.shape
-    num_scans = int(num_rows / ROWS_PER_SCAN)
+    num_scans = int(num_rows / rows_per_scan)
     x_in, y_in, z_in = lonlat2xyz(lon_array, lat_array)
 
     # Create an array of indexes that we want our result to have
     x = np.arange(res_factor * num_cols, dtype=np.float32) * (1. / res_factor)
     # 0.375 for 250m, 0.25 for 500m
-    y = np.arange(res_factor * ROWS_PER_SCAN, dtype=np.float32) * \
+    y = np.arange(res_factor * rows_per_scan, dtype=np.float32) * \
         (1. / res_factor) - (res_factor * (1. / 16) + (1. / 8))
     x, y = np.meshgrid(x, y)
     coordinates = np.array([y, x])  # Used by map_coordinates, major optimization
@@ -163,14 +212,15 @@ def interpolate_geolocation_cartesian(lon_array, lat_array, res_factor=4):
     # Interpolate each scan, one at a time, otherwise the math doesn't work well
     for scan_idx in range(num_scans):
         # Calculate indexes
-        j0 = ROWS_PER_SCAN * scan_idx
-        j1 = j0 + ROWS_PER_SCAN
-        k0 = ROWS_PER_SCAN * res_factor * scan_idx
-        k1 = k0 + ROWS_PER_SCAN * res_factor
+        j0 = rows_per_scan * scan_idx
+        j1 = j0 + rows_per_scan
+        k0 = rows_per_scan * res_factor * scan_idx
+        k1 = k0 + rows_per_scan * res_factor
 
         for nav_array, result_array in nav_arrays:
             # Use bilinear interpolation for all 250 meter pixels
             map_coordinates(nav_array[j0:j1, :], coordinates, output=result_array[k0:k1, :], order=1, mode='nearest')
+            _extrapolate_rightmost_columns(res_factor, result_array[k0:k1])
 
             if res_factor == 4:
                 # Use linear extrapolation for the first two 250 meter pixels along track
@@ -198,6 +248,14 @@ def interpolate_geolocation_cartesian(lon_array, lat_array, res_factor=4):
     return new_lons.astype(lon_array.dtype), new_lats.astype(lon_array.dtype)
 
 
+def _extrapolate_rightmost_columns(res_factor, result_array):
+    outer_columns_offset = 3 if res_factor == 4 else 1
+    # take the last two interpolated (not extrapolated) columns and find the difference
+    right_diff = result_array[:, -(outer_columns_offset + 1)] - result_array[:, -(outer_columns_offset + 2)]
+    for factor, column_idx in enumerate(range(-outer_columns_offset, 0)):
+        result_array[:, column_idx] += right_diff * (factor + 1)
+
+
 def _calc_slope_offset_250(result_array, y, start_idx, offset):
     m = (result_array[start_idx + offset + 3, :] - result_array[start_idx + offset, :]) / \
         (y[offset + 3, 0] - y[offset, 0])
@@ -214,9 +272,18 @@ def _calc_slope_offset_500(result_array, y, start_idx, offset):
 
 def modis_1km_to_250m(lon1, lat1):
     """Interpolate MODIS geolocation from 1km to 250m resolution."""
-    return interpolate_geolocation_cartesian(lon1, lat1, res_factor=4)
+    return interpolate_geolocation_cartesian(
+        lon1,
+        lat1,
+        coarse_resolution=1000,
+        fine_resolution=250,
+    )
 
 
 def modis_1km_to_500m(lon1, lat1):
     """Interpolate MODIS geolocation from 1km to 500m resolution."""
-    return interpolate_geolocation_cartesian(lon1, lat1, res_factor=2)
+    return interpolate_geolocation_cartesian(
+        lon1,
+        lat1,
+        coarse_resolution=1000,
+        fine_resolution=500)


=====================================
geotiepoints/tests/test_modisinterpolator.py
=====================================
@@ -1,6 +1,6 @@
 #!/usr/bin/env python
 # -*- coding: utf-8 -*-
-# Copyright (c) 2017-2021 Python-geotiepoints developers
+# Copyright (c) 2017-2022 Python-geotiepoints developers
 #
 # This program is free software: you can redistribute it and/or modify
 # it under the terms of the GNU General Public License as published by
@@ -16,10 +16,15 @@
 # along with this program.  If not, see <http://www.gnu.org/licenses/>.
 """Tests for MODIS interpolators."""
 
-import unittest
 import numpy as np
+from pyproj import Geod
 import h5py
 import os
+import dask
+import dask.array as da
+import xarray as xr
+import pytest
+from .utils import CustomScheduler
 from geotiepoints.modisinterpolator import (modis_1km_to_250m,
                                             modis_1km_to_500m,
                                             modis_5km_to_1km,
@@ -29,86 +34,139 @@ FILENAME_DATA = os.path.join(
     os.path.dirname(__file__), '../../testdata/modis_test_data.h5')
 
 
-def to_da(arr):
-    import xarray as xr
-    import dask.array as da
-
-    return xr.DataArray(da.from_array(arr, chunks=4096), dims=['y', 'x'])
-
-
-class TestModisInterpolator(unittest.TestCase):
-    def test_modis(self):
-        h5f = h5py.File(FILENAME_DATA, 'r')
-        lon1 = to_da(h5f['lon_1km'])
-        lat1 = to_da(h5f['lat_1km'])
-        satz1 = to_da(h5f['satz_1km'])
-
-        lon250 = to_da(h5f['lon_250m'])
-        lon500 = to_da(h5f['lon_500m'])
-
-        lat250 = to_da(h5f['lat_250m'])
-        lat500 = to_da(h5f['lat_500m'])
-
-        lons, lats = modis_1km_to_250m(lon1, lat1, satz1)
-        self.assertTrue(np.allclose(lon250, lons, atol=1e-2))
-        self.assertTrue(np.allclose(lat250, lats, atol=1e-2))
-
-        lons, lats = modis_1km_to_500m(lon1, lat1, satz1)
-        self.assertTrue(np.allclose(lon500, lons, atol=1e-2))
-        self.assertTrue(np.allclose(lat500, lats, atol=1e-2))
-
-        lat5 = lat1[2::5, 2::5]
-        lon5 = lon1[2::5, 2::5]
-
-        satz5 = satz1[2::5, 2::5]
-        lons, lats = modis_5km_to_1km(lon5, lat5, satz5)
-        self.assertTrue(np.allclose(lon1, lons, atol=1e-2))
-        self.assertTrue(np.allclose(lat1, lats, atol=1e-2))
-
-        # 5km to 500m
-        lons, lats = modis_5km_to_500m(lon5, lat5, satz5)
-        self.assertEqual(lon500.shape, lons.shape)
-        self.assertEqual(lat500.shape, lats.shape)
-        # self.assertTrue(np.allclose(lon500, lons, atol=1e-2))
-        # self.assertTrue(np.allclose(lat500, lats, atol=1e-2))
-
-        # 5km to 250m
-        lons, lats = modis_5km_to_250m(lon5, lat5, satz5)
-        self.assertEqual(lon250.shape, lons.shape)
-        self.assertEqual(lat250.shape, lats.shape)
-        # self.assertTrue(np.allclose(lon250, lons, atol=1e-2))
-        # self.assertTrue(np.allclose(lat250, lats, atol=1e-2))
-
-        # Test level 2
-        lat5 = lat1[2::5, 2:-5:5]
-        lon5 = lon1[2::5, 2:-5:5]
-
-        satz5 = satz1[2::5, 2:-5:5]
-        lons, lats = modis_5km_to_1km(lon5, lat5, satz5)
-        self.assertTrue(np.allclose(lon1, lons, atol=1e-2))
-        self.assertTrue(np.allclose(lat1, lats, atol=1e-2))
-
-        # Test nans issue (#19)
-        satz1 = to_da(abs(np.linspace(-65.4, 65.4, 1354)).repeat(20).reshape(-1, 20).T)
-        lons, lats = modis_1km_to_500m(lon1, lat1, satz1)
-        self.assertFalse(np.any(np.isnan(lons.compute())))
-        self.assertFalse(np.any(np.isnan(lats.compute())))
-
-    def test_poles_datum(self):
-        import xarray as xr
-        h5f = h5py.File(FILENAME_DATA, 'r')
-        orig_lon = to_da(h5f['lon_1km'])
-        lon1 = orig_lon + 180
-        lon1 = xr.where(lon1 > 180, lon1 - 360, lon1)
-        lat1 = to_da(h5f['lat_1km'])
-        satz1 = to_da(h5f['satz_1km'])
-
-        lat5 = lat1[2::5, 2::5]
-        lon5 = lon1[2::5, 2::5]
-
-        satz5 = satz1[2::5, 2::5]
-        lons, lats = modis_5km_to_1km(lon5, lat5, satz5)
-        lons = lons + 180
-        lons = xr.where(lons > 180, lons - 360, lons)
-        self.assertTrue(np.allclose(orig_lon, lons, atol=1e-2))
-        self.assertTrue(np.allclose(lat1, lats, atol=1e-2))
+def _to_dask(arr):
+    return da.from_array(arr, chunks=4096)
+
+
+def _to_da(arr):
+    return xr.DataArray(_to_dask(arr), dims=['y', 'x'])
+
+
+def _load_h5_geo_vars(*var_names):
+    h5f = h5py.File(FILENAME_DATA, 'r')
+    return tuple(h5f[var_name] for var_name in var_names)
+
+
+def load_1km_lonlat_as_numpy():
+    lon1, lat1 = _load_h5_geo_vars('lon_1km', 'lat_1km')
+    return lon1[:], lat1[:]
+
+
+def load_1km_lonlat_as_dask():
+    lon1, lat1 = _load_h5_geo_vars('lon_1km', 'lat_1km')
+    return _to_dask(lon1), _to_dask(lat1)
+
+
+def load_1km_lonlat_as_xarray_dask():
+    lon1, lat1 = _load_h5_geo_vars('lon_1km', 'lat_1km')
+    return _to_da(lon1), _to_da(lat1)
+
+
+def load_1km_lonlat_satz_as_xarray_dask():
+    lon1, lat1, satz1 = _load_h5_geo_vars('lon_1km', 'lat_1km', 'satz_1km')
+    return _to_da(lon1), _to_da(lat1), _to_da(satz1)
+
+
+def load_5km_lonlat_satz1_as_xarray_dask():
+    lon1, lat1, satz1 = _load_h5_geo_vars('lon_1km', 'lat_1km', 'satz_1km')
+    lon5 = lon1[2::5, 2::5]
+    lat5 = lat1[2::5, 2::5]
+    satz5 = satz1[2::5, 2::5]
+    return _to_da(lon5), _to_da(lat5), _to_da(satz5)
+
+
+def load_l2_5km_lonlat_satz1_as_xarray_dask():
+    lon1, lat1, satz1 = _load_h5_geo_vars('lon_1km', 'lat_1km', 'satz_1km')
+    lon5 = lon1[2::5, 2:-5:5]
+    lat5 = lat1[2::5, 2:-5:5]
+    satz5 = satz1[2::5, 2:-5:5]
+    return _to_da(lon5), _to_da(lat5), _to_da(satz5)
+
+
+def load_500m_lonlat_expected_as_xarray_dask():
+    h5f = h5py.File(FILENAME_DATA, 'r')
+    lon500 = _to_da(h5f['lon_500m'])
+    lat500 = _to_da(h5f['lat_500m'])
+    return lon500, lat500
+
+
+def load_250m_lonlat_expected_as_xarray_dask():
+    h5f = h5py.File(FILENAME_DATA, 'r')
+    lon250 = _to_da(h5f['lon_250m'])
+    lat250 = _to_da(h5f['lat_250m'])
+    return lon250, lat250
+
+
+def assert_geodetic_distance(
+        lons_actual: np.ndarray,
+        lats_actual: np.ndarray,
+        lons_desired: np.ndarray,
+        lats_desired: np.ndarray,
+        max_distance_diff: float,
+) -> None:
+    """Check that the geodetic distance between two sets of coordinates is smaller than a threshold.
+
+    Args:
+        lons_actual: Longitude array produced by interpolation being tested.
+        lats_actual: Latitude array produced by interpolation being tested.
+        lons_desired: Longitude array of expected/truth coordinates.
+        lats_desired: Latitude array of expected/truth coordinates.
+        max_distance_diff: Limit of allowed distance difference in meters.
+
+    """
+    g = Geod(ellps="WGS84")
+    _, _, dist = g.inv(lons_actual, lats_actual, lons_desired, lats_desired)
+    np.testing.assert_array_less(
+        dist, max_distance_diff,
+        err_msg=f"Coordinates are greater than {max_distance_diff} geodetic "
+                "meters from the expected coordinates.")
+
+
+ at pytest.mark.parametrize(
+    ("input_func", "exp_func", "interp_func", "dist_max"),
+    [
+        (load_1km_lonlat_satz_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 5),
+        (load_1km_lonlat_satz_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 8),
+        (load_5km_lonlat_satz1_as_xarray_dask, load_1km_lonlat_as_xarray_dask, modis_5km_to_1km, 25),
+        (load_l2_5km_lonlat_satz1_as_xarray_dask, load_1km_lonlat_as_xarray_dask, modis_5km_to_1km, 110),
+        (load_5km_lonlat_satz1_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_5km_to_500m, 19500),
+        (load_5km_lonlat_satz1_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_5km_to_250m, 25800),
+    ]
+)
+def test_sat_angle_based_interp(input_func, exp_func, interp_func, dist_max):
+    lon1, lat1, satz1 = input_func()
+    lons_exp, lats_exp = exp_func()
+
+    # when working with dask arrays, we shouldn't compute anything
+    with dask.config.set(scheduler=CustomScheduler(0)):
+        lons, lats = interp_func(lon1, lat1, satz1)
+
+    if hasattr(lons, "compute"):
+        lons, lats = da.compute(lons, lats)
+    assert_geodetic_distance(lons, lats, lons_exp, lats_exp, dist_max)
+    assert not np.any(np.isnan(lons))
+    assert not np.any(np.isnan(lats))
+
+
+def test_sat_angle_based_interp_nan_handling():
+    # See GH #19
+    lon1, lat1, satz1 = load_1km_lonlat_satz_as_xarray_dask()
+    satz1 = _to_da(abs(np.linspace(-65.4, 65.4, 1354)).repeat(20).reshape(-1, 20).T)
+    lons, lats = modis_1km_to_500m(lon1, lat1, satz1)
+    assert not np.any(np.isnan(lons.compute()))
+    assert not np.any(np.isnan(lats.compute()))
+
+
+def test_poles_datum():
+    orig_lon, lat1, satz1 = load_1km_lonlat_satz_as_xarray_dask()
+    lon1 = orig_lon + 180
+    lon1 = xr.where(lon1 > 180, lon1 - 360, lon1)
+
+    lat5 = lat1[2::5, 2::5]
+    lon5 = lon1[2::5, 2::5]
+    satz5 = satz1[2::5, 2::5]
+    lons, lats = modis_5km_to_1km(lon5, lat5, satz5)
+
+    lons = lons + 180
+    lons = xr.where(lons > 180, lons - 360, lons)
+    assert_geodetic_distance(lons, lats, orig_lon, lat1, 25.0)


=====================================
geotiepoints/tests/test_simple_modis_interpolator.py
=====================================
@@ -16,77 +16,36 @@
 # along with this program.  If not, see <http://www.gnu.org/licenses/>.
 """Tests for simple MODIS interpolators."""
 
-import os
 import numpy as np
 import pytest
-import h5py
 import dask
 import dask.array as da
-import xarray as xr
 
 from geotiepoints.simple_modis_interpolator import modis_1km_to_250m, modis_1km_to_500m
-from .utils import CustomScheduler
-
-FILENAME_DATA = os.path.join(
-    os.path.dirname(__file__), '../../testdata/modis_test_data.h5')
-
-
-def _to_dask(arr):
-    return da.from_array(arr, chunks=4096)
-
-
-def _to_da(arr):
-    return xr.DataArray(_to_dask(arr), dims=['y', 'x'])
-
-
-def _load_h5_lonlat_vars(lon_var, lat_var):
-    h5f = h5py.File(FILENAME_DATA, 'r')
-    lon1 = h5f[lon_var]
-    lat1 = h5f[lat_var]
-    return lon1, lat1
-
-
-def _load_1km_lonlat_as_numpy():
-    lon1, lat1 = _load_h5_lonlat_vars('lon_1km', 'lat_1km')
-    return lon1[:], lat1[:]
-
-
-def _load_1km_lonlat_as_dask():
-    lon1, lat1 = _load_h5_lonlat_vars('lon_1km', 'lat_1km')
-    return _to_dask(lon1), _to_dask(lat1)
-
-
-def _load_1km_lonlat_as_xarray_dask():
-    lon1, lat1 = _load_h5_lonlat_vars('lon_1km', 'lat_1km')
-    return _to_da(lon1), _to_da(lat1)
-
-
-def _load_500m_lonlat_expected_as_xarray_dask():
-    h5f = h5py.File(FILENAME_DATA, 'r')
-    lon500 = _to_da(h5f['lon_500m'])
-    lat500 = _to_da(h5f['lat_500m'])
-    return lon500, lat500
-
+from .test_modisinterpolator import (
+    assert_geodetic_distance,
+    load_1km_lonlat_as_xarray_dask,
+    load_1km_lonlat_as_dask,
+    load_1km_lonlat_as_numpy,
+    load_500m_lonlat_expected_as_xarray_dask,
+    load_250m_lonlat_expected_as_xarray_dask,
+)
 
-def _load_250m_lonlat_expected_as_xarray_dask():
-    h5f = h5py.File(FILENAME_DATA, 'r')
-    lon250 = _to_da(h5f['lon_250m'])
-    lat250 = _to_da(h5f['lat_250m'])
-    return lon250, lat250
+from .utils import CustomScheduler
 
 
 @pytest.mark.parametrize(
-    ("input_func", "exp_func", "interp_func"),
+    ("input_func", "exp_func", "interp_func", "dist_max"),
     [
-        (_load_1km_lonlat_as_xarray_dask, _load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m),
-        (_load_1km_lonlat_as_xarray_dask, _load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m),
-        (_load_1km_lonlat_as_dask, _load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m),
-        (_load_1km_lonlat_as_dask, _load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m),
-        (_load_1km_lonlat_as_numpy, _load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m),
-        (_load_1km_lonlat_as_numpy, _load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m),
+        (load_1km_lonlat_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 16),
+        (load_1km_lonlat_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27),
+        (load_1km_lonlat_as_dask, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 16),
+        (load_1km_lonlat_as_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27),
+        (load_1km_lonlat_as_numpy, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 16),
+        (load_1km_lonlat_as_numpy, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27),
     ]
 )
-def test_basic_interp(input_func, exp_func, interp_func):
+def test_basic_interp(input_func, exp_func, interp_func, dist_max):
     lon1, lat1 = input_func()
     lons_exp, lats_exp = exp_func()
 
@@ -96,17 +55,13 @@ def test_basic_interp(input_func, exp_func, interp_func):
 
     if hasattr(lons, "compute"):
         lons, lats = da.compute(lons, lats)
-    # our "truth" values are from the modisinterpolator results
-    atol = 0.038  # 1e-2
-    rtol = 0
-    np.testing.assert_allclose(lons_exp, lons, atol=atol, rtol=rtol)
-    np.testing.assert_allclose(lats_exp, lats, atol=atol, rtol=rtol)
+    assert_geodetic_distance(lons, lats, lons_exp, lats_exp, dist_max)
     assert not np.any(np.isnan(lons))
     assert not np.any(np.isnan(lats))
 
 
 def test_nonstandard_scan_size():
-    lon1, lat1 = _load_1km_lonlat_as_xarray_dask()
+    lon1, lat1 = load_1km_lonlat_as_xarray_dask()
     # remove 1 row from the end
     lon1 = lon1[:-1]
     lat1 = lat1[:-1]


=====================================
geotiepoints/version.py
=====================================
@@ -6,7 +6,7 @@
 # that just contains the computed version number.
 
 # This file is released into the public domain. Generated by
-# versioneer-0.18 (https://github.com/warner/python-versioneer)
+# versioneer-0.22 (https://github.com/python-versioneer/python-versioneer)
 
 """Git implementation of _version.py."""
 
@@ -15,6 +15,8 @@ import os
 import re
 import subprocess
 import sys
+from typing import Callable, Dict
+import functools
 
 
 def get_keywords():
@@ -23,9 +25,9 @@ def get_keywords():
     # setup.py/versioneer.py will grep for the variable names, so they must
     # each be defined on a line of their own. _version.py will just call
     # get_keywords().
-    git_refnames = " (HEAD -> main, tag: v1.4.0)"
-    git_full = "95e07ab583727b1f0989f91098339fb9872e9d89"
-    git_date = "2022-02-21 14:34:20 +0100"
+    git_refnames = " (HEAD -> main, tag: v1.4.1)"
+    git_full = "79b95e346852d8e5d21a6f7c310f457fe45467c7"
+    git_date = "2022-06-08 14:50:58 -0500"
     keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
     return keywords
 
@@ -52,12 +54,12 @@ class NotThisMethod(Exception):
     """Exception raised if a method is not valid for the current scenario."""
 
 
-LONG_VERSION_PY = {}
-HANDLERS = {}
+LONG_VERSION_PY: Dict[str, str] = {}
+HANDLERS: Dict[str, Dict[str, Callable]] = {}
 
 
 def register_vcs_handler(vcs, method):  # decorator
-    """Decorator to mark a method as the handler for a particular VCS."""
+    """Create decorator to mark a method as the handler of a VCS."""
     def decorate(f):
         """Store f in HANDLERS[vcs][method]."""
         if vcs not in HANDLERS:
@@ -71,17 +73,25 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
                 env=None):
     """Call the given command(s)."""
     assert isinstance(commands, list)
-    p = None
-    for c in commands:
+    process = None
+
+    popen_kwargs = {}
+    if sys.platform == "win32":
+        # This hides the console window if pythonw.exe is used
+        startupinfo = subprocess.STARTUPINFO()
+        startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
+        popen_kwargs["startupinfo"] = startupinfo
+
+    for command in commands:
         try:
-            dispcmd = str([c] + args)
+            dispcmd = str([command] + args)
             # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
+            process = subprocess.Popen([command] + args, cwd=cwd, env=env,
+                                       stdout=subprocess.PIPE,
+                                       stderr=(subprocess.PIPE if hide_stderr
+                                               else None), **popen_kwargs)
             break
-        except EnvironmentError:
+        except OSError:
             e = sys.exc_info()[1]
             if e.errno == errno.ENOENT:
                 continue
@@ -93,15 +103,13 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
         if verbose:
             print("unable to find command, tried %s" % (commands,))
         return None, None
-    stdout = p.communicate()[0].strip()
-    if sys.version_info[0] >= 3:
-        stdout = stdout.decode()
-    if p.returncode != 0:
+    stdout = process.communicate()[0].strip().decode()
+    if process.returncode != 0:
         if verbose:
             print("unable to run %s (error)" % dispcmd)
             print("stdout was %s" % stdout)
-        return None, p.returncode
-    return stdout, p.returncode
+        return None, process.returncode
+    return stdout, process.returncode
 
 
 def versions_from_parentdir(parentdir_prefix, root, verbose):
@@ -113,15 +121,14 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
     """
     rootdirs = []
 
-    for i in range(3):
+    for _ in range(3):
         dirname = os.path.basename(root)
         if dirname.startswith(parentdir_prefix):
             return {"version": dirname[len(parentdir_prefix):],
                     "full-revisionid": None,
                     "dirty": False, "error": None, "date": None}
-        else:
-            rootdirs.append(root)
-            root = os.path.dirname(root)  # up a level
+        rootdirs.append(root)
+        root = os.path.dirname(root)  # up a level
 
     if verbose:
         print("Tried directories %s but none started with prefix %s" %
@@ -138,22 +145,21 @@ def git_get_keywords(versionfile_abs):
     # _version.py.
     keywords = {}
     try:
-        f = open(versionfile_abs, "r")
-        for line in f.readlines():
-            if line.strip().startswith("git_refnames ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["refnames"] = mo.group(1)
-            if line.strip().startswith("git_full ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["full"] = mo.group(1)
-            if line.strip().startswith("git_date ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["date"] = mo.group(1)
-        f.close()
-    except EnvironmentError:
+        with open(versionfile_abs, "r") as fobj:
+            for line in fobj:
+                if line.strip().startswith("git_refnames ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["refnames"] = mo.group(1)
+                if line.strip().startswith("git_full ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["full"] = mo.group(1)
+                if line.strip().startswith("git_date ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["date"] = mo.group(1)
+    except OSError:
         pass
     return keywords
 
@@ -161,10 +167,14 @@ def git_get_keywords(versionfile_abs):
 @register_vcs_handler("git", "keywords")
 def git_versions_from_keywords(keywords, tag_prefix, verbose):
     """Get version information from git keywords."""
-    if not keywords:
-        raise NotThisMethod("no keywords at all, weird")
+    if "refnames" not in keywords:
+        raise NotThisMethod("Short version file found")
     date = keywords.get("date")
     if date is not None:
+        # Use only the last line.  Previous lines may contain GPG signature
+        # information.
+        date = date.splitlines()[-1]
+
         # git-2.2.0 added "%cI", which expands to an ISO-8601 -compliant
         # datestamp. However we prefer "%ci" (which expands to an "ISO-8601
         # -like" string, which we must then edit to make compliant), because
@@ -177,11 +187,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         if verbose:
             print("keywords are unexpanded, not using")
         raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
+    refs = {r.strip() for r in refnames.strip("()").split(",")}
     # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
     # just "foo-1.0". If we see a "tag: " prefix, prefer those.
     TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
+    tags = {r[len(TAG):] for r in refs if r.startswith(TAG)}
     if not tags:
         # Either we're using git < 1.8.3, or there really are no tags. We use
         # a heuristic: assume all version tags have a digit. The old git %d
@@ -190,7 +200,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # between branches and tags. By ignoring refnames without digits, we
         # filter out many common branch names like "release" and
         # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
+        tags = {r for r in refs if re.search(r'\d', r)}
         if verbose:
             print("discarding '%s', no digits" % ",".join(refs - tags))
     if verbose:
@@ -199,6 +209,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # sorting will prefer e.g. "2.0" over "2.0rc1"
         if ref.startswith(tag_prefix):
             r = ref[len(tag_prefix):]
+            # Filter out refs that exactly match prefix or that don't start
+            # with a number once the prefix is stripped (mostly a concern
+            # when prefix is '')
+            if not re.match(r'\d', r):
+                continue
             if verbose:
                 print("picking %s" % r)
             return {"version": r,
@@ -214,7 +229,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
 
 
 @register_vcs_handler("git", "pieces_from_vcs")
-def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
+def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     """Get version from 'git describe' in the root of the source tree.
 
     This only gets called if the git-archive 'subst' keywords were *not*
@@ -225,24 +240,32 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     if sys.platform == "win32":
         GITS = ["git.cmd", "git.exe"]
 
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
+    # GIT_DIR can interfere with correct operation of Versioneer.
+    # It may be intended to be passed to the Versioneer-versioned project,
+    # but that should not change where we get our version from.
+    env = os.environ.copy()
+    env.pop("GIT_DIR", None)
+    runner = functools.partial(runner, env=env)
+
+    _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root,
+                   hide_stderr=True)
     if rc != 0:
         if verbose:
             print("Directory %s not under git control" % root)
         raise NotThisMethod("'git rev-parse --git-dir' returned error")
 
+    MATCH_ARGS = ["--match", "%s*" % tag_prefix] if tag_prefix else []
+
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%s*" % tag_prefix],
-                                   cwd=root)
+    describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
+                                     "--always", "--long", *MATCH_ARGS],
+                              cwd=root)
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
     describe_out = describe_out.strip()
-    full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
+    full_out, rc = runner(GITS, ["rev-parse", "HEAD"], cwd=root)
     if full_out is None:
         raise NotThisMethod("'git rev-parse' failed")
     full_out = full_out.strip()
@@ -252,6 +275,39 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     pieces["short"] = full_out[:7]  # maybe improved later
     pieces["error"] = None
 
+    branch_name, rc = runner(GITS, ["rev-parse", "--abbrev-ref", "HEAD"],
+                             cwd=root)
+    # --abbrev-ref was added in git-1.6.3
+    if rc != 0 or branch_name is None:
+        raise NotThisMethod("'git rev-parse --abbrev-ref' returned error")
+    branch_name = branch_name.strip()
+
+    if branch_name == "HEAD":
+        # If we aren't exactly on a branch, pick a branch which represents
+        # the current commit. If all else fails, we are on a branchless
+        # commit.
+        branches, rc = runner(GITS, ["branch", "--contains"], cwd=root)
+        # --contains was added in git-1.5.4
+        if rc != 0 or branches is None:
+            raise NotThisMethod("'git branch --contains' returned error")
+        branches = branches.split("\n")
+
+        # Remove the first line if we're running detached
+        if "(" in branches[0]:
+            branches.pop(0)
+
+        # Strip off the leading "* " from the list of branches.
+        branches = [branch[2:] for branch in branches]
+        if "master" in branches:
+            branch_name = "master"
+        elif not branches:
+            branch_name = None
+        else:
+            # Pick the first branch that is returned. Good or bad.
+            branch_name = branches[0]
+
+    pieces["branch"] = branch_name
+
     # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
     # TAG might have hyphens.
     git_describe = describe_out
@@ -268,7 +324,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
         # TAG-NUM-gHEX
         mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
         if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
+            # unparsable. Maybe git-describe is misbehaving?
             pieces["error"] = ("unable to parse git-describe output: '%s'"
                                % describe_out)
             return pieces
@@ -293,13 +349,14 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
+        count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
         pieces["distance"] = int(count_out)  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"],
-                       cwd=root)[0].strip()
+    date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip()
+    # Use only the last line.  Previous lines may contain GPG signature
+    # information.
+    date = date.splitlines()[-1]
     pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
 
     return pieces
@@ -337,19 +394,67 @@ def render_pep440(pieces):
     return rendered
 
 
-def render_pep440_pre(pieces):
-    """TAG[.post.devDISTANCE] -- No -dirty.
+def render_pep440_branch(pieces):
+    """TAG[[.dev0]+DISTANCE.gHEX[.dirty]] .
+
+    The ".dev0" means not master branch. Note that .dev0 sorts backwards
+    (a feature branch will appear "older" than the master branch).
 
     Exceptions:
-    1: no tags. 0.post.devDISTANCE
+    1: no tags. 0[.dev0]+untagged.DISTANCE.gHEX[.dirty]
     """
     if pieces["closest-tag"]:
         rendered = pieces["closest-tag"]
+        if pieces["distance"] or pieces["dirty"]:
+            if pieces["branch"] != "master":
+                rendered += ".dev0"
+            rendered += plus_or_dot(pieces)
+            rendered += "%d.g%s" % (pieces["distance"], pieces["short"])
+            if pieces["dirty"]:
+                rendered += ".dirty"
+    else:
+        # exception #1
+        rendered = "0"
+        if pieces["branch"] != "master":
+            rendered += ".dev0"
+        rendered += "+untagged.%d.g%s" % (pieces["distance"],
+                                          pieces["short"])
+        if pieces["dirty"]:
+            rendered += ".dirty"
+    return rendered
+
+
+def pep440_split_post(ver):
+    """Split pep440 version string at the post-release segment.
+
+    Returns the release segments before the post-release and the
+    post-release version number (or -1 if no post-release segment is present).
+    """
+    vc = str.split(ver, ".post")
+    return vc[0], int(vc[1] or 0) if len(vc) == 2 else None
+
+
+def render_pep440_pre(pieces):
+    """TAG[.postN.devDISTANCE] -- No -dirty.
+
+    Exceptions:
+    1: no tags. 0.post0.devDISTANCE
+    """
+    if pieces["closest-tag"]:
         if pieces["distance"]:
-            rendered += ".post.dev%d" % pieces["distance"]
+            # update the post release segment
+            tag_version, post_version = pep440_split_post(pieces["closest-tag"])
+            rendered = tag_version
+            if post_version is not None:
+                rendered += ".post%d.dev%d" % (post_version+1, pieces["distance"])
+            else:
+                rendered += ".post0.dev%d" % (pieces["distance"])
+        else:
+            # no commits, use the tag as the version
+            rendered = pieces["closest-tag"]
     else:
         # exception #1
-        rendered = "0.post.dev%d" % pieces["distance"]
+        rendered = "0.post0.dev%d" % pieces["distance"]
     return rendered
 
 
@@ -380,12 +485,41 @@ def render_pep440_post(pieces):
     return rendered
 
 
+def render_pep440_post_branch(pieces):
+    """TAG[.postDISTANCE[.dev0]+gHEX[.dirty]] .
+
+    The ".dev0" means not master branch.
+
+    Exceptions:
+    1: no tags. 0.postDISTANCE[.dev0]+gHEX[.dirty]
+    """
+    if pieces["closest-tag"]:
+        rendered = pieces["closest-tag"]
+        if pieces["distance"] or pieces["dirty"]:
+            rendered += ".post%d" % pieces["distance"]
+            if pieces["branch"] != "master":
+                rendered += ".dev0"
+            rendered += plus_or_dot(pieces)
+            rendered += "g%s" % pieces["short"]
+            if pieces["dirty"]:
+                rendered += ".dirty"
+    else:
+        # exception #1
+        rendered = "0.post%d" % pieces["distance"]
+        if pieces["branch"] != "master":
+            rendered += ".dev0"
+        rendered += "+g%s" % pieces["short"]
+        if pieces["dirty"]:
+            rendered += ".dirty"
+    return rendered
+
+
 def render_pep440_old(pieces):
     """TAG[.postDISTANCE[.dev0]] .
 
     The ".dev0" means dirty.
 
-    Eexceptions:
+    Exceptions:
     1: no tags. 0.postDISTANCE[.dev0]
     """
     if pieces["closest-tag"]:
@@ -456,10 +590,14 @@ def render(pieces, style):
 
     if style == "pep440":
         rendered = render_pep440(pieces)
+    elif style == "pep440-branch":
+        rendered = render_pep440_branch(pieces)
     elif style == "pep440-pre":
         rendered = render_pep440_pre(pieces)
     elif style == "pep440-post":
         rendered = render_pep440_post(pieces)
+    elif style == "pep440-post-branch":
+        rendered = render_pep440_post_branch(pieces)
     elif style == "pep440-old":
         rendered = render_pep440_old(pieces)
     elif style == "git-describe":
@@ -495,7 +633,7 @@ def get_versions():
         # versionfile_source is the relative path from the top of the source
         # tree (where the .git directory might live) to this file. Invert
         # this to find the root from __file__.
-        for i in cfg.versionfile_source.split('/'):
+        for _ in cfg.versionfile_source.split('/'):
             root = os.path.dirname(root)
     except NameError:
         return {"version": "0+unknown", "full-revisionid": None,


=====================================
pyproject.toml
=====================================
@@ -1,2 +1,7 @@
 [build-system]
-requires = ["setuptools", "wheel", "oldest-supported-numpy", "Cython"]
\ No newline at end of file
+requires = ["setuptools", "wheel", "oldest-supported-numpy", "Cython", "versioneer-518"]
+build-backend = "setuptools.build_meta"
+
+[tool.coverage.run]
+relative_files = true
+


=====================================
setup.cfg
=====================================
@@ -10,5 +10,5 @@ ignore = D107
 VCS = git
 style = pep440
 versionfile_source = geotiepoints/version.py
-versionfile_build =
+versionfile_build = geotiepoints/version.py
 tag_prefix = v


=====================================
setup.py
=====================================
@@ -32,7 +32,7 @@ import numpy as np
 from Cython.Build import cythonize
 
 requirements = ['numpy', 'scipy', 'pandas']
-test_requires = ['pytest', 'pytest-cov', 'h5py', 'xarray', 'dask']
+test_requires = ['pytest', 'pytest-cov', 'h5py', 'xarray', 'dask', 'pyproj']
 
 if sys.platform.startswith("win"):
     extra_compile_args = []
@@ -50,10 +50,15 @@ EXTENSIONS = [
 
 cmdclass = versioneer.get_cmdclass()
 
+with open('README.md', 'r') as readme:
+    README = readme.read()
+
 if __name__ == "__main__":
     setup(name='python-geotiepoints',
           version=versioneer.get_version(),
           description='Interpolation of geographic tiepoints in Python',
+          long_description=README,
+          long_description_content_type='text/markdown',
           author='Adam Dybbroe, Martin Raspaud',
           author_email='martin.raspaud at smhi.se',
           classifiers=["Development Status :: 4 - Beta",
@@ -67,7 +72,7 @@ if __name__ == "__main__":
           packages=['geotiepoints'],
           # packages=find_packages(),
           setup_requires=['numpy', 'cython'],
-          python_requires='>=3.6',
+          python_requires='>=3.7',
           cmdclass=cmdclass,
           install_requires=requirements,
           ext_modules=cythonize(EXTENSIONS),


=====================================
testdata/create_modis_test_data.py
=====================================
@@ -0,0 +1,103 @@
+#!/usr/bin/env python
+"""Generate MODIS interpolation test data from real input data.
+
+This script is used to generate the "testdata/modis_test_data.h5" file that is
+used to validate the various modis interpolation algorithms in
+python-geotiepoints. The test data file consists of 1km "truth" longitude and
+latitude arrays from an input MOD03 HDF4 file and interpolated longitude and
+latitude arrays at 500m and 250m resolution. The interpolation is done using
+the CVIIRS based algorithm in ``geotiepoints/modisinterpolator.py``.
+The CVIIRS algorithm was used as opposed to the "simple" or other interpolation
+methods due to the smoother interpolation between pixels (no linear "steps").
+
+MOD03 files geolocation data is terrain corrected. This means that the
+interpolation methods currently in python-geotiepoints can't produce an
+exact matching result for a round trip test of 1km (truth) ->
+5km (every 5th pixel) -> 1km (interpolation result).
+The input MOD03 test data was chosen due to its lack of varying terrain
+(almost entirely ocean view) to minimize error/differences between the
+1km truth and 1km interpolation results.
+
+To limit size of the test data file and the reduce the execution time of tests
+the test data is limited to the last 2 scans (20 rows of 1km data) of the
+provided input data.
+
+"""
+import os
+import sys
+from datetime import datetime
+
+import h5py
+import numpy as np
+from pyhdf.SD import SD, SDC
+import xarray as xr
+import dask.array as da
+
+from geotiepoints.modisinterpolator import modis_1km_to_500m, modis_1km_to_250m
+
+
+def main():
+    import argparse
+    parser = argparse.ArgumentParser()
+    parser.add_argument("-i", "--input", required=True,
+                        help="Input MOD03 geolocation HDF4 filename to read 1km lon/lat data from.")
+    parser.add_argument("-o", "--output", default="modis_test_data.h5",
+                        help="Output test data HDF5 filename being created")
+    args = parser.parse_args()
+
+    num_1km_rows = 20
+    lons_1km, lats_1km, satz_1km = _get_1km_lon_lat_satz_from_mod03(args.input)
+    lons_1km = lons_1km[-num_1km_rows:]
+    lats_1km = lats_1km[-num_1km_rows:]
+    satz_1km = satz_1km[-num_1km_rows:]
+    lons_1km = xr.DataArray(da.from_array(lons_1km), dims=("y", "x"))
+    lats_1km = xr.DataArray(da.from_array(lats_1km), dims=("y", "x"))
+    satz_1km = xr.DataArray(da.from_array(satz_1km), dims=("y", "x"))
+
+    with h5py.File(args.output, "w") as output_h:
+        lons_500m, lats_500m = modis_1km_to_500m(lons_1km, lats_1km, satz_1km)
+        lons_500m = lons_500m.astype(np.float32, copy=False)
+        lats_500m = lats_500m.astype(np.float32, copy=False)
+
+        lons_250m, lats_250m = modis_1km_to_250m(lons_1km, lats_1km, satz_1km)
+        lons_250m = lons_250m.astype(np.float32, copy=False)
+        lats_250m = lats_250m.astype(np.float32, copy=False)
+
+        output_h.create_dataset("lon_1km", data=lons_1km, compression="gzip", compression_opts=9, shuffle=True)
+        output_h.create_dataset("lat_1km", data=lats_1km, compression="gzip", compression_opts=9, shuffle=True)
+        output_h.create_dataset("satz_1km", data=satz_1km, compression="gzip", compression_opts=9, shuffle=True)
+        output_h.create_dataset("lon_500m", data=lons_500m, compression="gzip", compression_opts=9, shuffle=True)
+        output_h.create_dataset('lat_500m', data=lats_500m, compression="gzip", compression_opts=9, shuffle=True)
+        output_h.create_dataset("lon_250m", data=lons_250m, compression="gzip", compression_opts=9, shuffle=True)
+        output_h.create_dataset("lat_250m", data=lats_250m, compression="gzip", compression_opts=9, shuffle=True)
+        output_h.attrs["1km_data_origin"] = os.path.basename(args.input)
+        output_h.attrs["description"] = (
+            "MODIS interpolation test data for the python-geotiepoints package. "
+            "The 1 km data is taken directly from a MOD03 file. The 250m and "
+            "500m is generated using the cviirs-based algorithm in "
+            "`geotiepoints/modisinterpolator.py`. For more information see "
+            "the generation script in `testdata/create_modis_test_data.py` in "
+            "the python-geotiepoints git repository."
+        )
+        output_h.attrs["creation_date"] = datetime.utcnow().strftime("%Y-%m-%d")
+
+
+def _get_1km_lon_lat_satz_from_mod03(hdf4_filename: str) -> tuple:
+    h = SD(hdf4_filename, mode=SDC.READ)
+    lon_var = h.select("Longitude")
+    lat_var = h.select("Latitude")
+    sat_zen_var = h.select("SensorZenith")
+
+    # ensure 32-bit float
+    lon_data = lon_var[:].astype(np.float32, copy=False)
+    lat_data = lat_var[:].astype(np.float32, copy=False)
+    sat_zen_attrs = sat_zen_var.attributes()
+    scale_factor = sat_zen_attrs.get("scale_factor", 1.0)
+    add_offset = sat_zen_attrs.get("add_offset", 0.0)
+    sat_zen_data = (sat_zen_var[:] * scale_factor + add_offset).astype(np.float32, copy=False)
+
+    return lon_data, lat_data, sat_zen_data
+
+
+if __name__ == "__main__":
+    sys.exit(main())


=====================================
testdata/modis_test_data.h5
=====================================
Binary files a/testdata/modis_test_data.h5 and b/testdata/modis_test_data.h5 differ


=====================================
versioneer.py
=====================================
@@ -1,5 +1,5 @@
 
-# Version: 0.18
+# Version: 0.22
 
 """The Versioneer - like a rocketeer, but for versions.
 
@@ -7,18 +7,14 @@ The Versioneer
 ==============
 
 * like a rocketeer, but for versions!
-* https://github.com/warner/python-versioneer
+* https://github.com/python-versioneer/python-versioneer
 * Brian Warner
 * License: Public Domain
-* Compatible With: python2.6, 2.7, 3.2, 3.3, 3.4, 3.5, 3.6, and pypy
-* [![Latest Version]
-(https://pypip.in/version/versioneer/badge.svg?style=flat)
-](https://pypi.python.org/pypi/versioneer/)
-* [![Build Status]
-(https://travis-ci.org/warner/python-versioneer.png?branch=master)
-](https://travis-ci.org/warner/python-versioneer)
-
-This is a tool for managing a recorded version number in distutils-based
+* Compatible with: Python 3.6, 3.7, 3.8, 3.9, 3.10 and pypy3
+* [![Latest Version][pypi-image]][pypi-url]
+* [![Build Status][travis-image]][travis-url]
+
+This is a tool for managing a recorded version number in distutils/setuptools-based
 python projects. The goal is to remove the tedious and error-prone "update
 the embedded version string" step from your release process. Making a new
 release should be as easy as recording a new tag in your version-control
@@ -27,9 +23,10 @@ system, and maybe making new tarballs.
 
 ## Quick Install
 
-* `pip install versioneer` to somewhere to your $PATH
-* add a `[versioneer]` section to your setup.cfg (see below)
+* `pip install versioneer` to somewhere in your $PATH
+* add a `[versioneer]` section to your setup.cfg (see [Install](INSTALL.md))
 * run `versioneer install` in your source tree, commit the results
+* Verify version information with `python setup.py version`
 
 ## Version Identifiers
 
@@ -61,7 +58,7 @@ version 1.3). Many VCS systems can report a description that captures this,
 for example `git describe --tags --dirty --always` reports things like
 "0.7-1-g574ab98-dirty" to indicate that the checkout is one revision past the
 0.7 tag, has a unique revision id of "574ab98", and is "dirty" (it has
-uncommitted changes.
+uncommitted changes).
 
 The version identifier is used for multiple purposes:
 
@@ -166,7 +163,7 @@ which may help identify what went wrong).
 
 Some situations are known to cause problems for Versioneer. This details the
 most significant ones. More can be found on Github
-[issues page](https://github.com/warner/python-versioneer/issues).
+[issues page](https://github.com/python-versioneer/python-versioneer/issues).
 
 ### Subprojects
 
@@ -180,7 +177,7 @@ two common reasons why `setup.py` might not be in the root:
   `setup.cfg`, and `tox.ini`. Projects like these produce multiple PyPI
   distributions (and upload multiple independently-installable tarballs).
 * Source trees whose main purpose is to contain a C library, but which also
-  provide bindings to Python (and perhaps other langauges) in subdirectories.
+  provide bindings to Python (and perhaps other languages) in subdirectories.
 
 Versioneer will look for `.git` in parent directories, and most operations
 should get the right version string. However `pip` and `setuptools` have bugs
@@ -194,9 +191,9 @@ work too.
 Pip-8.1.1 is known to have this problem, but hopefully it will get fixed in
 some later version.
 
-[Bug #38](https://github.com/warner/python-versioneer/issues/38) is tracking
+[Bug #38](https://github.com/python-versioneer/python-versioneer/issues/38) is tracking
 this issue. The discussion in
-[PR #61](https://github.com/warner/python-versioneer/pull/61) describes the
+[PR #61](https://github.com/python-versioneer/python-versioneer/pull/61) describes the
 issue from the Versioneer side in more detail.
 [pip PR#3176](https://github.com/pypa/pip/pull/3176) and
 [pip PR#3615](https://github.com/pypa/pip/pull/3615) contain work to improve
@@ -224,22 +221,10 @@ regenerated while a different version is checked out. Many setup.py commands
 cause egg_info to be rebuilt (including `sdist`, `wheel`, and installing into
 a different virtualenv), so this can be surprising.
 
-[Bug #83](https://github.com/warner/python-versioneer/issues/83) describes
+[Bug #83](https://github.com/python-versioneer/python-versioneer/issues/83) describes
 this one, but upgrading to a newer version of setuptools should probably
 resolve it.
 
-### Unicode version strings
-
-While Versioneer works (and is continually tested) with both Python 2 and
-Python 3, it is not entirely consistent with bytes-vs-unicode distinctions.
-Newer releases probably generate unicode version strings on py2. It's not
-clear that this is wrong, but it may be surprising for applications when then
-write these strings to a network connection or include them in bytes-oriented
-APIs like cryptographic checksums.
-
-[Bug #71](https://github.com/warner/python-versioneer/issues/71) investigates
-this question.
-
 
 ## Updating Versioneer
 
@@ -265,6 +250,14 @@ installation by editing setup.py . Alternatively, it might go the other
 direction and include code from all supported VCS systems, reducing the
 number of intermediate scripts.
 
+## Similar projects
+
+* [setuptools_scm](https://github.com/pypa/setuptools_scm/) - a non-vendored build-time
+  dependency
+* [minver](https://github.com/jbweston/miniver) - a lightweight reimplementation of
+  versioneer
+* [versioningit](https://github.com/jwodder/versioningit) - a PEP 518-based setuptools
+  plugin
 
 ## License
 
@@ -274,19 +267,28 @@ Specifically, both are released under the Creative Commons "Public Domain
 Dedication" license (CC0-1.0), as described in
 https://creativecommons.org/publicdomain/zero/1.0/ .
 
+[pypi-image]: https://img.shields.io/pypi/v/versioneer.svg
+[pypi-url]: https://pypi.python.org/pypi/versioneer/
+[travis-image]:
+https://img.shields.io/travis/com/python-versioneer/python-versioneer.svg
+[travis-url]: https://travis-ci.com/github/python-versioneer/python-versioneer
+
 """
+# pylint:disable=invalid-name,import-outside-toplevel,missing-function-docstring
+# pylint:disable=missing-class-docstring,too-many-branches,too-many-statements
+# pylint:disable=raise-missing-from,too-many-lines,too-many-locals,import-error
+# pylint:disable=too-few-public-methods,redefined-outer-name,consider-using-with
+# pylint:disable=attribute-defined-outside-init,too-many-arguments
 
-from __future__ import print_function
-try:
-    import configparser
-except ImportError:
-    import ConfigParser as configparser
+import configparser
 import errno
 import json
 import os
 import re
 import subprocess
 import sys
+from typing import Callable, Dict
+import functools
 
 
 class VersioneerConfig:
@@ -321,12 +323,12 @@ def get_root():
         # module-import table will cache the first one. So we can't use
         # os.path.dirname(__file__), as that will find whichever
         # versioneer.py was first imported, even in later projects.
-        me = os.path.realpath(os.path.abspath(__file__))
-        me_dir = os.path.normcase(os.path.splitext(me)[0])
+        my_path = os.path.realpath(os.path.abspath(__file__))
+        me_dir = os.path.normcase(os.path.splitext(my_path)[0])
         vsr_dir = os.path.normcase(os.path.splitext(versioneer_py)[0])
         if me_dir != vsr_dir:
             print("Warning: build in %s is using versioneer.py from %s"
-                  % (os.path.dirname(me), versioneer_py))
+                  % (os.path.dirname(my_path), versioneer_py))
     except NameError:
         pass
     return root
@@ -334,30 +336,29 @@ def get_root():
 
 def get_config_from_root(root):
     """Read the project setup.cfg file to determine Versioneer config."""
-    # This might raise EnvironmentError (if setup.cfg is missing), or
+    # This might raise OSError (if setup.cfg is missing), or
     # configparser.NoSectionError (if it lacks a [versioneer] section), or
     # configparser.NoOptionError (if it lacks "VCS="). See the docstring at
     # the top of versioneer.py for instructions on writing your setup.cfg .
     setup_cfg = os.path.join(root, "setup.cfg")
-    parser = configparser.SafeConfigParser()
-    with open(setup_cfg, "r") as f:
-        parser.readfp(f)
+    parser = configparser.ConfigParser()
+    with open(setup_cfg, "r") as cfg_file:
+        parser.read_file(cfg_file)
     VCS = parser.get("versioneer", "VCS")  # mandatory
 
-    def get(parser, name):
-        if parser.has_option("versioneer", name):
-            return parser.get("versioneer", name)
-        return None
+    # Dict-like interface for non-mandatory entries
+    section = parser["versioneer"]
+
     cfg = VersioneerConfig()
     cfg.VCS = VCS
-    cfg.style = get(parser, "style") or ""
-    cfg.versionfile_source = get(parser, "versionfile_source")
-    cfg.versionfile_build = get(parser, "versionfile_build")
-    cfg.tag_prefix = get(parser, "tag_prefix")
+    cfg.style = section.get("style", "")
+    cfg.versionfile_source = section.get("versionfile_source")
+    cfg.versionfile_build = section.get("versionfile_build")
+    cfg.tag_prefix = section.get("tag_prefix")
     if cfg.tag_prefix in ("''", '""'):
         cfg.tag_prefix = ""
-    cfg.parentdir_prefix = get(parser, "parentdir_prefix")
-    cfg.verbose = get(parser, "verbose")
+    cfg.parentdir_prefix = section.get("parentdir_prefix")
+    cfg.verbose = section.get("verbose")
     return cfg
 
 
@@ -366,17 +367,15 @@ class NotThisMethod(Exception):
 
 
 # these dictionaries contain VCS-specific tools
-LONG_VERSION_PY = {}
-HANDLERS = {}
+LONG_VERSION_PY: Dict[str, str] = {}
+HANDLERS: Dict[str, Dict[str, Callable]] = {}
 
 
 def register_vcs_handler(vcs, method):  # decorator
-    """Decorator to mark a method as the handler for a particular VCS."""
+    """Create decorator to mark a method as the handler of a VCS."""
     def decorate(f):
         """Store f in HANDLERS[vcs][method]."""
-        if vcs not in HANDLERS:
-            HANDLERS[vcs] = {}
-        HANDLERS[vcs][method] = f
+        HANDLERS.setdefault(vcs, {})[method] = f
         return f
     return decorate
 
@@ -385,17 +384,25 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
                 env=None):
     """Call the given command(s)."""
     assert isinstance(commands, list)
-    p = None
-    for c in commands:
+    process = None
+
+    popen_kwargs = {}
+    if sys.platform == "win32":
+        # This hides the console window if pythonw.exe is used
+        startupinfo = subprocess.STARTUPINFO()
+        startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
+        popen_kwargs["startupinfo"] = startupinfo
+
+    for command in commands:
         try:
-            dispcmd = str([c] + args)
+            dispcmd = str([command] + args)
             # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
+            process = subprocess.Popen([command] + args, cwd=cwd, env=env,
+                                       stdout=subprocess.PIPE,
+                                       stderr=(subprocess.PIPE if hide_stderr
+                                               else None), **popen_kwargs)
             break
-        except EnvironmentError:
+        except OSError:
             e = sys.exc_info()[1]
             if e.errno == errno.ENOENT:
                 continue
@@ -407,18 +414,16 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
         if verbose:
             print("unable to find command, tried %s" % (commands,))
         return None, None
-    stdout = p.communicate()[0].strip()
-    if sys.version_info[0] >= 3:
-        stdout = stdout.decode()
-    if p.returncode != 0:
+    stdout = process.communicate()[0].strip().decode()
+    if process.returncode != 0:
         if verbose:
             print("unable to run %s (error)" % dispcmd)
             print("stdout was %s" % stdout)
-        return None, p.returncode
-    return stdout, p.returncode
+        return None, process.returncode
+    return stdout, process.returncode
 
 
-LONG_VERSION_PY['git'] = '''
+LONG_VERSION_PY['git'] = r'''
 # This file helps to compute a version number in source trees obtained from
 # git-archive tarball (such as those provided by githubs download-from-tag
 # feature). Distribution tarballs (built by setup.py sdist) and build
@@ -426,7 +431,7 @@ LONG_VERSION_PY['git'] = '''
 # that just contains the computed version number.
 
 # This file is released into the public domain. Generated by
-# versioneer-0.18 (https://github.com/warner/python-versioneer)
+# versioneer-0.22 (https://github.com/python-versioneer/python-versioneer)
 
 """Git implementation of _version.py."""
 
@@ -435,6 +440,8 @@ import os
 import re
 import subprocess
 import sys
+from typing import Callable, Dict
+import functools
 
 
 def get_keywords():
@@ -472,12 +479,12 @@ class NotThisMethod(Exception):
     """Exception raised if a method is not valid for the current scenario."""
 
 
-LONG_VERSION_PY = {}
-HANDLERS = {}
+LONG_VERSION_PY: Dict[str, str] = {}
+HANDLERS: Dict[str, Dict[str, Callable]] = {}
 
 
 def register_vcs_handler(vcs, method):  # decorator
-    """Decorator to mark a method as the handler for a particular VCS."""
+    """Create decorator to mark a method as the handler of a VCS."""
     def decorate(f):
         """Store f in HANDLERS[vcs][method]."""
         if vcs not in HANDLERS:
@@ -491,17 +498,25 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
                 env=None):
     """Call the given command(s)."""
     assert isinstance(commands, list)
-    p = None
-    for c in commands:
+    process = None
+
+    popen_kwargs = {}
+    if sys.platform == "win32":
+        # This hides the console window if pythonw.exe is used
+        startupinfo = subprocess.STARTUPINFO()
+        startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
+        popen_kwargs["startupinfo"] = startupinfo
+
+    for command in commands:
         try:
-            dispcmd = str([c] + args)
+            dispcmd = str([command] + args)
             # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
+            process = subprocess.Popen([command] + args, cwd=cwd, env=env,
+                                       stdout=subprocess.PIPE,
+                                       stderr=(subprocess.PIPE if hide_stderr
+                                               else None), **popen_kwargs)
             break
-        except EnvironmentError:
+        except OSError:
             e = sys.exc_info()[1]
             if e.errno == errno.ENOENT:
                 continue
@@ -513,15 +528,13 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
         if verbose:
             print("unable to find command, tried %%s" %% (commands,))
         return None, None
-    stdout = p.communicate()[0].strip()
-    if sys.version_info[0] >= 3:
-        stdout = stdout.decode()
-    if p.returncode != 0:
+    stdout = process.communicate()[0].strip().decode()
+    if process.returncode != 0:
         if verbose:
             print("unable to run %%s (error)" %% dispcmd)
             print("stdout was %%s" %% stdout)
-        return None, p.returncode
-    return stdout, p.returncode
+        return None, process.returncode
+    return stdout, process.returncode
 
 
 def versions_from_parentdir(parentdir_prefix, root, verbose):
@@ -533,15 +546,14 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
     """
     rootdirs = []
 
-    for i in range(3):
+    for _ in range(3):
         dirname = os.path.basename(root)
         if dirname.startswith(parentdir_prefix):
             return {"version": dirname[len(parentdir_prefix):],
                     "full-revisionid": None,
                     "dirty": False, "error": None, "date": None}
-        else:
-            rootdirs.append(root)
-            root = os.path.dirname(root)  # up a level
+        rootdirs.append(root)
+        root = os.path.dirname(root)  # up a level
 
     if verbose:
         print("Tried directories %%s but none started with prefix %%s" %%
@@ -558,22 +570,21 @@ def git_get_keywords(versionfile_abs):
     # _version.py.
     keywords = {}
     try:
-        f = open(versionfile_abs, "r")
-        for line in f.readlines():
-            if line.strip().startswith("git_refnames ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["refnames"] = mo.group(1)
-            if line.strip().startswith("git_full ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["full"] = mo.group(1)
-            if line.strip().startswith("git_date ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["date"] = mo.group(1)
-        f.close()
-    except EnvironmentError:
+        with open(versionfile_abs, "r") as fobj:
+            for line in fobj:
+                if line.strip().startswith("git_refnames ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["refnames"] = mo.group(1)
+                if line.strip().startswith("git_full ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["full"] = mo.group(1)
+                if line.strip().startswith("git_date ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["date"] = mo.group(1)
+    except OSError:
         pass
     return keywords
 
@@ -581,10 +592,14 @@ def git_get_keywords(versionfile_abs):
 @register_vcs_handler("git", "keywords")
 def git_versions_from_keywords(keywords, tag_prefix, verbose):
     """Get version information from git keywords."""
-    if not keywords:
-        raise NotThisMethod("no keywords at all, weird")
+    if "refnames" not in keywords:
+        raise NotThisMethod("Short version file found")
     date = keywords.get("date")
     if date is not None:
+        # Use only the last line.  Previous lines may contain GPG signature
+        # information.
+        date = date.splitlines()[-1]
+
         # git-2.2.0 added "%%cI", which expands to an ISO-8601 -compliant
         # datestamp. However we prefer "%%ci" (which expands to an "ISO-8601
         # -like" string, which we must then edit to make compliant), because
@@ -597,11 +612,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         if verbose:
             print("keywords are unexpanded, not using")
         raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
+    refs = {r.strip() for r in refnames.strip("()").split(",")}
     # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
     # just "foo-1.0". If we see a "tag: " prefix, prefer those.
     TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
+    tags = {r[len(TAG):] for r in refs if r.startswith(TAG)}
     if not tags:
         # Either we're using git < 1.8.3, or there really are no tags. We use
         # a heuristic: assume all version tags have a digit. The old git %%d
@@ -610,7 +625,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # between branches and tags. By ignoring refnames without digits, we
         # filter out many common branch names like "release" and
         # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
+        tags = {r for r in refs if re.search(r'\d', r)}
         if verbose:
             print("discarding '%%s', no digits" %% ",".join(refs - tags))
     if verbose:
@@ -619,6 +634,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # sorting will prefer e.g. "2.0" over "2.0rc1"
         if ref.startswith(tag_prefix):
             r = ref[len(tag_prefix):]
+            # Filter out refs that exactly match prefix or that don't start
+            # with a number once the prefix is stripped (mostly a concern
+            # when prefix is '')
+            if not re.match(r'\d', r):
+                continue
             if verbose:
                 print("picking %%s" %% r)
             return {"version": r,
@@ -634,7 +654,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
 
 
 @register_vcs_handler("git", "pieces_from_vcs")
-def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
+def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     """Get version from 'git describe' in the root of the source tree.
 
     This only gets called if the git-archive 'subst' keywords were *not*
@@ -645,24 +665,32 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     if sys.platform == "win32":
         GITS = ["git.cmd", "git.exe"]
 
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
+    # GIT_DIR can interfere with correct operation of Versioneer.
+    # It may be intended to be passed to the Versioneer-versioned project,
+    # but that should not change where we get our version from.
+    env = os.environ.copy()
+    env.pop("GIT_DIR", None)
+    runner = functools.partial(runner, env=env)
+
+    _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root,
+                   hide_stderr=True)
     if rc != 0:
         if verbose:
             print("Directory %%s not under git control" %% root)
         raise NotThisMethod("'git rev-parse --git-dir' returned error")
 
+    MATCH_ARGS = ["--match", "%%s*" %% tag_prefix] if tag_prefix else []
+
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%%s*" %% tag_prefix],
-                                   cwd=root)
+    describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
+                                     "--always", "--long", *MATCH_ARGS],
+                              cwd=root)
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
     describe_out = describe_out.strip()
-    full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
+    full_out, rc = runner(GITS, ["rev-parse", "HEAD"], cwd=root)
     if full_out is None:
         raise NotThisMethod("'git rev-parse' failed")
     full_out = full_out.strip()
@@ -672,6 +700,39 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     pieces["short"] = full_out[:7]  # maybe improved later
     pieces["error"] = None
 
+    branch_name, rc = runner(GITS, ["rev-parse", "--abbrev-ref", "HEAD"],
+                             cwd=root)
+    # --abbrev-ref was added in git-1.6.3
+    if rc != 0 or branch_name is None:
+        raise NotThisMethod("'git rev-parse --abbrev-ref' returned error")
+    branch_name = branch_name.strip()
+
+    if branch_name == "HEAD":
+        # If we aren't exactly on a branch, pick a branch which represents
+        # the current commit. If all else fails, we are on a branchless
+        # commit.
+        branches, rc = runner(GITS, ["branch", "--contains"], cwd=root)
+        # --contains was added in git-1.5.4
+        if rc != 0 or branches is None:
+            raise NotThisMethod("'git branch --contains' returned error")
+        branches = branches.split("\n")
+
+        # Remove the first line if we're running detached
+        if "(" in branches[0]:
+            branches.pop(0)
+
+        # Strip off the leading "* " from the list of branches.
+        branches = [branch[2:] for branch in branches]
+        if "master" in branches:
+            branch_name = "master"
+        elif not branches:
+            branch_name = None
+        else:
+            # Pick the first branch that is returned. Good or bad.
+            branch_name = branches[0]
+
+    pieces["branch"] = branch_name
+
     # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
     # TAG might have hyphens.
     git_describe = describe_out
@@ -688,7 +749,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
         # TAG-NUM-gHEX
         mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
         if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
+            # unparsable. Maybe git-describe is misbehaving?
             pieces["error"] = ("unable to parse git-describe output: '%%s'"
                                %% describe_out)
             return pieces
@@ -713,13 +774,14 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
+        count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
         pieces["distance"] = int(count_out)  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%%ci", "HEAD"],
-                       cwd=root)[0].strip()
+    date = runner(GITS, ["show", "-s", "--format=%%ci", "HEAD"], cwd=root)[0].strip()
+    # Use only the last line.  Previous lines may contain GPG signature
+    # information.
+    date = date.splitlines()[-1]
     pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
 
     return pieces
@@ -757,19 +819,67 @@ def render_pep440(pieces):
     return rendered
 
 
-def render_pep440_pre(pieces):
-    """TAG[.post.devDISTANCE] -- No -dirty.
+def render_pep440_branch(pieces):
+    """TAG[[.dev0]+DISTANCE.gHEX[.dirty]] .
+
+    The ".dev0" means not master branch. Note that .dev0 sorts backwards
+    (a feature branch will appear "older" than the master branch).
 
     Exceptions:
-    1: no tags. 0.post.devDISTANCE
+    1: no tags. 0[.dev0]+untagged.DISTANCE.gHEX[.dirty]
     """
     if pieces["closest-tag"]:
         rendered = pieces["closest-tag"]
+        if pieces["distance"] or pieces["dirty"]:
+            if pieces["branch"] != "master":
+                rendered += ".dev0"
+            rendered += plus_or_dot(pieces)
+            rendered += "%%d.g%%s" %% (pieces["distance"], pieces["short"])
+            if pieces["dirty"]:
+                rendered += ".dirty"
+    else:
+        # exception #1
+        rendered = "0"
+        if pieces["branch"] != "master":
+            rendered += ".dev0"
+        rendered += "+untagged.%%d.g%%s" %% (pieces["distance"],
+                                          pieces["short"])
+        if pieces["dirty"]:
+            rendered += ".dirty"
+    return rendered
+
+
+def pep440_split_post(ver):
+    """Split pep440 version string at the post-release segment.
+
+    Returns the release segments before the post-release and the
+    post-release version number (or -1 if no post-release segment is present).
+    """
+    vc = str.split(ver, ".post")
+    return vc[0], int(vc[1] or 0) if len(vc) == 2 else None
+
+
+def render_pep440_pre(pieces):
+    """TAG[.postN.devDISTANCE] -- No -dirty.
+
+    Exceptions:
+    1: no tags. 0.post0.devDISTANCE
+    """
+    if pieces["closest-tag"]:
         if pieces["distance"]:
-            rendered += ".post.dev%%d" %% pieces["distance"]
+            # update the post release segment
+            tag_version, post_version = pep440_split_post(pieces["closest-tag"])
+            rendered = tag_version
+            if post_version is not None:
+                rendered += ".post%%d.dev%%d" %% (post_version+1, pieces["distance"])
+            else:
+                rendered += ".post0.dev%%d" %% (pieces["distance"])
+        else:
+            # no commits, use the tag as the version
+            rendered = pieces["closest-tag"]
     else:
         # exception #1
-        rendered = "0.post.dev%%d" %% pieces["distance"]
+        rendered = "0.post0.dev%%d" %% pieces["distance"]
     return rendered
 
 
@@ -800,12 +910,41 @@ def render_pep440_post(pieces):
     return rendered
 
 
+def render_pep440_post_branch(pieces):
+    """TAG[.postDISTANCE[.dev0]+gHEX[.dirty]] .
+
+    The ".dev0" means not master branch.
+
+    Exceptions:
+    1: no tags. 0.postDISTANCE[.dev0]+gHEX[.dirty]
+    """
+    if pieces["closest-tag"]:
+        rendered = pieces["closest-tag"]
+        if pieces["distance"] or pieces["dirty"]:
+            rendered += ".post%%d" %% pieces["distance"]
+            if pieces["branch"] != "master":
+                rendered += ".dev0"
+            rendered += plus_or_dot(pieces)
+            rendered += "g%%s" %% pieces["short"]
+            if pieces["dirty"]:
+                rendered += ".dirty"
+    else:
+        # exception #1
+        rendered = "0.post%%d" %% pieces["distance"]
+        if pieces["branch"] != "master":
+            rendered += ".dev0"
+        rendered += "+g%%s" %% pieces["short"]
+        if pieces["dirty"]:
+            rendered += ".dirty"
+    return rendered
+
+
 def render_pep440_old(pieces):
     """TAG[.postDISTANCE[.dev0]] .
 
     The ".dev0" means dirty.
 
-    Eexceptions:
+    Exceptions:
     1: no tags. 0.postDISTANCE[.dev0]
     """
     if pieces["closest-tag"]:
@@ -876,10 +1015,14 @@ def render(pieces, style):
 
     if style == "pep440":
         rendered = render_pep440(pieces)
+    elif style == "pep440-branch":
+        rendered = render_pep440_branch(pieces)
     elif style == "pep440-pre":
         rendered = render_pep440_pre(pieces)
     elif style == "pep440-post":
         rendered = render_pep440_post(pieces)
+    elif style == "pep440-post-branch":
+        rendered = render_pep440_post_branch(pieces)
     elif style == "pep440-old":
         rendered = render_pep440_old(pieces)
     elif style == "git-describe":
@@ -915,7 +1058,7 @@ def get_versions():
         # versionfile_source is the relative path from the top of the source
         # tree (where the .git directory might live) to this file. Invert
         # this to find the root from __file__.
-        for i in cfg.versionfile_source.split('/'):
+        for _ in cfg.versionfile_source.split('/'):
             root = os.path.dirname(root)
     except NameError:
         return {"version": "0+unknown", "full-revisionid": None,
@@ -950,22 +1093,21 @@ def git_get_keywords(versionfile_abs):
     # _version.py.
     keywords = {}
     try:
-        f = open(versionfile_abs, "r")
-        for line in f.readlines():
-            if line.strip().startswith("git_refnames ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["refnames"] = mo.group(1)
-            if line.strip().startswith("git_full ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["full"] = mo.group(1)
-            if line.strip().startswith("git_date ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["date"] = mo.group(1)
-        f.close()
-    except EnvironmentError:
+        with open(versionfile_abs, "r") as fobj:
+            for line in fobj:
+                if line.strip().startswith("git_refnames ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["refnames"] = mo.group(1)
+                if line.strip().startswith("git_full ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["full"] = mo.group(1)
+                if line.strip().startswith("git_date ="):
+                    mo = re.search(r'=\s*"(.*)"', line)
+                    if mo:
+                        keywords["date"] = mo.group(1)
+    except OSError:
         pass
     return keywords
 
@@ -973,10 +1115,14 @@ def git_get_keywords(versionfile_abs):
 @register_vcs_handler("git", "keywords")
 def git_versions_from_keywords(keywords, tag_prefix, verbose):
     """Get version information from git keywords."""
-    if not keywords:
-        raise NotThisMethod("no keywords at all, weird")
+    if "refnames" not in keywords:
+        raise NotThisMethod("Short version file found")
     date = keywords.get("date")
     if date is not None:
+        # Use only the last line.  Previous lines may contain GPG signature
+        # information.
+        date = date.splitlines()[-1]
+
         # git-2.2.0 added "%cI", which expands to an ISO-8601 -compliant
         # datestamp. However we prefer "%ci" (which expands to an "ISO-8601
         # -like" string, which we must then edit to make compliant), because
@@ -989,11 +1135,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         if verbose:
             print("keywords are unexpanded, not using")
         raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
+    refs = {r.strip() for r in refnames.strip("()").split(",")}
     # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
     # just "foo-1.0". If we see a "tag: " prefix, prefer those.
     TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
+    tags = {r[len(TAG):] for r in refs if r.startswith(TAG)}
     if not tags:
         # Either we're using git < 1.8.3, or there really are no tags. We use
         # a heuristic: assume all version tags have a digit. The old git %d
@@ -1002,7 +1148,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # between branches and tags. By ignoring refnames without digits, we
         # filter out many common branch names like "release" and
         # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
+        tags = {r for r in refs if re.search(r'\d', r)}
         if verbose:
             print("discarding '%s', no digits" % ",".join(refs - tags))
     if verbose:
@@ -1011,6 +1157,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # sorting will prefer e.g. "2.0" over "2.0rc1"
         if ref.startswith(tag_prefix):
             r = ref[len(tag_prefix):]
+            # Filter out refs that exactly match prefix or that don't start
+            # with a number once the prefix is stripped (mostly a concern
+            # when prefix is '')
+            if not re.match(r'\d', r):
+                continue
             if verbose:
                 print("picking %s" % r)
             return {"version": r,
@@ -1026,7 +1177,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
 
 
 @register_vcs_handler("git", "pieces_from_vcs")
-def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
+def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     """Get version from 'git describe' in the root of the source tree.
 
     This only gets called if the git-archive 'subst' keywords were *not*
@@ -1037,24 +1188,32 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     if sys.platform == "win32":
         GITS = ["git.cmd", "git.exe"]
 
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
+    # GIT_DIR can interfere with correct operation of Versioneer.
+    # It may be intended to be passed to the Versioneer-versioned project,
+    # but that should not change where we get our version from.
+    env = os.environ.copy()
+    env.pop("GIT_DIR", None)
+    runner = functools.partial(runner, env=env)
+
+    _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root,
+                   hide_stderr=True)
     if rc != 0:
         if verbose:
             print("Directory %s not under git control" % root)
         raise NotThisMethod("'git rev-parse --git-dir' returned error")
 
+    MATCH_ARGS = ["--match", "%s*" % tag_prefix] if tag_prefix else []
+
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%s*" % tag_prefix],
-                                   cwd=root)
+    describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
+                                     "--always", "--long", *MATCH_ARGS],
+                              cwd=root)
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
     describe_out = describe_out.strip()
-    full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
+    full_out, rc = runner(GITS, ["rev-parse", "HEAD"], cwd=root)
     if full_out is None:
         raise NotThisMethod("'git rev-parse' failed")
     full_out = full_out.strip()
@@ -1064,6 +1223,39 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     pieces["short"] = full_out[:7]  # maybe improved later
     pieces["error"] = None
 
+    branch_name, rc = runner(GITS, ["rev-parse", "--abbrev-ref", "HEAD"],
+                             cwd=root)
+    # --abbrev-ref was added in git-1.6.3
+    if rc != 0 or branch_name is None:
+        raise NotThisMethod("'git rev-parse --abbrev-ref' returned error")
+    branch_name = branch_name.strip()
+
+    if branch_name == "HEAD":
+        # If we aren't exactly on a branch, pick a branch which represents
+        # the current commit. If all else fails, we are on a branchless
+        # commit.
+        branches, rc = runner(GITS, ["branch", "--contains"], cwd=root)
+        # --contains was added in git-1.5.4
+        if rc != 0 or branches is None:
+            raise NotThisMethod("'git branch --contains' returned error")
+        branches = branches.split("\n")
+
+        # Remove the first line if we're running detached
+        if "(" in branches[0]:
+            branches.pop(0)
+
+        # Strip off the leading "* " from the list of branches.
+        branches = [branch[2:] for branch in branches]
+        if "master" in branches:
+            branch_name = "master"
+        elif not branches:
+            branch_name = None
+        else:
+            # Pick the first branch that is returned. Good or bad.
+            branch_name = branches[0]
+
+    pieces["branch"] = branch_name
+
     # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
     # TAG might have hyphens.
     git_describe = describe_out
@@ -1080,7 +1272,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
         # TAG-NUM-gHEX
         mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
         if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
+            # unparsable. Maybe git-describe is misbehaving?
             pieces["error"] = ("unable to parse git-describe output: '%s'"
                                % describe_out)
             return pieces
@@ -1105,13 +1297,14 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
+        count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
         pieces["distance"] = int(count_out)  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"],
-                       cwd=root)[0].strip()
+    date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip()
+    # Use only the last line.  Previous lines may contain GPG signature
+    # information.
+    date = date.splitlines()[-1]
     pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
 
     return pieces
@@ -1130,27 +1323,26 @@ def do_vcs_install(manifest_in, versionfile_source, ipy):
     if ipy:
         files.append(ipy)
     try:
-        me = __file__
-        if me.endswith(".pyc") or me.endswith(".pyo"):
-            me = os.path.splitext(me)[0] + ".py"
-        versioneer_file = os.path.relpath(me)
+        my_path = __file__
+        if my_path.endswith(".pyc") or my_path.endswith(".pyo"):
+            my_path = os.path.splitext(my_path)[0] + ".py"
+        versioneer_file = os.path.relpath(my_path)
     except NameError:
         versioneer_file = "versioneer.py"
     files.append(versioneer_file)
     present = False
     try:
-        f = open(".gitattributes", "r")
-        for line in f.readlines():
-            if line.strip().startswith(versionfile_source):
-                if "export-subst" in line.strip().split()[1:]:
-                    present = True
-        f.close()
-    except EnvironmentError:
+        with open(".gitattributes", "r") as fobj:
+            for line in fobj:
+                if line.strip().startswith(versionfile_source):
+                    if "export-subst" in line.strip().split()[1:]:
+                        present = True
+                        break
+    except OSError:
         pass
     if not present:
-        f = open(".gitattributes", "a+")
-        f.write("%s export-subst\n" % versionfile_source)
-        f.close()
+        with open(".gitattributes", "a+") as fobj:
+            fobj.write(f"{versionfile_source} export-subst\n")
         files.append(".gitattributes")
     run_command(GITS, ["add", "--"] + files)
 
@@ -1164,15 +1356,14 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
     """
     rootdirs = []
 
-    for i in range(3):
+    for _ in range(3):
         dirname = os.path.basename(root)
         if dirname.startswith(parentdir_prefix):
             return {"version": dirname[len(parentdir_prefix):],
                     "full-revisionid": None,
                     "dirty": False, "error": None, "date": None}
-        else:
-            rootdirs.append(root)
-            root = os.path.dirname(root)  # up a level
+        rootdirs.append(root)
+        root = os.path.dirname(root)  # up a level
 
     if verbose:
         print("Tried directories %s but none started with prefix %s" %
@@ -1181,7 +1372,7 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
 
 
 SHORT_VERSION_PY = """
-# This file was generated by 'versioneer.py' (0.18) from
+# This file was generated by 'versioneer.py' (0.22) from
 # revision-control system data, or from the parent directory name of an
 # unpacked source archive. Distribution tarballs contain a pre-generated copy
 # of this file.
@@ -1203,7 +1394,7 @@ def versions_from_file(filename):
     try:
         with open(filename) as f:
             contents = f.read()
-    except EnvironmentError:
+    except OSError:
         raise NotThisMethod("unable to read _version.py")
     mo = re.search(r"version_json = '''\n(.*)'''  # END VERSION_JSON",
                    contents, re.M | re.S)
@@ -1258,19 +1449,67 @@ def render_pep440(pieces):
     return rendered
 
 
-def render_pep440_pre(pieces):
-    """TAG[.post.devDISTANCE] -- No -dirty.
+def render_pep440_branch(pieces):
+    """TAG[[.dev0]+DISTANCE.gHEX[.dirty]] .
+
+    The ".dev0" means not master branch. Note that .dev0 sorts backwards
+    (a feature branch will appear "older" than the master branch).
 
     Exceptions:
-    1: no tags. 0.post.devDISTANCE
+    1: no tags. 0[.dev0]+untagged.DISTANCE.gHEX[.dirty]
     """
     if pieces["closest-tag"]:
         rendered = pieces["closest-tag"]
+        if pieces["distance"] or pieces["dirty"]:
+            if pieces["branch"] != "master":
+                rendered += ".dev0"
+            rendered += plus_or_dot(pieces)
+            rendered += "%d.g%s" % (pieces["distance"], pieces["short"])
+            if pieces["dirty"]:
+                rendered += ".dirty"
+    else:
+        # exception #1
+        rendered = "0"
+        if pieces["branch"] != "master":
+            rendered += ".dev0"
+        rendered += "+untagged.%d.g%s" % (pieces["distance"],
+                                          pieces["short"])
+        if pieces["dirty"]:
+            rendered += ".dirty"
+    return rendered
+
+
+def pep440_split_post(ver):
+    """Split pep440 version string at the post-release segment.
+
+    Returns the release segments before the post-release and the
+    post-release version number (or -1 if no post-release segment is present).
+    """
+    vc = str.split(ver, ".post")
+    return vc[0], int(vc[1] or 0) if len(vc) == 2 else None
+
+
+def render_pep440_pre(pieces):
+    """TAG[.postN.devDISTANCE] -- No -dirty.
+
+    Exceptions:
+    1: no tags. 0.post0.devDISTANCE
+    """
+    if pieces["closest-tag"]:
         if pieces["distance"]:
-            rendered += ".post.dev%d" % pieces["distance"]
+            # update the post release segment
+            tag_version, post_version = pep440_split_post(pieces["closest-tag"])
+            rendered = tag_version
+            if post_version is not None:
+                rendered += ".post%d.dev%d" % (post_version+1, pieces["distance"])
+            else:
+                rendered += ".post0.dev%d" % (pieces["distance"])
+        else:
+            # no commits, use the tag as the version
+            rendered = pieces["closest-tag"]
     else:
         # exception #1
-        rendered = "0.post.dev%d" % pieces["distance"]
+        rendered = "0.post0.dev%d" % pieces["distance"]
     return rendered
 
 
@@ -1301,12 +1540,41 @@ def render_pep440_post(pieces):
     return rendered
 
 
+def render_pep440_post_branch(pieces):
+    """TAG[.postDISTANCE[.dev0]+gHEX[.dirty]] .
+
+    The ".dev0" means not master branch.
+
+    Exceptions:
+    1: no tags. 0.postDISTANCE[.dev0]+gHEX[.dirty]
+    """
+    if pieces["closest-tag"]:
+        rendered = pieces["closest-tag"]
+        if pieces["distance"] or pieces["dirty"]:
+            rendered += ".post%d" % pieces["distance"]
+            if pieces["branch"] != "master":
+                rendered += ".dev0"
+            rendered += plus_or_dot(pieces)
+            rendered += "g%s" % pieces["short"]
+            if pieces["dirty"]:
+                rendered += ".dirty"
+    else:
+        # exception #1
+        rendered = "0.post%d" % pieces["distance"]
+        if pieces["branch"] != "master":
+            rendered += ".dev0"
+        rendered += "+g%s" % pieces["short"]
+        if pieces["dirty"]:
+            rendered += ".dirty"
+    return rendered
+
+
 def render_pep440_old(pieces):
     """TAG[.postDISTANCE[.dev0]] .
 
     The ".dev0" means dirty.
 
-    Eexceptions:
+    Exceptions:
     1: no tags. 0.postDISTANCE[.dev0]
     """
     if pieces["closest-tag"]:
@@ -1377,10 +1645,14 @@ def render(pieces, style):
 
     if style == "pep440":
         rendered = render_pep440(pieces)
+    elif style == "pep440-branch":
+        rendered = render_pep440_branch(pieces)
     elif style == "pep440-pre":
         rendered = render_pep440_pre(pieces)
     elif style == "pep440-post":
         rendered = render_pep440_post(pieces)
+    elif style == "pep440-post-branch":
+        rendered = render_pep440_post_branch(pieces)
     elif style == "pep440-old":
         rendered = render_pep440_old(pieces)
     elif style == "git-describe":
@@ -1480,8 +1752,12 @@ def get_version():
     return get_versions()["version"]
 
 
-def get_cmdclass():
-    """Get the custom setuptools/distutils subclasses used by Versioneer."""
+def get_cmdclass(cmdclass=None):
+    """Get the custom setuptools/distutils subclasses used by Versioneer.
+
+    If the package uses a different cmdclass (e.g. one from numpy), it
+    should be provide as an argument.
+    """
     if "versioneer" in sys.modules:
         del sys.modules["versioneer"]
         # this fixes the "python setup.py develop" case (also 'install' and
@@ -1495,12 +1771,15 @@ def get_cmdclass():
         # parent is protected against the child's "import versioneer". By
         # removing ourselves from sys.modules here, before the child build
         # happens, we protect the child from the parent's versioneer too.
-        # Also see https://github.com/warner/python-versioneer/issues/52
+        # Also see https://github.com/python-versioneer/python-versioneer/issues/52
 
-    cmds = {}
+    cmds = {} if cmdclass is None else cmdclass.copy()
 
     # we add "version" to both distutils and setuptools
-    from distutils.core import Command
+    try:
+        from setuptools import Command
+    except ImportError:
+        from distutils.core import Command
 
     class cmd_version(Command):
         description = "report generated version string"
@@ -1539,7 +1818,9 @@ def get_cmdclass():
     #  setup.py egg_info -> ?
 
     # we override different "build_py" commands for both environments
-    if "setuptools" in sys.modules:
+    if 'build_py' in cmds:
+        _build_py = cmds['build_py']
+    elif "setuptools" in sys.modules:
         from setuptools.command.build_py import build_py as _build_py
     else:
         from distutils.command.build_py import build_py as _build_py
@@ -1559,6 +1840,33 @@ def get_cmdclass():
                 write_to_version_file(target_versionfile, versions)
     cmds["build_py"] = cmd_build_py
 
+    if 'build_ext' in cmds:
+        _build_ext = cmds['build_ext']
+    elif "setuptools" in sys.modules:
+        from setuptools.command.build_ext import build_ext as _build_ext
+    else:
+        from distutils.command.build_ext import build_ext as _build_ext
+
+    class cmd_build_ext(_build_ext):
+        def run(self):
+            root = get_root()
+            cfg = get_config_from_root(root)
+            versions = get_versions()
+            _build_ext.run(self)
+            if self.inplace:
+                # build_ext --inplace will only build extensions in
+                # build/lib<..> dir with no _version.py to write to.
+                # As in place builds will already have a _version.py
+                # in the module dir, we do not need to write one.
+                return
+            # now locate _version.py in the new build/ directory and replace
+            # it with an updated value
+            target_versionfile = os.path.join(self.build_lib,
+                                              cfg.versionfile_build)
+            print("UPDATING %s" % target_versionfile)
+            write_to_version_file(target_versionfile, versions)
+    cmds["build_ext"] = cmd_build_ext
+
     if "cx_Freeze" in sys.modules:  # cx_freeze enabled?
         from cx_Freeze.dist import build_exe as _build_exe
         # nczeczulin reports that py2exe won't like the pep440-style string
@@ -1592,10 +1900,7 @@ def get_cmdclass():
         del cmds["build_py"]
 
     if 'py2exe' in sys.modules:  # py2exe enabled?
-        try:
-            from py2exe.distutils_buildexe import py2exe as _py2exe  # py3
-        except ImportError:
-            from py2exe.build_exe import py2exe as _py2exe  # py2
+        from py2exe.distutils_buildexe import py2exe as _py2exe
 
         class cmd_py2exe(_py2exe):
             def run(self):
@@ -1620,7 +1925,9 @@ def get_cmdclass():
         cmds["py2exe"] = cmd_py2exe
 
     # we override different "sdist" commands for both environments
-    if "setuptools" in sys.modules:
+    if 'sdist' in cmds:
+        _sdist = cmds['sdist']
+    elif "setuptools" in sys.modules:
         from setuptools.command.sdist import sdist as _sdist
     else:
         from distutils.command.sdist import sdist as _sdist
@@ -1687,21 +1994,26 @@ SAMPLE_CONFIG = """
 
 """
 
-INIT_PY_SNIPPET = """
+OLD_SNIPPET = """
 from ._version import get_versions
 __version__ = get_versions()['version']
 del get_versions
 """
 
+INIT_PY_SNIPPET = """
+from . import {0}
+__version__ = {0}.get_versions()['version']
+"""
+
 
 def do_setup():
-    """Main VCS-independent setup function for installing Versioneer."""
+    """Do main VCS-independent setup function for installing Versioneer."""
     root = get_root()
     try:
         cfg = get_config_from_root(root)
-    except (EnvironmentError, configparser.NoSectionError,
+    except (OSError, configparser.NoSectionError,
             configparser.NoOptionError) as e:
-        if isinstance(e, (EnvironmentError, configparser.NoSectionError)):
+        if isinstance(e, (OSError, configparser.NoSectionError)):
             print("Adding sample versioneer config to setup.cfg",
                   file=sys.stderr)
             with open(os.path.join(root, "setup.cfg"), "a") as f:
@@ -1725,12 +2037,18 @@ def do_setup():
         try:
             with open(ipy, "r") as f:
                 old = f.read()
-        except EnvironmentError:
+        except OSError:
             old = ""
-        if INIT_PY_SNIPPET not in old:
+        module = os.path.splitext(os.path.basename(cfg.versionfile_source))[0]
+        snippet = INIT_PY_SNIPPET.format(module)
+        if OLD_SNIPPET in old:
+            print(" replacing boilerplate in %s" % ipy)
+            with open(ipy, "w") as f:
+                f.write(old.replace(OLD_SNIPPET, snippet))
+        elif snippet not in old:
             print(" appending to %s" % ipy)
             with open(ipy, "a") as f:
-                f.write(INIT_PY_SNIPPET)
+                f.write(snippet)
         else:
             print(" %s unmodified" % ipy)
     else:
@@ -1749,7 +2067,7 @@ def do_setup():
                 if line.startswith("include "):
                     for include in line.split()[1:]:
                         simple_includes.add(include)
-    except EnvironmentError:
+    except OSError:
         pass
     # That doesn't cover everything MANIFEST.in can do
     # (http://docs.python.org/2/distutils/sourcedist.html#commands), so



View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geotiepoints/-/commit/180474b16e33a4743e2286202add868987414c96

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geotiepoints/-/commit/180474b16e33a4743e2286202add868987414c96
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20220611/ed624316/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list