[Git][debian-gis-team/python-geotiepoints][master] 5 commits: New upstream version 1.5.0

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Thu Oct 27 07:46:09 BST 2022



Antonio Valentino pushed to branch master at Debian GIS Project / python-geotiepoints


Commits:
c5812708 by Antonio Valentino at 2022-10-27T06:23:25+00:00
New upstream version 1.5.0
- - - - -
d4b93c87 by Antonio Valentino at 2022-10-27T06:23:55+00:00
Update upstream source from tag 'upstream/1.5.0'

Update to upstream version '1.5.0'
with Debian dir 810e916a689f3313685f4b64025678d770020a4f
- - - - -
5afc1014 by Antonio Valentino at 2022-10-27T06:25:45+00:00
New upstream release

- - - - -
bd00445d by Antonio Valentino at 2022-10-27T06:41:18+00:00
Refresh all patches

- - - - -
4c847737 by Antonio Valentino at 2022-10-27T06:41:40+00:00
Set distritbution to unstable

- - - - -


19 changed files:

- .github/workflows/ci.yaml
- CHANGELOG.md
- MANIFEST.in
- debian/changelog
- debian/patches/0001-Skip-tests-using-external-data.patch
- debian/patches/0002-Remove-versineer-from-pyproject-requires.patch
- + geotiepoints/_modis_interpolator.pyx
- + geotiepoints/_modis_utils.pxd
- + geotiepoints/_modis_utils.pyx
- + geotiepoints/_simple_modis_interpolator.pyx
- geotiepoints/modisinterpolator.py
- geotiepoints/simple_modis_interpolator.py
- geotiepoints/tests/test_modis.py
- geotiepoints/tests/test_modisinterpolator.py
- geotiepoints/tests/test_simple_modis_interpolator.py
- geotiepoints/version.py
- pyproject.toml
- setup.py
- versioneer.py


Changes:

=====================================
.github/workflows/ci.yaml
=====================================
@@ -60,6 +60,7 @@ jobs:
         shell: bash -l {0}
         run: |
           pip install -e .
+          python setup.py build_ext --inplace --cython-coverage --force
 
       - name: Run unit tests
         shell: bash -l {0}
@@ -81,8 +82,7 @@ jobs:
 
       - name: Coveralls Parallel
         # See https://github.com/AndreMiras/coveralls-python-action/pull/16
-        uses: miurahr/coveralls-python-action at patch-pyprject-toml
-#        uses: AndreMiras/coveralls-python-action at develop
+        uses: djhoese/coveralls-python-action at feature-cython-coverage
         with:
           flag-name: run-${{ matrix.test_number }}
           parallel: true
@@ -94,8 +94,7 @@ jobs:
     steps:
       - name: Coveralls Finished
         # See https://github.com/AndreMiras/coveralls-python-action/pull/16
-        uses: miurahr/coveralls-python-action at patch-pyprject-toml
-#        uses: AndreMiras/coveralls-python-action at develop
+        uses: djhoese/coveralls-python-action at feature-cython-coverage
         with:
           parallel-finished: true
 


=====================================
CHANGELOG.md
=====================================
@@ -1,3 +1,14 @@
+## Version 1.5.0 (2022/10/25)
+
+### Pull Requests Merged
+
+#### Features added
+
+* [PR 38](https://github.com/pytroll/python-geotiepoints/pull/38) - Rewrite simple and tiepoint modis interpolation in cython
+
+In this release 1 pull request was closed.
+
+
 ## Version 1.4.1 (2022/06/08)
 
 ### Issues Closed


=====================================
MANIFEST.in
=====================================
@@ -1,6 +1,7 @@
 include doc/Makefile
 recursive-include doc/source *
 include LICENSE.txt
-include geotiepoints/multilinear_cython.pyx
+include geotiepoints/*.pyx
+include geotiepoints/*.pxd
 include versioneer.py
 include geotiepoints/version.py


=====================================
debian/changelog
=====================================
@@ -1,9 +1,12 @@
-python-geotiepoints (1.4.1-3) UNRELEASED; urgency=medium
+python-geotiepoints (1.5.0-1) unstable; urgency=medium
 
+  * New upstream release.
   * debian/rules:
     - remove debug statement.
+  * debian/patches:
+    - refresh all patches.
 
- -- Antonio Valentino <antonio.valentino at tiscali.it>  Sat, 15 Oct 2022 17:25:34 +0000
+ -- Antonio Valentino <antonio.valentino at tiscali.it>  Thu, 27 Oct 2022 06:41:32 +0000
 
 python-geotiepoints (1.4.1-2) unstable; urgency=medium
 


=====================================
debian/patches/0001-Skip-tests-using-external-data.patch
=====================================
@@ -4,47 +4,55 @@ Subject: Skip tests using external data
 
 Forwarded: not-needed
 ---
- geotiepoints/tests/test_modis.py                     | 4 ++++
+ geotiepoints/tests/test_modis.py                     | 5 +++++
  geotiepoints/tests/test_modisinterpolator.py         | 3 +++
  geotiepoints/tests/test_simple_modis_interpolator.py | 7 +++++++
  setup.py                                             | 2 +-
- 4 files changed, 15 insertions(+), 1 deletion(-)
+ 4 files changed, 16 insertions(+), 1 deletion(-)
 
 diff --git a/geotiepoints/tests/test_modis.py b/geotiepoints/tests/test_modis.py
-index ef9c562..a2a1eac 100644
+index c039683..1f52c76 100644
 --- a/geotiepoints/tests/test_modis.py
 +++ b/geotiepoints/tests/test_modis.py
-@@ -61,6 +61,8 @@ class TestMODIS(unittest.TestCase):
-     def setUp(self):
-         pass
+@@ -19,6 +19,7 @@
+ import numpy as np
+ import h5py
+ import os
++import unittest
+ 
+ FILENAME_250M_RESULT = os.path.join(
+     os.path.dirname(__file__), '../../testdata/250m_lonlat_section_result.h5')
+@@ -52,6 +53,8 @@ class TestUtils:
+ class TestMODIS:
+     """Class for system testing the MODIS interpolation."""
  
 +    @unittest.skipIf(not os.path.isfile(FILENAME_FULL), 'deta file not available')
 +    @unittest.skipIf(not os.path.isfile(FILENAME_5KM), 'deta file not available')
      def test_5_to_1(self):
-         """test the 5km to 1km interpolation facility
-         """
-@@ -78,6 +80,8 @@ class TestMODIS(unittest.TestCase):
-         self.assert_(np.allclose(tlons, glons, atol=0.05))
-         self.assert_(np.allclose(tlats, glats, atol=0.05))
+         """Test the 5km to 1km interpolation facility."""
+ 
+@@ -68,6 +71,8 @@ class TestMODIS:
+         np.testing.assert_allclose(tlons, glons, atol=0.05)
+         np.testing.assert_allclose(tlats, glats, atol=0.05)
  
 +    @unittest.skipIf(not os.path.isfile(FILENAME_250M_RESULT), 'deta file not available')
 +    @unittest.skipIf(not os.path.isfile(FILENAME_250M_INPUT), 'deta file not available')
      def test_1000m_to_250m(self):
-         """test the 1 km to 250 meter interpolation facility
-         """
+         """Test the 1 km to 250 meter interpolation facility."""
+ 
 diff --git a/geotiepoints/tests/test_modisinterpolator.py b/geotiepoints/tests/test_modisinterpolator.py
-index 88056e7..cf32b08 100644
+index 35d5ff5..78ac1ce 100644
 --- a/geotiepoints/tests/test_modisinterpolator.py
 +++ b/geotiepoints/tests/test_modisinterpolator.py
-@@ -122,6 +122,7 @@ def assert_geodetic_distance(
+@@ -123,6 +123,7 @@ def assert_geodetic_distance(
                  "meters from the expected coordinates.")
  
  
 + at pytest.mark.skipif(not os.path.isfile(FILENAME_DATA), reason='data file not available')
  @pytest.mark.parametrize(
-     ("input_func", "exp_func", "interp_func", "dist_max"),
+     ("input_func", "exp_func", "interp_func", "dist_max", "exp_5km_warning"),
      [
-@@ -148,6 +149,7 @@ def test_sat_angle_based_interp(input_func, exp_func, interp_func, dist_max):
+@@ -156,6 +157,7 @@ def test_sat_angle_based_interp(input_func, exp_func, interp_func, dist_max, exp
      assert not np.any(np.isnan(lats))
  
  
@@ -52,7 +60,7 @@ index 88056e7..cf32b08 100644
  def test_sat_angle_based_interp_nan_handling():
      # See GH #19
      lon1, lat1, satz1 = load_1km_lonlat_satz_as_xarray_dask()
-@@ -157,6 +159,7 @@ def test_sat_angle_based_interp_nan_handling():
+@@ -165,6 +167,7 @@ def test_sat_angle_based_interp_nan_handling():
      assert not np.any(np.isnan(lats.compute()))
  
  
@@ -61,7 +69,7 @@ index 88056e7..cf32b08 100644
      orig_lon, lat1, satz1 = load_1km_lonlat_satz_as_xarray_dask()
      lon1 = orig_lon + 180
 diff --git a/geotiepoints/tests/test_simple_modis_interpolator.py b/geotiepoints/tests/test_simple_modis_interpolator.py
-index 45acff9..a6e32e7 100644
+index 3af03ef..34ccc24 100644
 --- a/geotiepoints/tests/test_simple_modis_interpolator.py
 +++ b/geotiepoints/tests/test_simple_modis_interpolator.py
 @@ -20,6 +20,11 @@ import numpy as np
@@ -93,11 +101,11 @@ index 45acff9..a6e32e7 100644
      lon1, lat1 = load_1km_lonlat_as_xarray_dask()
      # remove 1 row from the end
 diff --git a/setup.py b/setup.py
-index a9d59ef..a9a96e5 100644
+index 302aedb..78594f3 100644
 --- a/setup.py
 +++ b/setup.py
-@@ -69,7 +69,7 @@ if __name__ == "__main__":
-                        "Programming Language :: Python",
+@@ -114,7 +114,7 @@ if __name__ == "__main__":
+                        "Programming Language :: Cython",
                         "Topic :: Scientific/Engineering"],
            url="https://github.com/pytroll/python-geotiepoints",
 -          packages=['geotiepoints'],


=====================================
debian/patches/0002-Remove-versineer-from-pyproject-requires.patch
=====================================
@@ -8,12 +8,12 @@ Forwarded: not-needed
  1 file changed, 1 insertion(+), 1 deletion(-)
 
 diff --git a/pyproject.toml b/pyproject.toml
-index a37f62a..8ec99f2 100644
+index 8dbf46a..fbf64b5 100644
 --- a/pyproject.toml
 +++ b/pyproject.toml
 @@ -1,5 +1,5 @@
  [build-system]
--requires = ["setuptools", "wheel", "oldest-supported-numpy", "Cython", "versioneer-518"]
+-requires = ["setuptools", "wheel", "oldest-supported-numpy", "Cython", "versioneer"]
 +requires = ["setuptools", "wheel", "oldest-supported-numpy", "Cython"]
  build-backend = "setuptools.build_meta"
  


=====================================
geotiepoints/_modis_interpolator.pyx
=====================================
@@ -0,0 +1,702 @@
+cimport cython
+from ._modis_utils cimport lonlat2xyz, xyz2lonlat, floating, deg2rad
+from .simple_modis_interpolator import scanline_mapblocks
+
+from libc.math cimport asin, sin, cos, sqrt
+cimport numpy as np
+import numpy as np
+
+DEF R = 6370.997
+# Aqua altitude in km
+DEF H = 709.0
+
+
+ at scanline_mapblocks
+def interpolate(
+        np.ndarray[floating, ndim=2] lon1,
+        np.ndarray[floating, ndim=2] lat1,
+        np.ndarray[floating, ndim=2] satz1,
+        unsigned int coarse_resolution=0,
+        unsigned int fine_resolution=0,
+        unsigned int coarse_scan_width=0,
+):
+    """Helper function to interpolate scan-aligned arrays.
+
+    This function's decorator runs this function for each dask block/chunk of
+    scans. The arrays are scan-aligned meaning they are an even number of scans
+    (N rows per scan) and contain the entire scan width.
+
+    """
+    if coarse_resolution == 5000 and coarse_scan_width not in (0, 270, 271):
+        raise NotImplementedError(
+            "Can't interpolate if 5km tiepoints have less than 270 columns."
+        )
+    interp = MODISInterpolator(coarse_resolution, fine_resolution, coarse_scan_width=coarse_scan_width or 0)
+    return interp.interpolate(lon1, lat1, satz1)
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef void _compute_expansion_alignment(floating[:, :] satz_a, floating [:, :] satz_b, int scan_width,
+                                       floating[:, ::1] c_expansion, floating[:, ::1] c_alignment) nogil:
+    """Fill in expansion and alignment.
+    
+    Input angles should be in degrees and will be converted to radians.
+    
+    """
+    cdef Py_ssize_t i, j
+    cdef floating satz_a_rad, satz_b_rad, phi_a, phi_b, theta_a, theta_b, phi, zeta, theta, denominator, sin_beta_2, d, e
+    for i in range(satz_a.shape[0]):
+        for j in range(satz_a.shape[1]):
+            satz_a_rad = deg2rad(satz_a[i, j])
+            satz_b_rad = deg2rad(satz_b[i, j])
+            phi_a = _compute_phi(satz_a_rad)
+            phi_b = _compute_phi(satz_b_rad)
+            theta_a = _compute_theta(satz_a_rad, phi_a)
+            theta_b = _compute_theta(satz_b_rad, phi_b)
+            phi = (phi_a + phi_b) / 2
+            zeta = _compute_zeta(phi)
+            theta = _compute_theta(zeta, phi)
+            # Workaround for tiepoints symmetrical about the subsatellite-track
+            denominator = theta_a * 2 if theta_a == theta_b else theta_a - theta_b
+
+            c_expansion[i, j] = 4 * (((theta_a + theta_b) / 2 - theta) / denominator)
+
+            sin_beta_2 = scan_width / (2 * H)
+            d = ((R + H) / R * cos(phi) - cos(zeta)) * sin_beta_2
+            e = cos(zeta) - sqrt(cos(zeta) ** 2 - d ** 2)
+
+            c_alignment[i, j] = 4 * e * sin(zeta) / denominator
+
+
+cdef inline floating _compute_phi(floating zeta) nogil:
+    return asin(R * sin(zeta) / (R + H))
+
+
+cdef inline floating _compute_theta(floating zeta, floating phi) nogil:
+    return zeta - phi
+
+
+cdef inline floating _compute_zeta(floating phi) nogil:
+    return asin((R + H) * sin(phi) / R)
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating[:, :] _get_upper_left_corner(floating[:, ::1] arr) nogil:
+    return arr[:-1, :-1]
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating[:, :] _get_upper_right_corner(floating[:, ::1] arr) nogil:
+    return arr[:-1, 1:]
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating[:, :] _get_lower_right_corner(floating[:, ::1] arr) nogil:
+    return arr[1:, 1:]
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating[:, :] _get_lower_left_corner(floating[:, ::1] arr) nogil:
+    return arr[1:, :-1]
+
+
+cdef class MODISInterpolator:
+    """Helper class for MODIS interpolation.
+
+    Not intended for public use. Use ``modis_X_to_Y`` functions instead.
+
+    """
+
+    cdef int _coarse_scan_length
+    cdef int _coarse_scan_width
+    cdef int _coarse_pixels_per_1km
+    cdef int _fine_pixels_per_coarse_pixel
+    cdef int _fine_scan_width
+    cdef int _fine_scan_length
+    cdef int _coarse_resolution
+    cdef int _fine_resolution
+    cdef Py_ssize_t _factor_5km
+
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    def __cinit__(self, unsigned int coarse_resolution, unsigned int fine_resolution, unsigned int coarse_scan_width=0):
+        if coarse_resolution == 1000:
+            self._coarse_scan_length = 10
+            self._coarse_scan_width = 1354
+        elif coarse_resolution == 5000:
+            self._coarse_scan_length = 2
+            if coarse_scan_width == 0:
+                self._coarse_scan_width = 271
+            else:
+                self._coarse_scan_width = coarse_scan_width
+        self._coarse_pixels_per_1km = coarse_resolution // 1000
+
+        cdef int fine_pixels_per_1km = 1000 // fine_resolution
+        self._fine_pixels_per_coarse_pixel = fine_pixels_per_1km * self._coarse_pixels_per_1km
+        self._fine_scan_width = 1354 * fine_pixels_per_1km
+        self._coarse_resolution = coarse_resolution
+        self._fine_resolution = fine_resolution
+        # partial rows/columns to repeat: 5km->1km => 2, 5km->500m => 4, 5km->250m => 8
+        self._factor_5km = self._fine_pixels_per_coarse_pixel // self._coarse_pixels_per_1km * 2
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef interpolate(
+            self,
+            np.ndarray[floating, ndim=2] lon1,
+            np.ndarray[floating, ndim=2] lat1,
+            np.ndarray[floating, ndim=2] satz1):
+        """Interpolate MODIS geolocation from 'coarse_resolution' to 'fine_resolution'."""
+        lon1 = np.ascontiguousarray(lon1)
+        lat1 = np.ascontiguousarray(lat1)
+        satz1 = np.ascontiguousarray(satz1)
+
+        cdef int num_fine_scan_rows = self._coarse_scan_length * self._fine_pixels_per_coarse_pixel
+        cdef int num_fine_scan_cols = self._fine_scan_width
+        cdef np.ndarray[floating, ndim=2] a_track = np.empty((num_fine_scan_rows, num_fine_scan_cols), dtype=lon1.dtype)
+        cdef np.ndarray[floating, ndim=2] a_scan = np.empty((num_fine_scan_rows, num_fine_scan_cols), dtype=lon1.dtype)
+        cdef floating[:, ::1] a_track_view = a_track
+        cdef floating[:, ::1] a_scan_view = a_scan
+
+        cdef tuple coords_xy = self._get_coords()
+        cdef np.ndarray[floating, ndim=1] x = coords_xy[0]
+        cdef np.ndarray[floating, ndim=1] y = coords_xy[1]
+        cdef floating[::1] x_view = x
+        cdef floating[::1] y_view = y
+
+        cdef np.ndarray[floating, ndim=2] tmp_tiepoint_a, tmp_tiepoint_b, tmp_tiepoint_c, tmp_tiepoint_d
+        cdef floating[:, ::1] tmp_tiepoint_a_view, tmp_tiepoint_b_view, tmp_tiepoint_c_view, tmp_tiepoint_d_view
+        tmp_tiepoint_a = np.empty((num_fine_scan_rows, num_fine_scan_cols), dtype=lon1.dtype)
+        tmp_tiepoint_b = np.empty((num_fine_scan_rows, num_fine_scan_cols), dtype=lon1.dtype)
+        tmp_tiepoint_c = np.empty((num_fine_scan_rows, num_fine_scan_cols), dtype=lon1.dtype)
+        tmp_tiepoint_d = np.empty((num_fine_scan_rows, num_fine_scan_cols), dtype=lon1.dtype)
+        tmp_tiepoint_a_view = tmp_tiepoint_a
+        tmp_tiepoint_b_view = tmp_tiepoint_b
+        tmp_tiepoint_c_view = tmp_tiepoint_c
+        tmp_tiepoint_d_view = tmp_tiepoint_d
+
+        cdef floating[:, :, ::1] xyz_coarse_view = np.empty((self._coarse_scan_length, self._coarse_scan_width, 3), dtype=lon1.dtype)
+        cdef floating[:, :, ::1] xyz_fine_view = np.empty((num_fine_scan_rows, num_fine_scan_cols, 3), dtype=lon1.dtype)
+
+        cdef unsigned int scans = satz1.shape[0] // self._coarse_scan_length
+        cdef np.ndarray[floating, ndim=2] new_lons = np.empty((satz1.shape[0] * self._fine_pixels_per_coarse_pixel, self._fine_scan_width), dtype=lon1.dtype)
+        cdef np.ndarray[floating, ndim=2] new_lats = np.empty((satz1.shape[0] * self._fine_pixels_per_coarse_pixel, self._fine_scan_width), dtype=lon1.dtype)
+        cdef floating[:, ::1] coarse_lons = lon1
+        cdef floating[:, ::1] coarse_lats = lat1
+        cdef floating[:, ::1] coarse_satz = satz1
+        cdef floating[:, ::1] fine_lons = new_lons
+        cdef floating[:, ::1] fine_lats = new_lats
+        with nogil:
+            self._interpolate_lons_lats(
+                scans,
+                coarse_lons,
+                coarse_lats,
+                coarse_satz,
+                x_view,
+                y_view,
+                a_track_view,
+                a_scan_view,
+                tmp_tiepoint_a_view,
+                tmp_tiepoint_b_view,
+                tmp_tiepoint_c_view,
+                tmp_tiepoint_d_view,
+                xyz_coarse_view,
+                xyz_fine_view,
+                fine_lons,
+                fine_lats,
+            )
+        return new_lons, new_lats
+
+    cdef tuple _get_coords(self):
+        cdef np.ndarray[np.float32_t, ndim=1] x, y
+        cdef np.float32_t[::1] x_view, y_view
+        if self._coarse_scan_length == 10:
+            x = np.arange(self._fine_scan_width, dtype=np.float32) % self._fine_pixels_per_coarse_pixel
+            y = np.empty((self._coarse_scan_length * self._fine_pixels_per_coarse_pixel,), dtype=np.float32)
+            x_view = x
+            y_view = y
+            with nogil:
+                self._get_coords_1km(x_view, y_view)
+        else:
+            x, y = self._get_coords_5km()
+        return x, y
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _get_coords_1km(
+            self,
+            floating[::1] x_view,
+            floating[::1] y_view,
+    ) nogil:
+        cdef unsigned int scan_idx
+        cdef int i
+        cdef int fine_idx
+        cdef int half_scan_length = self._fine_pixels_per_coarse_pixel // 2
+        cdef unsigned int fine_pixels_per_scan = self._coarse_scan_length * self._fine_pixels_per_coarse_pixel
+        for fine_idx in range(fine_pixels_per_scan):
+            if fine_idx < half_scan_length:
+                y_view[fine_idx] = -half_scan_length + 0.5 + fine_idx
+            elif fine_idx >= fine_pixels_per_scan - half_scan_length:
+                y_view[fine_idx] = (self._fine_pixels_per_coarse_pixel + 0.5) + (half_scan_length - (fine_pixels_per_scan - fine_idx))
+            else:
+                y_view[fine_idx] = ((fine_idx + half_scan_length) % self._fine_pixels_per_coarse_pixel) + 0.5
+
+        for i in range(self._fine_pixels_per_coarse_pixel):
+            x_view[(self._fine_scan_width - self._fine_pixels_per_coarse_pixel) + i] = self._fine_pixels_per_coarse_pixel + i
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.initializedcheck(False)
+    cdef tuple _get_coords_5km(self):
+        cdef np.ndarray[np.float32_t, ndim=1] y = np.arange(self._fine_pixels_per_coarse_pixel * self._coarse_scan_length, dtype=np.float32) - 2
+        cdef np.ndarray[np.float32_t, ndim=1] x = (np.arange(self._fine_scan_width, dtype=np.float32) - 2) % self._fine_pixels_per_coarse_pixel
+        x[0] = -2
+        x[1] = -1
+        if self._coarse_scan_width == 271:
+            x[-2] = 5
+            x[-1] = 6
+        else:
+            # self._coarse_scan_width == 270
+            x[-7] = 5
+            x[-6] = 6
+            x[-5] = 7
+            x[-4] = 8
+            x[-3] = 9
+            x[-2] = 10
+            x[-1] = 11
+        return x, y
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _interpolate_lons_lats(self,
+                                     unsigned int scans,
+                                     floating[:, ::1] coarse_lons,
+                                     floating[:, ::1] coarse_lats,
+                                     floating[:, ::1] coarse_satz,
+                                     floating[::1] x,
+                                     floating[::1] y,
+                                     floating[:, ::1] a_track,
+                                     floating[:, ::1] a_scan,
+                                     floating[:, ::1] tmp_fine_a,
+                                     floating[:, ::1] tmp_fine_b,
+                                     floating[:, ::1] tmp_fine_c,
+                                     floating[:, ::1] tmp_fine_d,
+                                     floating[:, :, ::1] coarse_xyz,
+                                     floating[:, :, ::1] fine_xyz,
+                                     floating[:, ::1] fine_lons,
+                                     floating[:, ::1] fine_lats,
+                                     ) nogil:
+        cdef floating[:, ::1] lons_scan, lats_scan, satz_scan
+        cdef floating[:, ::1] new_lons_scan, new_lats_scan
+        cdef Py_ssize_t scan_idx
+        for scan_idx in range(0, scans):
+            lons_scan = coarse_lons[scan_idx * self._coarse_scan_length:(scan_idx + 1) * self._coarse_scan_length]
+            lats_scan = coarse_lats[scan_idx * self._coarse_scan_length:(scan_idx + 1) * self._coarse_scan_length]
+            satz_scan = coarse_satz[scan_idx * self._coarse_scan_length:(scan_idx + 1) * self._coarse_scan_length]
+            new_lons_scan = fine_lons[scan_idx * self._coarse_scan_length * self._fine_pixels_per_coarse_pixel:(scan_idx + 1) * self._coarse_scan_length * self._fine_pixels_per_coarse_pixel]
+            new_lats_scan = fine_lats[scan_idx * self._coarse_scan_length * self._fine_pixels_per_coarse_pixel:(scan_idx + 1) * self._coarse_scan_length * self._fine_pixels_per_coarse_pixel]
+            self._get_atrack_ascan(
+                satz_scan,
+                x,
+                y,
+                tmp_fine_a[:self._coarse_scan_length - 1, :self._coarse_scan_width - 1],
+                tmp_fine_b[:self._coarse_scan_length - 1, :self._coarse_scan_width - 1],
+                tmp_fine_c,
+                tmp_fine_d,
+                a_track,
+                a_scan)
+            lonlat2xyz(lons_scan, lats_scan, coarse_xyz)
+            self._compute_fine_xyz(a_track, a_scan, coarse_xyz,
+                                   tmp_fine_a, tmp_fine_b, tmp_fine_c, tmp_fine_d,
+                                   fine_xyz)
+
+            xyz2lonlat(fine_xyz, new_lons_scan, new_lats_scan)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _get_atrack_ascan(
+            self,
+            floating[:, ::1] satz,
+            floating[::1] x,
+            floating[::1] y,
+            floating[:, ::1] c_exp_coarse,
+            floating[:, ::1] c_ali_coarse,
+            floating[:, ::1] c_exp_fine,
+            floating[:, ::1] c_ali_fine,
+            floating[:, ::1] a_track,
+            floating[:, ::1] a_scan
+    ) nogil:
+        cdef floating[:, :] satz_a_view = _get_upper_left_corner(satz)
+        cdef floating[:, :] satz_b_view = _get_upper_right_corner(satz)
+        cdef Py_ssize_t scan_idx
+        _compute_expansion_alignment(
+            satz_a_view,
+            satz_b_view,
+            self._coarse_pixels_per_1km,
+            c_exp_coarse,
+            c_ali_coarse)
+
+        cdef floating[:, :] c_exp_view2 = c_exp_coarse
+        self._expand_tiepoint_array(c_exp_view2, c_exp_fine)
+
+        cdef floating[:, :] c_ali_view2 = c_ali_coarse
+        self._expand_tiepoint_array(c_ali_view2, c_ali_fine)
+
+        self._calculate_atrack_ascan(
+            x, y,
+            c_exp_fine, c_ali_fine,
+            a_track, a_scan)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _calculate_atrack_ascan(
+            self,
+            floating[::1] coords_x,
+            floating[::1] coords_y,
+            floating[:, ::1] c_exp_full,
+            floating[:, ::1] c_ali_full,
+            floating[:, ::1] a_track,
+            floating[:, ::1] a_scan,
+    ) nogil:
+        cdef Py_ssize_t i, j
+        cdef floating s_s, s_t
+        for j in range(coords_y.shape[0]):
+            for i in range(coords_x.shape[0]):
+                s_s = coords_x[i] / self._fine_pixels_per_coarse_pixel
+                s_t = coords_y[j] / self._fine_pixels_per_coarse_pixel
+                a_track[j, i] = s_t
+                a_scan[j, i] = s_s + s_s * (1 - s_s) * c_exp_full[j, i] + s_t * (1 - s_t) * c_ali_full[j, i]
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _compute_fine_xyz(
+            self,
+            floating[:, ::1] a_track_view,
+            floating[:, ::1] a_scan_view,
+            floating[:, :, ::1] xyz_view,
+            floating[:, ::1] data_tiepoint_a_view,
+            floating[:, ::1] data_tiepoint_b_view,
+            floating[:, ::1] data_tiepoint_c_view,
+            floating[:, ::1] data_tiepoint_d_view,
+            floating[:, :, ::1] xyz_comp_view,
+    ) nogil:
+        cdef Py_ssize_t k
+        cdef floating[:, :] comp_a_view, comp_b_view, comp_c_view, comp_d_view
+        for k in range(3):  # xyz
+            comp_a_view = xyz_view[:-1, :-1, k]  # upper left
+            comp_b_view = xyz_view[:-1, 1:, k]  # upper right
+            comp_c_view = xyz_view[1:, 1:, k]  # lower right
+            comp_d_view = xyz_view[1:, :-1, k]  # lower left
+            self._expand_tiepoint_array(comp_a_view, data_tiepoint_a_view)
+            self._expand_tiepoint_array(comp_b_view, data_tiepoint_b_view)
+            self._expand_tiepoint_array(comp_c_view, data_tiepoint_c_view)
+            self._expand_tiepoint_array(comp_d_view, data_tiepoint_d_view)
+            self._compute_fine_xyz_component(
+                a_track_view,
+                a_scan_view,
+                data_tiepoint_a_view,
+                data_tiepoint_b_view,
+                data_tiepoint_c_view,
+                data_tiepoint_d_view,
+                xyz_comp_view,
+                k,
+            )
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _compute_fine_xyz_component(
+            self,
+            floating[:, ::1] a_track_view,
+            floating[:, ::1] a_scan_view,
+            floating[:, ::1] data_a_2d_view,
+            floating[:, ::1] data_b_2d_view,
+            floating[:, ::1] data_c_2d_view,
+            floating[:, ::1] data_d_2d_view,
+            floating[:, :, ::1] xyz_comp_view,
+            Py_ssize_t k,
+    ) nogil:
+        cdef Py_ssize_t i, j
+        cdef floating scan1_tmp, scan2_tmp, atrack1, ascan1
+        for i in range(a_scan_view.shape[0]):
+            for j in range(a_scan_view.shape[1]):
+                atrack1 = a_track_view[i, j]
+                ascan1 = a_scan_view[i, j]
+                scan1_tmp = (1 - ascan1) * data_a_2d_view[i, j] + ascan1 * data_b_2d_view[i, j]
+                scan2_tmp = (1 - ascan1) * data_d_2d_view[i, j] + ascan1 * data_c_2d_view[i, j]
+                xyz_comp_view[i, j, k] = (1 - atrack1) * scan1_tmp + atrack1 * scan2_tmp
+
+    cdef void _expand_tiepoint_array(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] output_arr,
+    ) nogil:
+        if self._coarse_scan_length == 10:
+            self._expand_tiepoint_array_1km(input_arr, output_arr)
+        else:
+            self._expand_tiepoint_array_5km(input_arr, output_arr)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_1km(self, floating[:, :] input_arr, floating[:, ::1] expanded_arr) nogil:
+        self._expand_tiepoint_array_1km_main(input_arr, expanded_arr)
+        self._expand_tiepoint_array_1km_right_column(input_arr, expanded_arr)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_1km_main(self, floating[:, :] input_arr, floating[:, ::1] expanded_arr) nogil:
+        cdef floating tiepoint_value
+        cdef Py_ssize_t row_idx, col_idx, length_repeat_cycle, width_repeat_cycle, half_coarse_pixel_fine_offset, row_offset, col_offset
+        cdef Py_ssize_t row_repeat_offset, col_repeat_offset
+        half_coarse_pixel_fine_offset = self._fine_pixels_per_coarse_pixel // 2
+        for row_idx in range(input_arr.shape[0]):
+            row_offset = row_idx * self._fine_pixels_per_coarse_pixel + half_coarse_pixel_fine_offset
+            for col_idx in range(input_arr.shape[1]):
+                col_offset = col_idx * self._fine_pixels_per_coarse_pixel
+                tiepoint_value = input_arr[row_idx, col_idx]
+                self._expand_tiepoint_array_1km_with_repeat(tiepoint_value, expanded_arr, row_offset, col_offset)
+
+        row_idx = 0
+        row_offset = row_idx * self._fine_pixels_per_coarse_pixel
+        for col_idx in range(input_arr.shape[1]):
+            col_offset = col_idx * self._fine_pixels_per_coarse_pixel
+            tiepoint_value = input_arr[row_idx, col_idx]
+            self._expand_tiepoint_array_1km_with_repeat(tiepoint_value, expanded_arr, row_offset, col_offset)
+
+        row_idx = input_arr.shape[0] - 1
+        row_offset = row_idx * self._fine_pixels_per_coarse_pixel + self._fine_pixels_per_coarse_pixel
+        for col_idx in range(input_arr.shape[1]):
+            col_offset = col_idx * self._fine_pixels_per_coarse_pixel
+            tiepoint_value = input_arr[row_idx, col_idx]
+            self._expand_tiepoint_array_1km_with_repeat(tiepoint_value, expanded_arr, row_offset, col_offset)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_1km_right_column(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] expanded_arr
+    ) nogil:
+        cdef floating tiepoint_value
+        cdef Py_ssize_t row_idx, col_idx, length_repeat_cycle, width_repeat_cycle, half_coarse_pixel_fine_offset, row_offset, col_offset
+        cdef Py_ssize_t row_repeat_offset, col_repeat_offset
+        half_coarse_pixel_fine_offset = self._fine_pixels_per_coarse_pixel // 2
+        col_idx = input_arr.shape[1] - 1
+        col_offset = col_idx * self._fine_pixels_per_coarse_pixel + self._fine_pixels_per_coarse_pixel
+        for row_idx in range(input_arr.shape[0]):
+            row_offset = row_idx * self._fine_pixels_per_coarse_pixel + half_coarse_pixel_fine_offset
+            tiepoint_value = input_arr[row_idx, col_idx]
+            self._expand_tiepoint_array_1km_with_repeat(tiepoint_value, expanded_arr, row_offset, col_offset)
+
+        self._expand_tiepoint_array_1km_right_column_top_row(input_arr, expanded_arr)
+        self._expand_tiepoint_array_1km_right_column_bottom_row(input_arr, expanded_arr)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_1km_right_column_top_row(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] expanded_arr
+    ) nogil:
+        cdef floating tiepoint_value
+        cdef Py_ssize_t row_idx, col_idx, row_offset, col_offset
+        row_idx = 0
+        col_idx = input_arr.shape[1] - 1
+        tiepoint_value = input_arr[row_idx, col_idx]
+        row_offset = row_idx * self._fine_pixels_per_coarse_pixel
+        col_offset = col_idx * self._fine_pixels_per_coarse_pixel + self._fine_pixels_per_coarse_pixel
+        self._expand_tiepoint_array_1km_with_repeat(tiepoint_value, expanded_arr, row_offset, col_offset)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_1km_right_column_bottom_row(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] expanded_arr
+    ) nogil:
+        cdef floating tiepoint_value
+        cdef Py_ssize_t row_idx, col_idx, row_offset, col_offset
+        row_idx = input_arr.shape[0] - 1
+        col_idx = input_arr.shape[1] - 1
+        tiepoint_value = input_arr[row_idx, col_idx]
+        row_offset = row_idx * self._fine_pixels_per_coarse_pixel + self._fine_pixels_per_coarse_pixel
+        col_offset = col_idx * self._fine_pixels_per_coarse_pixel + self._fine_pixels_per_coarse_pixel
+        self._expand_tiepoint_array_1km_with_repeat(tiepoint_value, expanded_arr, row_offset, col_offset)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_1km_with_repeat(
+            self,
+            floating tiepoint_value,
+            floating[:, ::1] expanded_arr,
+            Py_ssize_t row_offset,
+            Py_ssize_t col_offset,
+    ) nogil:
+        cdef Py_ssize_t length_repeat_cycle, width_repeat_cycle
+        cdef Py_ssize_t row_repeat_offset, col_repeat_offset
+        for length_repeat_cycle in range(self._fine_pixels_per_coarse_pixel):
+            row_repeat_offset = row_offset + length_repeat_cycle
+            for width_repeat_cycle in range(self._fine_pixels_per_coarse_pixel):
+                col_repeat_offset = col_offset + width_repeat_cycle
+                expanded_arr[row_repeat_offset, col_repeat_offset] = tiepoint_value
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_5km(self, floating[:, :] input_arr, floating[:, ::1] expanded_arr) nogil:
+        self._expand_tiepoint_array_5km_main(input_arr, expanded_arr)
+        self._expand_tiepoint_array_5km_left(input_arr, expanded_arr)
+        if self._coarse_scan_width == 270:
+            self._expand_tiepoint_array_5km_270_extra_column(input_arr, expanded_arr)
+        self._expand_tiepoint_array_5km_right(input_arr, expanded_arr)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_5km_main(self, floating[:, :] input_arr, floating[:, ::1] expanded_arr) nogil:
+        cdef floating tiepoint_value
+        cdef Py_ssize_t row_idx, col_idx, row_offset, col_offset
+        for row_idx in range(input_arr.shape[0]):
+            row_offset = row_idx * self._fine_pixels_per_coarse_pixel * 2
+            for col_idx in range(input_arr.shape[1]):
+                col_offset = col_idx * self._fine_pixels_per_coarse_pixel + self._factor_5km
+                tiepoint_value = input_arr[row_idx, col_idx]
+                self._expand_tiepoint_array_5km_with_repeat(
+                    tiepoint_value,
+                    expanded_arr,
+                    row_offset,
+                    col_offset,
+                    self._fine_pixels_per_coarse_pixel)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_5km_270_extra_column(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] expanded_arr
+    ) nogil:
+        """Copy an extra coarse pixel column between the main copied area and the right-most columns."""
+        cdef floating tiepoint_value
+        cdef Py_ssize_t row_idx, col_idx, row_offset, col_offset
+        col_idx = input_arr.shape[1] - 1
+        col_offset = col_idx * self._fine_pixels_per_coarse_pixel + self._fine_pixels_per_coarse_pixel + self._factor_5km
+        for row_idx in range(input_arr.shape[0]):
+            row_offset = row_idx * self._fine_pixels_per_coarse_pixel * 2
+            tiepoint_value = input_arr[row_idx, col_idx]
+            self._expand_tiepoint_array_5km_with_repeat(
+                tiepoint_value,
+                expanded_arr,
+                row_offset,
+                col_offset,
+                self._fine_pixels_per_coarse_pixel)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_5km_left(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] expanded_arr,
+    ) nogil:
+        self._expand_tiepoint_array_5km_edges(input_arr, expanded_arr, 0, 0)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_5km_right(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] expanded_arr,
+    ) nogil:
+        self._expand_tiepoint_array_5km_edges(
+            input_arr,
+            expanded_arr,
+            input_arr.shape[1] - 1,
+            expanded_arr.shape[1] - self._factor_5km)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_5km_edges(
+            self,
+            floating[:, :] input_arr,
+            floating[:, ::1] expanded_arr,
+            Py_ssize_t course_col_idx,
+            Py_ssize_t fine_col_idx,
+    ) nogil:
+        cdef floating tiepoint_value
+        cdef Py_ssize_t row_idx, row_offset
+        for row_idx in range(input_arr.shape[0]):
+            row_offset = row_idx * self._fine_pixels_per_coarse_pixel * 2
+            tiepoint_value = input_arr[row_idx, course_col_idx]
+            self._expand_tiepoint_array_5km_with_repeat(
+                tiepoint_value,
+                expanded_arr,
+                row_offset,
+                fine_col_idx,
+                self._factor_5km)
+
+    @cython.boundscheck(False)
+    @cython.cdivision(True)
+    @cython.wraparound(False)
+    @cython.initializedcheck(False)
+    cdef void _expand_tiepoint_array_5km_with_repeat(
+            self,
+            floating tiepoint_value,
+            floating[:, ::1] expanded_arr,
+            Py_ssize_t row_offset,
+            Py_ssize_t col_offset,
+            Py_ssize_t col_width,
+    ) nogil:
+        cdef Py_ssize_t length_repeat_cycle, width_repeat_cycle
+        for length_repeat_cycle in range(self._fine_pixels_per_coarse_pixel * 2):
+            for width_repeat_cycle in range(col_width):
+                expanded_arr[row_offset + length_repeat_cycle,
+                             col_offset + width_repeat_cycle] = tiepoint_value


=====================================
geotiepoints/_modis_utils.pxd
=====================================
@@ -0,0 +1,22 @@
+cimport numpy as np
+
+ctypedef fused floating:
+    np.float32_t
+    np.float64_t
+
+cdef void lonlat2xyz(
+        floating[:, ::1] lons,
+        floating[:, ::1] lats,
+        floating[:, :, ::1] xyz,
+) nogil
+
+cdef void xyz2lonlat(
+        floating[:, :, ::1] xyz,
+        floating[:, ::1] lons,
+        floating[:, ::1] lats,
+        bint low_lat_z=*,
+        floating thr=*,
+) nogil
+
+cdef floating rad2deg(floating x) nogil
+cdef floating deg2rad(floating x) nogil


=====================================
geotiepoints/_modis_utils.pyx
=====================================
@@ -0,0 +1,205 @@
+from functools import wraps
+
+cimport cython
+cimport numpy as np
+from libc.math cimport asin, sin, cos, sqrt, acos, M_PI
+import numpy as np
+
+try:
+    import dask.array as da
+except ImportError:
+    # if dask can't be imported then we aren't going to be given dask arrays
+    da = None
+
+try:
+    import xarray as xr
+except ImportError:
+    xr = None
+
+
+DEF EARTH_RADIUS = 6370997.0
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+cdef void lonlat2xyz(
+        floating[:, ::1] lons,
+        floating[:, ::1] lats,
+        floating[:, :, ::1] xyz,
+) nogil:
+    """Convert lons and lats to cartesian coordinates."""
+    cdef Py_ssize_t i, j, k
+    cdef floating lon_rad, lat_rad
+    for i in range(lons.shape[0]):
+        for j in range(lons.shape[1]):
+            lon_rad = deg2rad(lons[i, j])
+            lat_rad = deg2rad(lats[i, j])
+            xyz[i, j, 0] = EARTH_RADIUS * cos(lat_rad) * cos(lon_rad)
+            xyz[i, j, 1] = EARTH_RADIUS * cos(lat_rad) * sin(lon_rad)
+            xyz[i, j, 2] = EARTH_RADIUS * sin(lat_rad)
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+cdef void xyz2lonlat(
+        floating[:, :, ::1] xyz,
+        floating[:, ::1] lons,
+        floating[:, ::1] lats,
+        bint low_lat_z=True,
+        floating thr=0.8) nogil:
+    """Get longitudes from cartesian coordinates."""
+    cdef Py_ssize_t i, j
+    cdef np.float64_t x, y, z
+    for i in range(xyz.shape[0]):
+        for j in range(xyz.shape[1]):
+            # 64-bit precision matters apparently
+            x = <np.float64_t>xyz[i, j, 0]
+            y = <np.float64_t>xyz[i, j, 1]
+            z = <np.float64_t>xyz[i, j, 2]
+            lons[i, j] = rad2deg(acos(x / sqrt(x ** 2 + y ** 2))) * _sign(y)
+            # if we are at low latitudes - small z, then get the
+            # latitudes only from z. If we are at high latitudes (close to the poles)
+            # then derive the latitude using x and y:
+            if low_lat_z and (z < thr * EARTH_RADIUS) and (z > -1.0 * thr * EARTH_RADIUS):
+                lats[i, j] = 90 - rad2deg(acos(z / EARTH_RADIUS))
+            else:
+                lats[i, j] = _sign(z) * (90 - rad2deg(asin(sqrt(x ** 2 + y ** 2) / EARTH_RADIUS)))
+
+
+cdef inline int _sign(floating x) nogil:
+    return 1 if x > 0 else (-1 if x < 0 else 0)
+
+
+ at cython.cdivision(True)
+cdef inline floating rad2deg(floating x) nogil:
+    return x * (180.0 / M_PI)
+
+
+ at cython.cdivision(True)
+cdef inline floating deg2rad(floating x) nogil:
+    return x * (M_PI / 180.0)
+
+
+def rows_per_scan_for_resolution(res):
+    return {
+        5000: 2,
+        1000: 10,
+        500: 20,
+        250: 40,
+    }[res]
+
+
+def scanline_mapblocks(func):
+    """Convert dask array inputs to appropriate map_blocks calls.
+
+    This function, applied as a decorator, will call the wrapped function
+    using dask's ``map_blocks``. It will rechunk dask array inputs when
+    necessary to make sure that the input chunks are entire scanlines to
+    avoid incorrect interpolation.
+
+    """
+    @wraps(func)
+    def _wrapper(*args, coarse_resolution=None, fine_resolution=None, **kwargs):
+        if coarse_resolution is None or fine_resolution is None:
+            raise ValueError("'coarse_resolution' and 'fine_resolution' are required keyword arguments.")
+        first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
+        if first_arr.ndim != 2 or first_arr.ndim != 2:
+            raise ValueError("Expected 2D input arrays.")
+        if hasattr(first_arr, "compute"):
+            # assume it is dask or xarray with dask, ensure proper chunk size
+            # if DataArray get just the dask array
+            dask_args = _extract_dask_arrays_from_args(args)
+            rows_per_scan = rows_per_scan_for_resolution(coarse_resolution)
+            rechunked_args = _rechunk_dask_arrays_if_needed(dask_args, rows_per_scan)
+            results = _call_map_blocks_interp(
+                func,
+                coarse_resolution,
+                fine_resolution,
+                *rechunked_args,
+                **kwargs
+            )
+            if hasattr(first_arr, "dims"):
+                # recreate DataArrays
+                results = _results_to_data_arrays(first_arr.dims, *results)
+            return results
+        return func(
+            *args,
+            coarse_resolution=coarse_resolution,
+            fine_resolution=fine_resolution,
+            **kwargs
+        )
+
+    return _wrapper
+
+
+def _extract_dask_arrays_from_args(args):
+    return [arr_obj.data if hasattr(arr_obj, "dims") else arr_obj for arr_obj in args]
+
+
+def _call_map_blocks_interp(func, coarse_resolution, fine_resolution, *args, **kwargs):
+    first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
+    res_factor = coarse_resolution // fine_resolution
+    new_row_chunks = tuple(x * res_factor for x in first_arr.chunks[0])
+    fine_pixels_per_1km = {250: 4, 500: 2, 1000: 1}[fine_resolution]
+    fine_scan_width = 1354 * fine_pixels_per_1km
+    new_col_chunks = (fine_scan_width,)
+    wrapped_func = _map_blocks_handler(func)
+    res = da.map_blocks(wrapped_func, *args,
+                        coarse_resolution=coarse_resolution,
+                        fine_resolution=fine_resolution,
+                        **kwargs,
+                        new_axis=[0],
+                        chunks=(2, new_row_chunks, new_col_chunks),
+                        dtype=first_arr.dtype,
+                        meta=np.empty((2, 2, 2), dtype=first_arr.dtype))
+    return tuple(res[idx] for idx in range(res.shape[0]))
+
+
+def _results_to_data_arrays(dims, *results):
+    new_results = []
+    for result in results:
+        if not isinstance(result, da.Array):
+            continue
+        data_arr = xr.DataArray(result, dims=dims)
+        new_results.append(data_arr)
+    return new_results
+
+
+def _rechunk_dask_arrays_if_needed(args, rows_per_scan: int):
+    # take current chunk size and get a relatively similar chunk size
+    first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
+    row_chunks = first_arr.chunks[0]
+    col_chunks = first_arr.chunks[1]
+    num_rows = first_arr.shape[0]
+    num_cols = first_arr.shape[-1]
+    good_row_chunks = all(x % rows_per_scan == 0 for x in row_chunks)
+    good_col_chunks = len(col_chunks) == 1 and col_chunks[0] != num_cols
+    all_orig_chunks = [arr.chunks for arr in args if hasattr(arr, "chunks")]
+
+    if num_rows % rows_per_scan != 0:
+        raise ValueError("Input longitude/latitude data does not consist of "
+                         "whole scans (10 rows per scan).")
+    all_same_chunks = all(
+        all_orig_chunks[0] == some_chunks
+        for some_chunks in all_orig_chunks[1:]
+    )
+    if good_row_chunks and good_col_chunks and all_same_chunks:
+        return args
+
+    new_row_chunks = (row_chunks[0] // rows_per_scan) * rows_per_scan
+    new_args = [arr.rechunk((new_row_chunks, -1)) if hasattr(arr, "chunks") else arr for arr in args]
+    return new_args
+
+
+def _map_blocks_handler(func):
+    @wraps(func)
+    def _map_blocks_wrapper(*args, **kwargs):
+        results = func(*args, **kwargs)
+        return np.concatenate(
+            tuple(result[np.newaxis] for result in results),
+            axis=0)
+    return _map_blocks_wrapper
+
+


=====================================
geotiepoints/_simple_modis_interpolator.pyx
=====================================
@@ -0,0 +1,275 @@
+cimport cython
+
+from ._modis_utils cimport floating
+from ._modis_utils cimport lonlat2xyz, xyz2lonlat
+from ._modis_utils import rows_per_scan_for_resolution
+cimport numpy as np
+import numpy as np
+from scipy.ndimage import map_coordinates
+
+
+def interpolate_geolocation_cartesian(
+        np.ndarray[floating, ndim=2] lon_array,
+        np.ndarray[floating, ndim=2] lat_array,
+        unsigned int coarse_resolution,
+        unsigned int fine_resolution):
+    lon_array = np.ascontiguousarray(lon_array)
+    lat_array = np.ascontiguousarray(lat_array)
+    cdef unsigned int rows_per_scan = rows_per_scan_for_resolution(coarse_resolution)
+    cdef unsigned int res_factor = coarse_resolution // fine_resolution
+    cdef Py_ssize_t num_rows = lon_array.shape[0]
+    cdef Py_ssize_t num_cols = lon_array.shape[1]
+    cdef unsigned int num_scans = num_rows // rows_per_scan
+
+    # SciPy's map_coordinates requires the x/y dimension to be first
+    cdef np.ndarray[floating, ndim=3] coordinates = np.empty(
+        (2, res_factor * rows_per_scan, res_factor * num_cols), dtype=lon_array.dtype)
+    cdef floating[:, :, ::1] coordinates_view = coordinates
+    _compute_yx_coordinate_arrays(res_factor, coordinates_view)
+
+    cdef np.ndarray[floating, ndim=3] xyz_result = np.empty(
+        (res_factor * rows_per_scan, num_cols * res_factor, 3), dtype=lon_array.dtype)
+    cdef floating[:, :, ::1] xyz_result_view = xyz_result
+    cdef np.ndarray[floating, ndim=3] xyz_in = np.empty(
+        (rows_per_scan, num_cols, 3), dtype=lon_array.dtype)
+    cdef floating[:, :, ::1] xyz_in_view = xyz_in
+    cdef floating[:, ::1] lon_in_view = lon_array
+    cdef floating[:, ::1] lat_in_view = lat_array
+
+    cdef np.ndarray[floating, ndim=2] new_lons = np.empty((res_factor * num_rows, res_factor * num_cols),
+                                                          dtype=lon_array.dtype)
+    cdef np.ndarray[floating, ndim=2] new_lats = np.empty((res_factor * num_rows, res_factor * num_cols),
+                                                          dtype=lon_array.dtype)
+    cdef floating[:, ::1] new_lons_view = new_lons
+    cdef floating[:, ::1] new_lats_view = new_lats
+
+    # Interpolate each scan, one at a time, otherwise the math doesn't work well
+    cdef Py_ssize_t scan_idx, j0, j1, k0, k1, comp_index
+    with nogil:
+        for scan_idx in range(num_scans):
+            # Calculate indexes
+            j0 = rows_per_scan * scan_idx
+            j1 = j0 + rows_per_scan
+            k0 = rows_per_scan * res_factor * scan_idx
+            k1 = k0 + rows_per_scan * res_factor
+            lonlat2xyz(lon_in_view[j0:j1, :], lat_in_view[j0:j1, :], xyz_in_view)
+
+            _compute_interpolated_xyz_scan(
+                res_factor, coordinates_view, xyz_in_view,
+                xyz_result_view)
+
+            xyz2lonlat(xyz_result_view, new_lons_view[k0:k1], new_lats_view[k0:k1], low_lat_z=True)
+    return new_lons, new_lats
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef void _compute_yx_coordinate_arrays(
+        unsigned int res_factor,
+        floating[:, :, ::1] coordinates,
+) nogil:
+    cdef Py_ssize_t i, j
+    for j in range(coordinates.shape[1]):
+        for i in range(coordinates.shape[2]):
+            # y coordinate - 0.375 for 250m, 0.25 for 500m
+            coordinates[0, j, i] = j * (1.0 / res_factor) - (res_factor * (1.0 / 16) + (1.0 / 8))
+            # x coordinate
+            coordinates[1, j, i] = i * (1.0 / res_factor)
+
+
+ at cython.boundscheck(False)
+cdef void _compute_interpolated_xyz_scan(
+        unsigned int res_factor,
+        floating[:, :, ::1] coordinates_view,
+        floating[:, :, ::1] xyz_input_view,
+        floating[:, :, ::1] xyz_result_view,
+) nogil:
+    cdef Py_ssize_t comp_index
+    cdef floating[:, :] input_view, result_view
+    with gil:
+        for comp_index in range(3):
+            input_view = xyz_input_view[:, :, comp_index]
+            result_view = xyz_result_view[:, :, comp_index]
+            _call_map_coordinates(
+                input_view,
+                coordinates_view,
+                result_view,
+            )
+
+    if res_factor == 4:
+        for comp_index in range(3):
+            result_view = xyz_result_view[:, :, comp_index]
+            _extrapolate_xyz_rightmost_columns(result_view, 3)
+            _interpolate_xyz_250(
+                result_view,
+                coordinates_view,
+            )
+    else:
+        for comp_index in range(3):
+            result_view = xyz_result_view[:, :, comp_index]
+            _extrapolate_xyz_rightmost_columns(result_view, 1)
+            _interpolate_xyz_500(
+                result_view,
+                coordinates_view,
+            )
+
+
+cdef void _call_map_coordinates(
+        floating[:, :] nav_array_view,
+        floating[:, :, ::1] coordinates_view,
+        floating[:, :] result_view,
+):
+    cdef np.ndarray[floating, ndim=2] nav_array = np.asarray(nav_array_view)
+    cdef np.ndarray[floating, ndim=3] coordinates_array = np.asarray(coordinates_view)
+    cdef np.ndarray[floating, ndim=2] result_array = np.asarray(result_view)
+    # Use bilinear interpolation for all 250 meter pixels
+    map_coordinates(nav_array, coordinates_array,
+                    output=result_array,
+                    order=1, mode='nearest')
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef void _extrapolate_xyz_rightmost_columns(
+        floating[:, :] result_view,
+        int num_columns,
+) nogil:
+    cdef Py_ssize_t row_idx, col_offset
+    cdef floating last_interp_col_diff
+    for row_idx in range(result_view.shape[0]):
+        last_interp_col_diff = result_view[row_idx, result_view.shape[1] - num_columns - 1] - \
+                               result_view[row_idx, result_view.shape[1] - num_columns - 2]
+        for col_offset in range(num_columns):
+            # map_coordinates repeated the last columns value, we now add more to it as an "extrapolation"
+            result_view[row_idx, result_view.shape[1] - num_columns + col_offset] += last_interp_col_diff * (col_offset + 1)
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef void _interpolate_xyz_250(
+        floating[:, :] result_view,
+        floating[:, :, ::1] coordinates_view,
+) nogil:
+    cdef Py_ssize_t col_idx
+    cdef floating m, b
+    cdef floating[:] result_col_view
+    cdef floating[:, ::1] y_coordinates = coordinates_view[0]
+    for col_idx in range(result_view.shape[1]):
+        result_col_view = result_view[:, col_idx]
+        # Use linear extrapolation for the first two 250 meter pixels along track
+        m = _calc_slope_250(result_col_view,
+                            y_coordinates,
+                            2)
+        b = _calc_offset_250(result_col_view,
+                             y_coordinates,
+                             m,
+                             2)
+        result_view[0, col_idx] = m * y_coordinates[0, 0] + b
+        result_view[1, col_idx] = m * y_coordinates[1, 0] + b
+
+        # Use linear extrapolation for the last  two 250 meter pixels along track
+        # m = (result_array[k0 + 37, :] - result_array[k0 + 34, :]) / (y[37, 0] - y[34, 0])
+        # b = result_array[k0 + 37, :] - m * y[37, 0]
+        m = _calc_slope_250(result_col_view,
+                            y_coordinates,
+                            34)
+        b = _calc_offset_250(result_col_view,
+                             y_coordinates,
+                             m,
+                             34)
+        result_view[38, col_idx] = m * y_coordinates[38, 0] + b
+        result_view[39, col_idx] = m * y_coordinates[39, 0] + b
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef void _interpolate_xyz_500(
+        floating[:, :] result_view,
+        floating[:, :, ::1] coordinates_view,
+) nogil:
+    cdef Py_ssize_t col_idx
+    cdef floating m, b
+    for col_idx in range(result_view.shape[1]):
+        # Use linear extrapolation for the first two 250 meter pixels along track
+        m = _calc_slope_500(
+            result_view[:, col_idx],
+            coordinates_view[0],
+            1)
+        b = _calc_offset_500(
+            result_view[:, col_idx],
+            coordinates_view[0],
+            m,
+            1)
+        result_view[0, col_idx] = m * coordinates_view[0, 0, 0] + b
+
+        # Use linear extrapolation for the last two 250 meter pixels along track
+        m = _calc_slope_500(
+            result_view[:, col_idx],
+            coordinates_view[0],
+            17)
+        b = _calc_offset_500(
+            result_view[:, col_idx],
+            coordinates_view[0],
+            m,
+            17)
+        result_view[19, col_idx] = m * coordinates_view[0, 19, 0] + b
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating _calc_slope_250(
+        floating[:] result_view,
+        floating[:, ::1] y,
+        Py_ssize_t offset,
+) nogil:
+    return (result_view[offset + 3] - result_view[offset]) / \
+           (y[offset + 3, 0] - y[offset, 0])
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating _calc_offset_250(
+        floating[:] result_view,
+        floating[:, ::1] y,
+        floating m,
+        Py_ssize_t offset,
+) nogil:
+    return result_view[offset + 3] - m * y[offset + 3, 0]
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating _calc_slope_500(
+        floating[:] result_view,
+        floating[:, ::1] y,
+        Py_ssize_t offset,
+) nogil:
+    return (result_view[offset + 1] - result_view[offset]) / \
+           (y[offset + 1, 0] - y[offset, 0])
+
+
+ at cython.boundscheck(False)
+ at cython.cdivision(True)
+ at cython.wraparound(False)
+ at cython.initializedcheck(False)
+cdef inline floating _calc_offset_500(
+        floating[:] result_view,
+        floating[:, ::1] y,
+        floating m,
+        Py_ssize_t offset,
+) nogil:
+    return result_view[offset + 1] - m * y[offset + 1, 0]


=====================================
geotiepoints/modisinterpolator.py
=====================================
@@ -19,7 +19,6 @@
 
 # You should have received a copy of the GNU General Public License
 # along with this program.  If not, see <http://www.gnu.org/licenses/>.
-
 """Interpolation of MODIS data using satellite zenith angle.
 
 Interpolation of geographical tiepoints using the second order interpolation
@@ -35,245 +34,31 @@ https://doi.org/10.1016/j.acags.2020.100025.
 (https://www.sciencedirect.com/science/article/pii/S2590197420300070)
 """
 
-import numpy as np
 import warnings
 
-from .geointerpolator import lonlat2xyz, xyz2lonlat
-from .simple_modis_interpolator import scanline_mapblocks
-
-R = 6371.
-# Aqua altitude in km
-H = 709.
-
-
-def _compute_phi(zeta):
-    return np.arcsin(R * np.sin(zeta) / (R + H))
-
-
-def _compute_theta(zeta, phi):
-    return zeta - phi
-
-
-def _compute_zeta(phi):
-    return np.arcsin((R + H) * np.sin(phi) / R)
-
-
-def _compute_expansion_alignment(satz_a, satz_b, scan_width):
-    """All angles in radians."""
-    zeta_a = satz_a
-    zeta_b = satz_b
-
-    phi_a = _compute_phi(zeta_a)
-    phi_b = _compute_phi(zeta_b)
-    theta_a = _compute_theta(zeta_a, phi_a)
-    theta_b = _compute_theta(zeta_b, phi_b)
-    phi = (phi_a + phi_b) / 2
-    zeta = _compute_zeta(phi)
-    theta = _compute_theta(zeta, phi)
-    # Workaround for tiepoints symetrical about the subsatellite-track
-    denominator = np.where(theta_a == theta_b, theta_a * 2, theta_a - theta_b)
-
-    c_expansion = 4 * (((theta_a + theta_b) / 2 - theta) / denominator)
-
-    sin_beta_2 = scan_width / (2 * H)
-
-    d = ((R + H) / R * np.cos(phi) - np.cos(zeta)) * sin_beta_2
-    e = np.cos(zeta) - np.sqrt(np.cos(zeta) ** 2 - d ** 2)
-
-    c_alignment = 4 * e * np.sin(zeta) / denominator
-
-    return c_expansion, c_alignment
-
-
-def _get_corners(arr):
-    arr_a = arr[:, :-1, :-1]
-    arr_b = arr[:, :-1, 1:]
-    arr_c = arr[:, 1:, 1:]
-    arr_d = arr[:, 1:, :-1]
-    return arr_a, arr_b, arr_c, arr_d
-
-
- at scanline_mapblocks
-def _interpolate(
-        lon1,
-        lat1,
-        satz1,
-        coarse_resolution=None,
-        fine_resolution=None,
-        coarse_scan_width=None,
-):
-    """Helper function to interpolate scan-aligned arrays.
-
-    This function's decorator runs this function for each dask block/chunk of
-    scans. The arrays are scan-aligned meaning they are an even number of scans
-    (N rows per scan) and contain the entire scan width.
-
-    """
-    interp = _MODISInterpolator(coarse_resolution, fine_resolution, coarse_scan_width=coarse_scan_width)
-    return interp.interpolate(lon1, lat1, satz1)
-
-
-class _MODISInterpolator:
-    """Helper class for MODIS interpolation.
-
-    Not intended for public use. Use ``modis_X_to_Y`` functions instead.
-
-    """
-    def __init__(self, coarse_resolution, fine_resolution, coarse_scan_width=None):
-        if coarse_resolution == 1000:
-            coarse_scan_length = 10
-            coarse_scan_width = 1354
-            self._get_coords = self._get_coords_1km
-            self._expand_tiepoint_array = self._expand_tiepoint_array_1km
-        elif coarse_resolution == 5000:
-            coarse_scan_length = 2
-            self._get_coords = self._get_coords_5km
-            self._expand_tiepoint_array = self._expand_tiepoint_array_5km
-            if coarse_scan_width is None:
-                coarse_scan_width = 271
-            else:
-                coarse_scan_width = coarse_scan_width
-        self._coarse_scan_length = coarse_scan_length
-        self._coarse_scan_width = coarse_scan_width
-        self._coarse_pixels_per_1km = coarse_resolution // 1000
-
-        fine_pixels_per_1km = {
-            250: 4,
-            500: 2,
-            1000: 1,
-        }[fine_resolution]
-        self._fine_pixels_per_coarse_pixel = fine_pixels_per_1km * self._coarse_pixels_per_1km
-        self._fine_scan_width = 1354 * fine_pixels_per_1km
-        self._fine_scan_length = fine_pixels_per_1km * 10 // coarse_scan_length
-        self._coarse_resolution = coarse_resolution
-        self._fine_resolution = fine_resolution
-
-    def interpolate(self, lon1, lat1, satz1):
-        """Interpolate MODIS geolocation from 'coarse_resolution' to 'fine_resolution'."""
-        scans = satz1.shape[0] // self._coarse_scan_length
-        # reshape to (num scans, rows per scan, columns per scan)
-        satz1 = satz1.reshape((-1, self._coarse_scan_length, self._coarse_scan_width))
-
-        satz_a, satz_b = _get_corners(np.deg2rad(satz1))[:2]
-        c_exp, c_ali = _compute_expansion_alignment(satz_a, satz_b, self._coarse_pixels_per_1km)
-
-        x, y = self._get_coords(scans)
-        i_rs, i_rt = np.meshgrid(x, y)
-
-        p_os = 0
-        p_ot = 0
-        s_s = (p_os + i_rs) * 1.0 / self._fine_pixels_per_coarse_pixel
-        s_t = (p_ot + i_rt) * 1.0 / self._fine_scan_length
-
-        c_exp_full = self._expand_tiepoint_array(c_exp)
-        c_ali_full = self._expand_tiepoint_array(c_ali)
-
-        a_track = s_t
-        a_scan = s_s + s_s * (1 - s_s) * c_exp_full + s_t * (1 - s_t) * c_ali_full
-
-        res = []
-        datasets = lonlat2xyz(lon1, lat1)
-        for data in datasets:
-            data = data.reshape((-1, self._coarse_scan_length, self._coarse_scan_width))
-            data_a, data_b, data_c, data_d = _get_corners(data)
-            data_a = self._expand_tiepoint_array(data_a)
-            data_b = self._expand_tiepoint_array(data_b)
-            data_c = self._expand_tiepoint_array(data_c)
-            data_d = self._expand_tiepoint_array(data_d)
-
-            data_1 = (1 - a_scan) * data_a + a_scan * data_b
-            data_2 = (1 - a_scan) * data_d + a_scan * data_c
-            data = (1 - a_track) * data_1 + a_track * data_2
-
-            res.append(data)
-        new_lons, new_lats = xyz2lonlat(*res)
-        return new_lons.astype(lon1.dtype), new_lats.astype(lat1.dtype)
-
-    def _get_coords_1km(self, scans):
-        y = (
-            np.arange((self._coarse_scan_length + 1) * self._fine_scan_length) % self._fine_scan_length
-        ) + 0.5
-        half_scan_length = self._fine_scan_length // 2
-        y = y[half_scan_length:-half_scan_length]
-        y[:half_scan_length] = np.arange(-self._fine_scan_length / 2 + 0.5, 0)
-        y[-half_scan_length:] = np.arange(self._fine_scan_length + 0.5, self._fine_scan_length * 3 / 2)
-        y = np.tile(y, scans)
-
-        x = np.arange(self._fine_scan_width) % self._fine_pixels_per_coarse_pixel
-        x[-self._fine_pixels_per_coarse_pixel:] = np.arange(
-            self._fine_pixels_per_coarse_pixel,
-            self._fine_pixels_per_coarse_pixel * 2)
-        return x, y
-
-    def _get_coords_5km(self, scans):
-        y = np.arange(self._fine_scan_length * self._coarse_scan_length) - 2
-        y = np.tile(y, scans)
-
-        x = (np.arange(self._fine_scan_width) - 2) % self._fine_pixels_per_coarse_pixel
-        x[0] = -2
-        x[1] = -1
-        if self._coarse_scan_width == 271:
-            x[-2] = 5
-            x[-1] = 6
-        elif self._coarse_scan_width == 270:
-            x[-7] = 5
-            x[-6] = 6
-            x[-5] = 7
-            x[-4] = 8
-            x[-3] = 9
-            x[-2] = 10
-            x[-1] = 11
-        else:
-            raise NotImplementedError(
-                "Can't interpolate if 5km tiepoints have less than 270 columns."
-            )
-        return x, y
-
-    def _expand_tiepoint_array_1km(self, arr):
-        arr = np.repeat(arr, self._fine_scan_length, axis=1)
-        arr = np.concatenate(
-            (arr[:, :self._fine_scan_length // 2, :], arr, arr[:, -(self._fine_scan_length // 2):, :]), axis=1
-        )
-        arr = np.repeat(arr.reshape((-1, self._coarse_scan_width - 1)), self._fine_pixels_per_coarse_pixel, axis=1)
-        return np.hstack((arr, arr[:, -self._fine_pixels_per_coarse_pixel:]))
-
-    def _expand_tiepoint_array_5km(self, arr):
-        arr = np.repeat(arr, self._fine_scan_length * 2, axis=1)
-        arr = np.repeat(arr.reshape((-1, self._coarse_scan_width - 1)), self._fine_pixels_per_coarse_pixel, axis=1)
-        factor = self._fine_pixels_per_coarse_pixel // self._coarse_pixels_per_1km
-        if self._coarse_scan_width == 271:
-            return np.hstack((arr[:, :2 * factor], arr, arr[:, -2 * factor:]))
-        else:
-            return np.hstack(
-                (
-                    arr[:, :2 * factor],
-                    arr,
-                    arr[:, -self._fine_pixels_per_coarse_pixel:],
-                    arr[:, -2 * factor:],
-                )
-            )
+from ._modis_interpolator import interpolate
 
 
 def modis_1km_to_250m(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 1km to 250m resolution."""
-    return _interpolate(lon1, lat1, satz1,
-                        coarse_resolution=1000,
-                        fine_resolution=250)
+    return interpolate(lon1, lat1, satz1,
+                       coarse_resolution=1000,
+                       fine_resolution=250)
 
 
 def modis_1km_to_500m(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 1km to 500m resolution."""
-    return _interpolate(lon1, lat1, satz1,
-                        coarse_resolution=1000,
-                        fine_resolution=500)
+    return interpolate(lon1, lat1, satz1,
+                       coarse_resolution=1000,
+                       fine_resolution=500)
 
 
 def modis_5km_to_1km(lon1, lat1, satz1):
     """Interpolate MODIS geolocation from 5km to 1km resolution."""
-    return _interpolate(lon1, lat1, satz1,
-                        coarse_resolution=5000,
-                        fine_resolution=1000,
-                        coarse_scan_width=lon1.shape[1])
+    return interpolate(lon1, lat1, satz1,
+                       coarse_resolution=5000,
+                       fine_resolution=1000,
+                       coarse_scan_width=lon1.shape[1])
 
 
 def modis_5km_to_500m(lon1, lat1, satz1):
@@ -281,10 +66,10 @@ def modis_5km_to_500m(lon1, lat1, satz1):
     warnings.warn(
         "Interpolating 5km geolocation to 500m resolution " "may result in poor quality"
     )
-    return _interpolate(lon1, lat1, satz1,
-                        coarse_resolution=5000,
-                        fine_resolution=500,
-                        coarse_scan_width=lon1.shape[1])
+    return interpolate(lon1, lat1, satz1,
+                       coarse_resolution=5000,
+                       fine_resolution=500,
+                       coarse_scan_width=lon1.shape[1])
 
 
 def modis_5km_to_250m(lon1, lat1, satz1):
@@ -292,7 +77,7 @@ def modis_5km_to_250m(lon1, lat1, satz1):
     warnings.warn(
         "Interpolating 5km geolocation to 250m resolution " "may result in poor quality"
     )
-    return _interpolate(lon1, lat1, satz1,
-                        coarse_resolution=5000,
-                        fine_resolution=250,
-                        coarse_scan_width=lon1.shape[1])
+    return interpolate(lon1, lat1, satz1,
+                       coarse_resolution=5000,
+                       fine_resolution=250,
+                       coarse_scan_width=lon1.shape[1])


=====================================
geotiepoints/simple_modis_interpolator.py
=====================================
@@ -32,143 +32,8 @@ in these files.
 
 """
 
-from functools import wraps
-
-import numpy as np
-from scipy.ndimage.interpolation import map_coordinates
-from .geointerpolator import lonlat2xyz, xyz2lonlat
-
-try:
-    import dask.array as da
-except ImportError:
-    # if dask can't be imported then we aren't going to be given dask arrays
-    da = None
-
-try:
-    import xarray as xr
-except ImportError:
-    xr = None
-
-
-def _rows_per_scan_for_resolution(res):
-    return {
-        5000: 2,
-        1000: 10,
-        500: 20,
-        250: 40,
-    }[res]
-
-
-def scanline_mapblocks(func):
-    """Convert dask array inputs to appropriate map_blocks calls.
-
-    This function, applied as a decorator, will call the wrapped function
-    using dask's ``map_blocks``. It will rechunk dask array inputs when
-    necessary to make sure that the input chunks are entire scanlines to
-    avoid incorrect interpolation.
-
-    """
-    @wraps(func)
-    def _wrapper(*args, coarse_resolution=None, fine_resolution=None, **kwargs):
-        if coarse_resolution is None or fine_resolution is None:
-            raise ValueError("'coarse_resolution' and 'fine_resolution' are required keyword arguments.")
-        first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
-        if first_arr.ndim != 2 or first_arr.ndim != 2:
-            raise ValueError("Expected 2D input arrays.")
-        if hasattr(first_arr, "compute"):
-            # assume it is dask or xarray with dask, ensure proper chunk size
-            # if DataArray get just the dask array
-            dask_args = _extract_dask_arrays_from_args(args)
-            rows_per_scan = _rows_per_scan_for_resolution(coarse_resolution)
-            rechunked_args = _rechunk_dask_arrays_if_needed(dask_args, rows_per_scan)
-            results = _call_map_blocks_interp(
-                func,
-                coarse_resolution,
-                fine_resolution,
-                *rechunked_args,
-                **kwargs
-            )
-            if hasattr(first_arr, "dims"):
-                # recreate DataArrays
-                results = _results_to_data_arrays(first_arr.dims, *results)
-            return results
-        return func(
-            *args,
-            coarse_resolution=coarse_resolution,
-            fine_resolution=fine_resolution,
-            **kwargs
-        )
-
-    return _wrapper
-
-
-def _extract_dask_arrays_from_args(args):
-    return [arr_obj.data if hasattr(arr_obj, "dims") else arr_obj for arr_obj in args]
-
-
-def _call_map_blocks_interp(func, coarse_resolution, fine_resolution, *args, **kwargs):
-    first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
-    res_factor = coarse_resolution // fine_resolution
-    new_row_chunks = tuple(x * res_factor for x in first_arr.chunks[0])
-    fine_pixels_per_1km = {250: 4, 500: 2, 1000: 1}[fine_resolution]
-    fine_scan_width = 1354 * fine_pixels_per_1km
-    new_col_chunks = (fine_scan_width,)
-    wrapped_func = _map_blocks_handler(func)
-    res = da.map_blocks(wrapped_func, *args,
-                        coarse_resolution=coarse_resolution,
-                        fine_resolution=fine_resolution,
-                        **kwargs,
-                        new_axis=[0],
-                        chunks=(2, new_row_chunks, new_col_chunks),
-                        dtype=first_arr.dtype,
-                        meta=np.empty((2, 2, 2), dtype=first_arr.dtype))
-    return tuple(res[idx] for idx in range(res.shape[0]))
-
-
-def _results_to_data_arrays(dims, *results):
-    new_results = []
-    for result in results:
-        if not isinstance(result, da.Array):
-            continue
-        data_arr = xr.DataArray(result, dims=dims)
-        new_results.append(data_arr)
-    return new_results
-
-
-def _rechunk_dask_arrays_if_needed(args, rows_per_scan: int):
-    # take current chunk size and get a relatively similar chunk size
-    first_arr = [arr for arr in args if hasattr(arr, "ndim")][0]
-    row_chunks = first_arr.chunks[0]
-    col_chunks = first_arr.chunks[1]
-    num_rows = first_arr.shape[0]
-    num_cols = first_arr.shape[-1]
-    good_row_chunks = all(x % rows_per_scan == 0 for x in row_chunks)
-    good_col_chunks = len(col_chunks) == 1 and col_chunks[0] != num_cols
-    all_orig_chunks = [arr.chunks for arr in args if hasattr(arr, "chunks")]
-
-    if num_rows % rows_per_scan != 0:
-        raise ValueError("Input longitude/latitude data does not consist of "
-                         "whole scans (10 rows per scan).")
-    all_same_chunks = all(
-        all_orig_chunks[0] == some_chunks
-        for some_chunks in all_orig_chunks[1:]
-    )
-    if good_row_chunks and good_col_chunks and all_same_chunks:
-        return args
-
-    new_row_chunks = (row_chunks[0] // rows_per_scan) * rows_per_scan
-    new_args = [arr.rechunk((new_row_chunks, -1)) if hasattr(arr, "chunks") else arr for arr in args]
-    return new_args
-
-
-def _map_blocks_handler(func):
-    @wraps(func)
-    def _map_blocks_wrapper(*args, **kwargs):
-        results = func(*args, **kwargs)
-        return np.concatenate(
-            tuple(result[np.newaxis] for result in results),
-            axis=0)
-    return _map_blocks_wrapper
+from ._modis_utils import scanline_mapblocks
+from ._simple_modis_interpolator import interpolate_geolocation_cartesian as interp_cython
 
 
 @scanline_mapblocks
@@ -190,84 +55,9 @@ def interpolate_geolocation_cartesian(lon_array, lat_array, coarse_resolution, f
         A two-element tuple (lon, lat).
 
     """
-    rows_per_scan = _rows_per_scan_for_resolution(coarse_resolution)
-    res_factor = coarse_resolution // fine_resolution
-    num_rows, num_cols = lon_array.shape
-    num_scans = int(num_rows / rows_per_scan)
-    x_in, y_in, z_in = lonlat2xyz(lon_array, lat_array)
-
-    # Create an array of indexes that we want our result to have
-    x = np.arange(res_factor * num_cols, dtype=np.float32) * (1. / res_factor)
-    # 0.375 for 250m, 0.25 for 500m
-    y = np.arange(res_factor * rows_per_scan, dtype=np.float32) * \
-        (1. / res_factor) - (res_factor * (1. / 16) + (1. / 8))
-    x, y = np.meshgrid(x, y)
-    coordinates = np.array([y, x])  # Used by map_coordinates, major optimization
-
-    new_x = np.empty((num_rows * res_factor, num_cols * res_factor), dtype=lon_array.dtype)
-    new_y = new_x.copy()
-    new_z = new_x.copy()
-    nav_arrays = [(x_in, new_x), (y_in, new_y), (z_in, new_z)]
-
-    # Interpolate each scan, one at a time, otherwise the math doesn't work well
-    for scan_idx in range(num_scans):
-        # Calculate indexes
-        j0 = rows_per_scan * scan_idx
-        j1 = j0 + rows_per_scan
-        k0 = rows_per_scan * res_factor * scan_idx
-        k1 = k0 + rows_per_scan * res_factor
-
-        for nav_array, result_array in nav_arrays:
-            # Use bilinear interpolation for all 250 meter pixels
-            map_coordinates(nav_array[j0:j1, :], coordinates, output=result_array[k0:k1, :], order=1, mode='nearest')
-            _extrapolate_rightmost_columns(res_factor, result_array[k0:k1])
-
-            if res_factor == 4:
-                # Use linear extrapolation for the first two 250 meter pixels along track
-                m, b = _calc_slope_offset_250(result_array, y, k0, 2)
-                result_array[k0 + 0, :] = m * y[0, 0] + b
-                result_array[k0 + 1, :] = m * y[1, 0] + b
-
-                # Use linear extrapolation for the last  two 250 meter pixels along track
-                # m = (result_array[k0 + 37, :] - result_array[k0 + 34, :]) / (y[37, 0] - y[34, 0])
-                # b = result_array[k0 + 37, :] - m * y[37, 0]
-                m, b = _calc_slope_offset_250(result_array, y, k0, 34)
-                result_array[k0 + 38, :] = m * y[38, 0] + b
-                result_array[k0 + 39, :] = m * y[39, 0] + b
-            else:
-                # 500m
-                # Use linear extrapolation for the first two 250 meter pixels along track
-                m, b = _calc_slope_offset_500(result_array, y, k0, 1)
-                result_array[k0 + 0, :] = m * y[0, 0] + b
-
-                # Use linear extrapolation for the last two 250 meter pixels along track
-                m, b = _calc_slope_offset_500(result_array, y, k0, 17)
-                result_array[k0 + 19, :] = m * y[19, 0] + b
-
-    new_lons, new_lats = xyz2lonlat(new_x, new_y, new_z, low_lat_z=True)
-    return new_lons.astype(lon_array.dtype), new_lats.astype(lon_array.dtype)
-
-
-def _extrapolate_rightmost_columns(res_factor, result_array):
-    outer_columns_offset = 3 if res_factor == 4 else 1
-    # take the last two interpolated (not extrapolated) columns and find the difference
-    right_diff = result_array[:, -(outer_columns_offset + 1)] - result_array[:, -(outer_columns_offset + 2)]
-    for factor, column_idx in enumerate(range(-outer_columns_offset, 0)):
-        result_array[:, column_idx] += right_diff * (factor + 1)
-
-
-def _calc_slope_offset_250(result_array, y, start_idx, offset):
-    m = (result_array[start_idx + offset + 3, :] - result_array[start_idx + offset, :]) / \
-        (y[offset + 3, 0] - y[offset, 0])
-    b = result_array[start_idx + offset + 3, :] - m * y[offset + 3, 0]
-    return m, b
-
-
-def _calc_slope_offset_500(result_array, y, start_idx, offset):
-    m = (result_array[start_idx + offset + 1, :] - result_array[start_idx + offset, :]) / \
-        (y[offset + 1, 0] - y[offset, 0])
-    b = result_array[start_idx + offset + 1, :] - m * y[offset + 1, 0]
-    return m, b
+    return interp_cython(
+        lon_array, lat_array, coarse_resolution, fine_resolution
+    )
 
 
 def modis_1km_to_250m(lon1, lat1):


=====================================
geotiepoints/tests/test_modis.py
=====================================
@@ -16,7 +16,6 @@
 # along with this program.  If not, see <http://www.gnu.org/licenses/>.
 """Unit tests for python-geotiepoints: MODIS examples."""
 
-import unittest
 import numpy as np
 import h5py
 import os
@@ -36,34 +35,25 @@ from geotiepoints import (modis5kmto1km, modis1kmto250m)
 from geotiepoints import get_scene_splits
 
 
-class TestUtils(unittest.TestCase):
-
-    """Class for unit testing the ancillary interpolation functions
-    """
-
-    def setUp(self):
-        pass
+class TestUtils:
+    """Class for unit testing the ancillary interpolation functions."""
 
     def test_get_numof_subscene_lines(self):
-        """Test getting the number of sub-scene lines dependent on the number
-        of CPUs and for various number of lines in a scan"""
+        """Test getting the number of sub-scene lines.
 
+        Function is dependent on the number of CPUs and for various number of
+        lines in a scan.
+        """
         ncpus = 3
-        scene_splits = get_scene_splits(1060,
-                                        10, ncpus)
-
-
-class TestMODIS(unittest.TestCase):
+        scene_splits = get_scene_splits(1060, 10, ncpus)
+        assert list(scene_splits) == [350, 700, 1050]
 
-    """Class for system testing the MODIS interpolation.
-    """
 
-    def setUp(self):
-        pass
+class TestMODIS:
+    """Class for system testing the MODIS interpolation."""
 
     def test_5_to_1(self):
-        """test the 5km to 1km interpolation facility
-        """
+        """Test the 5km to 1km interpolation facility."""
 
         with h5py.File(FILENAME_FULL) as h5f:
             glons = h5f['longitude'][:] / 1000.
@@ -75,12 +65,11 @@ class TestMODIS(unittest.TestCase):
 
         tlons, tlats = modis5kmto1km(lons, lats)
 
-        self.assert_(np.allclose(tlons, glons, atol=0.05))
-        self.assert_(np.allclose(tlats, glats, atol=0.05))
+        np.testing.assert_allclose(tlons, glons, atol=0.05)
+        np.testing.assert_allclose(tlats, glats, atol=0.05)
 
     def test_1000m_to_250m(self):
-        """test the 1 km to 250 meter interpolation facility
-        """
+        """Test the 1 km to 250 meter interpolation facility."""
 
         with h5py.File(FILENAME_250M_RESULT) as h5f:
             glons = h5f['longitude'][:] / 1000.
@@ -91,11 +80,9 @@ class TestMODIS(unittest.TestCase):
             lats = h5f['latitude'][:] / 1000.
 
         tlons, tlats = modis1kmto250m(lons, lats)
-
-        self.assert_(np.allclose(tlons, glons, atol=0.05))
-        self.assert_(np.allclose(tlats, glats, atol=0.05))
+        np.testing.assert_allclose(tlons, glons, atol=0.05)
+        np.testing.assert_allclose(tlats, glats, atol=0.05)
 
         tlons, tlats = modis1kmto250m(lons, lats, cores=4)
-
-        self.assert_(np.allclose(tlons, glons, atol=0.05))
-        self.assert_(np.allclose(tlats, glats, atol=0.05))
+        np.testing.assert_allclose(tlons, glons, atol=0.05)
+        np.testing.assert_allclose(tlats, glats, atol=0.05)


=====================================
geotiepoints/tests/test_modisinterpolator.py
=====================================
@@ -15,6 +15,7 @@
 # You should have received a copy of the GNU General Public License
 # along with this program.  If not, see <http://www.gnu.org/licenses/>.
 """Tests for MODIS interpolators."""
+import warnings
 
 import numpy as np
 from pyproj import Geod
@@ -123,23 +124,30 @@ def assert_geodetic_distance(
 
 
 @pytest.mark.parametrize(
-    ("input_func", "exp_func", "interp_func", "dist_max"),
+    ("input_func", "exp_func", "interp_func", "dist_max", "exp_5km_warning"),
     [
-        (load_1km_lonlat_satz_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 5),
-        (load_1km_lonlat_satz_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 8),
-        (load_5km_lonlat_satz1_as_xarray_dask, load_1km_lonlat_as_xarray_dask, modis_5km_to_1km, 25),
-        (load_l2_5km_lonlat_satz1_as_xarray_dask, load_1km_lonlat_as_xarray_dask, modis_5km_to_1km, 110),
-        (load_5km_lonlat_satz1_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_5km_to_500m, 19500),
-        (load_5km_lonlat_satz1_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_5km_to_250m, 25800),
+        (load_1km_lonlat_satz_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 5, False),
+        (load_1km_lonlat_satz_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 8.30, False),
+        (load_5km_lonlat_satz1_as_xarray_dask, load_1km_lonlat_as_xarray_dask, modis_5km_to_1km, 25, False),
+        (load_l2_5km_lonlat_satz1_as_xarray_dask, load_1km_lonlat_as_xarray_dask, modis_5km_to_1km, 110, False),
+        (load_5km_lonlat_satz1_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_5km_to_500m,
+         19500, True),
+        (load_5km_lonlat_satz1_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_5km_to_250m,
+         25800, True),
     ]
 )
-def test_sat_angle_based_interp(input_func, exp_func, interp_func, dist_max):
+def test_sat_angle_based_interp(input_func, exp_func, interp_func, dist_max, exp_5km_warning):
     lon1, lat1, satz1 = input_func()
     lons_exp, lats_exp = exp_func()
 
     # when working with dask arrays, we shouldn't compute anything
-    with dask.config.set(scheduler=CustomScheduler(0)):
+    with dask.config.set(scheduler=CustomScheduler(0)), warnings.catch_warnings(record=True) as warns:
         lons, lats = interp_func(lon1, lat1, satz1)
+    has_5km_warning = any("may result in poor quality" in str(w.message) for w in warns)
+    if exp_5km_warning:
+        assert has_5km_warning
+    else:
+        assert not has_5km_warning
 
     if hasattr(lons, "compute"):
         lons, lats = da.compute(lons, lats)
@@ -151,7 +159,7 @@ def test_sat_angle_based_interp(input_func, exp_func, interp_func, dist_max):
 def test_sat_angle_based_interp_nan_handling():
     # See GH #19
     lon1, lat1, satz1 = load_1km_lonlat_satz_as_xarray_dask()
-    satz1 = _to_da(abs(np.linspace(-65.4, 65.4, 1354)).repeat(20).reshape(-1, 20).T)
+    satz1 = _to_da(abs(np.linspace(-65.4, 65.4, 1354, dtype=np.float32)).repeat(20).reshape(-1, 20).T)
     lons, lats = modis_1km_to_500m(lon1, lat1, satz1)
     assert not np.any(np.isnan(lons.compute()))
     assert not np.any(np.isnan(lats.compute()))


=====================================
geotiepoints/tests/test_simple_modis_interpolator.py
=====================================
@@ -38,11 +38,11 @@ from .utils import CustomScheduler
     ("input_func", "exp_func", "interp_func", "dist_max"),
     [
         (load_1km_lonlat_as_xarray_dask, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 16),
-        (load_1km_lonlat_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27),
+        (load_1km_lonlat_as_xarray_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27.35),
         (load_1km_lonlat_as_dask, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 16),
-        (load_1km_lonlat_as_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27),
+        (load_1km_lonlat_as_dask, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27.35),
         (load_1km_lonlat_as_numpy, load_500m_lonlat_expected_as_xarray_dask, modis_1km_to_500m, 16),
-        (load_1km_lonlat_as_numpy, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27),
+        (load_1km_lonlat_as_numpy, load_250m_lonlat_expected_as_xarray_dask, modis_1km_to_250m, 27.35),
     ]
 )
 def test_basic_interp(input_func, exp_func, interp_func, dist_max):


=====================================
geotiepoints/version.py
=====================================
@@ -5,8 +5,9 @@
 # directories (produced by setup.py build) will contain a much shorter file
 # that just contains the computed version number.
 
-# This file is released into the public domain. Generated by
-# versioneer-0.22 (https://github.com/python-versioneer/python-versioneer)
+# This file is released into the public domain.
+# Generated by versioneer-0.27
+# https://github.com/python-versioneer/python-versioneer
 
 """Git implementation of _version.py."""
 
@@ -25,9 +26,9 @@ def get_keywords():
     # setup.py/versioneer.py will grep for the variable names, so they must
     # each be defined on a line of their own. _version.py will just call
     # get_keywords().
-    git_refnames = " (HEAD -> main, tag: v1.4.1)"
-    git_full = "79b95e346852d8e5d21a6f7c310f457fe45467c7"
-    git_date = "2022-06-08 14:50:58 -0500"
+    git_refnames = " (HEAD -> main, tag: v1.5.0)"
+    git_full = "0d8a1bc7e526cd2d1fe863a36bee8de79286273f"
+    git_date = "2022-10-25 11:26:49 -0500"
     keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
     return keywords
 
@@ -248,19 +249,18 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     runner = functools.partial(runner, env=env)
 
     _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                   hide_stderr=True)
+                   hide_stderr=not verbose)
     if rc != 0:
         if verbose:
             print("Directory %s not under git control" % root)
         raise NotThisMethod("'git rev-parse --git-dir' returned error")
 
-    MATCH_ARGS = ["--match", "%s*" % tag_prefix] if tag_prefix else []
-
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
-                                     "--always", "--long", *MATCH_ARGS],
-                              cwd=root)
+    describe_out, rc = runner(GITS, [
+        "describe", "--tags", "--dirty", "--always", "--long",
+        "--match", f"{tag_prefix}[[:digit:]]*"
+    ], cwd=root)
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
@@ -349,8 +349,8 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
-        pieces["distance"] = int(count_out)  # total number of commits
+        out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root)
+        pieces["distance"] = len(out.split())  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
     date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip()
@@ -446,7 +446,7 @@ def render_pep440_pre(pieces):
             tag_version, post_version = pep440_split_post(pieces["closest-tag"])
             rendered = tag_version
             if post_version is not None:
-                rendered += ".post%d.dev%d" % (post_version+1, pieces["distance"])
+                rendered += ".post%d.dev%d" % (post_version + 1, pieces["distance"])
             else:
                 rendered += ".post0.dev%d" % (pieces["distance"])
         else:


=====================================
pyproject.toml
=====================================
@@ -1,7 +1,8 @@
 [build-system]
-requires = ["setuptools", "wheel", "oldest-supported-numpy", "Cython", "versioneer-518"]
+requires = ["setuptools", "wheel", "oldest-supported-numpy", "Cython", "versioneer"]
 build-backend = "setuptools.build_meta"
 
 [tool.coverage.run]
 relative_files = true
-
+plugins = ["Cython.Coverage"]
+omit = ["geotiepoints/version.py"]


=====================================
setup.py
=====================================
@@ -26,10 +26,11 @@
 
 import sys
 
-from setuptools import Extension, setup
+from setuptools import setup
 import versioneer
 import numpy as np
-from Cython.Build import cythonize
+from Cython.Build import build_ext
+from Cython.Distutils import Extension
 
 requirements = ['numpy', 'scipy', 'pandas']
 test_requires = ['pytest', 'pytest-cov', 'h5py', 'xarray', 'dask', 'pyproj']
@@ -46,9 +47,53 @@ EXTENSIONS = [
         extra_compile_args=extra_compile_args,
         include_dirs=[np.get_include()],
     ),
+    Extension(
+        'geotiepoints._modis_interpolator',
+        sources=['geotiepoints/_modis_interpolator.pyx'],
+        extra_compile_args=extra_compile_args,
+        include_dirs=[np.get_include()],
+    ),
+    Extension(
+        'geotiepoints._simple_modis_interpolator',
+        sources=['geotiepoints/_simple_modis_interpolator.pyx'],
+        extra_compile_args=extra_compile_args,
+        include_dirs=[np.get_include()],
+    ),
+    Extension(
+        'geotiepoints._modis_utils',
+        sources=['geotiepoints/_modis_utils.pyx'],
+        extra_compile_args=extra_compile_args,
+        include_dirs=[np.get_include()],
+    ),
 ]
 
-cmdclass = versioneer.get_cmdclass()
+
+try:
+    sys.argv.remove("--cython-coverage")
+    cython_coverage = True
+except ValueError:
+    cython_coverage = False
+
+
+cython_directives = {
+    "language_level": "3",
+}
+define_macros = []
+if cython_coverage:
+    print("Enabling directives/macros for Cython coverage support")
+    cython_directives.update({
+        "linetrace": True,
+        "profile": True,
+    })
+    define_macros.extend([
+        ("CYTHON_TRACE", "1"),
+        ("CYTHON_TRACE_NOGIL", "1"),
+    ])
+    for ext in EXTENSIONS:
+        ext.define_macros = define_macros
+        ext.cython_directives.update(cython_directives)
+
+cmdclass = versioneer.get_cmdclass(cmdclass={"build_ext": build_ext})
 
 with open('README.md', 'r') as readme:
     README = readme.read()
@@ -61,12 +106,12 @@ if __name__ == "__main__":
           long_description_content_type='text/markdown',
           author='Adam Dybbroe, Martin Raspaud',
           author_email='martin.raspaud at smhi.se',
-          classifiers=["Development Status :: 4 - Beta",
+          classifiers=["Development Status :: 5 - Production/Stable",
                        "Intended Audience :: Science/Research",
-                       "License :: OSI Approved :: GNU General Public License v3 " +
-                       "or later (GPLv3+)",
+                       "License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)",
                        "Operating System :: OS Independent",
                        "Programming Language :: Python",
+                       "Programming Language :: Cython",
                        "Topic :: Scientific/Engineering"],
           url="https://github.com/pytroll/python-geotiepoints",
           packages=['geotiepoints'],
@@ -75,7 +120,7 @@ if __name__ == "__main__":
           python_requires='>=3.7',
           cmdclass=cmdclass,
           install_requires=requirements,
-          ext_modules=cythonize(EXTENSIONS),
+          ext_modules=EXTENSIONS,
           tests_require=test_requires,
           zip_safe=False
           )


=====================================
versioneer.py
=====================================
@@ -1,5 +1,5 @@
 
-# Version: 0.22
+# Version: 0.27
 
 """The Versioneer - like a rocketeer, but for versions.
 
@@ -9,12 +9,12 @@ The Versioneer
 * like a rocketeer, but for versions!
 * https://github.com/python-versioneer/python-versioneer
 * Brian Warner
-* License: Public Domain
-* Compatible with: Python 3.6, 3.7, 3.8, 3.9, 3.10 and pypy3
+* License: Public Domain (Unlicense)
+* Compatible with: Python 3.7, 3.8, 3.9, 3.10 and pypy3
 * [![Latest Version][pypi-image]][pypi-url]
 * [![Build Status][travis-image]][travis-url]
 
-This is a tool for managing a recorded version number in distutils/setuptools-based
+This is a tool for managing a recorded version number in setuptools-based
 python projects. The goal is to remove the tedious and error-prone "update
 the embedded version string" step from your release process. Making a new
 release should be as easy as recording a new tag in your version-control
@@ -23,10 +23,38 @@ system, and maybe making new tarballs.
 
 ## Quick Install
 
+Versioneer provides two installation modes. The "classic" vendored mode installs
+a copy of versioneer into your repository. The experimental build-time dependency mode
+is intended to allow you to skip this step and simplify the process of upgrading.
+
+### Vendored mode
+
+* `pip install versioneer` to somewhere in your $PATH
+   * A [conda-forge recipe](https://github.com/conda-forge/versioneer-feedstock) is
+     available, so you can also use `conda install -c conda-forge versioneer`
+* add a `[tool.versioneer]` section to your `pyproject.toml` or a
+  `[versioneer]` section to your `setup.cfg` (see [Install](INSTALL.md))
+   * Note that you will need to add `tomli` to your build-time dependencies if you
+     use `pyproject.toml`
+* run `versioneer install --vendor` in your source tree, commit the results
+* verify version information with `python setup.py version`
+
+### Build-time dependency mode
+
 * `pip install versioneer` to somewhere in your $PATH
-* add a `[versioneer]` section to your setup.cfg (see [Install](INSTALL.md))
-* run `versioneer install` in your source tree, commit the results
-* Verify version information with `python setup.py version`
+   * A [conda-forge recipe](https://github.com/conda-forge/versioneer-feedstock) is
+     available, so you can also use `conda install -c conda-forge versioneer`
+* add a `[tool.versioneer]` section to your `pyproject.toml` or a
+  `[versioneer]` section to your `setup.cfg` (see [Install](INSTALL.md))
+* add `versioneer` (with `[toml]` extra, if configuring in `pyproject.toml`)
+  to the `requires` key of the `build-system` table in `pyproject.toml`:
+  ```toml
+  [build-system]
+  requires = ["setuptools", "versioneer[toml]"]
+  build-backend = "setuptools.build_meta"
+  ```
+* run `versioneer install --no-vendor` in your source tree, commit the results
+* verify version information with `python setup.py version`
 
 ## Version Identifiers
 
@@ -231,9 +259,10 @@ resolve it.
 To upgrade your project to a new release of Versioneer, do the following:
 
 * install the new Versioneer (`pip install -U versioneer` or equivalent)
-* edit `setup.cfg`, if necessary, to include any new configuration settings
-  indicated by the release notes. See [UPGRADING](./UPGRADING.md) for details.
-* re-run `versioneer install` in your source tree, to replace
+* edit `setup.cfg` and `pyproject.toml`, if necessary,
+  to include any new configuration settings indicated by the release notes.
+  See [UPGRADING](./UPGRADING.md) for details.
+* re-run `versioneer install --[no-]vendor` in your source tree, to replace
   `SRC/_version.py`
 * commit any changed files
 
@@ -263,9 +292,8 @@ number of intermediate scripts.
 
 To make Versioneer easier to embed, all its code is dedicated to the public
 domain. The `_version.py` that it creates is also in the public domain.
-Specifically, both are released under the Creative Commons "Public Domain
-Dedication" license (CC0-1.0), as described in
-https://creativecommons.org/publicdomain/zero/1.0/ .
+Specifically, both are released under the "Unlicense", as described in
+https://unlicense.org/.
 
 [pypi-image]: https://img.shields.io/pypi/v/versioneer.svg
 [pypi-url]: https://pypi.python.org/pypi/versioneer/
@@ -287,8 +315,14 @@ import os
 import re
 import subprocess
 import sys
+from pathlib import Path
 from typing import Callable, Dict
 import functools
+try:
+    import tomli
+    have_tomli = True
+except ImportError:
+    have_tomli = False
 
 
 class VersioneerConfig:
@@ -326,7 +360,7 @@ def get_root():
         my_path = os.path.realpath(os.path.abspath(__file__))
         me_dir = os.path.normcase(os.path.splitext(my_path)[0])
         vsr_dir = os.path.normcase(os.path.splitext(versioneer_py)[0])
-        if me_dir != vsr_dir:
+        if me_dir != vsr_dir and "VERSIONEER_PEP518" not in globals():
             print("Warning: build in %s is using versioneer.py from %s"
                   % (os.path.dirname(my_path), versioneer_py))
     except NameError:
@@ -340,22 +374,32 @@ def get_config_from_root(root):
     # configparser.NoSectionError (if it lacks a [versioneer] section), or
     # configparser.NoOptionError (if it lacks "VCS="). See the docstring at
     # the top of versioneer.py for instructions on writing your setup.cfg .
-    setup_cfg = os.path.join(root, "setup.cfg")
-    parser = configparser.ConfigParser()
-    with open(setup_cfg, "r") as cfg_file:
-        parser.read_file(cfg_file)
-    VCS = parser.get("versioneer", "VCS")  # mandatory
+    root = Path(root)
+    pyproject_toml = root / "pyproject.toml"
+    setup_cfg = root / "setup.cfg"
+    section = None
+    if pyproject_toml.exists() and have_tomli:
+        try:
+            with open(pyproject_toml, 'rb') as fobj:
+                pp = tomli.load(fobj)
+            section = pp['tool']['versioneer']
+        except (tomli.TOMLDecodeError, KeyError):
+            pass
+    if not section:
+        parser = configparser.ConfigParser()
+        with open(setup_cfg) as cfg_file:
+            parser.read_file(cfg_file)
+        parser.get("versioneer", "VCS")  # raise error if missing
 
-    # Dict-like interface for non-mandatory entries
-    section = parser["versioneer"]
+        section = parser["versioneer"]
 
     cfg = VersioneerConfig()
-    cfg.VCS = VCS
+    cfg.VCS = section['VCS']
     cfg.style = section.get("style", "")
     cfg.versionfile_source = section.get("versionfile_source")
     cfg.versionfile_build = section.get("versionfile_build")
     cfg.tag_prefix = section.get("tag_prefix")
-    if cfg.tag_prefix in ("''", '""'):
+    if cfg.tag_prefix in ("''", '""', None):
         cfg.tag_prefix = ""
     cfg.parentdir_prefix = section.get("parentdir_prefix")
     cfg.verbose = section.get("verbose")
@@ -430,8 +474,9 @@ LONG_VERSION_PY['git'] = r'''
 # directories (produced by setup.py build) will contain a much shorter file
 # that just contains the computed version number.
 
-# This file is released into the public domain. Generated by
-# versioneer-0.22 (https://github.com/python-versioneer/python-versioneer)
+# This file is released into the public domain.
+# Generated by versioneer-0.27
+# https://github.com/python-versioneer/python-versioneer
 
 """Git implementation of _version.py."""
 
@@ -673,19 +718,18 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     runner = functools.partial(runner, env=env)
 
     _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                   hide_stderr=True)
+                   hide_stderr=not verbose)
     if rc != 0:
         if verbose:
             print("Directory %%s not under git control" %% root)
         raise NotThisMethod("'git rev-parse --git-dir' returned error")
 
-    MATCH_ARGS = ["--match", "%%s*" %% tag_prefix] if tag_prefix else []
-
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
-                                     "--always", "--long", *MATCH_ARGS],
-                              cwd=root)
+    describe_out, rc = runner(GITS, [
+        "describe", "--tags", "--dirty", "--always", "--long",
+        "--match", f"{tag_prefix}[[:digit:]]*"
+    ], cwd=root)
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
@@ -774,8 +818,8 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
-        pieces["distance"] = int(count_out)  # total number of commits
+        out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root)
+        pieces["distance"] = len(out.split())  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
     date = runner(GITS, ["show", "-s", "--format=%%ci", "HEAD"], cwd=root)[0].strip()
@@ -871,7 +915,7 @@ def render_pep440_pre(pieces):
             tag_version, post_version = pep440_split_post(pieces["closest-tag"])
             rendered = tag_version
             if post_version is not None:
-                rendered += ".post%%d.dev%%d" %% (post_version+1, pieces["distance"])
+                rendered += ".post%%d.dev%%d" %% (post_version + 1, pieces["distance"])
             else:
                 rendered += ".post0.dev%%d" %% (pieces["distance"])
         else:
@@ -1196,19 +1240,18 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     runner = functools.partial(runner, env=env)
 
     _, rc = runner(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                   hide_stderr=True)
+                   hide_stderr=not verbose)
     if rc != 0:
         if verbose:
             print("Directory %s not under git control" % root)
         raise NotThisMethod("'git rev-parse --git-dir' returned error")
 
-    MATCH_ARGS = ["--match", "%s*" % tag_prefix] if tag_prefix else []
-
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
-                                     "--always", "--long", *MATCH_ARGS],
-                              cwd=root)
+    describe_out, rc = runner(GITS, [
+        "describe", "--tags", "--dirty", "--always", "--long",
+        "--match", f"{tag_prefix}[[:digit:]]*"
+    ], cwd=root)
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
@@ -1297,8 +1340,8 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
-        pieces["distance"] = int(count_out)  # total number of commits
+        out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root)
+        pieces["distance"] = len(out.split())  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
     date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip()
@@ -1310,7 +1353,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
     return pieces
 
 
-def do_vcs_install(manifest_in, versionfile_source, ipy):
+def do_vcs_install(versionfile_source, ipy):
     """Git-specific installation logic for Versioneer.
 
     For Git, this means creating/changing .gitattributes to mark _version.py
@@ -1319,17 +1362,18 @@ def do_vcs_install(manifest_in, versionfile_source, ipy):
     GITS = ["git"]
     if sys.platform == "win32":
         GITS = ["git.cmd", "git.exe"]
-    files = [manifest_in, versionfile_source]
+    files = [versionfile_source]
     if ipy:
         files.append(ipy)
-    try:
-        my_path = __file__
-        if my_path.endswith(".pyc") or my_path.endswith(".pyo"):
-            my_path = os.path.splitext(my_path)[0] + ".py"
-        versioneer_file = os.path.relpath(my_path)
-    except NameError:
-        versioneer_file = "versioneer.py"
-    files.append(versioneer_file)
+    if "VERSIONEER_PEP518" not in globals():
+        try:
+            my_path = __file__
+            if my_path.endswith((".pyc", ".pyo")):
+                my_path = os.path.splitext(my_path)[0] + ".py"
+            versioneer_file = os.path.relpath(my_path)
+        except NameError:
+            versioneer_file = "versioneer.py"
+        files.append(versioneer_file)
     present = False
     try:
         with open(".gitattributes", "r") as fobj:
@@ -1372,7 +1416,7 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
 
 
 SHORT_VERSION_PY = """
-# This file was generated by 'versioneer.py' (0.22) from
+# This file was generated by 'versioneer.py' (0.27) from
 # revision-control system data, or from the parent directory name of an
 # unpacked source archive. Distribution tarballs contain a pre-generated copy
 # of this file.
@@ -1501,7 +1545,7 @@ def render_pep440_pre(pieces):
             tag_version, post_version = pep440_split_post(pieces["closest-tag"])
             rendered = tag_version
             if post_version is not None:
-                rendered += ".post%d.dev%d" % (post_version+1, pieces["distance"])
+                rendered += ".post%d.dev%d" % (post_version + 1, pieces["distance"])
             else:
                 rendered += ".post0.dev%d" % (pieces["distance"])
         else:
@@ -1753,7 +1797,7 @@ def get_version():
 
 
 def get_cmdclass(cmdclass=None):
-    """Get the custom setuptools/distutils subclasses used by Versioneer.
+    """Get the custom setuptools subclasses used by Versioneer.
 
     If the package uses a different cmdclass (e.g. one from numpy), it
     should be provide as an argument.
@@ -1775,11 +1819,8 @@ def get_cmdclass(cmdclass=None):
 
     cmds = {} if cmdclass is None else cmdclass.copy()
 
-    # we add "version" to both distutils and setuptools
-    try:
-        from setuptools import Command
-    except ImportError:
-        from distutils.core import Command
+    # we add "version" to setuptools
+    from setuptools import Command
 
     class cmd_version(Command):
         description = "report generated version string"
@@ -1802,7 +1843,7 @@ def get_cmdclass(cmdclass=None):
                 print(" error: %s" % vers["error"])
     cmds["version"] = cmd_version
 
-    # we override "build_py" in both distutils and setuptools
+    # we override "build_py" in setuptools
     #
     # most invocation pathways end up running build_py:
     #  distutils/build -> build_py
@@ -1817,13 +1858,14 @@ def get_cmdclass(cmdclass=None):
     #   then does setup.py bdist_wheel, or sometimes setup.py install
     #  setup.py egg_info -> ?
 
+    # pip install -e . and setuptool/editable_wheel will invoke build_py
+    # but the build_py command is not expected to copy any files.
+
     # we override different "build_py" commands for both environments
     if 'build_py' in cmds:
         _build_py = cmds['build_py']
-    elif "setuptools" in sys.modules:
-        from setuptools.command.build_py import build_py as _build_py
     else:
-        from distutils.command.build_py import build_py as _build_py
+        from setuptools.command.build_py import build_py as _build_py
 
     class cmd_build_py(_build_py):
         def run(self):
@@ -1831,6 +1873,10 @@ def get_cmdclass(cmdclass=None):
             cfg = get_config_from_root(root)
             versions = get_versions()
             _build_py.run(self)
+            if getattr(self, "editable_mode", False):
+                # During editable installs `.py` and data files are
+                # not copied to build_lib
+                return
             # now locate _version.py in the new build/ directory and replace
             # it with an updated value
             if cfg.versionfile_build:
@@ -1842,10 +1888,8 @@ def get_cmdclass(cmdclass=None):
 
     if 'build_ext' in cmds:
         _build_ext = cmds['build_ext']
-    elif "setuptools" in sys.modules:
-        from setuptools.command.build_ext import build_ext as _build_ext
     else:
-        from distutils.command.build_ext import build_ext as _build_ext
+        from setuptools.command.build_ext import build_ext as _build_ext
 
     class cmd_build_ext(_build_ext):
         def run(self):
@@ -1863,6 +1907,11 @@ def get_cmdclass(cmdclass=None):
             # it with an updated value
             target_versionfile = os.path.join(self.build_lib,
                                               cfg.versionfile_build)
+            if not os.path.exists(target_versionfile):
+                print(f"Warning: {target_versionfile} does not exist, skipping "
+                      "version update. This can happen if you are running build_ext "
+                      "without first running build_py.")
+                return
             print("UPDATING %s" % target_versionfile)
             write_to_version_file(target_versionfile, versions)
     cmds["build_ext"] = cmd_build_ext
@@ -1900,7 +1949,10 @@ def get_cmdclass(cmdclass=None):
         del cmds["build_py"]
 
     if 'py2exe' in sys.modules:  # py2exe enabled?
-        from py2exe.distutils_buildexe import py2exe as _py2exe
+        try:
+            from py2exe.setuptools_buildexe import py2exe as _py2exe
+        except ImportError:
+            from py2exe.distutils_buildexe import py2exe as _py2exe
 
         class cmd_py2exe(_py2exe):
             def run(self):
@@ -1924,13 +1976,48 @@ def get_cmdclass(cmdclass=None):
                              })
         cmds["py2exe"] = cmd_py2exe
 
+    # sdist farms its file list building out to egg_info
+    if 'egg_info' in cmds:
+        _egg_info = cmds['egg_info']
+    else:
+        from setuptools.command.egg_info import egg_info as _egg_info
+
+    class cmd_egg_info(_egg_info):
+        def find_sources(self):
+            # egg_info.find_sources builds the manifest list and writes it
+            # in one shot
+            super().find_sources()
+
+            # Modify the filelist and normalize it
+            root = get_root()
+            cfg = get_config_from_root(root)
+            self.filelist.append('versioneer.py')
+            if cfg.versionfile_source:
+                # There are rare cases where versionfile_source might not be
+                # included by default, so we must be explicit
+                self.filelist.append(cfg.versionfile_source)
+            self.filelist.sort()
+            self.filelist.remove_duplicates()
+
+            # The write method is hidden in the manifest_maker instance that
+            # generated the filelist and was thrown away
+            # We will instead replicate their final normalization (to unicode,
+            # and POSIX-style paths)
+            from setuptools import unicode_utils
+            normalized = [unicode_utils.filesys_decode(f).replace(os.sep, '/')
+                          for f in self.filelist.files]
+
+            manifest_filename = os.path.join(self.egg_info, 'SOURCES.txt')
+            with open(manifest_filename, 'w') as fobj:
+                fobj.write('\n'.join(normalized))
+
+    cmds['egg_info'] = cmd_egg_info
+
     # we override different "sdist" commands for both environments
     if 'sdist' in cmds:
         _sdist = cmds['sdist']
-    elif "setuptools" in sys.modules:
-        from setuptools.command.sdist import sdist as _sdist
     else:
-        from distutils.command.sdist import sdist as _sdist
+        from setuptools.command.sdist import sdist as _sdist
 
     class cmd_sdist(_sdist):
         def run(self):
@@ -2055,42 +2142,10 @@ def do_setup():
         print(" %s doesn't exist, ok" % ipy)
         ipy = None
 
-    # Make sure both the top-level "versioneer.py" and versionfile_source
-    # (PKG/_version.py, used by runtime code) are in MANIFEST.in, so
-    # they'll be copied into source distributions. Pip won't be able to
-    # install the package without this.
-    manifest_in = os.path.join(root, "MANIFEST.in")
-    simple_includes = set()
-    try:
-        with open(manifest_in, "r") as f:
-            for line in f:
-                if line.startswith("include "):
-                    for include in line.split()[1:]:
-                        simple_includes.add(include)
-    except OSError:
-        pass
-    # That doesn't cover everything MANIFEST.in can do
-    # (http://docs.python.org/2/distutils/sourcedist.html#commands), so
-    # it might give some false negatives. Appending redundant 'include'
-    # lines is safe, though.
-    if "versioneer.py" not in simple_includes:
-        print(" appending 'versioneer.py' to MANIFEST.in")
-        with open(manifest_in, "a") as f:
-            f.write("include versioneer.py\n")
-    else:
-        print(" 'versioneer.py' already in MANIFEST.in")
-    if cfg.versionfile_source not in simple_includes:
-        print(" appending versionfile_source ('%s') to MANIFEST.in" %
-              cfg.versionfile_source)
-        with open(manifest_in, "a") as f:
-            f.write("include %s\n" % cfg.versionfile_source)
-    else:
-        print(" versionfile_source already in MANIFEST.in")
-
     # Make VCS-specific changes. For git, this means creating/changing
     # .gitattributes to mark _version.py for export-subst keyword
     # substitution.
-    do_vcs_install(manifest_in, cfg.versionfile_source, ipy)
+    do_vcs_install(cfg.versionfile_source, ipy)
     return 0
 
 
@@ -2131,10 +2186,14 @@ def scan_setup_py():
     return errors
 
 
+def setup_command():
+    """Set up Versioneer and exit with appropriate error code."""
+    errors = do_setup()
+    errors += scan_setup_py()
+    sys.exit(1 if errors else 0)
+
+
 if __name__ == "__main__":
     cmd = sys.argv[1]
     if cmd == "setup":
-        errors = do_setup()
-        errors += scan_setup_py()
-        if errors:
-            sys.exit(1)
+        setup_command()



View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geotiepoints/-/compare/33a99ac3c3daffccceb7b5bac09a661b95604074...4c8477370b331ad78e9fbaa0148ebb0c83ad9dc5

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geotiepoints/-/compare/33a99ac3c3daffccceb7b5bac09a661b95604074...4c8477370b331ad78e9fbaa0148ebb0c83ad9dc5
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20221027/90268230/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list