[Git][debian-gis-team/pyresample][upstream] New upstream version 1.26.0
Antonio Valentino (@antonio.valentino)
gitlab at salsa.debian.org
Thu Nov 24 21:32:51 GMT 2022
Antonio Valentino pushed to branch upstream at Debian GIS Project / pyresample
Commits:
2451f9b0 by Antonio Valentino at 2022-11-24T21:11:33+00:00
New upstream version 1.26.0
- - - - -
27 changed files:
- + .github/dependabot.yml
- .github/workflows/ci.yaml
- .github/workflows/deploy.yaml
- CHANGELOG.md
- pyresample/area_config.py
- pyresample/bilinear/__init__.py
- pyresample/bilinear/xarr.py
- pyresample/boundary.py
- pyresample/ewa/_legacy_dask_ewa.py
- pyresample/ewa/dask_ewa.py
- pyresample/ewa/ewa.py
- + pyresample/future/spherical/__init__.py
- + pyresample/future/spherical/point.py
- pyresample/geometry.py
- pyresample/gradient/_gradient_search.pyx
- pyresample/spherical.py
- pyresample/spherical_geometry.py
- + pyresample/test/test_area_config.py
- pyresample/test/test_bilinear.py
- + pyresample/test/test_boundary.py
- pyresample/test/test_geometry.py
- + pyresample/test/test_sgeom/__init__.py
- + pyresample/test/test_sgeom/test_point.py
- pyresample/test/test_spherical.py
- pyresample/test/test_utils.py
- pyresample/version.py
- versioneer.py
Changes:
=====================================
.github/dependabot.yml
=====================================
@@ -0,0 +1,11 @@
+# To get started with Dependabot version updates, you'll need to specify which
+# package ecosystems to update and where the package manifests are located.
+# Please see the documentation for all configuration options:
+# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
+
+version: 2
+updates:
+ - package-ecosystem: "github-actions" # See documentation for possible values
+ directory: "/" # Location of package manifests
+ schedule:
+ interval: "weekly"
=====================================
.github/workflows/ci.yaml
=====================================
@@ -13,7 +13,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout source
- uses: actions/checkout at v2
+ uses: actions/checkout at v3
with:
fetch-depth: 0
@@ -61,7 +61,7 @@ jobs:
steps:
- name: Checkout source
- uses: actions/checkout at v2
+ uses: actions/checkout at v3
- name: Setup Conda Environment
uses: conda-incubator/setup-miniconda at v2
@@ -111,7 +111,7 @@ jobs:
cd docs && mkdir doctest && sphinx-build -E -n -b doctest ./source ./doctest && cd ..
- name: Upload unittest coverage to Codecov
- uses: codecov/codecov-action at v1
+ uses: codecov/codecov-action at v3
with:
flags: unittests
file: ./coverage.xml
=====================================
.github/workflows/deploy.yaml
=====================================
@@ -17,7 +17,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout source
- uses: actions/checkout at v2
+ uses: actions/checkout at v3
- name: Create sdist
shell: bash -l {0}
@@ -26,7 +26,7 @@ jobs:
python -m build -s
- name: Upload sdist to build artifacts
- uses: actions/upload-artifact at v2
+ uses: actions/upload-artifact at v3
with:
name: sdist
path: dist/*.tar.gz
@@ -52,12 +52,12 @@ jobs:
docker-image: manylinux2014_i686
steps:
- - uses: actions/checkout at v2
+ - uses: actions/checkout at v3
- run: |
git fetch --prune --unshallow
- name: Set up Python ${{ matrix.python-version }}
- uses: actions/setup-python at v1
+ uses: actions/setup-python at v4
with:
python-version: '${{ matrix.python-version }}'
@@ -92,7 +92,7 @@ jobs:
python -c "import pyresample; assert 'unknown' not in pyresample.__version__, 'incorrect version found'"
- name: Upload wheel(s) as build artifacts
- uses: actions/upload-artifact at v2
+ uses: actions/upload-artifact at v3
with:
name: wheels
path: dist/*.whl
@@ -104,18 +104,18 @@ jobs:
if: github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags/v')
steps:
- name: Download sdist artifact
- uses: actions/download-artifact at v2
+ uses: actions/download-artifact at v3
with:
name: sdist
path: dist
- name: Download wheels artifact
- uses: actions/download-artifact at v2
+ uses: actions/download-artifact at v3
with:
name: wheels
path: dist
- name: Publish package to PyPI
if: github.event.action != 'published'
- uses: pypa/gh-action-pypi-publish at v1.4.1
+ uses: pypa/gh-action-pypi-publish at v1.5.1
with:
user: __token__
password: ${{ secrets.test_pypi_password }}
@@ -126,18 +126,18 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Download sdist artifact
- uses: actions/download-artifact at v2
+ uses: actions/download-artifact at v3
with:
name: sdist
path: dist
- name: Download wheels artifact
- uses: actions/download-artifact at v2
+ uses: actions/download-artifact at v3
with:
name: wheels
path: dist
- name: Publish package to PyPI
if: github.event.action == 'published'
- uses: pypa/gh-action-pypi-publish at v1.4.1
+ uses: pypa/gh-action-pypi-publish at v1.5.1
with:
user: __token__
password: ${{ secrets.pypi_password }}
=====================================
CHANGELOG.md
=====================================
@@ -1,3 +1,50 @@
+## Version 1.26.0 (2022/11/24)
+
+### Issues Closed
+
+* [Issue 474](https://github.com/pytroll/pyresample/issues/474) - get_geostationary_bounding_box* contains duplicated vertices at the equator ([PR 475](https://github.com/pytroll/pyresample/pull/475) by [@ghiggi](https://github.com/ghiggi))
+* [Issue 457](https://github.com/pytroll/pyresample/issues/457) - Pyresample 1.25.1 create_area_def return wrong lons with the .get_lonlats()
+* [Issue 453](https://github.com/pytroll/pyresample/issues/453) - Import Error using XArrayBilinearResampler missing failed import of dask ([PR 454](https://github.com/pytroll/pyresample/pull/454) by [@benjaminesse](https://github.com/benjaminesse))
+* [Issue 445](https://github.com/pytroll/pyresample/issues/445) - Release GIL in gradient search resampling ([PR 455](https://github.com/pytroll/pyresample/pull/455) by [@mraspaud](https://github.com/mraspaud))
+* [Issue 439](https://github.com/pytroll/pyresample/issues/439) - SwathDefinition.update_hash() raise error after slicing the swath object ([PR 462](https://github.com/pytroll/pyresample/pull/462) by [@mraspaud](https://github.com/mraspaud))
+
+In this release 5 issues were closed.
+
+### Pull Requests Merged
+
+#### Bugs fixed
+
+* [PR 479](https://github.com/pytroll/pyresample/pull/479) - Fix bbox creation for SwathDefinitions with NaNs
+* [PR 475](https://github.com/pytroll/pyresample/pull/475) - Fix for duplicate coordinates in bbox_lonlats for geostationary area ([474](https://github.com/pytroll/pyresample/issues/474), [474](https://github.com/pytroll/pyresample/issues/474))
+* [PR 463](https://github.com/pytroll/pyresample/pull/463) - Fix EWA default for 'weight_delta_max' to match docstring
+* [PR 462](https://github.com/pytroll/pyresample/pull/462) - Fix hashing of definitions for non contiguous arrays ([439](https://github.com/pytroll/pyresample/issues/439))
+* [PR 438](https://github.com/pytroll/pyresample/pull/438) - Fix using cached LUTs in bilinear resampler
+
+#### Features added
+
+* [PR 473](https://github.com/pytroll/pyresample/pull/473) - Add boundary method to AreaDefinition and SwathDefinition
+* [PR 465](https://github.com/pytroll/pyresample/pull/465) - [Future Spherical Class] Add SPoint and SMultiPoint
+* [PR 455](https://github.com/pytroll/pyresample/pull/455) - Use memoryviews and allow nogil in gradient search ([445](https://github.com/pytroll/pyresample/issues/445))
+* [PR 451](https://github.com/pytroll/pyresample/pull/451) - Refactor the area loading internal function
+
+#### Documentation changes
+
+* [PR 454](https://github.com/pytroll/pyresample/pull/454) - Fix import warning in bilinear resampler to mention dask ([453](https://github.com/pytroll/pyresample/issues/453))
+
+In this release 10 pull requests were closed.
+
+
+## Version 1.25.1 (2022/08/02)
+
+### Pull Requests Merged
+
+#### Bugs fixed
+
+* [PR 447](https://github.com/pytroll/pyresample/pull/447) - Fix handling of lon/lat coordinates on CRS with prime meridian != 0
+
+In this release 1 pull request was closed.
+
+
## Version 1.25.0 (2022/07/29)
### Issues Closed
=====================================
pyresample/area_config.py
=====================================
@@ -195,21 +195,28 @@ def _parse_yaml_area_file(area_file_name, *regions):
params = area_dict.get(area_name)
if params is None:
raise AreaNotFound('Area "{0}" not found in file "{1}"'.format(area_name, area_file_name))
- params.setdefault('area_id', area_name)
- # Optional arguments.
- params['shape'] = _capture_subarguments(params, 'shape', ['height', 'width'])
- params['upper_left_extent'] = _capture_subarguments(params, 'upper_left_extent', ['upper_left_extent', 'x', 'y',
- 'units'])
- params['center'] = _capture_subarguments(params, 'center', ['center', 'x', 'y', 'units'])
- params['area_extent'] = _capture_subarguments(params, 'area_extent', ['area_extent', 'lower_left_xy',
- 'upper_right_xy', 'units'])
- params['resolution'] = _capture_subarguments(params, 'resolution', ['resolution', 'dx', 'dy', 'units'])
- params['radius'] = _capture_subarguments(params, 'radius', ['radius', 'dx', 'dy', 'units'])
- params['rotation'] = _capture_subarguments(params, 'rotation', ['rotation', 'units'])
- res.append(create_area_def(**params))
+ area_def = _create_area_def_from_dict(area_name, params)
+ res.append(area_def)
return res
+def _create_area_def_from_dict(area_name, params):
+ """Create an area definition from a string of parameters."""
+ params.setdefault('area_id', area_name)
+ # Optional arguments.
+ params['shape'] = _capture_subarguments(params, 'shape', ['height', 'width'])
+ params['upper_left_extent'] = _capture_subarguments(params, 'upper_left_extent', ['upper_left_extent', 'x', 'y',
+ 'units'])
+ params['center'] = _capture_subarguments(params, 'center', ['center', 'x', 'y', 'units'])
+ params['area_extent'] = _capture_subarguments(params, 'area_extent', ['area_extent', 'lower_left_xy',
+ 'upper_right_xy', 'units'])
+ params['resolution'] = _capture_subarguments(params, 'resolution', ['resolution', 'dx', 'dy', 'units'])
+ params['radius'] = _capture_subarguments(params, 'radius', ['radius', 'dx', 'dy', 'units'])
+ params['rotation'] = _capture_subarguments(params, 'rotation', ['rotation', 'units'])
+ area_def = create_area_def(**params)
+ return area_def
+
+
def _capture_subarguments(params, arg_name, sub_arg_list):
"""Capture :func:`~pyresample.utils.create_area_def` sub-arguments (i.e. units, height, dx, etc) from a yaml file.
=====================================
pyresample/bilinear/__init__.py
=====================================
@@ -47,7 +47,7 @@ try:
XArrayResamplerBilinear,
)
except ImportError:
- warnings.warn("XArray and/or zarr not found, XArrayBilinearResampler won't be available.")
+ warnings.warn("XArray, dask, and/or zarr not found, XArrayBilinearResampler won't be available.")
XArrayBilinearResampler = None
CACHE_INDICES = None
XArrayResamplerBilinear = None
=====================================
pyresample/bilinear/xarr.py
=====================================
@@ -164,7 +164,7 @@ class XArrayBilinearResampler(BilinearBase):
def from_delayed(delayeds, shp):
return [da.from_delayed(d, shp, np.float32) for d in delayeds]
- data = _check_data_shape(data, self._valid_input_index)
+ data = _check_data_shape(data, self._source_geo_def.shape)
if data.ndim == 2:
shp = self.bilinear_s.shape
else:
@@ -264,10 +264,10 @@ def _get_valid_input_index(source_geo_def,
return valid_input_index, source_lons, source_lats
-def _check_data_shape(data, input_idxs):
+def _check_data_shape(data, input_xy_shape):
"""Check data shape and adjust if necessary."""
# Handle multiple datasets
- if data.ndim > 2 and data.shape[0] * data.shape[1] == input_idxs.shape[0]:
+ if data.ndim > 2 and data.shape[0] * data.shape[1] == input_xy_shape[0]:
# Move the "channel" dimension first
data = da.moveaxis(data, -1, 0)
=====================================
pyresample/boundary.py
=====================================
@@ -18,6 +18,7 @@
"""The Boundary classes."""
import logging
+import warnings
import numpy as np
@@ -54,14 +55,52 @@ class Boundary(object):
class AreaBoundary(Boundary):
- """Area boundary objects."""
+ """Area boundary objects.
+
+ The inputs must be a (lon_coords, lat_coords) tuple for each of the 4 sides.
+ """
def __init__(self, *sides):
Boundary.__init__(self)
+ # Check 4 sides are provided
+ if len(sides) != 4:
+ raise ValueError("AreaBoundary expects 4 sides.")
+ # Retrieve sides
self.sides_lons, self.sides_lats = zip(*sides)
self.sides_lons = list(self.sides_lons)
self.sides_lats = list(self.sides_lats)
+ @classmethod
+ def from_lonlat_sides(cls, lon_sides, lat_sides):
+ """Define AreaBoundary from list of lon_sides and lat_sides.
+
+ For an area of shape (m, n), the sides must adhere the format:
+
+ sides = [np.array([v00, v01, ..., v0n]),
+ np.array([v0n, v1n, ..., vmn]),
+ np.array([vmn, ..., vm1, vm0]),
+ np.array([vm0, ... ,v10, v00])]
+ """
+ boundary = cls(*zip(lon_sides, lat_sides))
+ return boundary
+
+ def contour(self):
+ """Get the (lons, lats) tuple of the boundary object.
+
+ It excludes the last element of each side because it's included in the next side.
+ """
+ lons = np.concatenate([lns[:-1] for lns in self.sides_lons])
+ lats = np.concatenate([lts[:-1] for lts in self.sides_lats])
+ return lons, lats
+
+ @property
+ def vertices(self):
+ """Return boundary polygon vertices."""
+ lons, lats = self.contour()
+ vertices = np.vstack((lons, lats)).T
+ vertices = vertices.astype(np.float64, copy=False) # Important for spherical ops.
+ return vertices
+
def decimate(self, ratio):
"""Remove some points in the boundaries, but never the corners."""
for i in range(len(self.sides_lons)):
@@ -76,19 +115,15 @@ class AreaBoundary(Boundary):
self.sides_lons[i] = self.sides_lons[i][points]
self.sides_lats[i] = self.sides_lats[i][points]
- def contour(self):
- """Get the (lons, lats) tuple of the boundary object."""
- lons = np.concatenate([lns[:-1] for lns in self.sides_lons])
- lats = np.concatenate([lts[:-1] for lts in self.sides_lats])
-
- return lons, lats
-
class AreaDefBoundary(AreaBoundary):
"""Boundaries for area definitions (pyresample)."""
def __init__(self, area, frequency=1):
lons, lats = area.get_bbox_lonlats()
+ warnings.warn("'AreaDefBoundary' will be removed in the future. " +
+ "Use the Swath/AreaDefinition 'boundary' method instead!.",
+ PendingDeprecationWarning)
AreaBoundary.__init__(self,
*zip(lons, lats))
=====================================
pyresample/ewa/_legacy_dask_ewa.py
=====================================
@@ -223,7 +223,7 @@ class LegacyDaskEWAResampler(BaseResampler):
def compute(self, data, cache_id=None, fill_value=0, weight_count=10000,
weight_min=0.01, weight_distance_max=1.0,
- weight_delta_max=1.0, weight_sum_min=-1.0,
+ weight_delta_max=10.0, weight_sum_min=-1.0,
maximum_weight_mode=False, grid_coverage=0, chunks=None,
**kwargs):
"""Resample the data according to the precomputed X/Y coordinates."""
=====================================
pyresample/ewa/dask_ewa.py
=====================================
@@ -380,7 +380,7 @@ class DaskEWAResampler(BaseResampler):
def compute(self, data, cache_id=None, rows_per_scan=None, chunks=None, fill_value=None,
weight_count=10000, weight_min=0.01, weight_distance_max=1.0,
- weight_delta_max=1.0, weight_sum_min=-1.0,
+ weight_delta_max=10.0, weight_sum_min=-1.0,
maximum_weight_mode=None, **kwargs):
"""Resample the data according to the precomputed X/Y coordinates."""
# not used in this step
@@ -454,7 +454,7 @@ class DaskEWAResampler(BaseResampler):
def resample(self, data, cache_dir=None, mask_area=None,
rows_per_scan=None, persist=False, chunks=None, fill_value=None,
weight_count=10000, weight_min=0.01, weight_distance_max=1.0,
- weight_delta_max=1.0, weight_sum_min=-1.0,
+ weight_delta_max=10.0, weight_sum_min=-1.0,
maximum_weight_mode=None):
"""Resample using an elliptical weighted averaging algorithm.
=====================================
pyresample/ewa/ewa.py
=====================================
@@ -149,7 +149,9 @@ def fornav(cols, rows, area_def, data_in,
"""
if isinstance(data_in, (tuple, list)):
# we can only support one data type per call at this time
- assert(in_arr.dtype == data_in[0].dtype for in_arr in data_in[1:])
+ for in_arr in data_in[1:]:
+ if in_arr.dtype != data_in[0].dtype:
+ raise ValueError("All input arrays must be the same dtype")
else:
# assume they gave us a single numpy array-like object
data_in = [data_in]
=====================================
pyresample/future/spherical/__init__.py
=====================================
@@ -0,0 +1,20 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+#
+# Copyright (c) 2021 Pyresample developers
+#
+# This program is free software: you can redistribute it and/or modify it under
+# the terms of the GNU Lesser General Public License as published by the Free
+# Software Foundation, either version 3 of the License, or (at your option) any
+# later version.
+#
+# This program is distributed in the hope that it will be useful, but WITHOUT
+# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
+# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
+# details.
+#
+# You should have received a copy of the GNU Lesser General Public License along
+# with this program. If not, see <http://www.gnu.org/licenses/>.
+"""Future features that are backwards incompatible with current functionality."""
+
+from .point import SMultiPoint, SPoint # noqa
=====================================
pyresample/future/spherical/point.py
=====================================
@@ -0,0 +1,88 @@
+#!/usr/bin/env python3
+# -*- coding: utf-8 -*-
+#
+# Copyright (c) 2022 Pyresample developers
+#
+# This program is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <http://www.gnu.org/licenses/>.
+"""Define single and multiple points on the sphere through SPoint and SMultiPoint classes."""
+import numpy as np
+
+from pyresample.spherical import SCoordinate
+
+
+class SPoint(SCoordinate):
+ """Object representing a single point on a sphere.
+
+ The ``lon`` and ``lat`` coordinates must be provided in radians.
+ """
+
+ def __init__(self, lon, lat):
+ lon = np.asarray(lon)
+ lat = np.asarray(lat)
+ if lon.size > 1 or lat.size > 1:
+ raise ValueError("Use SMultiPoint to define multiple points.")
+ super().__init__(lon, lat)
+
+ @classmethod
+ def from_degrees(cls, lon, lat):
+ """Create SPoint from lon/lat coordinates in degrees."""
+ return cls(np.deg2rad(lon), np.deg2rad(lat))
+
+ def __str__(self):
+ """Get simplified representation of lon/lat arrays in radians."""
+ return str((float(self.lon), float(self.lat)))
+
+ def __repr__(self):
+ """Get simplified representation of lon/lat arrays in radians."""
+ return str((float(self.lon), float(self.lat)))
+
+ def to_shapely(self):
+ """Convert the SPoint to a shapely Point (in lon/lat degrees)."""
+ from shapely.geometry import Point
+ point = Point(*self.vertices_in_degrees[0])
+ return point
+
+
+class SMultiPoint(SCoordinate):
+ """Object representing multiple points on a sphere."""
+
+ def __init__(self, lon, lat):
+ lon = np.asarray(lon)
+ lat = np.asarray(lat)
+ if lon.ndim == 0 or lat.ndim == 0:
+ raise ValueError("Use SPoint to define single points.")
+ super().__init__(lon, lat)
+
+ @classmethod
+ def from_degrees(cls, lon, lat):
+ """Create SMultiPoint from lon/lat coordinates in degrees."""
+ return cls(np.deg2rad(lon), np.deg2rad(lat))
+
+ def __eq__(self, other):
+ """Check equality."""
+ return np.allclose(self.lon, other.lon) and np.allclose(self.lat, other.lat)
+
+ def __str__(self):
+ """Get simplified representation of lon/lat arrays in radians."""
+ return str(self.vertices)
+
+ def __repr__(self):
+ """Get simplified representation of lon/lat arrays in radians."""
+ return str(self.vertices)
+
+ def to_shapely(self):
+ """Convert the SMultiPoint to a shapely MultiPoint (in lon/lat degrees)."""
+ from shapely.geometry import MultiPoint
+ point = MultiPoint(self.vertices_in_degrees)
+ return point
=====================================
pyresample/geometry.py
=====================================
@@ -327,7 +327,23 @@ class BaseDefinition:
(s4_dim1.squeeze(), s4_dim2.squeeze())])
if hasattr(dim1[0], 'compute') and da is not None:
dim1, dim2 = da.compute(dim1, dim2)
- return dim1, dim2
+ clean_dim1, clean_dim2 = self._filter_bbox_nans(dim1, dim2)
+ return clean_dim1, clean_dim2
+
+ def _filter_bbox_nans(
+ self,
+ dim1_sides: list[np.ndarray],
+ dim2_sides: list[np.ndarray],
+ ) -> tuple[list[np.ndarray], list[np.ndarray]]:
+ new_dim1_sides = []
+ new_dim2_sides = []
+ for dim1_side, dim2_side in zip(dim1_sides, dim2_sides):
+ is_valid_mask = ~(np.isnan(dim1_side) | np.isnan(dim2_side))
+ if not is_valid_mask.any():
+ raise ValueError("Can't compute swath bounding coordinates. At least one side is completely invalid.")
+ new_dim1_sides.append(dim1_side[is_valid_mask])
+ new_dim2_sides.append(dim2_side[is_valid_mask])
+ return new_dim1_sides, new_dim2_sides
def _get_bbox_slices(self, frequency):
height, width = self.shape
@@ -391,6 +407,28 @@ class BaseDefinition:
x, y = self._get_bbox_elements(self.get_proj_coords, frequency)
return np.hstack(x), np.hstack(y)
+ def boundary(self, frequency=None, force_clockwise=False):
+ """Retrieve the AreaBoundary object.
+
+ Parameters
+ ----------
+ frequency:
+ The number of points to provide for each side. By default (None)
+ the full width and height will be provided.
+ force_clockwise:
+ Perform minimal checks and reordering of coordinates to ensure
+ that the returned coordinates follow a clockwise direction.
+ This is important for compatibility with
+ :class:`pyresample.spherical.SphPolygon` where operations depend
+ on knowing the inside versus the outside of a polygon. These
+ operations assume that coordinates are clockwise.
+ Default is False.
+ """
+ from pyresample.boundary import AreaBoundary
+ lon_sides, lat_sides = self.get_bbox_lonlats(frequency=frequency,
+ force_clockwise=force_clockwise)
+ return AreaBoundary.from_lonlat_sides(lon_sides, lat_sides)
+
def get_cartesian_coords(self, nprocs=None, data_slice=None, cache=False):
"""Retrieve cartesian coordinates of geometry definition.
@@ -708,7 +746,7 @@ def get_array_hashable(arr):
try:
return arr.name.encode('utf-8') # dask array
except AttributeError:
- return np.asarray(arr).view(np.uint8) # np array
+ return np.ascontiguousarray(arr).view(np.uint8) # np array
class SwathDefinition(CoordinateDefinition):
@@ -1507,6 +1545,59 @@ class AreaDefinition(_ProjectionDefinition):
return False
return 'geostationary' in coord_operation.method_name.lower()
+ def _get_geo_boundary_sides(self, frequency=None):
+ """Retrieve the boundary sides list for geostationary projections."""
+ # Define default frequency
+ if frequency is None:
+ frequency = 50
+ # Ensure at least 4 points are used
+ if frequency < 4:
+ frequency = 4
+ # Ensure an even number of vertices for side creation
+ if (frequency % 2) != 0:
+ frequency = frequency + 1
+ lons, lats = get_geostationary_bounding_box(self, nb_points=frequency)
+ # Retrieve dummy sides for GEO (side1 and side3 always of length 2)
+ side02_step = int(frequency / 2) - 1
+ lon_sides = [lons[slice(0, side02_step + 1)],
+ lons[slice(side02_step, side02_step + 1 + 1)],
+ lons[slice(side02_step + 1, side02_step * 2 + 1 + 1)],
+ np.append(lons[side02_step * 2 + 1], lons[0])
+ ]
+ lat_sides = [lats[slice(0, side02_step + 1)],
+ lats[slice(side02_step, side02_step + 1 + 1)],
+ lats[slice(side02_step + 1, side02_step * 2 + 1 + 1)],
+ np.append(lats[side02_step * 2 + 1], lats[0])
+ ]
+ return lon_sides, lat_sides
+
+ def boundary(self, frequency=None, force_clockwise=False):
+ """Retrieve the AreaBoundary object.
+
+ Parameters
+ ----------
+ frequency:
+ The number of points to provide for each side. By default (None)
+ the full width and height will be provided, except for geostationary
+ projection where by default only 50 points are selected.
+ force_clockwise:
+ Perform minimal checks and reordering of coordinates to ensure
+ that the returned coordinates follow a clockwise direction.
+ This is important for compatibility with
+ :class:`pyresample.spherical.SphPolygon` where operations depend
+ on knowing the inside versus the outside of a polygon. These
+ operations assume that coordinates are clockwise.
+ Default is False.
+ """
+ from pyresample.boundary import AreaBoundary
+ if self.is_geostationary:
+ lon_sides, lat_sides = self._get_geo_boundary_sides(frequency=frequency)
+ else:
+ lon_sides, lat_sides = self.get_bbox_lonlats(frequency=frequency,
+ force_clockwise=force_clockwise)
+ boundary = AreaBoundary.from_lonlat_sides(lon_sides, lat_sides)
+ return boundary
+
@property
def area_extent(self):
"""Tuple of this area's extent (xmin, ymin, xmax, ymax)."""
@@ -2691,24 +2782,30 @@ def get_geostationary_angle_extent(geos_area):
def get_geostationary_bounding_box_in_proj_coords(geos_area, nb_points=50):
"""Get the bbox in geos projection coordinates of the valid pixels inside `geos_area`.
- Args:
- nb_points: Number of points on the polygon
+ Notes:
+ - The first and last element of the output vectors are equal.
+ - If nb_points is even, it will return x and y vectors of length nb_points + 1.
+
+ Parameters
+ ----------
+ nb_points : Number of points on the polygon.
+
"""
xmax, ymax = get_geostationary_angle_extent(geos_area)
h = get_geostationary_height(geos_area.crs)
# generate points around the north hemisphere in satellite projection
# make it a bit smaller so that we stay inside the valid area
- x = np.cos(np.linspace(-np.pi, 0, int(nb_points / 2.0))) * (xmax - 0.0001)
- y = -np.sin(np.linspace(-np.pi, 0, int(nb_points / 2.0))) * (ymax - 0.0001)
+ x = np.cos(np.linspace(-np.pi, 0, int(nb_points / 2.0) + 1)) * (xmax - 0.0001)
+ y = -np.sin(np.linspace(-np.pi, 0, int(nb_points / 2.0) + 1)) * (ymax - 0.0001)
ll_x, ll_y, ur_x, ur_y = geos_area.area_extent
x *= h
y *= h
-
- x = np.clip(np.concatenate([x, x[::-1]]), min(ll_x, ur_x), max(ll_x, ur_x))
- y = np.clip(np.concatenate([y, -y]), min(ll_y, ur_y), max(ll_y, ur_y))
+ # We remove one element with [:-1] to avoid duplicate values at the equator
+ x = np.clip(np.concatenate([x[:-1], x[::-1]]), min(ll_x, ur_x), max(ll_x, ur_x))
+ y = np.clip(np.concatenate([y[:-1], -y]), min(ll_y, ur_y), max(ll_y, ur_y))
return x, y
@@ -2719,7 +2816,9 @@ def get_geostationary_bounding_box_in_lonlats(geos_area, nb_points=50):
Args:
nb_points: Number of points on the polygon
"""
- return get_geostationary_bounding_box(geos_area, nb_points)
+ x, y = get_geostationary_bounding_box_in_proj_coords(geos_area, nb_points)
+ lons, lats = Proj(geos_area.crs)(x, y, inverse=True)
+ return lons, lats
def get_geostationary_bounding_box(geos_area, nb_points=50):
@@ -2728,9 +2827,10 @@ def get_geostationary_bounding_box(geos_area, nb_points=50):
Args:
nb_points: Number of points on the polygon
"""
- x, y = get_geostationary_bounding_box_in_proj_coords(geos_area, nb_points)
-
- return Proj(geos_area.crs)(x, y, inverse=True)
+ warnings.warn("'get_geostationary_bounding_box' is deprecated. Please use "
+ "'get_geostationary_bounding_box_in_lonlats' instead.",
+ DeprecationWarning)
+ return get_geostationary_bounding_box_in_lonlats(geos_area, nb_points)
def combine_area_extents_vertical(area1, area2):
=====================================
pyresample/gradient/_gradient_search.pyx
=====================================
@@ -94,15 +94,15 @@ ctypedef void (*FN)(const DTYPE_t[:, :, :] data, int l0, int p0, double dl, doub
@cython.boundscheck(False)
@cython.wraparound(False)
-cpdef one_step_gradient_search(np.ndarray[DTYPE_t, ndim=3] data,
- np.ndarray[DTYPE_t, ndim=2] src_x,
- np.ndarray[DTYPE_t, ndim=2] src_y,
- np.ndarray[DTYPE_t, ndim=2] xl,
- np.ndarray[DTYPE_t, ndim=2] xp,
- np.ndarray[DTYPE_t, ndim=2] yl,
- np.ndarray[DTYPE_t, ndim=2] yp,
- np.ndarray[DTYPE_t, ndim=2] dst_x,
- np.ndarray[DTYPE_t, ndim=2] dst_y,
+cpdef one_step_gradient_search(const DTYPE_t [:, :, :] data,
+ DTYPE_t [:, :] src_x,
+ DTYPE_t [:, :] src_y,
+ DTYPE_t [:, :] xl,
+ DTYPE_t [:, :] xp,
+ DTYPE_t [:, :] yl,
+ DTYPE_t [:, :] yp,
+ DTYPE_t [:, :] dst_x,
+ DTYPE_t [:, :] dst_y,
method='bilinear'):
"""Gradient search, simple case variant."""
cdef FN fun
@@ -119,16 +119,17 @@ cpdef one_step_gradient_search(np.ndarray[DTYPE_t, ndim=3] data,
# output image array --> needs to be (lines, pixels) --> y,x
- cdef np.ndarray[DTYPE_t, ndim = 3] image = np.full([z_size, y_size, x_size], np.nan, dtype=DTYPE)
- cdef np.ndarray[size_t, ndim = 1] elements = np.arange(x_size, dtype=np.uintp)
-
- one_step_gradient_search_no_gil(data,
- src_x, src_y,
- xl, xp, yl, yp,
- dst_x, dst_y,
- x_size, y_size,
- fun, image,
- elements)
+ image = np.full([z_size, y_size, x_size], np.nan, dtype=DTYPE)
+ cdef DTYPE_t [:, :, :] image_view = image
+ cdef size_t [:] elements = np.arange(x_size, dtype=np.uintp)
+ with nogil:
+ one_step_gradient_search_no_gil(data,
+ src_x, src_y,
+ xl, xp, yl, yp,
+ dst_x, dst_y,
+ x_size, y_size,
+ fun, image_view,
+ elements)
# return the output image
return image
@@ -213,14 +214,14 @@ cdef void one_step_gradient_search_no_gil(const DTYPE_t[:, :, :] data,
@cython.boundscheck(False)
@cython.wraparound(False)
-cpdef one_step_gradient_indices(np.ndarray[DTYPE_t, ndim=2] src_x,
- np.ndarray[DTYPE_t, ndim=2] src_y,
- np.ndarray[DTYPE_t, ndim=2] xl,
- np.ndarray[DTYPE_t, ndim=2] xp,
- np.ndarray[DTYPE_t, ndim=2] yl,
- np.ndarray[DTYPE_t, ndim=2] yp,
- np.ndarray[DTYPE_t, ndim=2] dst_x,
- np.ndarray[DTYPE_t, ndim=2] dst_y):
+cpdef one_step_gradient_indices(DTYPE_t [:, :] src_x,
+ DTYPE_t [:, :] src_y,
+ DTYPE_t [:, :] xl,
+ DTYPE_t [:, :] xp,
+ DTYPE_t [:, :] yl,
+ DTYPE_t [:, :] yp,
+ DTYPE_t [:, :] dst_x,
+ DTYPE_t [:, :] dst_y):
"""Gradient search, simple case variant, returning float indices.
This is appropriate for monotonous gradients only, i.e. not modis or viirs in satellite projection.
@@ -232,17 +233,19 @@ cpdef one_step_gradient_indices(np.ndarray[DTYPE_t, ndim=2] src_x,
cdef size_t x_size = dst_x.shape[1]
# output indices arrays --> needs to be (lines, pixels) --> y,x
- cdef np.ndarray[DTYPE_t, ndim = 3] indices = np.full([2, y_size, x_size], np.nan, dtype=DTYPE)
- cdef np.ndarray[size_t, ndim = 1] elements = np.arange(x_size, dtype=np.uintp)
+ indices = np.full([2, y_size, x_size], np.nan, dtype=DTYPE)
+ cdef DTYPE_t [:, :, :] indices_view = indices
+ cdef size_t [:] elements = np.arange(x_size, dtype=np.uintp)
# fake_data is not going to be used anyway as we just fill in the indices
- cdef np.ndarray[DTYPE_t, ndim = 3] fake_data = np.full([1, 1, 1], np.nan, dtype=DTYPE)
-
- one_step_gradient_search_no_gil(fake_data,
- src_x, src_y,
- xl, xp, yl, yp,
- dst_x, dst_y,
- x_size, y_size,
- indices_xy, indices,
- elements)
+ cdef DTYPE_t [:, :, :] fake_data = np.full([1, 1, 1], np.nan, dtype=DTYPE)
+
+ with nogil:
+ one_step_gradient_search_no_gil(fake_data,
+ src_x, src_y,
+ xl, xp, yl, yp,
+ dst_x, dst_y,
+ x_size, y_size,
+ indices_xy, indices_view,
+ elements)
return indices
=====================================
pyresample/spherical.py
=====================================
@@ -34,22 +34,128 @@ def _unwrap_radians(val, mod=np.pi):
return (val + mod) % (2 * mod) - mod
+def _xyz_to_vertices(x, y, z):
+ """Create vertices array from x,y,z values or vectors.
+
+ If x, y, z are scalar arrays, it creates a 1x3 np.array.
+ If x, y, z are np.array with shape nx1, it creates a nx3 np.array.
+ """
+ if x.ndim == 0:
+ vertices = np.array([x, y, z])
+ else:
+ vertices = np.vstack([x, y, z]).T
+ return vertices
+
+
+def _ensure_is_array(arr):
+ """Ensure that a possible np.value input is converted to np.array."""
+ if arr.ndim == 0:
+ arr = np.asarray([arr])
+ return arr
+
+
+def _vincenty_matrix(lon, lat, lon_ref, lat_ref):
+ """Compute a distance matrix using Vincenty formula.
+
+ The lon/lat inputs must be provided in radians !
+ The output must be multiplied by the Earth radius to obtain the distance in m or km.
+ The returned distance matrix has shape (n x n_ref).
+ """
+ lon = _ensure_is_array(lon)
+ lat = _ensure_is_array(lat)
+ lon_ref = _ensure_is_array(lon_ref)
+ lat_ref = _ensure_is_array(lat_ref)
+ lon = lon[:, np.newaxis]
+ lat = lat[:, np.newaxis]
+ diff_lon = lon - lon_ref
+ num = ((np.cos(lat_ref) * np.sin(diff_lon)) ** 2 +
+ (np.cos(lat) * np.sin(lat_ref) -
+ np.sin(lat) * np.cos(lat_ref) * np.cos(diff_lon)) ** 2)
+ den = (np.sin(lat) * np.sin(lat_ref) +
+ np.cos(lat) * np.cos(lat_ref) * np.cos(diff_lon))
+ dist = np.arctan2(num ** .5, den)
+ return dist
+
+
+def _haversine_matrix(lon, lat, lon_ref, lat_ref):
+ """Compute a distance matrix using haversine formula.
+
+ The lon/lat inputs must be provided in radians !
+ The output must be multiplied by the Earth radius to obtain the distance in m or km.
+ The returned distance matrix has shape (n x n_ref).
+ """
+ lon = _ensure_is_array(lon)
+ lat = _ensure_is_array(lat)
+ lon_ref = _ensure_is_array(lon_ref)
+ lat_ref = _ensure_is_array(lat_ref)
+ lon = lon[:, np.newaxis]
+ lat = lat[:, np.newaxis]
+ diff_lon = lon - lon_ref # n x n_ref matrix
+ diff_lat = lat - lat_ref # n x n_ref matrix
+ a = np.sin(diff_lat / 2.0) ** 2.0 + np.cos(lat) * np.cos(lat_ref) * np.sin(diff_lon / 2.0) ** 2.0
+ dist = 2.0 * np.arcsin(a ** .5) # equivalent of; 2.0 * np.arctan2(np.sqrt(a), np.sqrt(1-a))
+ return dist
+
+
+def _check_lon_validity(lon):
+ """Check longitude validity."""
+ if np.any(np.isinf(lon)):
+ raise ValueError("Longitude values can not contain inf values.")
+
+
+def _check_lat_validity(lat):
+ """Check latitude validity."""
+ if np.any(np.isinf(lat)):
+ raise ValueError("Latitude values can not contain inf values.")
+ if np.any(np.logical_or(lat > np.pi / 2, lat < -np.pi / 2)):
+ raise ValueError("Latitude values must range between [-pi/2, pi/2].")
+
+
+def _check_lon_lat(lon, lat):
+ """Check and format lon/lat values/arrays."""
+ lon = np.asarray(lon, dtype=np.float64)
+ lat = np.asarray(lat, dtype=np.float64)
+ _check_lon_validity(lon)
+ _check_lat_validity(lat)
+ return lon, lat
+
+
class SCoordinate(object):
"""Spherical coordinates.
- The ``lon`` and ``lat`` coordinates should be provided in radians.
+ The ``lon`` and ``lat`` coordinates must be provided in radians.
"""
def __init__(self, lon, lat):
- if np.isfinite(lon):
- self.lon = float(_unwrap_radians(lon))
- else:
- self.lon = float(lon)
+ lon, lat = _check_lon_lat(lon, lat)
+ self.lon = _unwrap_radians(lon)
self.lat = lat
+ @property
+ def vertices(self):
+ """Return point(s) radians vertices in a ndarray of shape [n,2]."""
+ # Single values
+ if self.lon.ndim == 0:
+ vertices = np.array([self.lon, self.lat])[np.newaxis, :]
+ # Array values
+ else:
+ vertices = np.vstack((self.lon, self.lat)).T
+ return vertices
+
+ @property
+ def vertices_in_degrees(self):
+ """Return point(s) degrees vertices in a ndarray of shape [n,2]."""
+ return np.rad2deg(self.vertices)
+
def cross2cart(self, point):
- """Compute the cross product, and convert to cartesian coordinates."""
+ """Compute the cross product, and convert to cartesian coordinates.
+
+ Note:
+ - the cross product of the same point gives a zero vector.
+ - the cross product between points lying at the equator gives a zero vector.
+ - the cross product between points lying at the poles.
+ """
lat1 = self.lat
lon1 = self.lon
lat2 = point.lat
@@ -62,35 +168,48 @@ class SCoordinate(object):
g = np.cos(lat1)
h = np.cos(lat2)
i = np.sin(lon2 - lon1)
- res = CCoordinate(np.array([-ad * c + be * f,
- ad * f + be * c,
- g * h * i]))
-
+ x = -ad * c + be * f
+ y = ad * f + be * c
+ z = g * h * i
+ vertices = _xyz_to_vertices(x, y, z)
+ res = CCoordinate(vertices)
return res
def to_cart(self):
"""Convert to cartesian."""
- return CCoordinate(np.array([np.cos(self.lat) * np.cos(self.lon),
- np.cos(self.lat) * np.sin(self.lon),
- np.sin(self.lat)]))
+ x = np.cos(self.lat) * np.cos(self.lon)
+ y = np.cos(self.lat) * np.sin(self.lon)
+ z = np.sin(self.lat)
+ vertices = _xyz_to_vertices(x, y, z)
+ return CCoordinate(vertices)
def distance(self, point):
- """Get distance using Vincenty formula."""
- dlambda = self.lon - point.lon
- num = ((np.cos(point.lat) * np.sin(dlambda)) ** 2 +
- (np.cos(self.lat) * np.sin(point.lat) -
- np.sin(self.lat) * np.cos(point.lat) *
- np.cos(dlambda)) ** 2)
- den = (np.sin(self.lat) * np.sin(point.lat) +
- np.cos(self.lat) * np.cos(point.lat) * np.cos(dlambda))
+ """Get distance using Vincenty formula.
- return np.arctan2(num ** .5, den)
+ The result must be multiplied by Earth radius to obtain distance in m or km.
+ """
+ lat = self.lat
+ lon = self.lon
+ lon_ref = point.lon
+ lat_ref = point.lat
+ dist = _vincenty_matrix(lon, lat, lon_ref, lat_ref)
+ if dist.size == 1: # single point case
+ dist = dist.item()
+ return dist
def hdistance(self, point):
- """Get distance using Haversine formula."""
- return 2 * np.arcsin((np.sin((point.lat - self.lat) / 2.0) ** 2.0 +
- np.cos(point.lat) * np.cos(self.lat) *
- np.sin((point.lon - self.lon) / 2.0) ** 2.0) ** .5)
+ """Get distance using Haversine formula.
+
+ The result must be multiplied by Earth radius to obtain distance in m or km.
+ """
+ lat = self.lat
+ lon = self.lon
+ lon_ref = point.lon
+ lat_ref = point.lat
+ dist = _haversine_matrix(lon, lat, lon_ref, lat_ref)
+ if dist.size == 1: # single point case
+ dist = dist.item()
+ return dist
def __ne__(self, other):
"""Check inequality."""
@@ -112,6 +231,47 @@ class SCoordinate(object):
"""Get iterator over lon/lat pairs."""
return zip([self.lon, self.lat]).__iter__()
+ def plot(self, ax=None,
+ projection_crs=None,
+ add_coastlines=True,
+ add_gridlines=True,
+ add_background=True,
+ **plot_kwargs):
+ """Plot the point(s) using Cartopy.
+
+ Assume vertices to be in radians.
+ """
+ import matplotlib.pyplot as plt
+ try:
+ import cartopy.crs as ccrs
+ except ModuleNotFoundError:
+ raise ModuleNotFoundError(
+ "Install cartopy to plot spherical geometries. For example, 'pip install cartopy'.")
+
+ # Create figure if ax not provided
+ if ax is None:
+ if projection_crs is None:
+ projection_crs = ccrs.PlateCarree()
+ fig, ax = plt.subplots(subplot_kw=dict(projection=projection_crs))
+
+ # Plot Points
+ vertices = self.vertices_in_degrees
+ ax.scatter(x=vertices[:, 0],
+ y=vertices[:, 1],
+ **plot_kwargs)
+
+ # Beautify plots
+ if add_background:
+ ax.stock_img()
+ if add_coastlines:
+ ax.coastlines()
+ if add_gridlines():
+ gl = ax.gridlines(draw_labels=True, linestyle='--')
+ gl.xlabels_top = False
+ gl.ylabels_right = False
+
+ return ax
+
class CCoordinate(object):
"""Cartesian coordinates."""
@@ -123,14 +283,28 @@ class CCoordinate(object):
"""Get Euclidean norm of the vector."""
return np.sqrt(np.einsum('...i, ...i', self.cart, self.cart))
- def normalize(self):
- """Normalize the vector."""
- self.cart /= np.sqrt(np.einsum('...i, ...i', self.cart, self.cart))
+ def normalize(self, inplace=False):
+ """Normalize the vector.
- return self
+ If self.cart == [0,0,0], norm=0, and cart becomes [nan, nan, nan]:
+ Note that self.cart == [0,0,0] can occurs when computing:
+ - the cross product of the same point.
+ - the cross product between points lying at the equator.
+ - the cross product between points lying at the poles.
+ """
+ norm = self.norm()
+ norm = norm[..., np.newaxis] # enable vectorization
+ if inplace:
+ self.cart /= norm
+ return None
+ cart = self.cart / norm
+ return CCoordinate(cart)
def cross(self, point):
- """Get cross product with another vector."""
+ """Get cross product with another vector.
+
+ The cross product of the same vector gives a zero vector.
+ """
return CCoordinate(np.cross(self.cart, point.cart))
def dot(self, point):
@@ -176,9 +350,11 @@ class CCoordinate(object):
return self.__mul__(other)
def to_spherical(self):
- """Convert to Spherical coordinate object."""
- return SCoordinate(np.arctan2(self.cart[1], self.cart[0]),
- np.arcsin(self.cart[2]))
+ """Convert to SPoint/SMultiPoint object."""
+ # TODO: this in future should point to SPoint or SMultiPoint
+ lon = np.arctan2(self.cart[..., 1], self.cart[..., 0])
+ lat = np.arcsin(self.cart[..., 2])
+ return SCoordinate(lon, lat)
class Arc(object):
@@ -236,6 +412,7 @@ class Arc(object):
ub_ = a__.cross2cart(c__)
val = ua_.dot(ub_) / (ua_.norm() * ub_.norm())
+
if abs(val - 1) < EPSILON:
angle = 0
elif abs(val + 1) < EPSILON:
@@ -305,10 +482,10 @@ class Arc(object):
ab_ = a__.hdistance(b__)
cd_ = c__.hdistance(d__)
- if(((i in (a__, b__)) or
+ if (((i in (a__, b__)) or
(abs(a__.hdistance(i) + b__.hdistance(i) - ab_) < EPSILON)) and
- ((i in (c__, d__)) or
- (abs(c__.hdistance(i) + d__.hdistance(i) - cd_) < EPSILON))):
+ ((i in (c__, d__)) or
+ (abs(c__.hdistance(i) + d__.hdistance(i) - cd_) < EPSILON))):
return i
return None
=====================================
pyresample/spherical_geometry.py
=====================================
@@ -50,7 +50,7 @@ class Coordinate(object):
x__=None, y__=None, z__=None, R__=1):
self.R__ = R__
if lat is not None and lon is not None:
- if not(-180 <= lon <= 180 and -90 <= lat <= 90):
+ if not (-180 <= lon <= 180 and -90 <= lat <= 90):
raise ValueError('Illegal (lon, lat) coordinates: (%s, %s)'
% (lon, lat))
self.lat = math.radians(lat)
@@ -75,8 +75,8 @@ class Coordinate(object):
def __ne__(self, other):
"""Check inequality."""
- if(abs(self.lat - other.lat) < EPSILON and
- abs(self.lon - other.lon) < EPSILON):
+ if (abs(self.lat - other.lat) < EPSILON and
+ abs(self.lon - other.lon) < EPSILON):
return 0
else:
return 1
@@ -286,8 +286,8 @@ class Arc(object):
ab_ = a__.distance(b__)
cd_ = c__.distance(d__)
- if(abs(a__.distance(i) + b__.distance(i) - ab_) < EPSILON and
- abs(c__.distance(i) + d__.distance(i) - cd_) < EPSILON):
+ if (abs(a__.distance(i) + b__.distance(i) - ab_) < EPSILON and
+ abs(c__.distance(i) + d__.distance(i) - cd_) < EPSILON):
return i
return None
=====================================
pyresample/test/test_area_config.py
=====================================
@@ -0,0 +1,228 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+#
+# Copyright (c) 2015-2022 Pyresample developers
+#
+# This program is free software: you can redistribute it and/or modify it under
+# the terms of the GNU Lesser General Public License as published by the Free
+# Software Foundation, either version 3 of the License, or (at your option) any
+# later version.
+#
+# This program is distributed in the hope that it will be useful, but WITHOUT
+# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
+# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
+# details.
+#
+# You should have received a copy of the GNU Lesser General Public License along
+# with this program. If not, see <http://www.gnu.org/licenses/>.
+"""Test the area_config functions."""
+
+import io
+import os
+import pathlib
+import unittest
+
+import numpy as np
+
+
+class TestLegacyAreaParser(unittest.TestCase):
+ """Test legacy .cfg parsing."""
+
+ def test_area_parser_legacy(self):
+ """Test legacy area parser."""
+ from pyresample import parse_area_file
+ ease_nh, ease_sh = parse_area_file(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'),
+ 'ease_nh', 'ease_sh')
+
+ # pyproj 2.0+ adds some extra parameters
+ projection = ("{'R': '6371228', 'lat_0': '90', 'lon_0': '0', "
+ "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
+ "'units': 'm', 'x_0': '0', 'y_0': '0'}")
+ nh_str = """Area ID: ease_nh
+Description: Arctic EASE grid
+Projection ID: ease_nh
+Projection: {}
+Number of columns: 425
+Number of rows: 425
+Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
+ self.assertEqual(ease_nh.__str__(), nh_str)
+ self.assertIsInstance(ease_nh.proj_dict['lat_0'], (int, float))
+
+ projection = ("{'R': '6371228', 'lat_0': '-90', 'lon_0': '0', "
+ "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
+ "'units': 'm', 'x_0': '0', 'y_0': '0'}")
+ sh_str = """Area ID: ease_sh
+Description: Antarctic EASE grid
+Projection ID: ease_sh
+Projection: {}
+Number of columns: 425
+Number of rows: 425
+Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
+ self.assertEqual(ease_sh.__str__(), sh_str)
+ self.assertIsInstance(ease_sh.proj_dict['lat_0'], (int, float))
+
+ def test_load_area(self):
+ from pyresample import load_area
+ ease_nh = load_area(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'), 'ease_nh')
+ # pyproj 2.0+ adds some extra parameters
+ projection = ("{'R': '6371228', 'lat_0': '90', 'lon_0': '0', "
+ "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
+ "'units': 'm', 'x_0': '0', 'y_0': '0'}")
+ nh_str = """Area ID: ease_nh
+Description: Arctic EASE grid
+Projection ID: ease_nh
+Projection: {}
+Number of columns: 425
+Number of rows: 425
+Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
+ self.assertEqual(nh_str, ease_nh.__str__())
+
+ def test_area_file_not_found_exception(self):
+ from pyresample.area_config import load_area
+ self.assertRaises(FileNotFoundError, load_area,
+ "/this/file/does/not/exist.yaml")
+ self.assertRaises(FileNotFoundError, load_area,
+ pathlib.Path("/this/file/does/not/exist.yaml"))
+
+ def test_not_found_exception(self):
+ from pyresample.area_config import AreaNotFound, parse_area_file
+ self.assertRaises(AreaNotFound, parse_area_file,
+ os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'), 'no_area')
+
+ def test_commented(self):
+ from pyresample import parse_area_file
+ areas = parse_area_file(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'))
+ self.assertNotIn('commented', [area.name for area in areas])
+
+
+class TestYAMLAreaParser(unittest.TestCase):
+ """Test YAML parsing."""
+
+ def test_area_parser_yaml(self):
+ """Test YAML area parser."""
+ from pyresample import parse_area_file
+ test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
+ test_areas = parse_area_file(test_area_file, 'ease_nh', 'ease_sh', 'test_meters', 'test_degrees',
+ 'test_latlong')
+ ease_nh, ease_sh, test_m, test_deg, test_latlong = test_areas
+
+ # pyproj 2.0+ adds some extra parameters
+ projection = ("{'R': '6371228', 'lat_0': '-90', 'lon_0': '0', "
+ "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
+ "'units': 'm', 'x_0': '0', 'y_0': '0'}")
+ nh_str = """Area ID: ease_nh
+Description: Arctic EASE grid
+Projection: {}
+Number of columns: 425
+Number of rows: 425
+Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
+ self.assertEqual(ease_nh.__str__(), nh_str)
+
+ sh_str = """Area ID: ease_sh
+Description: Antarctic EASE grid
+Projection: {}
+Number of columns: 425
+Number of rows: 425
+Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
+ self.assertEqual(ease_sh.__str__(), sh_str)
+
+ m_str = """Area ID: test_meters
+Description: test_meters
+Projection: {}
+Number of columns: 850
+Number of rows: 425
+Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
+ self.assertEqual(test_m.__str__(), m_str)
+
+ deg_str = """Area ID: test_degrees
+Description: test_degrees
+Projection: {}
+Number of columns: 850
+Number of rows: 425
+Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
+ self.assertEqual(test_deg.__str__(), deg_str)
+
+ # pyproj 2.0+ adds some extra parameters
+ projection = ("{'ellps': 'WGS84', 'no_defs': 'None', "
+ "'pm': '-81.36', 'proj': 'longlat', "
+ "'type': 'crs'}")
+ latlong_str = """Area ID: test_latlong
+Description: Basic latlong grid
+Projection: {}
+Number of columns: 3473
+Number of rows: 4058
+Area extent: (-0.0812, 0.4039, 0.0812, 0.5428)""".format(projection)
+ self.assertEqual(test_latlong.__str__(), latlong_str)
+
+ def test_dynamic_area_parser_yaml(self):
+ """Test YAML area parser on dynamic areas."""
+ from pyresample import parse_area_file
+ from pyresample.geometry import DynamicAreaDefinition
+ test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
+ test_area = parse_area_file(test_area_file, 'test_dynamic_resolution')[0]
+
+ self.assertIsInstance(test_area, DynamicAreaDefinition)
+ self.assertTrue(hasattr(test_area, 'resolution'))
+ self.assertEqual(test_area.resolution, (1000.0, 1000.0))
+
+ # lat/lon
+ from pyresample import parse_area_file
+ from pyresample.geometry import DynamicAreaDefinition
+ test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
+ test_area = parse_area_file(test_area_file, 'test_dynamic_resolution_ll')[0]
+
+ self.assertIsInstance(test_area, DynamicAreaDefinition)
+ self.assertTrue(hasattr(test_area, 'resolution'))
+ np.testing.assert_allclose(test_area.resolution, (1.0, 1.0))
+
+ def test_dynamic_area_parser_passes_resolution(self):
+ """Test that the resolution from the file is passed to a dynamic area."""
+ from pyresample import parse_area_file
+ test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
+ test_area = parse_area_file(test_area_file, 'omerc_bb_1000')[0]
+ assert test_area.resolution == (1000, 1000)
+
+ def test_multiple_file_content(self):
+ from pyresample import parse_area_file
+ from pyresample.area_config import load_area_from_string
+ area_list = ["""ease_sh:
+ description: Antarctic EASE grid
+ projection:
+ a: 6371228.0
+ units: m
+ lon_0: 0
+ proj: laea
+ lat_0: -90
+ shape:
+ height: 425
+ width: 425
+ area_extent:
+ lower_left_xy: [-5326849.0625, -5326849.0625]
+ upper_right_xy: [5326849.0625, 5326849.0625]
+ units: m
+""",
+ """ease_sh2:
+ description: Antarctic EASE grid
+ projection:
+ a: 6371228.0
+ units: m
+ lon_0: 0
+ proj: laea
+ lat_0: -90
+ shape:
+ height: 425
+ width: 425
+ area_extent:
+ lower_left_xy: [-5326849.0625, -5326849.0625]
+ upper_right_xy: [5326849.0625, 5326849.0625]
+ units: m
+"""]
+ with self.assertWarns(DeprecationWarning):
+ results = parse_area_file(area_list)
+ self.assertEqual(len(results), 2)
+ self.assertIn(results[0].area_id, ('ease_sh', 'ease_sh2'))
+ self.assertIn(results[1].area_id, ('ease_sh', 'ease_sh2'))
+ results2 = parse_area_file([io.StringIO(ar) for ar in area_list])
+ results3 = load_area_from_string(area_list)
+ self.assertEqual(results, results2)
+ self.assertEqual(results, results3)
=====================================
pyresample/test/test_bilinear.py
=====================================
@@ -481,6 +481,22 @@ class TestXarrayBilinear(unittest.TestCase):
[-5326849.0625, -5326849.0625,
5326849.0625, 5326849.0625])
+ # Area that partially overlaps the source data
+ self.target_def_partial = geometry.AreaDefinition('area_partial_overlap',
+ 'Europe (3km, HRV, VTC)',
+ 'areaD',
+ {'a': '6378144.0',
+ 'b': '6356759.0',
+ 'lat_0': '50.00',
+ 'lat_ts': '50.00',
+ 'lon_0': '8.00',
+ 'proj': 'stere'},
+ 4, 4,
+ [59559.320999999996,
+ -909968.64000000001,
+ 2920503.401,
+ 1490031.3600000001])
+
# Input data around the target pixel at 0.63388324, 55.08234642,
in_shape = (100, 100)
self.data1 = DataArray(da.ones((in_shape[0], in_shape[1])), dims=('y', 'x'))
@@ -1199,6 +1215,33 @@ class TestXarrayBilinear(unittest.TestCase):
finally:
shutil.rmtree(tempdir, ignore_errors=True)
+ def test_get_sample_from_cached_bil_info(self):
+ """Test getting data using pre-calculated resampling info."""
+ import os
+ import shutil
+ from tempfile import mkdtemp
+
+ from pyresample.bilinear import XArrayBilinearResampler
+
+ resampler = XArrayBilinearResampler(self.source_def, self.target_def_partial,
+ self.radius)
+ resampler.get_bil_info()
+
+ try:
+ tempdir = mkdtemp()
+ filename = os.path.join(tempdir, "test.zarr")
+
+ resampler.save_resampling_info(filename)
+
+ assert os.path.exists(filename)
+
+ new_resampler = XArrayBilinearResampler(self.source_def, self.target_def_partial,
+ self.radius)
+ new_resampler.load_resampling_info(filename)
+ _ = new_resampler.get_sample_from_bil_info(self.data1)
+ finally:
+ shutil.rmtree(tempdir, ignore_errors=True)
+
def test_check_fill_value():
"""Test that fill_value replacement/adjustment works."""
=====================================
pyresample/test/test_boundary.py
=====================================
@@ -0,0 +1,110 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+# pyresample, Resampling of remote sensing image data in python
+#
+# Copyright (C) 2010-2022 Pyresample developers
+#
+# This program is free software: you can redistribute it and/or modify it under
+# the terms of the GNU Lesser General Public License as published by the Free
+# Software Foundation, either version 3 of the License, or (at your option) any
+# later version.
+#
+# This program is distributed in the hope that it will be useful, but WITHOUT
+# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
+# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
+# details.
+#
+# You should have received a copy of the GNU Lesser General Public License
+# along with this program. If not, see <http://www.gnu.org/licenses/>.
+"""Test the boundary objects."""
+import unittest
+
+import numpy as np
+import pytest
+
+from pyresample.boundary import AreaBoundary
+
+
+class TestAreaBoundary(unittest.TestCase):
+ """Test 'AreaBoundary' class."""
+
+ def test_creation_from_lonlat_sides(self):
+ """Test AreaBoundary creation from sides."""
+ lon_sides = [np.array([1.0, 1.5, 2.0]),
+ np.array([2.0, 3.0]),
+ np.array([3.0, 3.5, 4.0]),
+ np.array([4.0, 1.0])]
+ lat_sides = [np.array([6.0, 6.5, 7.0]),
+ np.array([7.0, 8.0]),
+ np.array([8.0, 8.5, 9.0]),
+ np.array([9.0, 6.0])]
+ # Define AreaBoundary
+ boundary = AreaBoundary.from_lonlat_sides(lon_sides, lat_sides)
+
+ # Assert sides coincides
+ for b_lon, src_lon in zip(boundary.sides_lons, lon_sides):
+ assert np.allclose(b_lon, src_lon)
+
+ for b_lat, src_lat in zip(boundary.sides_lats, lat_sides):
+ assert np.allclose(b_lat, src_lat)
+
+ def test_creation(self):
+ """Test AreaBoundary creation."""
+ list_sides = [(np.array([1., 1.5, 2.]), np.array([6., 6.5, 7.])),
+ (np.array([2., 3.]), np.array([7., 8.])),
+ (np.array([3., 3.5, 4.]), np.array([8., 8.5, 9.])),
+ (np.array([4., 1.]), np.array([9., 6.]))]
+ lon_sides = [side[0]for side in list_sides]
+ lat_sides = [side[1]for side in list_sides]
+
+ # Define AreaBoundary
+ boundary = AreaBoundary(*list_sides)
+
+ # Assert sides coincides
+ for b_lon, src_lon in zip(boundary.sides_lons, lon_sides):
+ assert np.allclose(b_lon, src_lon)
+
+ for b_lat, src_lat in zip(boundary.sides_lats, lat_sides):
+ assert np.allclose(b_lat, src_lat)
+
+ def test_number_sides_required(self):
+ """Test AreaBoundary requires 4 sides ."""
+ list_sides = [(np.array([1., 1.5, 2.]), np.array([6., 6.5, 7.])),
+ (np.array([2., 3.]), np.array([7., 8.])),
+ (np.array([3., 3.5, 4.]), np.array([8., 8.5, 9.])),
+ (np.array([4., 1.]), np.array([9., 6.]))]
+ with pytest.raises(ValueError):
+ AreaBoundary(*list_sides[0:3])
+
+ def test_vertices_property(self):
+ """Test AreaBoundary vertices property."""
+ lon_sides = [np.array([1.0, 1.5, 2.0]),
+ np.array([2.0, 3.0]),
+ np.array([3.0, 3.5, 4.0]),
+ np.array([4.0, 1.0])]
+ lat_sides = [np.array([6.0, 6.5, 7.0]),
+ np.array([7.0, 8.0]),
+ np.array([8.0, 8.5, 9.0]),
+ np.array([9.0, 6.0])]
+ # Define AreaBoundary
+ boundary = AreaBoundary.from_lonlat_sides(lon_sides, lat_sides)
+
+ # Assert vertices
+ expected_vertices = np.array([[1., 6.],
+ [1.5, 6.5],
+ [2., 7.],
+ [3., 8.],
+ [3.5, 8.5],
+ [4., 9.]])
+ assert np.allclose(boundary.vertices, expected_vertices)
+
+ def test_contour(self):
+ """Test that AreaBoundary.contour returns the correct (lon,lat) tuple."""
+ list_sides = [(np.array([1., 1.5, 2.]), np.array([6., 6.5, 7.])),
+ (np.array([2., 3.]), np.array([7., 8.])),
+ (np.array([3., 3.5, 4.]), np.array([8., 8.5, 9.])),
+ (np.array([4., 1.]), np.array([9., 6.]))]
+ boundary = AreaBoundary(*list_sides)
+ lons, lats = boundary.contour()
+ assert np.allclose(lons, np.array([1., 1.5, 2., 3., 3.5, 4.]))
+ assert np.allclose(lats, np.array([6., 6.5, 7., 8., 8.5, 9.]))
=====================================
pyresample/test/test_geometry.py
=====================================
@@ -2,7 +2,7 @@
# -*- coding: utf-8 -*-
# pyresample, Resampling of remote sensing image data in python
#
-# Copyright (C) 2010-2020 Pyresample developers
+# Copyright (C) 2010-2022 Pyresample developers
#
# This program is free software: you can redistribute it and/or modify it under
# the terms of the GNU Lesser General Public License as published by the Free
@@ -17,6 +17,7 @@
# You should have received a copy of the GNU Lesser General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
"""Test the geometry objects."""
+import contextlib
import random
import sys
import unittest
@@ -35,7 +36,11 @@ from pyresample.geometry import (
combine_area_extents_vertical,
concatenate_area_defs,
)
-from pyresample.test.utils import catch_warnings
+from pyresample.test.utils import (
+ catch_warnings,
+ create_test_latitude,
+ create_test_longitude,
+)
class Test(unittest.TestCase):
@@ -471,6 +476,16 @@ class Test(unittest.TestCase):
swath_def = geometry.SwathDefinition(xrlons, xrlats)
self.assertIsInstance(hash(swath_def), int)
+ def test_non_contiguous_swath_hash(self):
+ """Test swath hash."""
+ lons = np.array([[1.2, 1.3, 1.4, 1.5],
+ [1.2, 1.3, 1.4, 1.5]])
+ lats = np.array([[65.9, 65.86, 65.82, 65.78],
+ [65.9, 65.86, 65.82, 65.78]])
+ swath_def = geometry.SwathDefinition(lons, lats)
+ swath_def_subset = swath_def[:, slice(0, 2)]
+ self.assertIsInstance(hash(swath_def_subset), int)
+
def test_area_equal(self):
"""Test areas equality."""
area_def = geometry.AreaDefinition('areaD', 'Europe (3km, HRV, VTC)', 'areaD',
@@ -1596,7 +1611,7 @@ def assert_np_dict_allclose(dict1, dict2):
try:
np.testing.assert_allclose(val, dict2[key])
except TypeError:
- assert(val == dict2[key])
+ assert val == dict2[key]
class TestSwathDefinition(unittest.TestCase):
@@ -2593,19 +2608,21 @@ class TestCrop(unittest.TestCase):
lon, lat = geometry.get_geostationary_bounding_box(geos_area, 20)
# This musk be equal to lon.
- elon = np.array([-79.23372832, -77.9694809, -74.55229623, -67.32816598,
- -41.45591465, 41.45591465, 67.32816598, 74.55229623,
- 77.9694809, 79.23372832, 79.23372832, 77.9694809,
- 74.55229623, 67.32816598, 41.45591465, -41.45591465,
- -67.32816598, -74.55229623, -77.9694809, -79.23372832])
- elat = np.array([6.94302533e-15, 1.97333299e+01, 3.92114217e+01, 5.82244715e+01,
- 7.52409201e+01, 7.52409201e+01, 5.82244715e+01, 3.92114217e+01,
- 1.97333299e+01, -0.00000000e+00, -6.94302533e-15, -1.97333299e+01,
- -3.92114217e+01, -5.82244715e+01, -7.52409201e+01, -7.52409201e+01,
- -5.82244715e+01, -3.92114217e+01, -1.97333299e+01, 0.0])
-
- np.testing.assert_allclose(lon, elon)
- np.testing.assert_allclose(lat, elat)
+ elon = np.array([-79.23372832, -78.19662326, -75.42516215, -70.22636028,
+ -56.89851775, 0., 56.89851775, 70.22636028,
+ 75.42516215, 78.19662326, 79.23372832, 78.19662326,
+ 75.42516215, 70.22636028, 56.89851775, 0.,
+ -56.89851775, -70.22636028, -75.42516215, -78.19662326,
+ -79.23372832])
+ elat = np.array([0., 17.76879577, 35.34328897, 52.5978607,
+ 69.00533142, 79.14811219, 69.00533142, 52.5978607,
+ 35.34328897, 17.76879577, -0., -17.76879577,
+ -35.34328897, -52.5978607, -69.00533142, -79.14811219,
+ -69.00533142, -52.5978607, -35.34328897, -17.76879577,
+ 0.])
+
+ np.testing.assert_allclose(lon, elon, atol=1e-07)
+ np.testing.assert_allclose(lat, elat, atol=1e-07)
geos_area = MagicMock()
lon_0 = 10
@@ -2815,7 +2832,7 @@ class TestAreaDefGetAreaSlices(unittest.TestCase):
assert isinstance(slice_x.start, int)
assert isinstance(slice_y.start, int)
self.assertEqual(slice(46, 3667, None), slice_x)
- self.assertEqual(slice(52, 3663, None), slice_y)
+ self.assertEqual(slice(56, 3659, None), slice_y)
area_to_cover = geometry.AreaDefinition('areaD', 'Europe (3km, HRV, VTC)', 'areaD',
{'a': 6378144.0,
@@ -2873,7 +2890,7 @@ class TestAreaDefGetAreaSlices(unittest.TestCase):
assert isinstance(slice_x.start, int)
assert isinstance(slice_y.start, int)
self.assertEqual(slice_x, slice(46, 3667, None))
- self.assertEqual(slice_y, slice(52, 3663, None))
+ self.assertEqual(slice_y, slice(56, 3659, None))
def test_get_area_slices_nongeos(self):
"""Check area slicing for non-geos projections."""
@@ -2946,37 +2963,22 @@ class TestBboxLonlats:
@pytest.mark.parametrize("force_clockwise", [False, True])
@pytest.mark.parametrize("use_dask", [False, True])
@pytest.mark.parametrize("use_xarray", [False, True])
+ @pytest.mark.parametrize("nan_pattern", [None, "scan", "half", "whole"])
def test_swath_def_bbox(self, lon_start, lon_stop,
lat_start, lat_stop, exp_nonforced_clockwise,
force_clockwise,
- use_dask, use_xarray):
- from pyresample.geometry import SwathDefinition
-
- from .utils import create_test_latitude, create_test_longitude
+ use_dask, use_xarray, nan_pattern):
swath_shape = (50, 10)
lons = create_test_longitude(lon_start, lon_stop, swath_shape)
lats = create_test_latitude(lat_start, lat_stop, swath_shape)
-
- if use_dask:
- lons = da.from_array(lons)
- lats = da.from_array(lats)
- if use_xarray:
- lons = xr.DataArray(lons, dims=('y', 'x'))
- lats = xr.DataArray(lats, dims=('y', 'x'))
-
+ lons, lats = _add_nans_if_necessary(lons, lats, nan_pattern)
+ lons, lats = _convert_type_if_necessary(lons, lats, use_dask, use_xarray)
swath_def = SwathDefinition(lons, lats)
- bbox_lons, bbox_lats = swath_def.get_bbox_lonlats(force_clockwise=force_clockwise)
- assert len(bbox_lons) == len(bbox_lats)
- assert len(bbox_lons) == 4
- for side_lons, side_lats in zip(bbox_lons, bbox_lats):
- assert isinstance(side_lons, np.ndarray)
- assert isinstance(side_lats, np.ndarray)
- assert side_lons.shape == side_lats.shape
- is_cw = _is_clockwise(np.concatenate(bbox_lons), np.concatenate(bbox_lats))
- if exp_nonforced_clockwise or force_clockwise:
- assert is_cw
- else:
- assert not is_cw
+ with _raises_if(nan_pattern == "whole", ValueError):
+ bbox_lons, bbox_lats = swath_def.get_bbox_lonlats(force_clockwise=force_clockwise)
+ if nan_pattern != "whole":
+ _check_bbox_structure_and_values(bbox_lons, bbox_lats)
+ _check_bbox_clockwise(bbox_lons, bbox_lats, exp_nonforced_clockwise, force_clockwise)
def test_swath_def_bbox_decimated(self):
from pyresample.geometry import SwathDefinition
@@ -3001,6 +3003,52 @@ class TestBboxLonlats:
assert bbox_lons[0][-1] == bbox_lons[1][0]
+def _add_nans_if_necessary(lons, lats, nan_pattern):
+ if nan_pattern == "scan":
+ lons[20:30, -1] = np.nan
+ elif nan_pattern == "half":
+ lons[:25, -1] = np.nan
+ elif nan_pattern == "whole":
+ lons[:, -1] = np.nan
+ return lons, lats
+
+
+def _convert_type_if_necessary(lons, lats, use_dask, use_xarray):
+ if use_dask:
+ lons = da.from_array(lons)
+ lats = da.from_array(lats)
+ if use_xarray:
+ lons = xr.DataArray(lons, dims=('y', 'x'))
+ lats = xr.DataArray(lats, dims=('y', 'x'))
+ return lons, lats
+
+
+def _check_bbox_structure_and_values(bbox_lons, bbox_lats):
+ assert not any(np.isnan(side_lon).any() for side_lon in bbox_lons)
+ assert not any(np.isnan(side_lat).any() for side_lat in bbox_lats)
+ assert len(bbox_lons) == len(bbox_lats)
+ assert len(bbox_lons) == 4
+ for side_lons, side_lats in zip(bbox_lons, bbox_lats):
+ assert isinstance(side_lons, np.ndarray)
+ assert isinstance(side_lats, np.ndarray)
+ assert side_lons.shape == side_lats.shape
+
+
+def _check_bbox_clockwise(bbox_lons, bbox_lats, exp_nonforced_clockwise, force_clockwise):
+ is_cw = _is_clockwise(np.concatenate(bbox_lons), np.concatenate(bbox_lats))
+ if exp_nonforced_clockwise or force_clockwise:
+ assert is_cw
+ else:
+ assert not is_cw
+
+
+ at contextlib.contextmanager
+def _raises_if(condition, exp_exception, *args, **kwargs):
+ expectation = pytest.raises(exp_exception, *args, **kwargs) if condition else contextlib.nullcontext()
+ with expectation:
+ yield
+
+
def _is_clockwise(lons, lats):
# https://stackoverflow.com/a/1165943/433202
prev_point = (lons[0], lats[0])
@@ -3013,6 +3061,238 @@ def _is_clockwise(lons, lats):
return edge_sum > 0
+class TestBoundary(unittest.TestCase):
+ """Test 'boundary' method for <area_type>Definition classes."""
+
+ def test_polar_south_pole_projection(self):
+ """Test boundary for polar projection around the south pole."""
+ # Define polar projection
+ proj_dict_polar_sh = {
+ 'proj_id': "polar_sh_projection",
+ "area_id": 'polar_sh_projection',
+ "description": 'Antarctic EASE grid',
+ # projection : 'EPSG:3409',
+ "projection": {'proj': 'laea', 'lat_0': -90, 'lon_0': 0, 'a': 6371228.0, 'units': 'm'},
+ "width": 2,
+ "height": 2,
+ "area_extent": (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625),
+ }
+ # Define AreaDefintion and retrieve AreaBoundary
+ areadef = geometry.AreaDefinition(**proj_dict_polar_sh)
+ boundary = areadef.boundary(force_clockwise=False)
+
+ # Check boundary shape
+ height, width = areadef.shape
+ n_vertices = (width - 1) * 2 + (height - 1) * 2
+ assert boundary.vertices.shape == (n_vertices, 2)
+
+ # Check boundary vertices is in correct order
+ expected_vertices = np.array([[-45., -55.61313895],
+ [45., -55.61313895],
+ [135., -55.61313895],
+ [-135., -55.61313895]])
+ assert np.allclose(expected_vertices, boundary.vertices)
+
+ def test_nort_pole_projection(self):
+ """Test boundary for polar projection around the nort pole."""
+ # Define polar projection
+ proj_dict_polar_nh = {
+ 'proj_id': "polar_nh_projection",
+ "area_id": 'polar_nh_projection',
+ "description": 'Artic EASE grid',
+ "projection": {'proj': 'laea', 'lat_0': 90, 'lon_0': 0, 'a': 6371228.0, 'units': 'm'},
+ "width": 2,
+ "height": 2,
+ "area_extent": (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625),
+ }
+ # Define AreaDefintion and retrieve AreaBoundary
+ areadef = geometry.AreaDefinition(**proj_dict_polar_nh)
+ boundary = areadef.boundary(force_clockwise=False)
+
+ # Check boundary shape
+ height, width = areadef.shape
+ n_vertices = (width - 1) * 2 + (height - 1) * 2
+ assert boundary.vertices.shape == (n_vertices, 2)
+
+ # Check boundary vertices is in correct order
+ expected_vertices = np.array([[-135., 55.61313895],
+ [135., 55.61313895],
+ [45., 55.61313895],
+ [-45., 55.61313895]])
+ assert np.allclose(expected_vertices, boundary.vertices)
+
+ def test_geostationary_projection(self):
+ """Test boundary for geostationary projection."""
+ # Define geostationary projection
+ proj_dict_geostationary = {
+ 'proj_id': "dummy_geo_projection",
+ "area_id": 'dummy_geo_projection',
+ "description": 'geostationary projection',
+ "projection": {'a': 6378169.00, 'b': 6356583.80, 'h': 35785831.00, 'lon_0': 0, 'proj': 'geos'},
+ "area_extent": (-5500000., -5500000., 5500000., 5500000.),
+ "width": 100,
+ "height": 100,
+ }
+ # Define AreaDefintion and retrieve AreaBoundary
+ areadef = geometry.AreaDefinition(**proj_dict_geostationary)
+
+ # Check default boundary shape
+ default_n_vertices = 50
+ boundary = areadef.boundary(frequency=None)
+ assert boundary.vertices.shape == (default_n_vertices, 2)
+
+ # Check minimum boundary vertices
+ n_vertices = 3
+ minimum_n_vertices = 4
+ boundary = areadef.boundary(frequency=n_vertices)
+ assert boundary.vertices.shape == (minimum_n_vertices, 2)
+
+ # Check odd frequency number
+ # - Rounded to the sequent even number (to construct the sides)
+ n_odd_vertices = 5
+ boundary = areadef.boundary(frequency=n_odd_vertices)
+ assert boundary.vertices.shape == (n_odd_vertices + 1, 2)
+
+ # Check boundary vertices
+ n_vertices = 10
+ boundary = areadef.boundary(frequency=n_vertices, force_clockwise=False)
+ boundary.vertices.shape
+
+ # Check boundary vertices is in correct order
+ expected_vertices = np.array([[-7.92337283e+01, 6.94302533e-15],
+ [-7.54251621e+01, 3.53432890e+01],
+ [-5.68985178e+01, 6.90053314e+01],
+ [5.68985178e+01, 6.90053314e+01],
+ [7.54251621e+01, 3.53432890e+01],
+ [7.92337283e+01, -6.94302533e-15],
+ [7.54251621e+01, -3.53432890e+01],
+ [5.68985178e+01, -6.90053314e+01],
+ [-5.68985178e+01, -6.90053314e+01],
+ [-7.54251621e+01, -3.53432890e+01]])
+ assert np.allclose(expected_vertices, boundary.vertices)
+
+ def test_global_platee_caree_projection(self):
+ """Test boundary for global platee caree projection."""
+ # Define global projection
+ proj_dict_global_wgs84 = {
+ 'proj_id': "epsg4326",
+ 'area_id': 'epsg4326',
+ 'description': 'Global equal latitude/longitude grid for global sphere',
+ "projection": 'EPSG:4326',
+ "width": 4,
+ "height": 4,
+ "area_extent": (-180.0, -90.0, 180.0, 90.0),
+ }
+ # Define AreaDefintion and retrieve AreaBoundary
+ areadef = geometry.AreaDefinition(**proj_dict_global_wgs84)
+ boundary = areadef.boundary(force_clockwise=False)
+
+ # Check boundary shape
+ height, width = areadef.shape
+ n_vertices = (width - 1) * 2 + (height - 1) * 2
+ assert boundary.vertices.shape == (n_vertices, 2)
+
+ # Check boundary vertices is in correct order
+ expected_vertices = np.array([[-135., 67.5],
+ [-45., 67.5],
+ [45., 67.5],
+ [135., 67.5],
+ [135., 22.5],
+ [135., -22.5],
+ [135., -67.5],
+ [45., -67.5],
+ [-45., -67.5],
+ [-135., -67.5],
+ [-135., -22.5],
+ [-135., 22.5]])
+ assert np.allclose(expected_vertices, boundary.vertices)
+
+ def test_minimal_global_platee_caree_projection(self):
+ """Test boundary for global platee caree projection."""
+ # Define minimal global projection
+ proj_dict_global_wgs84 = {
+ 'proj_id': "epsg4326",
+ 'area_id': 'epsg4326',
+ 'description': 'Global equal latitude/longitude grid for global sphere',
+ "projection": 'EPSG:4326',
+ "width": 2,
+ "height": 2,
+ "area_extent": (-180.0, -90.0, 180.0, 90.0),
+ }
+
+ # Define AreaDefintion and retrieve AreaBoundary
+ areadef = geometry.AreaDefinition(**proj_dict_global_wgs84)
+ boundary = areadef.boundary(force_clockwise=False)
+
+ # Check boundary shape
+ height, width = areadef.shape
+ n_vertices = (width - 1) * 2 + (height - 1) * 2
+ assert boundary.vertices.shape == (n_vertices, 2)
+
+ # Check boundary vertices is in correct order
+ expected_vertices = np.array([[-90., 45.],
+ [90., 45.],
+ [90., -45.],
+ [-90., -45.]])
+ assert np.allclose(expected_vertices, boundary.vertices)
+
+ def test_local_area_projection(self):
+ """Test local area projection in meter."""
+ # Define ch1903 projection (eastings, northings)
+ proj_dict_ch1903 = {
+ 'proj_id': "swiss_area",
+ 'area_id': 'swiss_area',
+ 'description': 'Swiss CH1903+ / LV95',
+ "projection": 'EPSG:2056',
+ "width": 2,
+ "height": 2,
+ "area_extent": (2_600_000.0, 1_050_000, 2_800_000.0, 1_170_000),
+ }
+
+ # Define AreaDefintion and retrieve AreaBoundary
+ areadef = geometry.AreaDefinition(**proj_dict_ch1903)
+ boundary = areadef.boundary(force_clockwise=False)
+
+ # Check boundary shape
+ height, width = areadef.shape
+ n_vertices = (width - 1) * 2 + (height - 1) * 2
+ assert boundary.vertices.shape == (n_vertices, 2)
+
+ # Check boundary vertices is in correct order
+ expected_vertices = np.array([[8.08993639, 46.41074744],
+ [9.39028624, 46.39582417],
+ [9.37106733, 45.85619242],
+ [8.08352612, 45.87097006]])
+ assert np.allclose(expected_vertices, boundary.vertices)
+
+ def test_swath_definition(self):
+ """Test boundary for swath definition."""
+ lons = np.array([[1.2, 1.3, 1.4, 1.5],
+ [1.2, 1.3, 1.4, 1.5]])
+ lats = np.array([[65.9, 65.86, 65.82, 65.78],
+ [65.89, 65.86, 65.82, 65.78]])
+
+ # Define SwathDefinition and retrieve AreaBoundary
+ swath_def = SwathDefinition(lons, lats)
+ boundary = swath_def.boundary(force_clockwise=False)
+
+ # Check boundary shape
+ height, width = swath_def.shape
+ n_vertices = (width - 1) * 2 + (height - 1) * 2
+ assert boundary.vertices.shape == (n_vertices, 2)
+
+ # Check boundary vertices is in correct order
+ expected_vertices = np.array([[1.2, 65.9],
+ [1.3, 65.86],
+ [1.4, 65.82],
+ [1.5, 65.78],
+ [1.5, 65.78],
+ [1.4, 65.82],
+ [1.3, 65.86],
+ [1.2, 65.89]])
+ assert np.allclose(expected_vertices, boundary.vertices)
+
+
def _gen_swath_def_xarray_dask():
lons, lats = _gen_swath_lons_lats()
lons_dask = da.from_array(lons)
@@ -3042,7 +3322,6 @@ def _gen_swath_def_numpy():
def _gen_swath_lons_lats():
- from .utils import create_test_latitude, create_test_longitude
swath_shape = (50, 10)
lon_start, lon_stop, lat_start, lat_stop = (3.0, 12.0, 75.0, 26.0)
lons = create_test_longitude(lon_start, lon_stop, swath_shape)
=====================================
pyresample/test/test_sgeom/__init__.py
=====================================
@@ -0,0 +1,18 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+#
+# Copyright (c) 2021 Pyresample developers
+#
+# This program is free software: you can redistribute it and/or modify it under
+# the terms of the GNU Lesser General Public License as published by the Free
+# Software Foundation, either version 3 of the License, or (at your option) any
+# later version.
+#
+# This program is distributed in the hope that it will be useful, but WITHOUT
+# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
+# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
+# details.
+#
+# You should have received a copy of the GNU Lesser General Public License along
+# with this program. If not, see <http://www.gnu.org/licenses/>.
+"""Test for spherical geometries."""
=====================================
pyresample/test/test_sgeom/test_point.py
=====================================
@@ -0,0 +1,226 @@
+#!/usr/bin/env python3
+# -*- coding: utf-8 -*-
+# Copyright (c) 2013-2022 Pyresample Developers
+#
+# This program is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see <http://www.gnu.org/licenses/>.
+"""Test cases for SPoint and SMultiPoint."""
+import unittest
+
+import numpy as np
+import pytest
+
+from pyresample.future.spherical import SMultiPoint, SPoint
+
+
+class TestSPoint(unittest.TestCase):
+ """Test SPoint."""
+
+ def test_latitude_validity(self):
+ """Check SPoint raises error if providing bad latitude."""
+ # Test latitude outside range
+ lon = 0
+ lat = np.pi
+ with pytest.raises(ValueError):
+ SPoint(lon, lat)
+ # Test inf
+ lon = 0
+ lat = np.inf
+ with pytest.raises(ValueError):
+ SPoint(lon, lat)
+
+ def test_longitude_validity(self):
+ """Check SPoint raises error if providing bad longitude."""
+ # Test inf
+ lon = np.inf
+ lat = 0
+ with pytest.raises(ValueError):
+ SPoint(lon, lat)
+
+ def test_creation_from_degrees(self):
+ """Check SPoint creation from lat/lon in degrees."""
+ lon = 0
+ lat = 20
+ p1 = SPoint.from_degrees(lon, lat)
+ p2 = SPoint(np.deg2rad(lon), np.deg2rad(lat))
+ assert p1 == p2
+
+ def test_vertices(self):
+ """Test vertices property."""
+ lons = 0
+ lats = np.pi / 2
+ p = SPoint(lons, lats)
+ res = np.array([[0., 1.57079633]])
+ assert np.allclose(p.vertices, res)
+
+ def test_vertices_in_degrees(self):
+ """Test vertices_in_degrees property."""
+ lons = 0
+ lats = np.pi / 2
+ p = SPoint(lons, lats)
+ res = np.array([[0., 90.]])
+ assert np.allclose(p.vertices_in_degrees, res)
+
+ def test_raise_error_if_multi_point(self):
+ """Check SPoint raises error providing multiple points."""
+ lons = np.array([0, np.pi])
+ lats = np.array([-np.pi / 2, np.pi / 2])
+ with pytest.raises(ValueError):
+ SPoint(lons, lats)
+
+ def test_str(self):
+ """Check the string representation."""
+ d = SPoint(1.0, 0.5)
+ self.assertEqual(str(d), "(1.0, 0.5)")
+
+ def test_repr(self):
+ """Check the representation."""
+ d = SPoint(1.0, 0.5)
+ self.assertEqual(repr(d), "(1.0, 0.5)")
+
+ def test_to_shapely(self):
+ """Test conversion to shapely."""
+ from shapely.geometry import Point
+ lon = 0.0
+ lat = np.pi / 2
+ spherical_point = SPoint(lon, lat)
+ shapely_point = Point(0.0, 90.0)
+ assert shapely_point.equals_exact(spherical_point.to_shapely(), tolerance=1e-10)
+
+
+class TestSMultiPoint(unittest.TestCase):
+ """Test SMultiPoint."""
+
+ def test_single_point(self):
+ """Test behaviour when providing single lon,lat values."""
+ # Single values must raise error
+ with pytest.raises(ValueError):
+ SMultiPoint(2, 1)
+ # Array values must not raise error
+ p = SMultiPoint([2], [1])
+ assert p.lon.shape == (1,)
+ assert p.lat.shape == (1,)
+ assert p.vertices.shape == (1, 2)
+
+ def test_creation_from_degrees(self):
+ """Check SMultiPoint creation from lat/lon in degrees."""
+ lon = np.array([0, 10])
+ lat = np.array([20, 30])
+ p1 = SMultiPoint.from_degrees(lon, lat)
+ p2 = SMultiPoint(np.deg2rad(lon), np.deg2rad(lat))
+ assert p1 == p2
+
+ def test_vertices(self):
+ """Test vertices property."""
+ lons = np.array([0, np.pi])
+ lats = np.array([-np.pi / 2, np.pi / 2])
+ p = SMultiPoint(lons, lats)
+ res = np.array([[0., -1.57079633],
+ [-3.14159265, 1.57079633]])
+ assert np.allclose(p.vertices, res)
+
+ def test_vertices_in_degrees(self):
+ """Test vertices_in_degrees property."""
+ lons = np.array([0, np.pi])
+ lats = np.array([-np.pi / 2, np.pi / 2])
+ p = SMultiPoint(lons, lats)
+ res = np.array([[0., -90.],
+ [-180., 90.]])
+ assert np.allclose(p.vertices_in_degrees, res)
+
+ def test_distance(self):
+ """Test Vincenty formula."""
+ lons = np.array([0, np.pi])
+ lats = np.array([-np.pi / 2, np.pi / 2])
+ p1 = SMultiPoint(lons, lats)
+ lons = np.array([0, np.pi / 2, np.pi])
+ lats = np.array([-np.pi / 2, 0, np.pi / 2])
+ p2 = SMultiPoint(lons, lats)
+ d12 = p1.distance(p2)
+ d21 = p2.distance(p1)
+ self.assertEqual(d12.shape, (2, 3))
+ self.assertEqual(d21.shape, (3, 2))
+ res = np.array([[0., 1.57079633, 3.14159265],
+ [3.14159265, 1.57079633, 0.]])
+ assert np.allclose(d12, res)
+ # Special case with 1 point
+ p1 = SMultiPoint(lons[[0]], lats[[0]])
+ p2 = SMultiPoint(lons[[0]], lats[[0]])
+ d12 = p1.distance(p2)
+ assert isinstance(d12, float)
+
+ def test_hdistance(self):
+ """Test Haversine formula."""
+ lons = np.array([0, np.pi])
+ lats = np.array([-np.pi / 2, np.pi / 2])
+ p1 = SMultiPoint(lons, lats)
+ lons = np.array([0, np.pi / 2, np.pi])
+ lats = np.array([-np.pi / 2, 0, np.pi / 2])
+ p2 = SMultiPoint(lons, lats)
+ d12 = p1.hdistance(p2)
+ d21 = p2.hdistance(p1)
+ self.assertEqual(d12.shape, (2, 3))
+ self.assertEqual(d21.shape, (3, 2))
+ res = np.array([[0., 1.57079633, 3.14159265],
+ [3.14159265, 1.57079633, 0.]])
+ assert np.allclose(d12, res)
+
+ def test_eq(self):
+ """Check the equality."""
+ lons = [0, np.pi]
+ lats = [-np.pi / 2, np.pi / 2]
+ p = SMultiPoint(lons, lats)
+ p1 = SMultiPoint(lons, lats)
+ assert p == p1
+
+ def test_eq_antimeridian(self):
+ """Check the equality with longitudes at -180/180 degrees."""
+ lons = [np.pi, np.pi]
+ lons1 = [-np.pi, -np.pi]
+ lats = [-np.pi / 2, np.pi / 2]
+ p = SMultiPoint(lons, lats)
+ p1 = SMultiPoint(lons1, lats)
+ assert p == p1
+
+ def test_neq(self):
+ """Check the equality."""
+ lons = np.array([0, np.pi])
+ lats = [-np.pi / 2, np.pi / 2]
+ p = SMultiPoint(lons, lats)
+ p1 = SMultiPoint(lons + 0.1, lats)
+ assert p != p1
+
+ def test_str(self):
+ """Check the string representation."""
+ lons = [0, np.pi]
+ lats = [-np.pi / 2, np.pi / 2]
+ p = SMultiPoint(lons, lats)
+ expected_str = '[[ 0. -1.57079633]\n [-3.14159265 1.57079633]]'
+ self.assertEqual(str(p), expected_str)
+
+ def test_repr(self):
+ """Check the representation."""
+ lons = [0, np.pi]
+ lats = [-np.pi / 2, np.pi / 2]
+ p = SMultiPoint(lons, lats)
+ expected_repr = '[[ 0. -1.57079633]\n [-3.14159265 1.57079633]]'
+ self.assertEqual(repr(p), expected_repr)
+
+ def test_to_shapely(self):
+ """Test conversion to shapely."""
+ from shapely.geometry import MultiPoint
+ lons = np.array([0.0, np.pi])
+ lats = np.array([-np.pi / 2, np.pi / 2])
+ spherical_multipoint = SMultiPoint(lons, lats)
+ shapely_multipoint = MultiPoint([(0.0, -90.0), (-180.0, 90.0)])
+ assert shapely_multipoint.equals_exact(spherical_multipoint.to_shapely(), tolerance=1e-10)
=====================================
pyresample/test/test_spherical.py
=====================================
@@ -130,6 +130,31 @@ class TestSCoordinate(unittest.TestCase):
d = SCoordinate(0, 0).hdistance(SCoordinate(1, 1))
np.testing.assert_allclose(d, 1.2745557823062943)
+ def test_crosscart(self):
+ """Test cross product and conversion to cartesian."""
+ # Test crossproduct between poles
+ p = SCoordinate(np.deg2rad(50), np.deg2rad(90.0))
+ p1 = SCoordinate(np.deg2rad(50), np.deg2rad(-90.0))
+ cp = p.cross2cart(p1)
+ assert np.allclose(cp.cart, [0, 0, 0])
+
+ # Test crossproduct between points along the equator
+ p = SCoordinate(np.deg2rad(-180.0), np.deg2rad(0.0))
+ p1 = SCoordinate(np.deg2rad(0.0), np.deg2rad(0.0))
+ cp = p.cross2cart(p1)
+ assert np.allclose(cp.cart, [0, 0, 0])
+
+ # Test crossproduct between same points
+ p = SCoordinate(np.deg2rad(50), np.deg2rad(20.0))
+ p1 = SCoordinate(np.deg2rad(50), np.deg2rad(20.0))
+ cp = p.cross2cart(p1)
+ assert np.allclose(cp.cart, [0, 0, 0])
+
+ p = SCoordinate(np.deg2rad(-180), np.deg2rad(90.0))
+ p1 = SCoordinate(np.deg2rad(180), np.deg2rad(90.0))
+ p.cross2cart(p1)
+ assert np.allclose(cp.cart, [0, 0, 0])
+
def test_str(self):
"""Check the string representation."""
d = SCoordinate(0, 0)
@@ -146,12 +171,6 @@ class TestSCoordinate(unittest.TestCase):
point_end = SCoordinate(np.pi, 0)
assert point_start == point_end
- def test_equality_of_infinites(self):
- """Test that infinite coordinates are equal."""
- coord1 = SCoordinate(np.inf, np.inf)
- coord2 = SCoordinate(np.inf, np.inf)
- assert coord1 == coord2
-
class TestCCoordinate(unittest.TestCase):
"""Test SCoordinates."""
=====================================
pyresample/test/test_utils.py
=====================================
@@ -17,9 +17,7 @@
# with this program. If not, see <http://www.gnu.org/licenses/>.
"""Test various utility functions."""
-import io
import os
-import pathlib
import unittest
import uuid
from tempfile import NamedTemporaryFile
@@ -41,209 +39,6 @@ def tmptiff(width=100, height=100, transform=None, crs=None, dtype=np.uint8):
return fname
-class TestLegacyAreaParser(unittest.TestCase):
- """Test legacy .cfg parsing."""
-
- def test_area_parser_legacy(self):
- """Test legacy area parser."""
- from pyresample import parse_area_file
- ease_nh, ease_sh = parse_area_file(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'),
- 'ease_nh', 'ease_sh')
-
- # pyproj 2.0+ adds some extra parameters
- projection = ("{'R': '6371228', 'lat_0': '90', 'lon_0': '0', "
- "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
- "'units': 'm', 'x_0': '0', 'y_0': '0'}")
- nh_str = """Area ID: ease_nh
-Description: Arctic EASE grid
-Projection ID: ease_nh
-Projection: {}
-Number of columns: 425
-Number of rows: 425
-Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
- self.assertEqual(ease_nh.__str__(), nh_str)
- self.assertIsInstance(ease_nh.proj_dict['lat_0'], (int, float))
-
- projection = ("{'R': '6371228', 'lat_0': '-90', 'lon_0': '0', "
- "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
- "'units': 'm', 'x_0': '0', 'y_0': '0'}")
- sh_str = """Area ID: ease_sh
-Description: Antarctic EASE grid
-Projection ID: ease_sh
-Projection: {}
-Number of columns: 425
-Number of rows: 425
-Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
- self.assertEqual(ease_sh.__str__(), sh_str)
- self.assertIsInstance(ease_sh.proj_dict['lat_0'], (int, float))
-
- def test_load_area(self):
- from pyresample import load_area
- ease_nh = load_area(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'), 'ease_nh')
- # pyproj 2.0+ adds some extra parameters
- projection = ("{'R': '6371228', 'lat_0': '90', 'lon_0': '0', "
- "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
- "'units': 'm', 'x_0': '0', 'y_0': '0'}")
- nh_str = """Area ID: ease_nh
-Description: Arctic EASE grid
-Projection ID: ease_nh
-Projection: {}
-Number of columns: 425
-Number of rows: 425
-Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
- self.assertEqual(nh_str, ease_nh.__str__())
-
- def test_area_file_not_found_exception(self):
- from pyresample.area_config import load_area
- self.assertRaises(FileNotFoundError, load_area,
- "/this/file/does/not/exist.yaml")
- self.assertRaises(FileNotFoundError, load_area,
- pathlib.Path("/this/file/does/not/exist.yaml"))
-
- def test_not_found_exception(self):
- from pyresample.area_config import AreaNotFound, parse_area_file
- self.assertRaises(AreaNotFound, parse_area_file,
- os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'), 'no_area')
-
- def test_commented(self):
- from pyresample import parse_area_file
- areas = parse_area_file(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'))
- self.assertNotIn('commented', [area.name for area in areas])
-
-
-class TestYAMLAreaParser(unittest.TestCase):
- """Test YAML parsing."""
-
- def test_area_parser_yaml(self):
- """Test YAML area parser."""
- from pyresample import parse_area_file
- test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
- test_areas = parse_area_file(test_area_file, 'ease_nh', 'ease_sh', 'test_meters', 'test_degrees',
- 'test_latlong')
- ease_nh, ease_sh, test_m, test_deg, test_latlong = test_areas
-
- # pyproj 2.0+ adds some extra parameters
- projection = ("{'R': '6371228', 'lat_0': '-90', 'lon_0': '0', "
- "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', "
- "'units': 'm', 'x_0': '0', 'y_0': '0'}")
- nh_str = """Area ID: ease_nh
-Description: Arctic EASE grid
-Projection: {}
-Number of columns: 425
-Number of rows: 425
-Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
- self.assertEqual(ease_nh.__str__(), nh_str)
-
- sh_str = """Area ID: ease_sh
-Description: Antarctic EASE grid
-Projection: {}
-Number of columns: 425
-Number of rows: 425
-Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
- self.assertEqual(ease_sh.__str__(), sh_str)
-
- m_str = """Area ID: test_meters
-Description: test_meters
-Projection: {}
-Number of columns: 850
-Number of rows: 425
-Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
- self.assertEqual(test_m.__str__(), m_str)
-
- deg_str = """Area ID: test_degrees
-Description: test_degrees
-Projection: {}
-Number of columns: 850
-Number of rows: 425
-Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection)
- self.assertEqual(test_deg.__str__(), deg_str)
-
- # pyproj 2.0+ adds some extra parameters
- projection = ("{'ellps': 'WGS84', 'no_defs': 'None', "
- "'pm': '-81.36', 'proj': 'longlat', "
- "'type': 'crs'}")
- latlong_str = """Area ID: test_latlong
-Description: Basic latlong grid
-Projection: {}
-Number of columns: 3473
-Number of rows: 4058
-Area extent: (-0.0812, 0.4039, 0.0812, 0.5428)""".format(projection)
- self.assertEqual(test_latlong.__str__(), latlong_str)
-
- def test_dynamic_area_parser_yaml(self):
- """Test YAML area parser on dynamic areas."""
- from pyresample import parse_area_file
- from pyresample.geometry import DynamicAreaDefinition
- test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
- test_area = parse_area_file(test_area_file, 'test_dynamic_resolution')[0]
-
- self.assertIsInstance(test_area, DynamicAreaDefinition)
- self.assertTrue(hasattr(test_area, 'resolution'))
- self.assertEqual(test_area.resolution, (1000.0, 1000.0))
-
- # lat/lon
- from pyresample import parse_area_file
- from pyresample.geometry import DynamicAreaDefinition
- test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
- test_area = parse_area_file(test_area_file, 'test_dynamic_resolution_ll')[0]
-
- self.assertIsInstance(test_area, DynamicAreaDefinition)
- self.assertTrue(hasattr(test_area, 'resolution'))
- np.testing.assert_allclose(test_area.resolution, (1.0, 1.0))
-
- def test_dynamic_area_parser_passes_resolution(self):
- """Test that the resolution from the file is passed to a dynamic area."""
- from pyresample import parse_area_file
- test_area_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.yaml')
- test_area = parse_area_file(test_area_file, 'omerc_bb_1000')[0]
- assert test_area.resolution == (1000, 1000)
-
- def test_multiple_file_content(self):
- from pyresample import parse_area_file
- from pyresample.area_config import load_area_from_string
- area_list = ["""ease_sh:
- description: Antarctic EASE grid
- projection:
- a: 6371228.0
- units: m
- lon_0: 0
- proj: laea
- lat_0: -90
- shape:
- height: 425
- width: 425
- area_extent:
- lower_left_xy: [-5326849.0625, -5326849.0625]
- upper_right_xy: [5326849.0625, 5326849.0625]
- units: m
-""",
- """ease_sh2:
- description: Antarctic EASE grid
- projection:
- a: 6371228.0
- units: m
- lon_0: 0
- proj: laea
- lat_0: -90
- shape:
- height: 425
- width: 425
- area_extent:
- lower_left_xy: [-5326849.0625, -5326849.0625]
- upper_right_xy: [5326849.0625, 5326849.0625]
- units: m
-"""]
- with self.assertWarns(DeprecationWarning):
- results = parse_area_file(area_list)
- self.assertEqual(len(results), 2)
- self.assertIn(results[0].area_id, ('ease_sh', 'ease_sh2'))
- self.assertIn(results[1].area_id, ('ease_sh', 'ease_sh2'))
- results2 = parse_area_file([io.StringIO(ar) for ar in area_list])
- results3 = load_area_from_string(area_list)
- self.assertEqual(results, results2)
- self.assertEqual(results, results3)
-
-
class TestPreprocessing(unittest.TestCase):
"""Tests for index generating functions."""
=====================================
pyresample/version.py
=====================================
@@ -6,7 +6,7 @@
# that just contains the computed version number.
# This file is released into the public domain. Generated by
-# versioneer-0.22 (https://github.com/python-versioneer/python-versioneer)
+# versioneer-0.23 (https://github.com/python-versioneer/python-versioneer)
"""Git implementation of _version.py."""
@@ -25,9 +25,9 @@ def get_keywords():
# setup.py/versioneer.py will grep for the variable names, so they must
# each be defined on a line of their own. _version.py will just call
# get_keywords().
- git_refnames = " (tag: v1.25.1)"
- git_full = "164b714a9d00a2b5e103a2214400248047dc8424"
- git_date = "2022-08-02 07:46:40 -0500"
+ git_refnames = " (HEAD -> main, tag: v1.26.0)"
+ git_full = "bc85902e8eac474a17e88206158a932835cf0a7a"
+ git_date = "2022-11-24 09:51:45 -0600"
keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
return keywords
@@ -254,13 +254,12 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
print("Directory %s not under git control" % root)
raise NotThisMethod("'git rev-parse --git-dir' returned error")
- MATCH_ARGS = ["--match", "%s*" % tag_prefix] if tag_prefix else []
-
# if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
# if there isn't one, this yields HEX[-dirty] (no NUM)
- describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
- "--always", "--long", *MATCH_ARGS],
- cwd=root)
+ describe_out, rc = runner(GITS, [
+ "describe", "--tags", "--dirty", "--always", "--long",
+ "--match", f"{tag_prefix}[[:digit:]]*"
+ ], cwd=root)
# --long was added in git-1.5.5
if describe_out is None:
raise NotThisMethod("'git describe' failed")
@@ -349,8 +348,8 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
else:
# HEX: no tags
pieces["closest-tag"] = None
- count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
- pieces["distance"] = int(count_out) # total number of commits
+ out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root)
+ pieces["distance"] = len(out.split()) # total number of commits
# commit date: see ISO-8601 comment in git_versions_from_keywords()
date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip()
@@ -446,7 +445,7 @@ def render_pep440_pre(pieces):
tag_version, post_version = pep440_split_post(pieces["closest-tag"])
rendered = tag_version
if post_version is not None:
- rendered += ".post%d.dev%d" % (post_version+1, pieces["distance"])
+ rendered += ".post%d.dev%d" % (post_version + 1, pieces["distance"])
else:
rendered += ".post0.dev%d" % (pieces["distance"])
else:
=====================================
versioneer.py
=====================================
@@ -1,5 +1,5 @@
-# Version: 0.22
+# Version: 0.23
"""The Versioneer - like a rocketeer, but for versions.
@@ -9,8 +9,8 @@ The Versioneer
* like a rocketeer, but for versions!
* https://github.com/python-versioneer/python-versioneer
* Brian Warner
-* License: Public Domain
-* Compatible with: Python 3.6, 3.7, 3.8, 3.9, 3.10 and pypy3
+* License: Public Domain (CC0-1.0)
+* Compatible with: Python 3.7, 3.8, 3.9, 3.10 and pypy3
* [![Latest Version][pypi-image]][pypi-url]
* [![Build Status][travis-image]][travis-url]
@@ -355,7 +355,7 @@ def get_config_from_root(root):
cfg.versionfile_source = section.get("versionfile_source")
cfg.versionfile_build = section.get("versionfile_build")
cfg.tag_prefix = section.get("tag_prefix")
- if cfg.tag_prefix in ("''", '""'):
+ if cfg.tag_prefix in ("''", '""', None):
cfg.tag_prefix = ""
cfg.parentdir_prefix = section.get("parentdir_prefix")
cfg.verbose = section.get("verbose")
@@ -431,7 +431,7 @@ LONG_VERSION_PY['git'] = r'''
# that just contains the computed version number.
# This file is released into the public domain. Generated by
-# versioneer-0.22 (https://github.com/python-versioneer/python-versioneer)
+# versioneer-0.23 (https://github.com/python-versioneer/python-versioneer)
"""Git implementation of _version.py."""
@@ -679,13 +679,12 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
print("Directory %%s not under git control" %% root)
raise NotThisMethod("'git rev-parse --git-dir' returned error")
- MATCH_ARGS = ["--match", "%%s*" %% tag_prefix] if tag_prefix else []
-
# if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
# if there isn't one, this yields HEX[-dirty] (no NUM)
- describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
- "--always", "--long", *MATCH_ARGS],
- cwd=root)
+ describe_out, rc = runner(GITS, [
+ "describe", "--tags", "--dirty", "--always", "--long",
+ "--match", f"{tag_prefix}[[:digit:]]*"
+ ], cwd=root)
# --long was added in git-1.5.5
if describe_out is None:
raise NotThisMethod("'git describe' failed")
@@ -774,8 +773,8 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
else:
# HEX: no tags
pieces["closest-tag"] = None
- count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
- pieces["distance"] = int(count_out) # total number of commits
+ out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root)
+ pieces["distance"] = len(out.split()) # total number of commits
# commit date: see ISO-8601 comment in git_versions_from_keywords()
date = runner(GITS, ["show", "-s", "--format=%%ci", "HEAD"], cwd=root)[0].strip()
@@ -871,7 +870,7 @@ def render_pep440_pre(pieces):
tag_version, post_version = pep440_split_post(pieces["closest-tag"])
rendered = tag_version
if post_version is not None:
- rendered += ".post%%d.dev%%d" %% (post_version+1, pieces["distance"])
+ rendered += ".post%%d.dev%%d" %% (post_version + 1, pieces["distance"])
else:
rendered += ".post0.dev%%d" %% (pieces["distance"])
else:
@@ -1202,13 +1201,12 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
print("Directory %s not under git control" % root)
raise NotThisMethod("'git rev-parse --git-dir' returned error")
- MATCH_ARGS = ["--match", "%s*" % tag_prefix] if tag_prefix else []
-
# if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
# if there isn't one, this yields HEX[-dirty] (no NUM)
- describe_out, rc = runner(GITS, ["describe", "--tags", "--dirty",
- "--always", "--long", *MATCH_ARGS],
- cwd=root)
+ describe_out, rc = runner(GITS, [
+ "describe", "--tags", "--dirty", "--always", "--long",
+ "--match", f"{tag_prefix}[[:digit:]]*"
+ ], cwd=root)
# --long was added in git-1.5.5
if describe_out is None:
raise NotThisMethod("'git describe' failed")
@@ -1297,8 +1295,8 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
else:
# HEX: no tags
pieces["closest-tag"] = None
- count_out, rc = runner(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
- pieces["distance"] = int(count_out) # total number of commits
+ out, rc = runner(GITS, ["rev-list", "HEAD", "--left-right"], cwd=root)
+ pieces["distance"] = len(out.split()) # total number of commits
# commit date: see ISO-8601 comment in git_versions_from_keywords()
date = runner(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[0].strip()
@@ -1310,7 +1308,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, runner=run_command):
return pieces
-def do_vcs_install(manifest_in, versionfile_source, ipy):
+def do_vcs_install(versionfile_source, ipy):
"""Git-specific installation logic for Versioneer.
For Git, this means creating/changing .gitattributes to mark _version.py
@@ -1319,7 +1317,7 @@ def do_vcs_install(manifest_in, versionfile_source, ipy):
GITS = ["git"]
if sys.platform == "win32":
GITS = ["git.cmd", "git.exe"]
- files = [manifest_in, versionfile_source]
+ files = [versionfile_source]
if ipy:
files.append(ipy)
try:
@@ -1372,7 +1370,7 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
SHORT_VERSION_PY = """
-# This file was generated by 'versioneer.py' (0.22) from
+# This file was generated by 'versioneer.py' (0.23) from
# revision-control system data, or from the parent directory name of an
# unpacked source archive. Distribution tarballs contain a pre-generated copy
# of this file.
@@ -1501,7 +1499,7 @@ def render_pep440_pre(pieces):
tag_version, post_version = pep440_split_post(pieces["closest-tag"])
rendered = tag_version
if post_version is not None:
- rendered += ".post%d.dev%d" % (post_version+1, pieces["distance"])
+ rendered += ".post%d.dev%d" % (post_version + 1, pieces["distance"])
else:
rendered += ".post0.dev%d" % (pieces["distance"])
else:
@@ -1753,7 +1751,7 @@ def get_version():
def get_cmdclass(cmdclass=None):
- """Get the custom setuptools/distutils subclasses used by Versioneer.
+ """Get the custom setuptools subclasses used by Versioneer.
If the package uses a different cmdclass (e.g. one from numpy), it
should be provide as an argument.
@@ -1775,11 +1773,8 @@ def get_cmdclass(cmdclass=None):
cmds = {} if cmdclass is None else cmdclass.copy()
- # we add "version" to both distutils and setuptools
- try:
- from setuptools import Command
- except ImportError:
- from distutils.core import Command
+ # we add "version" to setuptools
+ from setuptools import Command
class cmd_version(Command):
description = "report generated version string"
@@ -1802,7 +1797,7 @@ def get_cmdclass(cmdclass=None):
print(" error: %s" % vers["error"])
cmds["version"] = cmd_version
- # we override "build_py" in both distutils and setuptools
+ # we override "build_py" in setuptools
#
# most invocation pathways end up running build_py:
# distutils/build -> build_py
@@ -1817,13 +1812,14 @@ def get_cmdclass(cmdclass=None):
# then does setup.py bdist_wheel, or sometimes setup.py install
# setup.py egg_info -> ?
+ # pip install -e . and setuptool/editable_wheel will invoke build_py
+ # but the build_py command is not expected to copy any files.
+
# we override different "build_py" commands for both environments
if 'build_py' in cmds:
_build_py = cmds['build_py']
- elif "setuptools" in sys.modules:
- from setuptools.command.build_py import build_py as _build_py
else:
- from distutils.command.build_py import build_py as _build_py
+ from setuptools.command.build_py import build_py as _build_py
class cmd_build_py(_build_py):
def run(self):
@@ -1831,6 +1827,10 @@ def get_cmdclass(cmdclass=None):
cfg = get_config_from_root(root)
versions = get_versions()
_build_py.run(self)
+ if getattr(self, "editable_mode", False):
+ # During editable installs `.py` and data files are
+ # not copied to build_lib
+ return
# now locate _version.py in the new build/ directory and replace
# it with an updated value
if cfg.versionfile_build:
@@ -1842,10 +1842,8 @@ def get_cmdclass(cmdclass=None):
if 'build_ext' in cmds:
_build_ext = cmds['build_ext']
- elif "setuptools" in sys.modules:
- from setuptools.command.build_ext import build_ext as _build_ext
else:
- from distutils.command.build_ext import build_ext as _build_ext
+ from setuptools.command.build_ext import build_ext as _build_ext
class cmd_build_ext(_build_ext):
def run(self):
@@ -1863,6 +1861,11 @@ def get_cmdclass(cmdclass=None):
# it with an updated value
target_versionfile = os.path.join(self.build_lib,
cfg.versionfile_build)
+ if not os.path.exists(target_versionfile):
+ print(f"Warning: {target_versionfile} does not exist, skipping "
+ "version update. This can happen if you are running build_ext "
+ "without first running build_py.")
+ return
print("UPDATING %s" % target_versionfile)
write_to_version_file(target_versionfile, versions)
cmds["build_ext"] = cmd_build_ext
@@ -1925,13 +1928,48 @@ def get_cmdclass(cmdclass=None):
})
cmds["py2exe"] = cmd_py2exe
+ # sdist farms its file list building out to egg_info
+ if 'egg_info' in cmds:
+ _sdist = cmds['egg_info']
+ else:
+ from setuptools.command.egg_info import egg_info as _egg_info
+
+ class cmd_egg_info(_egg_info):
+ def find_sources(self):
+ # egg_info.find_sources builds the manifest list and writes it
+ # in one shot
+ super().find_sources()
+
+ # Modify the filelist and normalize it
+ root = get_root()
+ cfg = get_config_from_root(root)
+ self.filelist.append('versioneer.py')
+ if cfg.versionfile_source:
+ # There are rare cases where versionfile_source might not be
+ # included by default, so we must be explicit
+ self.filelist.append(cfg.versionfile_source)
+ self.filelist.sort()
+ self.filelist.remove_duplicates()
+
+ # The write method is hidden in the manifest_maker instance that
+ # generated the filelist and was thrown away
+ # We will instead replicate their final normalization (to unicode,
+ # and POSIX-style paths)
+ from setuptools import unicode_utils
+ normalized = [unicode_utils.filesys_decode(f).replace(os.sep, '/')
+ for f in self.filelist.files]
+
+ manifest_filename = os.path.join(self.egg_info, 'SOURCES.txt')
+ with open(manifest_filename, 'w') as fobj:
+ fobj.write('\n'.join(normalized))
+
+ cmds['egg_info'] = cmd_egg_info
+
# we override different "sdist" commands for both environments
if 'sdist' in cmds:
_sdist = cmds['sdist']
- elif "setuptools" in sys.modules:
- from setuptools.command.sdist import sdist as _sdist
else:
- from distutils.command.sdist import sdist as _sdist
+ from setuptools.command.sdist import sdist as _sdist
class cmd_sdist(_sdist):
def run(self):
@@ -2056,42 +2094,10 @@ def do_setup():
print(" %s doesn't exist, ok" % ipy)
ipy = None
- # Make sure both the top-level "versioneer.py" and versionfile_source
- # (PKG/_version.py, used by runtime code) are in MANIFEST.in, so
- # they'll be copied into source distributions. Pip won't be able to
- # install the package without this.
- manifest_in = os.path.join(root, "MANIFEST.in")
- simple_includes = set()
- try:
- with open(manifest_in, "r") as f:
- for line in f:
- if line.startswith("include "):
- for include in line.split()[1:]:
- simple_includes.add(include)
- except OSError:
- pass
- # That doesn't cover everything MANIFEST.in can do
- # (http://docs.python.org/2/distutils/sourcedist.html#commands), so
- # it might give some false negatives. Appending redundant 'include'
- # lines is safe, though.
- if "versioneer.py" not in simple_includes:
- print(" appending 'versioneer.py' to MANIFEST.in")
- with open(manifest_in, "a") as f:
- f.write("include versioneer.py\n")
- else:
- print(" 'versioneer.py' already in MANIFEST.in")
- if cfg.versionfile_source not in simple_includes:
- print(" appending versionfile_source ('%s') to MANIFEST.in" %
- cfg.versionfile_source)
- with open(manifest_in, "a") as f:
- f.write("include %s\n" % cfg.versionfile_source)
- else:
- print(" versionfile_source already in MANIFEST.in")
-
# Make VCS-specific changes. For git, this means creating/changing
# .gitattributes to mark _version.py for export-subst keyword
# substitution.
- do_vcs_install(manifest_in, cfg.versionfile_source, ipy)
+ do_vcs_install(cfg.versionfile_source, ipy)
return 0
View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/-/commit/2451f9b0641983855590d309229596607aeb87e2
--
View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/-/commit/2451f9b0641983855590d309229596607aeb87e2
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20221124/53cd8c3f/attachment-0001.htm>
More information about the Pkg-grass-devel
mailing list