[Git][debian-gis-team/trollimage][upstream] New upstream version 1.23.0
Antonio Valentino (@antonio.valentino)
gitlab at salsa.debian.org
Wed Feb 14 18:54:11 GMT 2024
Antonio Valentino pushed to branch upstream at Debian GIS Project / trollimage
Commits:
4ecb1d98 by Antonio Valentino at 2024-02-14T18:40:52+00:00
New upstream version 1.23.0
- - - - -
12 changed files:
- .github/dependabot.yml
- .github/workflows/ci.yaml
- .github/workflows/deploy.yaml
- .pre-commit-config.yaml
- CHANGELOG.md
- pyproject.toml
- setup.cfg
- trollimage/colormap.py
- trollimage/tests/test_colormap.py
- trollimage/tests/test_image.py
- trollimage/version.py
- trollimage/xrimage.py
Changes:
=====================================
.github/dependabot.yml
=====================================
@@ -8,4 +8,4 @@ updates:
- package-ecosystem: "github-actions" # See documentation for possible values
directory: "/" # Location of package manifests
schedule:
- interval: "weekly"
+ interval: "monthly"
=====================================
.github/workflows/ci.yaml
=====================================
@@ -28,7 +28,7 @@ jobs:
uses: actions/checkout at v4
- name: Setup Conda Environment
- uses: conda-incubator/setup-miniconda at v2
+ uses: conda-incubator/setup-miniconda at v3
with:
miniforge-variant: Mambaforge
miniforge-version: latest
@@ -72,7 +72,7 @@ jobs:
pytest --cov=trollimage trollimage/tests --cov-report=xml --cov-report=
- name: Upload unittest coverage to Codecov
- uses: codecov/codecov-action at v3
+ uses: codecov/codecov-action at v4
with:
flags: unittests
file: ./coverage.xml
=====================================
.github/workflows/deploy.yaml
=====================================
@@ -22,7 +22,7 @@ jobs:
python -m build -s
- name: Upload sdist to build artifacts
- uses: actions/upload-artifact at v3
+ uses: actions/upload-artifact at v4
with:
name: sdist
path: dist/*.tar.gz
@@ -36,12 +36,16 @@ jobs:
include:
- os: windows-2019
cibw_archs: "AMD64 ARM64"
+ artifact_name: "win"
- os: macos-11
cibw_archs: "x86_64 arm64"
+ artifact_name: "mac"
- os: "ubuntu-20.04"
cibw_archs: "aarch64"
+ artifact_name: "ubuntu-aarch"
- os: "ubuntu-20.04"
cibw_archs: "x86_64"
+ artifact_name: "ubuntu-x86_64"
steps:
- uses: actions/checkout at v4
@@ -55,15 +59,15 @@ jobs:
platforms: all
- name: Build wheels
- uses: pypa/cibuildwheel at v2.16.2
+ uses: pypa/cibuildwheel at v2.16.5
env:
CIBW_SKIP: "cp36-* cp37-* cp38-* pp* *-manylinux_i686 *-musllinux_i686 *-musllinux_aarch64 *-win32"
CIBW_ARCHS: "${{ matrix.cibw_archs }}"
CIBW_TEST_SKIP: "*_arm64 *_universal2:arm64"
- - uses: actions/upload-artifact at v3
+ - uses: actions/upload-artifact at v4
with:
- name: wheels
+ name: wheels-${{ matrix.artifact_name }}
path: ./wheelhouse/*.whl
upload_to_pypi:
@@ -71,25 +75,40 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Download sdist artifact
- uses: actions/download-artifact at v3
+ uses: actions/download-artifact at v4
with:
name: sdist
path: dist
- - name: Download wheels artifact
- uses: actions/download-artifact at v3
+ - name: Download wheels artifact - win
+ uses: actions/download-artifact at v4
with:
- name: wheels
+ name: wheels-win
+ path: dist
+ - name: Download wheels artifact - mac
+ uses: actions/download-artifact at v4
+ with:
+ name: wheels-mac
+ path: dist
+ - name: Download wheels artifact - ubuntu aarch
+ uses: actions/download-artifact at v4
+ with:
+ name: wheels-ubuntu-aarch
+ path: dist
+ - name: Download wheels artifact - ubuntu x86_64
+ uses: actions/download-artifact at v4
+ with:
+ name: wheels-ubuntu-x86_64
path: dist
- name: Publish package to Test PyPI
if: github.event.action != 'published' && github.event_name == 'push' && startsWith(github.event.ref, 'refs/tags/v')
- uses: pypa/gh-action-pypi-publish at v1.8.10
+ uses: pypa/gh-action-pypi-publish at v1.8.11
with:
user: __token__
password: ${{ secrets.test_pypi_password }}
repository_url: https://test.pypi.org/legacy/
- name: Publish package to PyPI
if: github.event.action == 'published'
- uses: pypa/gh-action-pypi-publish at v1.8.10
+ uses: pypa/gh-action-pypi-publish at v1.8.11
with:
user: __token__
password: ${{ secrets.pypi_password }}
=====================================
.pre-commit-config.yaml
=====================================
@@ -6,3 +6,8 @@ repos:
hooks:
- id: flake8
additional_dependencies: [flake8-docstrings, flake8-debugger, flake8-bugbear]
+
+ci:
+ # To trigger manually, comment on a pull request with "pre-commit.ci autofix"
+ autofix_prs: false
+ autoupdate_schedule: "monthly"
\ No newline at end of file
=====================================
CHANGELOG.md
=====================================
@@ -1,3 +1,25 @@
+## Version 1.23.0 (2024/02/14)
+
+### Issues Closed
+
+* [Issue 162](https://github.com/pytroll/trollimage/issues/162) - Versions above 1.21.0 will not produce IR Images ([PR 163](https://github.com/pytroll/trollimage/pull/163) by [@mraspaud](https://github.com/mraspaud))
+
+In this release 1 issue was closed.
+
+### Pull Requests Merged
+
+#### Bugs fixed
+
+* [PR 157](https://github.com/pytroll/trollimage/pull/157) - Fix most warnings during tests
+* [PR 149](https://github.com/pytroll/trollimage/pull/149) - Make sure no cast warning is issued when saving
+
+#### Features added
+
+* [PR 163](https://github.com/pytroll/trollimage/pull/163) - Allow linear alpha stretching in certain conditions ([162](https://github.com/pytroll/trollimage/issues/162))
+
+In this release 3 pull requests were closed.
+
+
## Version 1.22.2 (2023/11/26)
### Pull Requests Merged
=====================================
pyproject.toml
=====================================
@@ -13,3 +13,19 @@ build-backend = "setuptools.build_meta"
relative_files = true
plugins = ["Cython.Coverage"]
omit = ["trollimage/version.py", "versioneer.py"]
+
+[tool.pytest.ini_options]
+minversion = "6.0"
+addopts = ["-ra", "--showlocals", "--strict-markers", "--strict-config"]
+xfail_strict = true
+filterwarnings = [
+ "error",
+ "ignore::rasterio.errors.NotGeoreferencedWarning",
+ # dateutil needs a new release
+ # https://github.com/dateutil/dateutil/issues/1314
+ 'ignore:datetime.datetime.utcfromtimestamp\(\) is deprecated and scheduled for removal:DeprecationWarning:dateutil',
+]
+log_cli_level = "info"
+testpaths = [
+ "trollimage/tests",
+]
\ No newline at end of file
=====================================
setup.cfg
=====================================
@@ -1,10 +1,3 @@
-[bdist_rpm]
-requires=numpy python-pillow
-release=1
-
-[bdist_wheel]
-universal=1
-
[flake8]
max-line-length = 120
exclude =
=====================================
trollimage/colormap.py
=====================================
@@ -239,11 +239,11 @@ class Colormap(object):
f"{self.values.shape[0]} and {self.colors.shape[0]}.")
def _validate_colors(self, colors):
- colors = np.array(colors)
+ colors = np.asarray(colors)
if colors.ndim != 2 or colors.shape[-1] not in (3, 4):
raise ValueError("Colormap 'colors' must be RGB or RGBA. Got unexpected shape: {}".format(colors.shape))
if not np.issubdtype(colors.dtype, np.floating):
- warnings.warn("Colormap 'colors' should be flotaing point numbers between 0 and 1.", stacklevel=3)
+ warnings.warn("Colormap 'colors' should be floating point numbers between 0 and 1.", stacklevel=3)
colors = colors.astype(np.float64)
return colors
=====================================
trollimage/tests/test_colormap.py
=====================================
@@ -184,19 +184,20 @@ class TestColormap:
def test_nonfloat_colors(self):
"""Pass integer colors to colormap."""
- colormap.Colormap(
- colors=np.arange(5 * 3, dtype=np.uint8).reshape((5, 3)),
- values=np.linspace(0, 1, 5),
- )
+ with pytest.warns(UserWarning, match="should be floating point"):
+ colormap.Colormap(
+ colors=np.arange(5 * 3, dtype=np.uint8).reshape((5, 3)),
+ values=np.linspace(0, 1, 5),
+ )
def test_merge_nonmonotonic(self):
"""Test that merged colormaps must have monotonic values."""
cmap1 = colormap.Colormap(
- colors=np.arange(5 * 3).reshape((5, 3)),
+ colors=np.arange(5 * 3.0).reshape((5, 3)),
values=np.linspace(2, 3, 5),
)
cmap2 = colormap.Colormap(
- colors=np.arange(5 * 3).reshape((5, 3)),
+ colors=np.arange(5 * 3.0).reshape((5, 3)),
values=np.linspace(0, 1, 5),
)
with pytest.raises(ValueError, match=r".*monotonic.*"):
@@ -274,11 +275,11 @@ class TestColormap:
def test_merge_equal_values(self):
"""Test that merged colormaps can have equal values at the merge point."""
cmap1 = colormap.Colormap(
- colors=np.arange(5 * 3).reshape((5, 3)),
+ colors=np.arange(5 * 3.0).reshape((5, 3)),
values=np.linspace(0, 1, 5),
)
cmap2 = colormap.Colormap(
- colors=np.arange(5 * 3).reshape((5, 3)),
+ colors=np.arange(5 * 3.0).reshape((5, 3)),
values=np.linspace(1, 2, 5),
)
assert cmap1.values[-1] == cmap2.values[0]
@@ -288,11 +289,11 @@ class TestColormap:
def test_merge_monotonic_decreasing(self):
"""Test that merged colormaps can be monotonically decreasing."""
cmap1 = colormap.Colormap(
- colors=np.arange(5 * 3).reshape((5, 3)),
+ colors=np.arange(5 * 3.0).reshape((5, 3)),
values=np.linspace(2, 1, 5),
)
cmap2 = colormap.Colormap(
- colors=np.arange(5 * 3).reshape((5, 3)),
+ colors=np.arange(5 * 3.0).reshape((5, 3)),
values=np.linspace(1, 0, 5),
)
_assert_monotonic_values(cmap1, increasing=False)
@@ -557,7 +558,7 @@ class TestFromFileCreation:
else:
res = orig_cmap.to_csv(None, color_scale=color_scale)
assert isinstance(res, str)
- new_cmap = colormap.Colormap.from_file(res, color_scale=color_scale)
+ new_cmap = colormap.Colormap.from_string(res, color_scale=color_scale)
np.testing.assert_allclose(orig_cmap.values, new_cmap.values)
np.testing.assert_allclose(orig_cmap.colors, new_cmap.colors)
@@ -636,23 +637,27 @@ class TestFromFileCreation:
with pytest.raises(ValueError):
colormap.Colormap.from_file(cmap_filename)
- def test_cmap_from_np(self, tmp_path):
+ @pytest.mark.parametrize("color_scale", [None, 1.0])
+ def test_cmap_from_np(self, tmp_path, color_scale):
"""Test creating a colormap from a numpy file."""
- cmap_data = _generate_cmap_test_data(None, "RGB")
+ cmap_data = _generate_cmap_test_data(color_scale, "RGB")
fnp = tmp_path / "test.npy"
np.save(fnp, cmap_data)
- cmap = colormap.Colormap.from_np(fnp, color_scale=1)
+ cmap = colormap.Colormap.from_np(fnp, color_scale=color_scale or 255)
np.testing.assert_allclose(cmap.values, [0, 0.33333333, 0.6666667, 1])
- np.testing.assert_array_equal(cmap.colors, cmap_data)
+ exp_data = cmap_data if color_scale == 1.0 else cmap_data / 255.0
+ np.testing.assert_array_equal(cmap.colors, exp_data)
- def test_cmap_from_csv(self, tmp_path, color_scale=1):
+ @pytest.mark.parametrize("color_scale", [None, 1.0])
+ def test_cmap_from_csv(self, tmp_path, color_scale):
"""Test creating a colormap from a CSV file."""
- cmap_data = _generate_cmap_test_data(None, "RGB")
+ cmap_data = _generate_cmap_test_data(color_scale, "RGB")
fnp = tmp_path / "test.csv"
np.savetxt(fnp, cmap_data, delimiter=",")
- cmap = colormap.Colormap.from_csv(fnp, color_scale=1)
+ cmap = colormap.Colormap.from_csv(fnp, color_scale=color_scale or 255)
np.testing.assert_allclose(cmap.values, [0, 0.33333333, 0.66666667, 1])
- np.testing.assert_array_equal(cmap.colors, cmap_data)
+ exp_data = cmap_data if color_scale == 1.0 else cmap_data / 255.0
+ np.testing.assert_array_equal(cmap.colors, exp_data)
def test_cmap_from_string():
@@ -663,12 +668,14 @@ def test_cmap_from_string():
np.testing.assert_array_equal(cmap.colors, [[0, 0, 0], [1, 1, 1], [2, 2, 2]])
-def test_cmap_from_ndarray():
+ at pytest.mark.parametrize("color_scale", [None, 255, 1.0])
+def test_cmap_from_ndarray(color_scale):
"""Test creating a colormap from a numpy array."""
- cmap_data = _generate_cmap_test_data(None, "RGB")
- cmap = colormap.Colormap.from_ndarray(cmap_data, color_scale=1)
+ cmap_data = _generate_cmap_test_data(color_scale, "RGB")
+ cmap = colormap.Colormap.from_ndarray(cmap_data, color_scale=color_scale or 255)
np.testing.assert_allclose(cmap.values, [0, 0.33333333, 0.66666667, 1])
- np.testing.assert_array_equal(cmap.colors, cmap_data)
+ exp_data = cmap_data if color_scale == 1.0 else cmap_data / 255.0
+ np.testing.assert_array_equal(cmap.colors, exp_data)
def test_cmap_from_name():
=====================================
trollimage/tests/test_image.py
=====================================
@@ -22,6 +22,8 @@ import random
import sys
import tempfile
import unittest
+from datetime import timezone, datetime
+import warnings
from unittest import mock
from collections import OrderedDict
from tempfile import NamedTemporaryFile
@@ -879,160 +881,216 @@ class TestXRImage:
with NamedTemporaryFile(suffix='.png') as tmp:
img.save(tmp.name)
- @pytest.mark.skipif(sys.platform.startswith('win'),
- reason="'NamedTemporaryFile' not supported on Windows")
- def test_save_geotiff_float(self):
+ def test_save_geotiff_float_numpy_array(self, tmp_path):
"""Test saving geotiffs when input data is float."""
# numpy array image - scale to 0 to 1 first
- data = xr.DataArray(np.arange(75).reshape(5, 5, 3) / 75.,
+ data = xr.DataArray(np.arange(75).reshape((5, 5, 3)) / 75.,
dims=['y', 'x', 'bands'],
coords={'bands': ['R', 'G', 'B']})
img = xrimage.XRImage(data)
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (4, 5, 5) # alpha band added
- exp = (np.arange(75.).reshape(5, 5, 3) / 75. * 255).round()
- np.testing.assert_allclose(file_data[0], exp[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp[:, :, 2])
- np.testing.assert_allclose(file_data[3], 255) # completely opaque
-
- data = xr.DataArray(da.from_array(np.arange(75.).reshape(5, 5, 3) / 75., chunks=5),
+ filename = tmp_path / "image.tif"
+
+ img.save(filename)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (4, 5, 5) # alpha band added
+ exp = (np.arange(75.).reshape(5, 5, 3) / 75. * 255).round()
+ np.testing.assert_allclose(file_data[0], exp[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp[:, :, 2])
+ np.testing.assert_allclose(file_data[3], 255) # completely opaque
+
+ def test_save_geotiff_float_dask_array(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
dims=['y', 'x', 'bands'],
coords={'bands': ['R', 'G', 'B']})
img = xrimage.XRImage(data)
- # Regular default save
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (4, 5, 5) # alpha band added
- exp = (np.arange(75.).reshape(5, 5, 3) / 75. * 255).round()
- np.testing.assert_allclose(file_data[0], exp[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp[:, :, 2])
- np.testing.assert_allclose(file_data[3], 255) # completely opaque
-
- # with NaNs
+ filename = tmp_path / "image.tif"
+
+ img.save(filename)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (4, 5, 5) # alpha band added
+ exp = (np.arange(75.).reshape(5, 5, 3) / 75. * 255).round()
+ np.testing.assert_allclose(file_data[0], exp[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp[:, :, 2])
+ np.testing.assert_allclose(file_data[3], 255) # completely opaque
+
+ def test_save_geotiff_float_dask_array_with_nans(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
data = data.where(data > 10. / 75.)
img = xrimage.XRImage(data)
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (4, 5, 5) # alpha band added
- exp = np.arange(75.).reshape(5, 5, 3) / 75.
- exp[exp <= 10. / 75.] = 0 # numpy converts NaNs to 0s
- exp = (exp * 255).round()
- np.testing.assert_allclose(file_data[0], exp[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp[:, :, 2])
- is_null = (exp == 0).all(axis=2)
- np.testing.assert_allclose(file_data[3][~is_null], 255) # completely opaque
- np.testing.assert_allclose(file_data[3][is_null], 0) # completely transparent
-
- # with fill value
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name, fill_value=128)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (3, 5, 5) # no alpha band
- exp = np.arange(75.).reshape(5, 5, 3) / 75.
- exp2 = (exp * 255).round()
- exp2[exp <= 10. / 75.] = 128
- np.testing.assert_allclose(file_data[0], exp2[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp2[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp2[:, :, 2])
-
- # float type - floats can't have alpha channel
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name, dtype=np.float32)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (3, 5, 5) # no alpha band
- exp = np.arange(75.).reshape(5, 5, 3) / 75.
- # fill value is forced to 0
- exp[exp <= 10. / 75.] = 0
- np.testing.assert_allclose(file_data[0], exp[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp[:, :, 2])
-
- # float type with NaN fill value
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name, dtype=np.float32, fill_value=np.nan)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (3, 5, 5) # no alpha band
- exp = np.arange(75.).reshape(5, 5, 3) / 75.
- exp[exp <= 10. / 75.] = np.nan
- np.testing.assert_allclose(file_data[0], exp[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp[:, :, 2])
-
- # float type with non-NaN fill value
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name, dtype=np.float32, fill_value=128)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (3, 5, 5) # no alpha band
- exp = np.arange(75.).reshape(5, 5, 3) / 75.
- exp[exp <= 10. / 75.] = 128
- np.testing.assert_allclose(file_data[0], exp[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp[:, :, 2])
-
- # float input with fill value saved to int16 (signed!)
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name, dtype=np.int16, fill_value=-128)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (3, 5, 5) # no alpha band
- exp = np.arange(75.).reshape(5, 5, 3) / 75.
- exp2 = (exp * (2 ** 16 - 1) - (2 ** 15)).round()
- exp2[exp <= 10. / 75.] = -128.
- np.testing.assert_allclose(file_data[0], exp2[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp2[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp2[:, :, 2])
+ filename = tmp_path / "image.tif"
+
+ with warnings.catch_warnings():
+ warnings.simplefilter("error", RuntimeWarning)
+ img.save(filename)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (4, 5, 5) # alpha band added
+ exp = np.arange(75.).reshape(5, 5, 3) / 75.
+ exp[exp <= 10. / 75.] = 0 # numpy converts NaNs to 0s
+ exp = (exp * 255).round()
+ np.testing.assert_allclose(file_data[0], exp[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp[:, :, 2])
+ is_null = (exp == 0).all(axis=2)
+ np.testing.assert_allclose(file_data[3][~is_null], 255) # completely opaque
+ np.testing.assert_allclose(file_data[3][is_null], 0) # completely transparent
+
+ def test_save_geotiff_float_dask_array_with_nans_and_fill_value(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
+ data = data.where(data > 10. / 75.)
+ img = xrimage.XRImage(data)
+ filename = tmp_path / "image.tif"
+
+ with pytest.warns(UserWarning, match="fill value will overlap with valid data"):
+ img.save(filename, fill_value=128)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (3, 5, 5) # no alpha band
+ exp = np.arange(75.).reshape(5, 5, 3) / 75.
+ exp2 = (exp * 255).round()
+ exp2[exp <= 10. / 75.] = 128
+ np.testing.assert_allclose(file_data[0], exp2[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp2[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp2[:, :, 2])
+
+ def test_save_geotiff_float_dask_array_to_float(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
+ data = data.where(data > 10. / 75.)
+ img = xrimage.XRImage(data)
+ filename = tmp_path / "image.tif"
+
+ img.save(filename, dtype=np.float32)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (3, 5, 5) # no alpha band
+ exp = np.arange(75.).reshape(5, 5, 3) / 75.
+ # fill value is forced to 0
+ exp[exp <= 10. / 75.] = 0
+ np.testing.assert_allclose(file_data[0], exp[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp[:, :, 2])
+
+ def test_save_geotiff_float_dask_array_to_float_with_nans_fill_value(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
+ data = data.where(data > 10. / 75.)
+ img = xrimage.XRImage(data)
+ filename = tmp_path / "image.tif"
+
+ img.save(filename, dtype=np.float32, fill_value=np.nan)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (3, 5, 5) # no alpha band
+ exp = np.arange(75.).reshape(5, 5, 3) / 75.
+ exp[exp <= 10. / 75.] = np.nan
+ np.testing.assert_allclose(file_data[0], exp[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp[:, :, 2])
+
+ def test_save_geotiff_float_dask_array_to_float_with_numeric_fill_value(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
+ data = data.where(data > 10. / 75.)
+ img = xrimage.XRImage(data)
+ filename = tmp_path / "image.tif"
+
+ img.save(filename, dtype=np.float32, fill_value=128)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (3, 5, 5) # no alpha band
+ exp = np.arange(75.).reshape(5, 5, 3) / 75.
+ exp[exp <= 10. / 75.] = 128
+ np.testing.assert_allclose(file_data[0], exp[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp[:, :, 2])
+
+ def test_save_geotiff_float_dask_array_to_signed_int(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
+ data = data.where(data > 10. / 75.)
+ img = xrimage.XRImage(data)
+ filename = tmp_path / "image.tif"
+
+ with pytest.warns(UserWarning, match="fill value will overlap with valid data"):
+ img.save(filename, dtype=np.int16, fill_value=-128)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (3, 5, 5) # no alpha band
+ exp = np.arange(75.).reshape(5, 5, 3) / 75.
+ exp2 = (exp * (2 ** 16 - 1) - (2 ** 15)).round()
+ exp2[exp <= 10. / 75.] = -128.
+ np.testing.assert_allclose(file_data[0], exp2[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp2[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp2[:, :, 2])
+
+ def test_delayed_save_geotiff_float_dask_array(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
+ data = data.where(data > 10. / 75.)
+ img = xrimage.XRImage(data)
+ filename = tmp_path / "image.tif"
- # dask delayed save
- with NamedTemporaryFile(suffix='.tif') as tmp:
- delay = img.save(tmp.name, compute=False)
- assert isinstance(delay, tuple)
- assert isinstance(delay[0], da.Array)
- assert isinstance(delay[1], RIODataset)
- da.store(*delay)
- delay[1].close()
+ delay = img.save(filename, compute=False)
+ assert isinstance(delay, tuple)
+ assert isinstance(delay[0], da.Array)
+ assert isinstance(delay[1], RIODataset)
+ da.store(*delay)
+ delay[1].close()
- # float RGBA input to uint8
+ def test_save_geotiff_float_dask_array_with_alpha(self, tmp_path):
+ """Test saving geotiffs when input data is float."""
+ data = xr.DataArray(da.from_array(np.arange(75.).reshape((5, 5, 3)) / 75., chunks=5),
+ dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B']})
+ data = data.where(data > 10. / 75.)
alpha = xr.ones_like(data[:, :, 0])
alpha = alpha.where(data.notnull().all(dim='bands'), 0)
alpha['bands'] = 'A'
# make a float version of a uint8 RGBA
rgb_data = xr.concat((data, alpha), dim='bands')
img = xrimage.XRImage(rgb_data)
- with NamedTemporaryFile(suffix='.tif') as tmp:
- img.save(tmp.name)
- with rio.open(tmp.name) as f:
- file_data = f.read()
- assert file_data.shape == (4, 5, 5) # alpha band already existed
- exp = np.arange(75.).reshape(5, 5, 3) / 75.
- exp[exp <= 10. / 75.] = 0 # numpy converts NaNs to 0s
- exp = (exp * 255.).round()
- np.testing.assert_allclose(file_data[0], exp[:, :, 0])
- np.testing.assert_allclose(file_data[1], exp[:, :, 1])
- np.testing.assert_allclose(file_data[2], exp[:, :, 2])
- not_null = (alpha != 0).values
- np.testing.assert_allclose(file_data[3][not_null], 255) # completely opaque
- np.testing.assert_allclose(file_data[3][~not_null], 0) # completely transparent
+ filename = tmp_path / "image.tif"
+
+ img.save(filename)
+ with rio.open(filename) as f:
+ file_data = f.read()
+ assert file_data.shape == (4, 5, 5) # alpha band already existed
+ exp = np.arange(75.).reshape(5, 5, 3) / 75.
+ exp[exp <= 10. / 75.] = 0 # numpy converts NaNs to 0s
+ exp = (exp * 255.).round()
+ np.testing.assert_allclose(file_data[0], exp[:, :, 0])
+ np.testing.assert_allclose(file_data[1], exp[:, :, 1])
+ np.testing.assert_allclose(file_data[2], exp[:, :, 2])
+ not_null = (alpha != 0).values
+ np.testing.assert_allclose(file_data[3][not_null], 255) # completely opaque
+ np.testing.assert_allclose(file_data[3][~not_null], 0) # completely transparent
@pytest.mark.skipif(sys.platform.startswith('win'),
reason="'NamedTemporaryFile' not supported on Windows")
def test_save_geotiff_datetime(self):
"""Test saving geotiffs when start_time is in the attributes."""
- import datetime as dt
-
data = xr.DataArray(np.arange(75).reshape(5, 5, 3), dims=[
'y', 'x', 'bands'], coords={'bands': ['R', 'G', 'B']})
@@ -1042,7 +1100,7 @@ class TestXRImage:
assert "TIFFTAG_DATETIME" not in tags
# Valid datetime
- data.attrs['start_time'] = dt.datetime.utcnow()
+ data.attrs['start_time'] = datetime.now(timezone.utc)
tags = _get_tags_after_writing_to_geotiff(data)
assert "TIFFTAG_DATETIME" in tags
@@ -1639,6 +1697,81 @@ class TestXRImage:
np.testing.assert_allclose(img.data.values, res, atol=1.e-6)
+ @pytest.mark.parametrize("dtype", (np.float32, np.float64, float))
+ def test_linear_stretch_does_not_affect_alpha_with_partial_cutoffs(self, dtype):
+ """Test linear stretching with cutoffs."""
+ arr = np.arange(100, dtype=dtype).reshape(5, 5, 4) / 74.
+ arr[:, :, -1] = 1 # alpha channel, fully opaque
+ data = xr.DataArray(arr.copy(), dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B', 'A']})
+ img = xrimage.XRImage(data)
+ img.stretch_linear([(0.005, 0.005), (0.005, 0.005), (0.005, 0.005)])
+ assert img.data.dtype == dtype
+ res = np.array([[[-0.005051, -0.005051, -0.005051, 1.],
+ [0.037037, 0.037037, 0.037037, 1.],
+ [0.079125, 0.079125, 0.079125, 1.],
+ [0.121212, 0.121212, 0.121212, 1.],
+ [0.1633, 0.1633, 0.1633, 1.]],
+ [[0.205387, 0.205387, 0.205387, 1.],
+ [0.247475, 0.247475, 0.247475, 1.],
+ [0.289562, 0.289562, 0.289562, 1.],
+ [0.33165, 0.33165, 0.33165, 1.],
+ [0.373737, 0.373737, 0.373737, 1.]],
+ [[0.415825, 0.415825, 0.415825, 1.],
+ [0.457912, 0.457912, 0.457912, 1.],
+ [0.5, 0.5, 0.5, 1.],
+ [0.542088, 0.542088, 0.542088, 1.],
+ [0.584175, 0.584175, 0.584175, 1.]],
+ [[0.626263, 0.626263, 0.626263, 1.],
+ [0.66835, 0.66835, 0.66835, 1.],
+ [0.710438, 0.710438, 0.710438, 1.],
+ [0.752525, 0.752525, 0.752525, 1.],
+ [0.794613, 0.794613, 0.794613, 1.]],
+ [[0.8367, 0.8367, 0.8367, 1.],
+ [0.878788, 0.878788, 0.878788, 1.],
+ [0.920875, 0.920875, 0.920875, 1.],
+ [0.962963, 0.962963, 0.962963, 1.],
+ [1.005051, 1.005051, 1.005051, 1.]]], dtype=dtype)
+
+ np.testing.assert_allclose(img.data.values, res, atol=1.e-6)
+
+ @pytest.mark.parametrize("dtype", (np.float32, np.float64, float))
+ def test_linear_stretch_does_affect_alpha_with_explicit_cutoffs(self, dtype):
+ """Test linear stretching with full explicit cutoffs."""
+ arr = np.arange(100, dtype=dtype).reshape(5, 5, 4) / 74.
+ data = xr.DataArray(arr.copy(), dims=['y', 'x', 'bands'],
+ coords={'bands': ['R', 'G', 'B', 'A']})
+ img = xrimage.XRImage(data)
+ img.stretch_linear([(0.005, 0.005), (0.005, 0.005), (0.005, 0.005), (0.005, 0.005)])
+ assert img.data.dtype == dtype
+ res = np.array([[[-0.005051, -0.005051, -0.005051, -0.005051],
+ [0.037037, 0.037037, 0.037037, 0.037037],
+ [0.079125, 0.079125, 0.079125, 0.079125],
+ [0.121212, 0.121212, 0.121212, 0.121212],
+ [0.1633, 0.1633, 0.1633, 0.1633]],
+ [[0.205387, 0.205387, 0.205387, 0.205387],
+ [0.247475, 0.247475, 0.247475, 0.247475],
+ [0.289562, 0.289562, 0.289562, 0.289562],
+ [0.33165, 0.33165, 0.33165, 0.33165],
+ [0.373737, 0.373737, 0.373737, 0.373737]],
+ [[0.415825, 0.415825, 0.415825, 0.415825],
+ [0.457912, 0.457912, 0.457912, 0.457912],
+ [0.5, 0.5, 0.5, 0.5],
+ [0.542088, 0.542088, 0.542088, 0.542088],
+ [0.584175, 0.584175, 0.584175, 0.584175]],
+ [[0.626263, 0.626263, 0.626263, 0.626263],
+ [0.66835, 0.66835, 0.66835, 0.66835],
+ [0.710438, 0.710438, 0.710438, 0.710438],
+ [0.752525, 0.752525, 0.752525, 0.752525],
+ [0.794613, 0.794613, 0.794613, 0.794613]],
+ [[0.8367, 0.8367, 0.8367, 0.8367],
+ [0.878788, 0.878788, 0.878788, 0.878788],
+ [0.920875, 0.920875, 0.920875, 0.920875],
+ [0.962963, 0.962963, 0.962963, 0.962963],
+ [1.005051, 1.005051, 1.005051, 1.005051]]], dtype=dtype)
+
+ np.testing.assert_allclose(img.data.values, res, atol=1.e-6)
+
@pytest.mark.parametrize(("dtype", "max_val", "exp_min", "exp_max"),
((np.uint8, 255, -0.005358012691140175, 1.0053772069513798),
(np.int8, 127, -0.004926108196377754, 1.0058689523488282),
@@ -1762,42 +1895,39 @@ class TestXRImage:
from trollimage import xrimage
arr = np.arange(75., dtype=dtype).reshape(5, 5, 3) / 74.
- data = xr.DataArray(arr.copy(), dims=['y', 'x', 'bands'],
+ data = xr.DataArray(arr.copy() + 0.1, dims=['y', 'x', 'bands'],
coords={'bands': ['R', 'G', 'B']})
img = xrimage.XRImage(data)
img.stretch_weber_fechner(2.5, 0.2)
enhs = img.data.attrs['enhancement_history'][0]
assert enhs == {'weber_fechner': (2.5, 0.2)}
assert img.data.dtype == dtype
- res = np.array([[[-np.inf, -6.73656795, -5.0037],
- [-3.99003723, -3.27083205, -2.71297317],
- [-2.25716928, -1.87179258, -1.5379641],
- [-1.24350651, -0.98010522, -0.74182977],
- [-0.52430133, -0.32419456, -0.13892463]],
-
- [[0.03355755, 0.19490385, 0.34646541],
- [0.48936144, 0.6245295, 0.75276273],
- [0.87473814, 0.99103818, 1.10216759],
- [1.20856662, 1.31062161, 1.40867339],
- [1.50302421, 1.59394332, 1.68167162]],
-
- [[1.7664255, 1.84840006, 1.92777181],
- [2.00470095, 2.07933336, 2.1518022],
- [2.22222939, 2.29072683, 2.35739745],
- [2.42233616, 2.48563068, 2.54736221],
- [2.60760609, 2.66643234, 2.72390613]],
-
- [[2.78008827, 2.83503554, 2.88880105],
- [2.94143458, 2.99298279, 3.04348956],
- [3.09299613, 3.14154134, 3.18916183],
- [3.23589216, 3.28176501, 3.32681127],
- [3.37106022, 3.41453957, 3.45727566]],
-
- [[3.49929345, 3.54061671, 3.58126801],
- [3.62126886, 3.66063976, 3.69940022],
- [3.7375689, 3.7751636, 3.81220131],
- [3.84869831, 3.88467015, 3.92013174],
- [3.95509735, 3.98958065, 4.02359478]]], dtype=dtype)
+ res = np.array([
+ [[0., 0., 0.],
+ [0., 0., 0.],
+ [0., 0., 0.0993509],
+ [0.25663552, 0.40460747, 0.54430866],
+ [0.6766145, 0.8022693, 0.92190945]],
+ [[1.0360844, 1.1452721, 1.2498899],
+ [1.350305, 1.446842, 1.5397894],
+ [1.629405, 1.7159187, 1.7995386],
+ [1.8804514, 1.9588281, 2.0348217],
+ [2.1085734, 2.1802118, 2.2498538]],
+ [[2.3176088, 2.3835757, 2.4478464],
+ [2.5105066, 2.571634, 2.631303],
+ [2.689581, 2.7465308, 2.8022122],
+ [2.8566809, 2.9099874, 2.9621818],
+ [3.013308, 3.0634098, 3.1125278]],
+ [[3.1606987, 3.207959, 3.2543423],
+ [3.299881, 3.344605, 3.3885431],
+ [3.431722, 3.4741676, 3.515905],
+ [3.5569568, 3.597345, 3.6370919],
+ [3.6762161, 3.714738, 3.7526748]],
+ [[3.7900448, 3.826864, 3.863149],
+ [3.8989153, 3.934177, 3.968948],
+ [4.0032415, 4.037072, 4.07045],
+ [4.103389, 4.135899, 4.1679916],
+ [4.199678, 4.2309675, 4.2618704]]], dtype=dtype)
np.testing.assert_allclose(img.data.values, res, atol=1.e-6)
@@ -2277,7 +2407,7 @@ class TestXRImageColorize:
assert "colormap" not in metadata
else:
assert "colormap" in metadata
- loaded_brbg = Colormap.from_file(metadata["colormap"])
+ loaded_brbg = Colormap.from_string(metadata["colormap"])
np.testing.assert_allclose(new_brbg.values, loaded_brbg.values)
np.testing.assert_allclose(new_brbg.colors, loaded_brbg.colors)
@@ -2466,7 +2596,7 @@ class TestXRImagePalettize:
assert "colormap" not in metadata
else:
assert "colormap" in metadata
- loaded_brbg = Colormap.from_file(metadata["colormap"])
+ loaded_brbg = Colormap.from_string(metadata["colormap"])
np.testing.assert_allclose(new_brbg.values, loaded_brbg.values)
np.testing.assert_allclose(new_brbg.colors, loaded_brbg.colors)
=====================================
trollimage/version.py
=====================================
@@ -26,9 +26,9 @@ def get_keywords():
# setup.py/versioneer.py will grep for the variable names, so they must
# each be defined on a line of their own. _version.py will just call
# get_keywords().
- git_refnames = " (tag: v1.22.2)"
- git_full = "f77379d8554f074407e0bdb6cdec6cf551625b22"
- git_date = "2023-11-26 10:35:12 -0600"
+ git_refnames = " (HEAD -> main, tag: v1.23.0)"
+ git_full = "a57ac0bb9ee95a6088d8d0bfaa5877ae91390940"
+ git_date = "2024-02-14 17:12:30 +0100"
keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
return keywords
=====================================
trollimage/xrimage.py
=====================================
@@ -683,6 +683,8 @@ class XRImage:
data = data.clip(0, 1) * scale + offset
attrs.setdefault('enhancement_history', list()).append({'scale': scale, 'offset': offset})
data = data.round()
+ if fill_value is None:
+ data = data.fillna(np.iinfo(dtype).min)
data.attrs = attrs
return data
@@ -1002,30 +1004,20 @@ class XRImage:
else:
raise TypeError("Stretch parameter must be a string or a tuple.")
- @staticmethod
- def _compute_quantile(data, dims, cutoffs):
- """Compute quantile for stretch_linear.
-
- Dask delayed functions need to be non-internal functions (created
- inside a function) to be serializable on a multi-process scheduler.
-
- Quantile requires the data to be loaded since it not supported on
- dask arrays yet.
-
- """
- # numpy doesn't get a 'quantile' function until 1.15
- # for better backwards compatibility we use xarray's version
- data_arr = xr.DataArray(data, dims=dims)
- # delayed will provide us the fully computed xarray with ndarray
- left, right = data_arr.quantile([cutoffs[0], 1. - cutoffs[1]], dim=['x', 'y'])
- logger.debug("Interval: left=%s, right=%s", str(left), str(right))
- return left.data, right.data
-
def stretch_linear(self, cutoffs=(0.005, 0.005)):
"""Stretch linearly the contrast of the current image.
Use *cutoffs* for left and right trimming.
+ If the cutoffs are just a tuple or list of two scalars, all the
+ channels except the alpha channel will be stretched with the cutoffs.
+ If the cutoffs are a sequence of tuples/lists of two scalars then:
+
+ - if there are the same number of tuples/lists as channels, each channel will be stretched with the respective
+ cutoff.
+ - if there is one less tuple/list as channels, the same applies, except for the alpha channel which will
+ not be stretched.
+
"""
logger.debug("Perform a linear contrast stretch.")
@@ -1045,12 +1037,20 @@ class XRImage:
cutoff_type = self.data.dtype
data = self.data
- if 'A' in self.data.coords['bands'].values:
+ nb_bands = len(data.coords["bands"])
+
+ dont_stretch_alpha = ('A' in self.data.coords['bands'].values and
+ (np.isscalar(cutoffs[0]) or len(cutoffs) == nb_bands - 1))
+
+ if np.isscalar(cutoffs[0]):
+ cutoffs = [cutoffs] * nb_bands
+
+ if dont_stretch_alpha:
data = self.data.sel(bands=self.data.coords['bands'].values[:-1])
left_data, right_data = self._get_left_and_right_quantiles_without_alpha(data, cutoffs, cutoff_type)
- if 'A' in self.data.coords['bands'].values:
+ if dont_stretch_alpha:
left_data = np.hstack([left_data, np.array([0])])
right_data = np.hstack([right_data, np.array([1])])
left = xr.DataArray(left_data, dims=('bands',),
@@ -1069,6 +1069,33 @@ class XRImage:
dtype=cutoff_type)
return left_data, right_data
+ @staticmethod
+ def _compute_quantile(data, dims, cutoffs):
+ """Compute quantile for stretch_linear.
+
+ Dask delayed functions need to be non-internal functions (created
+ inside a function) to be serializable on a multi-process scheduler.
+
+ Quantile requires the data to be loaded since it not supported on
+ dask arrays yet.
+
+ """
+ # numpy doesn't get a 'quantile' function until 1.15
+ # for better backwards compatibility we use xarray's version
+ data_arr = xr.DataArray(data, dims=dims)
+ # delayed will provide us the fully computed xarray with ndarray
+ nb_bands = len(data_arr.coords["bands"])
+
+ left = []
+ right = []
+ for i in range(nb_bands):
+ left_i, right_i = data_arr.isel(bands=i).quantile([cutoffs[i][0], 1-cutoffs[i][1]])
+ left.append(left_i)
+ right.append(right_i)
+
+ logger.debug("Interval: left=%s, right=%s", str(left), str(right))
+ return np.array(left), np.array(right)
+
def crude_stretch(self, min_stretch=None, max_stretch=None):
"""Perform simple linear stretching.
@@ -1240,11 +1267,12 @@ class XRImage:
p = k.ln(S/S0)
p is perception, S is the stimulus, S0 is the stimulus threshold (the
- highest unpercieved stimulus), and k is the factor.
+ highest unperceived stimulus), and k is the factor.
"""
attrs = self.data.attrs
- self.data = k * np.log(self.data / s0)
+ clipped_stimuli = np.clip(self.data, s0, None)
+ self.data = k * np.log(clipped_stimuli / s0)
self.data.attrs = attrs
self.data.attrs.setdefault('enhancement_history', []).append({'weber_fechner': (k, s0)})
View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/-/commit/4ecb1d98abfd76d921b55c00e5fbed45d262333b
--
View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/-/commit/4ecb1d98abfd76d921b55c00e5fbed45d262333b
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20240214/eabf3e31/attachment-0001.htm>
More information about the Pkg-grass-devel
mailing list