[Git][debian-gis-team/pystac][master] 4 commits: New upstream version 1.14.2
Antonio Valentino (@antonio.valentino)
gitlab at salsa.debian.org
Fri Dec 19 07:27:34 GMT 2025
Antonio Valentino pushed to branch master at Debian GIS Project / pystac
Commits:
2e4ea56e by Antonio Valentino at 2025-12-19T07:22:23+00:00
New upstream version 1.14.2
- - - - -
d774682b by Antonio Valentino at 2025-12-19T07:22:30+00:00
Update upstream source from tag 'upstream/1.14.2'
Update to upstream version '1.14.2'
with Debian dir a05157bb3a5dd12c130f8a36fe377f6a496ca4e0
- - - - -
a10bc1f8 by Antonio Valentino at 2025-12-19T07:22:58+00:00
New upstream release
- - - - -
dccc3169 by Antonio Valentino at 2025-12-19T07:23:54+00:00
Set distribution to unstable
- - - - -
25 changed files:
- .github/pull_request_template.md
- .github/workflows/continuous-integration.yml
- + .github/workflows/pr.yml
- + .github/workflows/release-please.yml
- .github/workflows/release.yml
- + .release-please-manifest.json
- CHANGELOG.md
- − RELEASING.md
- debian/changelog
- docs/contributing.rst
- pystac/__init__.py
- pystac/asset.py
- pystac/extensions/datacube.py
- pystac/extensions/projection.py
- pystac/item.py
- pystac/utils.py
- pystac/version.py
- + release-please-config.json
- + tests/data-files/projection/example-with-version-1.2.json
- tests/extensions/test_datacube.py
- tests/extensions/test_projection.py
- tests/test_asset.py
- tests/test_catalog.py
- tests/test_item.py
- tests/test_utils.py
Changes:
=====================================
.github/pull_request_template.md
=====================================
@@ -9,5 +9,5 @@
- [ ] Pre-commit hooks pass (run `pre-commit run --all-files`)
- [ ] Tests pass (run `pytest`)
- [ ] Documentation has been updated to reflect changes, if applicable
-- [ ] This PR maintains or improves overall codebase code coverage.
-- [ ] Changes are added to the [CHANGELOG](https://github.com/stac-utils/pystac/blob/main/CHANGELOG.md). See [the docs](https://pystac.readthedocs.io/en/latest/contributing.html#changelog) for information about adding to the changelog.
+- [ ] This PR maintains or improves overall codebase code coverage
+- [ ] This PR's title is formatted per [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/)
=====================================
.github/workflows/continuous-integration.yml
=====================================
@@ -32,8 +32,8 @@ jobs:
- windows-latest
- macos-latest
steps:
- - uses: actions/checkout at v5
- - uses: astral-sh/setup-uv at v6
+ - uses: actions/checkout at v6
+ - uses: astral-sh/setup-uv at v7
with:
python-version: ${{ matrix.python-version }}
- name: Sync
@@ -55,8 +55,8 @@ jobs:
name: coverage
runs-on: ubuntu-latest
steps:
- - uses: actions/checkout at v5
- - uses: astral-sh/setup-uv at v6
+ - uses: actions/checkout at v6
+ - uses: astral-sh/setup-uv at v7
- name: Install with dependencies
run: uv sync --all-extras
- name: Run coverage with orjson
@@ -84,8 +84,8 @@ jobs:
without-orjson:
runs-on: ubuntu-latest
steps:
- - uses: actions/checkout at v5
- - uses: astral-sh/setup-uv at v6
+ - uses: actions/checkout at v6
+ - uses: astral-sh/setup-uv at v7
- name: Sync
run: uv sync
- name: Uninstall orjson
@@ -99,11 +99,11 @@ jobs:
# appropriate for CI on Github actions.
runs-on: ubuntu-latest
steps:
- - uses: actions/checkout at v5
+ - uses: actions/checkout at v6
- uses: actions/setup-python at v6
with:
python-version: "3.10"
- - uses: astral-sh/setup-uv at v6
+ - uses: astral-sh/setup-uv at v7
with:
enable-cache: true
- name: Sync
@@ -116,8 +116,8 @@ jobs:
docs:
runs-on: ubuntu-latest
steps:
- - uses: actions/checkout at v5
- - uses: astral-sh/setup-uv at v6
+ - uses: actions/checkout at v6
+ - uses: astral-sh/setup-uv at v7
- name: Install pandoc
run: sudo apt-get install pandoc
- name: Sync
=====================================
.github/workflows/pr.yml
=====================================
@@ -0,0 +1,19 @@
+name: PR
+
+on:
+ pull_request_target:
+ types:
+ - opened
+ - edited
+ - reopened
+
+jobs:
+ lint:
+ name: Lint
+ runs-on: ubuntu-latest
+ permissions:
+ pull-requests: read
+ steps:
+ - uses: amannn/action-semantic-pull-request at v6
+ env:
+ GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
=====================================
.github/workflows/release-please.yml
=====================================
@@ -0,0 +1,56 @@
+on:
+ push:
+ branches:
+ - main
+
+permissions:
+ contents: write
+ issues: write
+ pull-requests: write
+
+name: release-please
+
+jobs:
+ release-please:
+ runs-on: ubuntu-latest
+ outputs:
+ prs: ${{ steps.release-please.outputs.prs }}
+ steps:
+ - uses: actions/create-github-app-token at v2
+ id: generate-token
+ with:
+ app-id: ${{ vars.RELEASE_BOT_CLIENT_ID }}
+ private-key: ${{ secrets.RELEASE_BOT_PRIVATE_KEY }}
+ - uses: googleapis/release-please-action at v4
+ id: release-please
+ with:
+ token: ${{ steps.generate-token.outputs.token }}
+ release-type: python
+
+ update-static-files:
+ runs-on: ubuntu-latest
+ needs: release-please
+ if: ${{ needs.release-please.outputs.prs }}
+ steps:
+ - uses: actions/create-github-app-token at v2
+ id: generate-token
+ with:
+ app-id: ${{ vars.RELEASE_BOT_CLIENT_ID }}
+ private-key: ${{ secrets.RELEASE_BOT_PRIVATE_KEY }}
+ - name: Checkout PR branch
+ uses: actions/checkout at v6
+ with:
+ token: ${{ steps.generate-token.outputs.token }}
+ ref: ${{ fromJSON(needs.release-please.outputs.prs)[0].headBranchName }}
+ - uses: astral-sh/setup-uv at v7
+ - name: Run pull-static script
+ run: uv run scripts/pull-static
+ - name: Run pytest with rewrite mode
+ run: uv run pytest --record-mode rewrite
+ - name: Commit and push changes
+ run: |
+ git config user.name "github-actions[bot]"
+ git config user.email "github-actions[bot]@users.noreply.github.com"
+ git add -A
+ git diff --staged --quiet || git commit -m "chore: update static files and test recordings"
+ git push
=====================================
.github/workflows/release.yml
=====================================
@@ -16,7 +16,7 @@ jobs:
id-token: write
if: ${{ github.repository }} == 'stac-utils/pystac'
steps:
- - uses: actions/checkout at v5
+ - uses: actions/checkout at v6
- name: Set up Python 3.x
uses: actions/setup-python at v6
with:
=====================================
.release-please-manifest.json
=====================================
@@ -0,0 +1,6 @@
+{
+ ".": "1.14.2",
+ "extra-files": [
+ "pystac/version.py"
+ ]
+}
=====================================
CHANGELOG.md
=====================================
@@ -1,6 +1,16 @@
# Changelog
-## [Unreleased]
+## [1.14.2](https://github.com/stac-utils/pystac/compare/v1.14.1...v1.14.2) (2025-12-17)
+
+
+### Bug Fixes
+
+* Remove unused pystac.validation import ([#1583](https://github.com/stac-utils/pystac/pull/1583))
+* clone extra_fields for Item ([#1601](https://github.com/stac-utils/pystac/issues/1601)) ([6ba7da1](https://github.com/stac-utils/pystac/commit/6ba7da1796488c8de30eedf972dce07fbbec248f))
+* make release-please two separate jobs ([#1607](https://github.com/stac-utils/pystac/issues/1607)) ([bb6d289](https://github.com/stac-utils/pystac/commit/bb6d2892675bbc49249e8c06a4634610bd826f53))
+* Make `extent` not required for `VerticalSpatialDimension` ([#1596](https://github.com/stac-utils/pystac/pull/1596))
+* `Asset.get_absolute_href()` now properly resolves root relative hrefs ([#1599](https://github.com/stac-utils/pystac/pull/1599))
+* Clone extra fields on `Item` ([#1601](https://github.com/stac-utils/pystac/pull/1601))
## [v1.14.1] - 2025-09-18
=====================================
RELEASING.md deleted
=====================================
@@ -1,28 +0,0 @@
-# Releasing
-
-This is a checklist to use when releasing a new PySTAC version.
-
-1. Determine the next version. We do not currently have a versioning guide, but <https://github.com/radiantearth/stac-spec/discussions/1184> has some discussion around the topic.
-2. Create a release branch with the name `release/vX.Y.Z`, where `X.Y.Z` is the next version (e.g. `1.7.0`).
-3. Pull fields-normalized.json from cdn: run `scripts/pull-static`. Note you will need to have [jq](https://stedolan.github.io/jq/) installed.
-4. Update the `__version__` attribute in `pystac/version.py` with the new version.
-5. Update all cassettes: `pytest --record-mode rewrite`
-6. Update the CHANGELOG.
- - Create a new header below `## [Unreleased]` with the new version.
- - Remove any unused header sections.
- - Update the links at the bottom of the page for the new header.
- - Audit the CHANGELOG for correctness and readability.
-7. Audit the changes.
- Use the CHANGELOG, your favorite diff tool, and the merged Github pull requests to ensure that:
- - All notable changes are captured in the CHANGELOG.
- - The type of release is appropriate for the new version number, i.e. if there are breaking changes, the MAJOR version number must be increased.
- - All deprecated items that were marked for removal in this version are removed.
-8. Commit your changes, push your branch to Github, and request a review.
-9. Once approved, merge the PR.
-10. Once the PR is merged, create a tag with the version name, e.g. `vX.Y.Z`.
- Prefer a signed tag, if possible.
- Push the tag to Github.
-11. Use the tag to finish your release notes, and publish those.
- The "auto generate" feature is your friend, here.
- When the release is published, this will trigger the build and release on PyPI.
-12. Announced the release in [Gitter](https://matrix.to/#/#SpatioTemporal-Asset-Catalog_python:gitter.im) and on any relevant social media.
=====================================
debian/changelog
=====================================
@@ -1,9 +1,12 @@
-pystac (1.14.1-2) UNRELEASED; urgency=medium
+pystac (1.14.2-1) unstable; urgency=medium
- * Team upload.
+ [ Bas Couwenberg ]
* Use test-build-validate-cleanup instead of test-build-twice.
- -- Bas Couwenberg <sebastic at debian.org> Sat, 25 Oct 2025 13:05:05 +0200
+ [ Antonio Valentino ]
+ * New upstream release.
+
+ -- Antonio Valentino <antonio.valentino at tiscali.it> Fri, 19 Dec 2025 07:23:34 +0000
pystac (1.14.1-1) unstable; urgency=medium
=====================================
docs/contributing.rst
=====================================
@@ -131,21 +131,9 @@ CHANGELOG
^^^^^^^^^
PySTAC maintains a `changelog <https://github.com/stac-utils/pystac/blob/develop/CHANGELOG.md>`_
-to track changes between releases. All PRs should make a changelog entry unless
-the change is trivial (e.g. fixing typos) or is entirely invisible to users who may
-be upgrading versions (e.g. an improvement to the CI system).
-
-For changelog entries, please link to the PR of that change. This needs to happen in a
-few steps:
-
-- Make a PR to PySTAC with your changes
-- Record the link to the PR
-- Push an additional commit to your branch with the changelog entry with the link to the
- PR.
-
-For more information on changelogs and how to write a good entry, see `keep a changelog
-<https://keepachangelog.com/en/1.0.0/>`_.
-
+to track changes between releases. This changelog is automatically kept up-to-date by
+`release-please <https://github.com/googleapis/release-please>`_, specifically on an unmerged
+release PR.
Style
^^^^^
=====================================
pystac/__init__.py
=====================================
@@ -86,7 +86,6 @@ from pystac.item_assets import ItemAssetDefinition
from pystac.item_collection import ItemCollection
from pystac.provider import ProviderRole, Provider
from pystac.utils import HREF
-import pystac.validation
import pystac.extensions.hooks
import pystac.extensions.classification
@@ -239,3 +238,18 @@ def read_dict(
if stac_io is None:
stac_io = StacIO.default()
return stac_io.stac_object_from_dict(d, href, root)
+
+
+def __getattr__(name: str) -> Any:
+ if name == "validation":
+ import warnings
+ import pystac.validation
+
+ warnings.warn(
+ "pystac.validation will not be automatically imported to the package in "
+ "pystac v2.0. Instead, import it directly: `import pystac.validation`",
+ DeprecationWarning,
+ stacklevel=2,
+ )
+ return pystac.validation
+ raise AttributeError(f"module '{__name__}' has no attribute '{name}'")
=====================================
pystac/asset.py
=====================================
@@ -105,13 +105,12 @@ class Asset:
str: The absolute HREF of this asset, or None if an absolute HREF could not
be determined.
"""
- if utils.is_absolute_href(self.href):
+ item_self = self.owner.get_self_href() if self.owner is not None else None
+ if utils.is_absolute_href(self.href, item_self):
return self.href
else:
- if self.owner is not None:
- item_self = self.owner.get_self_href()
- if item_self is not None:
- return utils.make_absolute_href(self.href, item_self)
+ if item_self is not None:
+ return utils.make_absolute_href(self.href, item_self)
return None
def to_dict(self) -> dict[str, Any]:
@@ -341,7 +340,7 @@ class Assets(Protocol):
"""
self_href = self.get_self_href()
for asset in self.assets.values():
- if is_absolute_href(asset.href):
+ if is_absolute_href(asset.href, self_href):
if self_href is None:
raise STACError(
"Cannot make asset HREFs relative if no self_href is set."
@@ -360,7 +359,7 @@ class Assets(Protocol):
"""
self_href = self.get_self_href()
for asset in self.assets.values():
- if not is_absolute_href(asset.href):
+ if not is_absolute_href(asset.href, self_href):
if self_href is None:
raise STACError(
"Cannot make relative asset HREFs absolute "
@@ -380,10 +379,10 @@ class Assets(Protocol):
def _absolute_href(href: str, owner: Assets | None, action: str = "access") -> str:
- if utils.is_absolute_href(href):
+ item_self = owner.get_self_href() if owner else None
+ if utils.is_absolute_href(href, item_self):
return href
else:
- item_self = owner.get_self_href() if owner else None
if item_self is None:
raise ValueError(
f"Cannot {action} file if asset href ('{href}') is relative "
=====================================
pystac/extensions/datacube.py
=====================================
@@ -142,15 +142,18 @@ class Dimension(ABC):
class SpatialDimension(Dimension):
@property
- def extent(self) -> list[float]:
+ def extent(self) -> list[float] | None:
"""Extent (lower and upper bounds) of the dimension as two-dimensional array.
Open intervals with ``None`` are not allowed."""
- return get_required(
- self.properties.get(DIM_EXTENT_PROP), "cube:dimension", DIM_EXTENT_PROP
+ return cast(
+ list[float],
+ get_required(
+ self.properties.get(DIM_EXTENT_PROP), "cube:dimension", DIM_EXTENT_PROP
+ ),
)
@extent.setter
- def extent(self, v: list[float]) -> None:
+ def extent(self, v: list[float] | None) -> None:
self.properties[DIM_EXTENT_PROP] = v
@property
@@ -228,6 +231,19 @@ class VerticalSpatialDimension(SpatialDimension):
def axis(self, v: VerticalSpatialDimensionAxis) -> None:
self.properties[DIM_AXIS_PROP] = v
+ @property
+ def extent(self) -> list[float] | None:
+ """Extent (lower and upper bounds) of the dimension as two-dimensional array.
+ Open intervals with ``None`` are not allowed."""
+ return self.properties.get(DIM_EXTENT_PROP)
+
+ @extent.setter
+ def extent(self, v: list[float] | None) -> None:
+ if v is None:
+ self.properties.pop(DIM_EXTENT_PROP, None)
+ else:
+ self.properties[DIM_EXTENT_PROP] = v
+
@property
def unit(self) -> str | None:
"""The unit of measurement for the data, preferably compliant to `UDUNITS-2
=====================================
pystac/extensions/projection.py
=====================================
@@ -30,6 +30,7 @@ SCHEMA_URI: str = "https://stac-extensions.github.io/projection/v2.0.0/schema.js
SCHEMA_URIS: list[str] = [
"https://stac-extensions.github.io/projection/v1.0.0/schema.json",
"https://stac-extensions.github.io/projection/v1.1.0/schema.json",
+ "https://stac-extensions.github.io/projection/v1.2.0/schema.json",
SCHEMA_URI,
]
PREFIX: str = "proj:"
@@ -467,6 +468,7 @@ class ProjectionExtensionHooks(ExtensionHooks):
"projection",
"https://stac-extensions.github.io/projection/v1.0.0/schema.json",
"https://stac-extensions.github.io/projection/v1.1.0/schema.json",
+ "https://stac-extensions.github.io/projection/v1.2.0/schema.json",
}
stac_object_types = {pystac.STACObjectType.ITEM}
@@ -478,12 +480,27 @@ class ProjectionExtensionHooks(ExtensionHooks):
# proj:epsg moved to proj:code
if epsg := obj["properties"].pop("proj:epsg", None):
- obj["properties"]["proj:code"] = f"EPSG:{epsg}"
+ if obj["properties"].get("proj:code", None) is None:
+ obj["properties"]["proj:code"] = f"EPSG:{epsg}"
+ elif not obj["properties"]["proj:code"] == f"EPSG:{epsg}":
+ warnings.warn(
+ "Both proj:code and proj:epsg are specified and they have "
+ "conflicting values. This might lead to surprising behavior.",
+ UserWarning,
+ )
for key in ["assets", "item_assets"]:
for asset in obj.get(key, {}).values():
if epsg := asset.pop("proj:epsg", None):
- asset["proj:code"] = f"EPSG:{epsg}"
+ if asset.get("proj:code", None) is None:
+ asset["proj:code"] = f"EPSG:{epsg}"
+ elif not asset["proj:code"] == f"EPSG:{epsg}":
+ warnings.warn(
+ "Both proj:code and proj:epsg are specified and they "
+ "have conflicting values. This might lead to surprising "
+ "behavior.",
+ UserWarning,
+ )
super().migrate(obj, version, info)
=====================================
pystac/item.py
=====================================
@@ -398,6 +398,7 @@ class Item(STACObject, Assets):
stac_extensions=deepcopy(self.stac_extensions),
collection=self.collection_id,
assets={k: asset.clone() for k, asset in self.assets.items()},
+ extra_fields=deepcopy(self.extra_fields),
)
for link in self.links:
clone.add_link(link.clone())
=====================================
pystac/utils.py
=====================================
@@ -378,19 +378,27 @@ def make_absolute_href(
return _make_absolute_href_path(parsed_source, parsed_start, start_is_dir)
-def is_absolute_href(href: str) -> bool:
+def is_absolute_href(href: str, start_href: str | None = None) -> bool:
"""Determines if an HREF is absolute or not.
May be used on either local file paths or URLs.
Args:
href : The HREF to consider.
+ start_href : The HREF that will be used as the basis for checking if
+ ``source_href`` is a relative path. Defaults to None.
Returns:
bool: ``True`` if the given HREF is absolute, ``False`` if it is relative.
"""
parsed = safe_urlparse(href)
- return parsed.scheme not in ["", "file"] or os.path.isabs(parsed.path)
+ if parsed.scheme not in ["", "file"]:
+ return True
+ else:
+ parsed_start_scheme = (
+ "" if start_href is None else safe_urlparse(start_href).scheme
+ )
+ return parsed_start_scheme in ["", "file"] and os.path.isabs(parsed.path)
def datetime_to_str(dt: datetime, timespec: str = "auto") -> str:
=====================================
pystac/version.py
=====================================
@@ -1,6 +1,6 @@
import os
-__version__ = "1.14.1"
+__version__ = "1.14.2" # x-release-please-version
"""Library version"""
=====================================
release-please-config.json
=====================================
@@ -0,0 +1,13 @@
+{
+ "packages": {
+ ".": {
+ "changelog-path": "CHANGELOG.md",
+ "release-type": "python",
+ "bump-minor-pre-major": false,
+ "bump-patch-for-minor-pre-major": false,
+ "draft": false,
+ "prerelease": false
+ }
+ },
+ "$schema": "https://raw.githubusercontent.com/googleapis/release-please/main/schemas/config.json"
+}
=====================================
tests/data-files/projection/example-with-version-1.2.json
=====================================
@@ -0,0 +1,435 @@
+{
+ "type": "Feature",
+ "stac_version": "1.0.0",
+ "id": "LC81530252014153LGN00",
+ "properties": {
+ "datetime": "2018-10-01T01:08:32.033000Z",
+ "proj:epsg": 32614,
+ "proj:code": "EPSG:32613",
+ "proj:wkt2": "PROJCS[\"WGS 84 / UTM zone 14N\",GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",-99],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32614\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",NORTH]]",
+ "proj:projjson": {
+ "$schema": "https://proj.org/schemas/v0.2/projjson.schema.json",
+ "type": "ProjectedCRS",
+ "name": "WGS 84 / UTM zone 14N",
+ "base_crs": {
+ "name": "WGS 84",
+ "datum": {
+ "type": "GeodeticReferenceFrame",
+ "name": "World Geodetic System 1984",
+ "ellipsoid": {
+ "name": "WGS 84",
+ "semi_major_axis": 6378137,
+ "inverse_flattening": 298.257223563
+ }
+ },
+ "coordinate_system": {
+ "subtype": "ellipsoidal",
+ "axis": [
+ {
+ "name": "Geodetic latitude",
+ "abbreviation": "Lat",
+ "direction": "north",
+ "unit": "degree"
+ },
+ {
+ "name": "Geodetic longitude",
+ "abbreviation": "Lon",
+ "direction": "east",
+ "unit": "degree"
+ }
+ ]
+ },
+ "id": {
+ "authority": "EPSG",
+ "code": 4326
+ }
+ },
+ "conversion": {
+ "name": "UTM zone 14N",
+ "method": {
+ "name": "Transverse Mercator",
+ "id": {
+ "authority": "EPSG",
+ "code": 9807
+ }
+ },
+ "parameters": [
+ {
+ "name": "Latitude of natural origin",
+ "value": 0,
+ "unit": "degree",
+ "id": {
+ "authority": "EPSG",
+ "code": 8801
+ }
+ },
+ {
+ "name": "Longitude of natural origin",
+ "value": -99,
+ "unit": "degree",
+ "id": {
+ "authority": "EPSG",
+ "code": 8802
+ }
+ },
+ {
+ "name": "Scale factor at natural origin",
+ "value": 0.9996,
+ "unit": "unity",
+ "id": {
+ "authority": "EPSG",
+ "code": 8805
+ }
+ },
+ {
+ "name": "False easting",
+ "value": 500000,
+ "unit": "metre",
+ "id": {
+ "authority": "EPSG",
+ "code": 8806
+ }
+ },
+ {
+ "name": "False northing",
+ "value": 0,
+ "unit": "metre",
+ "id": {
+ "authority": "EPSG",
+ "code": 8807
+ }
+ }
+ ]
+ },
+ "coordinate_system": {
+ "subtype": "Cartesian",
+ "axis": [
+ {
+ "name": "Easting",
+ "abbreviation": "E",
+ "direction": "east",
+ "unit": "metre"
+ },
+ {
+ "name": "Northing",
+ "abbreviation": "N",
+ "direction": "north",
+ "unit": "metre"
+ }
+ ]
+ },
+ "area": "World - N hemisphere - 102\u00b0W to 96\u00b0W - by country",
+ "bbox": {
+ "south_latitude": 0,
+ "west_longitude": -102,
+ "north_latitude": 84,
+ "east_longitude": -96
+ },
+ "id": {
+ "authority": "EPSG",
+ "code": 32614
+ }
+ },
+ "proj:geometry": {
+ "coordinates": [
+ [
+ [
+ 169200.0,
+ 3712800.0
+ ],
+ [
+ 403200.0,
+ 3712800.0
+ ],
+ [
+ 403200.0,
+ 3951000.0
+ ],
+ [
+ 169200.0,
+ 3951000.0
+ ],
+ [
+ 169200.0,
+ 3712800.0
+ ]
+ ]
+ ],
+ "type": "Polygon"
+ },
+ "proj:bbox": [
+ 169200.0,
+ 3712800.0,
+ 403200.0,
+ 3951000.0
+ ],
+ "proj:centroid": {
+ "lat": 34.595302781575604,
+ "lon": -101.34448382627504
+ },
+ "proj:shape": [
+ 8391,
+ 8311
+ ],
+ "proj:transform": [
+ 30.0,
+ 0.0,
+ 224985.0,
+ 0.0,
+ -30.0,
+ 6790215.0,
+ 0.0,
+ 0.0,
+ 1.0
+ ]
+ },
+ "geometry": {
+ "type": "Polygon",
+ "coordinates": [
+ [
+ [
+ 152.52758,
+ 60.63437
+ ],
+ [
+ 149.1755,
+ 61.19016
+ ],
+ [
+ 148.13933,
+ 59.51584
+ ],
+ [
+ 151.33786,
+ 58.97792
+ ],
+ [
+ 152.52758,
+ 60.63437
+ ]
+ ]
+ ]
+ },
+ "links": [
+ {
+ "rel": "collection",
+ "href": "https://example.com/landsat/collection.json"
+ }
+ ],
+ "assets": {
+ "B1": {
+ "href": "https://landsat-pds.s3.amazonaws.com/c1/L8/107/018/LC08_L1TP_107018_20181001_20181001_01_RT/LC08_L1TP_107018_20181001_20181001_01_RT_B1.TIF",
+ "type": "image/tiff; application=geotiff",
+ "title": "Band 1 (coastal)",
+ "eo:bands": [
+ {
+ "name": "B1",
+ "common_name": "coastal",
+ "center_wavelength": 0.44,
+ "full_width_half_max": 0.02
+ }
+ ]
+ },
+ "B8": {
+ "href": "https://landsat-pds.s3.amazonaws.com/c1/L8/107/018/LC08_L1TP_107018_20181001_20181001_01_RT/LC08_L1TP_107018_20181001_20181001_01_RT_B8.TIF",
+ "type": "image/tiff; application=geotiff",
+ "title": "Band 8 (panchromatic)",
+ "eo:bands": [
+ {
+ "name": "B8",
+ "center_wavelength": 0.59,
+ "full_width_half_max": 0.18
+ }
+ ],
+ "proj:epsg": 9999,
+ "proj:code": "EPSG:9998",
+ "proj:wkt2": "PROJCS[\"WGS 84 / UTM zone 14N\",GEOGCS[\"WGS 84\",DATUM[\"WGS_1984\",SPHEROID[\"WGS 84\",6378137,298.257223563,AUTHORITY[\"EPSG\",\"7030\"]],AUTHORITY[\"EPSG\",\"6326\"]],PRIMEM[\"Greenwich\",0,AUTHORITY[\"EPSG\",\"8901\"]],UNIT[\"degree\",0.01745329251994328,AUTHORITY[\"EPSG\",\"9122\"]],AUTHORITY[\"EPSG\",\"4326\"]],UNIT[\"metre\",1,AUTHORITY[\"EPSG\",\"9001\"]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"latitude_of_origin\",0],PARAMETER[\"central_meridian\",-99],PARAMETER[\"scale_factor\",0.9996],PARAMETER[\"false_easting\",500000],PARAMETER[\"false_northing\",0],AUTHORITY[\"EPSG\",\"32614\"],AXIS[\"Easting\",EAST],AXIS[\"Northing\",TEST_TEXT]]",
+ "proj:projjson": {
+ "$schema": "https://proj.org/schemas/v0.2/projjson.schema.json",
+ "type": "ProjectedCRS",
+ "name": "WGS 84 / UTM zone 14N",
+ "base_crs": {
+ "name": "WGS 84",
+ "datum": {
+ "type": "GeodeticReferenceFrame",
+ "name": "World Geodetic System 1984",
+ "ellipsoid": {
+ "name": "WGS 84",
+ "semi_major_axis": 6378137,
+ "inverse_flattening": 298.257223563
+ }
+ },
+ "coordinate_system": {
+ "subtype": "ellipsoidal",
+ "axis": [
+ {
+ "name": "Geodetic latitude",
+ "abbreviation": "Lat",
+ "direction": "north",
+ "unit": "degree"
+ },
+ {
+ "name": "Geodetic longitude",
+ "abbreviation": "Lon",
+ "direction": "east",
+ "unit": "degree"
+ }
+ ]
+ },
+ "id": {
+ "authority": "EPSG",
+ "code": 4326
+ }
+ },
+ "conversion": {
+ "name": "UTM zone 14N",
+ "method": {
+ "name": "Transverse Mercator",
+ "id": {
+ "authority": "EPSG",
+ "code": 9807
+ }
+ },
+ "parameters": [
+ {
+ "name": "Latitude of natural origin",
+ "value": 0,
+ "unit": "degree",
+ "id": {
+ "authority": "EPSG",
+ "code": 8801
+ }
+ },
+ {
+ "name": "Longitude of natural origin",
+ "value": -99,
+ "unit": "degree",
+ "id": {
+ "authority": "EPSG",
+ "code": 8802
+ }
+ },
+ {
+ "name": "Scale factor at natural origin",
+ "value": 0.9996,
+ "unit": "unity",
+ "id": {
+ "authority": "EPSG",
+ "code": 8805
+ }
+ },
+ {
+ "name": "False easting",
+ "value": 500000,
+ "unit": "metre",
+ "id": {
+ "authority": "EPSG",
+ "code": 8806
+ }
+ },
+ {
+ "name": "False northing",
+ "value": 0,
+ "unit": "metre",
+ "id": {
+ "authority": "EPSG",
+ "code": 8807
+ }
+ }
+ ]
+ },
+ "coordinate_system": {
+ "subtype": "Cartesian",
+ "axis": [
+ {
+ "name": "Easting",
+ "abbreviation": "E",
+ "direction": "east",
+ "unit": "metre"
+ },
+ {
+ "name": "Northing",
+ "abbreviation": "N",
+ "direction": "north",
+ "unit": "metre"
+ }
+ ]
+ },
+ "area": "World - N hemisphere - 102\u00b0W to 96\u00b0W - by country",
+ "bbox": {
+ "south_latitude": 0,
+ "west_longitude": -102,
+ "north_latitude": 84,
+ "east_longitude": -96
+ },
+ "id": {
+ "authority": "EPSG",
+ "code": 9999
+ }
+ },
+ "proj:geometry": {
+ "coordinates": [
+ [
+ [
+ 0.0,
+ 0.0
+ ],
+ [
+ 1.0,
+ 0.0
+ ],
+ [
+ 1.0,
+ 1.0
+ ],
+ [
+ 1.0,
+ 0.0
+ ],
+ [
+ 0.0,
+ 0.0
+ ]
+ ]
+ ],
+ "type": "Polygon"
+ },
+ "proj:bbox": [
+ 1.0,
+ 2.0,
+ 3.0,
+ 4.0
+ ],
+ "proj:centroid": {
+ "lat": 0.5,
+ "lon": 0.3
+ },
+ "proj:shape": [
+ 16781,
+ 16621
+ ],
+ "proj:transform": [
+ 15.0,
+ 0.0,
+ 224992.5,
+ 0.0,
+ -15.0,
+ 6790207.5,
+ 0.0,
+ 0.0,
+ 1.0
+ ]
+ }
+ },
+ "bbox": [
+ 148.13933,
+ 59.51584,
+ 152.52758,
+ 60.63437
+ ],
+ "stac_extensions": [
+ "https://stac-extensions.github.io/eo/v1.1.0/schema.json",
+ "https://stac-extensions.github.io/projection/v1.1.0/schema.json"
+ ],
+ "collection": "landsat-8-l1"
+ }
\ No newline at end of file
=====================================
tests/extensions/test_datacube.py
=====================================
@@ -110,6 +110,20 @@ def test_temporal_dimension_description(
assert "description" not in temporal_dimension.properties
+def test_vertical_dimension_extent_not_required() -> None:
+ props: dict[str, list[float]] = {}
+ dim = dc.VerticalSpatialDimension(props)
+ assert dim.extent is None
+
+
+def test_vertical_dimension_setting_extent_to_none_pops_it() -> None:
+ props: dict[str, list[float]] = {"extent": [10, 100]}
+ dim = dc.VerticalSpatialDimension(props)
+ assert dim.extent == [10, 100]
+ dim.extent = None
+ assert props == {}
+
+
def test_stac_extensions(ext_item: Item) -> None:
assert dc.DatacubeExtension.has_extension(ext_item)
=====================================
tests/extensions/test_projection.py
=====================================
@@ -574,7 +574,28 @@ def test_get_set_code(projection_landsat8_item: Item) -> None:
assert proj_item.properties["proj:code"] == "IAU_2015:30100"
-def test_migrate_item() -> None:
+def test_migrate_item_on_1_2() -> None:
+ old = "https://stac-extensions.github.io/projection/v1.2.0/schema.json"
+ current = "https://stac-extensions.github.io/projection/v2.0.0/schema.json"
+
+ path = TestCases.get_path("data-files/projection/example-with-version-1.2.json")
+ with pytest.warns(UserWarning, match="surprising behavior"):
+ item = Item.from_file(path)
+
+ assert old not in item.stac_extensions
+ assert current in item.stac_extensions
+
+ assert item.ext.proj.epsg == 32613
+ assert item.ext.proj.code == "EPSG:32613"
+
+ assert item.assets["B1"].ext.proj.epsg == 32613
+ assert item.assets["B1"].ext.proj.code == "EPSG:32613"
+
+ assert item.assets["B8"].ext.proj.epsg == 9998
+ assert item.assets["B8"].ext.proj.code == "EPSG:9998"
+
+
+def test_migrate_item_on_1_1() -> None:
old = "https://stac-extensions.github.io/projection/v1.1.0/schema.json"
current = "https://stac-extensions.github.io/projection/v2.0.0/schema.json"
=====================================
tests/test_asset.py
=====================================
@@ -102,3 +102,112 @@ def test_delete_asset_relative_no_owner_fails(tmp_asset: pystac.Asset) -> None:
assert asset.href in str(e.value)
assert os.path.exists(href)
+
+
+ at pytest.mark.parametrize(
+ "self_href, asset_href, expected_href",
+ (
+ (
+ "http://test.com/stac/catalog/myitem.json",
+ "asset.data",
+ "http://test.com/stac/catalog/asset.data",
+ ),
+ (
+ "http://test.com/stac/catalog/myitem.json",
+ "/asset.data",
+ "http://test.com/asset.data",
+ ),
+ ),
+)
+def test_asset_get_absolute_href(
+ tmp_asset: pystac.Asset,
+ self_href: str,
+ asset_href: str,
+ expected_href: str,
+) -> None:
+ asset = tmp_asset
+ item = asset.owner
+
+ if not isinstance(item, pystac.Item):
+ raise TypeError("Asset must belong to an Item")
+
+ # Set the item HREF as per test
+ item.set_self_href(self_href)
+ assert item.get_self_href() == self_href
+
+ # Set the asset HREF as per test and check expected output
+ asset.href = asset_href
+ assert asset.get_absolute_href() == expected_href
+
+
+ at pytest.mark.skipif(os.name == "nt", reason="Unix only test")
+ at pytest.mark.parametrize(
+ "self_href, asset_href, expected_href",
+ (
+ (
+ "/local/myitem.json",
+ "asset.data",
+ "/local/asset.data",
+ ),
+ (
+ "/local/myitem.json",
+ "subdir/asset.data",
+ "/local/subdir/asset.data",
+ ),
+ (
+ "/local/myitem.json",
+ "/absolute/asset.data",
+ "/absolute/asset.data",
+ ),
+ ),
+)
+def test_asset_get_absolute_href_unix(
+ tmp_asset: pystac.Asset,
+ self_href: str,
+ asset_href: str,
+ expected_href: str,
+) -> None:
+ test_asset_get_absolute_href(tmp_asset, self_href, asset_href, expected_href)
+
+
+ at pytest.mark.skipif(os.name != "nt", reason="Windows only test")
+ at pytest.mark.parametrize(
+ "self_href, asset_href, expected_href",
+ (
+ (
+ "{tmpdir}/myitem.json",
+ "asset.data",
+ "{tmpdir}/asset.data",
+ ),
+ (
+ "{tmpdir}/myitem.json",
+ "subdir/asset.data",
+ "{tmpdir}/subdir/asset.data",
+ ),
+ (
+ "{tmpdir}/myitem.json",
+ "c:/absolute/asset.data",
+ "c:/absolute/asset.data",
+ ),
+ (
+ "{tmpdir}/myitem.json",
+ "d:\\absolute\\asset.data",
+ "d:\\absolute\\asset.data",
+ ),
+ ),
+)
+def test_asset_get_absolute_href_windows(
+ tmp_path: Path,
+ tmp_asset: pystac.Asset,
+ self_href: str,
+ asset_href: str,
+ expected_href: str,
+) -> None:
+ # For windows, we need an actual existing temporary directory
+ tmpdir = tmp_path.as_posix()
+ test_asset_get_absolute_href(
+ tmp_asset,
+ self_href.format(tmpdir=tmpdir),
+ asset_href.format(tmpdir=tmpdir),
+ expected_href.format(tmpdir=tmpdir),
+ )
=====================================
tests/test_catalog.py
=====================================
@@ -2024,3 +2024,9 @@ def test_get_root_link_cares_about_media_type(catalog: pystac.Catalog) -> None:
)
root_link = catalog.get_root_link()
assert root_link and root_link.target != "./self.json"
+
+
+def test_clone_extra_fields(catalog: Catalog) -> None:
+ catalog.extra_fields["foo"] = "bar"
+ cloned = catalog.clone()
+ assert cloned.extra_fields["foo"] == "bar"
=====================================
tests/test_item.py
=====================================
@@ -686,3 +686,9 @@ def test_migrate_by_default() -> None:
data = json.load(f)
item = pystac.Item.from_dict(data) # default used to be migrate=False
assert item.ext.proj.code == "EPSG:32614"
+
+
+def test_clone_extra_fields(item: Item) -> None:
+ item.extra_fields["foo"] = "bar"
+ cloned = item.clone()
+ assert cloned.extra_fields["foo"] == "bar"
=====================================
tests/test_utils.py
=====================================
@@ -197,14 +197,15 @@ def test_make_absolute_href_windows(
def test_is_absolute_href() -> None:
# Test cases of (href, expected)
test_cases = [
- ("item.json", False),
- ("./item.json", False),
- ("../item.json", False),
- ("http://stacspec.org/item.json", True),
+ ("item.json", False, None),
+ ("./item.json", False, None),
+ ("../item.json", False, None),
+ ("http://stacspec.org/item.json", True, None),
+ ("http://stacspec.org/item.json", True, "http://stacspec.org/"),
]
- for href, expected in test_cases:
- actual = is_absolute_href(href)
+ for href, expected, start_href in test_cases:
+ actual = is_absolute_href(href, start_href)
assert actual == expected
@@ -214,15 +215,16 @@ def test_is_absolute_href_os_aware() -> None:
is_windows = os.name == "nt"
incl_drive_letter = path_includes_drive_letter()
test_cases = [
- ("/item.json", not incl_drive_letter),
- ("/home/someuser/Downloads/item.json", not incl_drive_letter),
- ("file:///home/someuser/Downloads/item.json", not incl_drive_letter),
- ("d:/item.json", is_windows),
- ("c:/files/more_files/item.json", is_windows),
+ ("/item.json", not incl_drive_letter, None),
+ ("/item.json", False, "http://stacspec.org/"),
+ ("/home/someuser/Downloads/item.json", not incl_drive_letter, None),
+ ("file:///home/someuser/Downloads/item.json", not incl_drive_letter, None),
+ ("d:/item.json", is_windows, None),
+ ("c:/files/more_files/item.json", is_windows, None),
]
- for href, expected in test_cases:
- actual = is_absolute_href(href)
+ for href, expected, start_href in test_cases:
+ actual = is_absolute_href(href, start_href)
assert actual == expected
@@ -231,15 +233,15 @@ def test_is_absolute_href_windows() -> None:
# Test cases of (href, expected)
test_cases = [
- ("item.json", False),
- (".\\item.json", False),
- ("..\\item.json", False),
- ("c:\\item.json", True),
- ("http://stacspec.org/item.json", True),
+ ("item.json", False, None),
+ (".\\item.json", False, None),
+ ("..\\item.json", False, None),
+ ("c:\\item.json", True, None),
+ ("http://stacspec.org/item.json", True, None),
]
- for href, expected in test_cases:
- actual = is_absolute_href(href)
+ for href, expected, start_href in test_cases:
+ actual = is_absolute_href(href, start_href)
assert actual == expected
View it on GitLab: https://salsa.debian.org/debian-gis-team/pystac/-/compare/f87fe92c714c8abb833aa4bc48ce56377929dfde...dccc316934052adcd4aaed3db53a268c16293330
--
View it on GitLab: https://salsa.debian.org/debian-gis-team/pystac/-/compare/f87fe92c714c8abb833aa4bc48ce56377929dfde...dccc316934052adcd4aaed3db53a268c16293330
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20251219/5a845d5c/attachment-0001.htm>
More information about the Pkg-grass-devel
mailing list