[Git][debian-gis-team/asf-search][upstream] New upstream version 10.1.1

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Sun Oct 12 09:00:29 BST 2025



Antonio Valentino pushed to branch upstream at Debian GIS Project / asf-search


Commits:
4272fa26 by Antonio Valentino at 2025-10-12T07:30:30+00:00
New upstream version 10.1.1
- - - - -


14 changed files:

- .github/workflows/changelog.yml
- .github/workflows/label-prod-pr.yml
- .github/workflows/lint.yml
- .github/workflows/prod-request-merged.yml
- .github/workflows/pypi-publish.yml
- .github/workflows/run-pytest.yml
- CHANGELOG.md
- asf_search/CMR/datasets.py
- asf_search/Products/OPERAS1Product.py
- asf_search/constants/PRODUCT_TYPE.py
- + tests/ASFProduct/test_ASFSubproduct.py
- tests/pytest-config.yml
- tests/pytest-managers.py
- + tests/yml_tests/test_ASFSubproduct.yml


Changes:

=====================================
.github/workflows/changelog.yml
=====================================
@@ -16,7 +16,7 @@ jobs:
       - uses: actions/checkout at v1
 
       - name: Changelog check
-        uses: Zomzog/changelog-checker at v1.0.0
+        uses: Zomzog/changelog-checker at v1.3.0
         with:
           fileName: CHANGELOG.md
           noChangelogLabel: bumpless


=====================================
.github/workflows/label-prod-pr.yml
=====================================
@@ -17,7 +17,7 @@ jobs:
     if: github.event.pull_request.state == 'open'
     steps:
       - name: Require Version Label
-        uses: mheap/github-action-required-labels at v1
+        uses: mheap/github-action-required-labels at v5.5.0
         with:
           mode: exactly
           count: 1


=====================================
.github/workflows/lint.yml
=====================================
@@ -5,7 +5,7 @@ jobs:
     runs-on: ubuntu-latest
 
     steps:
-      - uses: actions/checkout at v4
-      - uses: chartboost/ruff-action at v1
+      - uses: actions/checkout at v5
+      - uses: astral-sh/ruff-action at v3
         with:
           src: './asf_search'
\ No newline at end of file


=====================================
.github/workflows/prod-request-merged.yml
=====================================
@@ -20,7 +20,7 @@ jobs:
         contains(github.event.pull_request.labels.*.name, 'major')
       )
     steps:
-    - uses: actions/checkout at v2
+    - uses: actions/checkout at v5
 
     - name: Save version type
       # Whichever one return's true, will let their 'echo' statement run:


=====================================
.github/workflows/pypi-publish.yml
=====================================
@@ -14,7 +14,7 @@ jobs:
   DeployToPypi:
     runs-on: ubuntu-latest
     steps:
-      - uses: actions/checkout at v2
+      - uses: actions/checkout at v5
 
       - name: Install dependencies
         run: python3 -m pip install --upgrade pip build


=====================================
.github/workflows/run-pytest.yml
=====================================
@@ -5,8 +5,8 @@ jobs:
   run-tests:
     runs-on: ubuntu-latest
     steps:
-      - uses: actions/checkout at v2
-      - uses: actions/setup-python at v5
+      - uses: actions/checkout at v5
+      - uses: actions/setup-python at v6
         with:
             python-version: '3.10'
       - name: Install Dependencies
@@ -18,8 +18,9 @@ jobs:
         run: python3 -m pytest -n auto --cov=asf_search --cov-report=xml --dont-run-file test_known_bugs .
 
       - name: Upload coverage to Codecov
-        uses: codecov/codecov-action at v3
+        uses: codecov/codecov-action at v5
         with:
+          token: ${{ secrets.CODECOV_TOKEN }}
           fail_ci_if_error: false
           files: ./coverage.xml
           flags: unittests


=====================================
CHANGELOG.md
=====================================
@@ -25,6 +25,11 @@ and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
 -
 
 -->
+------
+## [v10.1.1](https://github.com/asfadmin/Discovery-asf_search/compare/v10.1.0...v10.1.1)
+### Added
+- Adds `TROPO_ZENITH` OPERA-S1 product type constant to `PRODUCT_TYPE.py` and concept-id to dataset
+
 ------
 ## [v10.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v10.0.5...v10.1.0)
 ### Changed


=====================================
asf_search/CMR/datasets.py
=====================================
@@ -461,6 +461,7 @@ dataset_collections = {
         'OPERA_L2_RTC-S1-STATIC_V1': ['C1259981910-ASF', 'C2795135174-ASF'],
         'OPERA_L2_RTC-S1_PROVISIONAL_V0': ['C1257995186-ASF'],
         'OPERA_L3_DISP-S1_V1': ['C3294057315-ASF', 'C1271830354-ASF'],
+        'OPERA_L4_TROPO-ZENITH_V1': ['C3717139408-ASF'],
     },
     'OPERA-S1-CALVAL': {
         'OPERA_L2_CSLC-S1_CALVAL_V1': ['C1260721945-ASF', 'C2803501758-ASF'],
@@ -1402,6 +1403,9 @@ collections_by_processing_level = {
         'C3294057315-ASF',
         'C1271830354-ASF'
     ],
+    'TROPO-ZENITH': [
+        'C3717139408-ASF',
+    ],
     'GRD_FD': [
         'C1214471197-ASF',
         'C1212200781-ASF',


=====================================
asf_search/Products/OPERAS1Product.py
=====================================
@@ -3,7 +3,6 @@ from asf_search import ASFSearchOptions, ASFSession
 from asf_search.CMR.translate import try_parse_date, try_parse_int
 from asf_search.Products import S1Product
 
-
 class OPERAS1Product(S1Product):
     """
     ASF Dataset Documentation Page: https://asf.alaska.edu/datasets/daac/opera/
@@ -64,9 +63,20 @@ class OPERAS1Product(S1Product):
             for entry in self.properties['bytes']
         }
 
-        center = self.centroid()
-        self.properties['centerLat'] = center.y
-        self.properties['centerLon'] = center.x
+        if self.properties['processingLevel'] is None:
+            self.properties['processingLevel'] = self.umm_get(self.umm, 'AdditionalAttributes', ('Name', 'PRODUCT_TYPE'), 'Values', 0)
+        
+        # if self.properties['processingLevel'] == 'TROPO-ZENITH':
+        #     west,north,east, south = self.umm['SpatialExtent']['HorizontalSpatialDomain']['Geometry']['BoundingRectangles'][0].values()
+
+        #     self.geometry = {'coordinates': [[[west, north], [east,north], [east, south], [west, south], [west, north]]], 'type': 'Polygon'}
+        if self.properties['processingLevel'] == 'TROPO-ZENITH':
+            self.properties['centerLat'] = None
+            self.properties['centerLon'] = None
+        else:
+            center = self.centroid()
+            self.properties['centerLat'] = center.y
+            self.properties['centerLon'] = center.x
 
         self.properties.pop('frameNumber')
 


=====================================
asf_search/constants/PRODUCT_TYPE.py
=====================================
@@ -101,6 +101,7 @@ CSLC = 'CSLC'
 RTC_STATIC = 'RTC-STATIC'
 CSLC_STATIC = 'CSLC-STATIC'
 DISP_S1 = 'DISP-S1'
+TROPO_ZENITH = 'TROPO-ZENITH'
 
 # NISAR
 L0B = 'L0B'


=====================================
tests/ASFProduct/test_ASFSubproduct.py
=====================================
@@ -0,0 +1,50 @@
+from asf_search import ASFProduct, Products, granule_search
+
+def run_test_ASFSubproduct(scene_names: list[str], expected_subclass: str):
+    scenes = granule_search(scene_names)
+
+    assert sorted([scene.properties['fileID'] for scene in scenes]) == sorted(scene_names)
+
+    for scene in scenes:
+        assert expected_subclass.upper() == scene.__class__.__name__ .upper(), f'Expected scene "{scene.properties["fileID"]}" to be of ASFProduct subclass {expected_subclass}. Got {scene.__class__.__name__}'
+        if isinstance(scene, Products.OPERAS1Product):
+            _test_OPERAS1Product(scene)
+        if isinstance(scene, Products.S1BurstProduct):
+            _test_S1BurstProduct(scene)
+
+def _test_OPERAS1Product(scene: Products.OPERAS1Product):
+    processing_level = scene.properties['processingLevel']
+
+    if processing_level in ['RTC', 'RTC-STATIC']:
+        _check_properties_set(scene.properties, ['bistaticDelayCorrection'])
+
+        if processing_level == 'RTC':
+            _check_properties_set(scene.properties,['noiseCorrection', 'postProcessingFilter'])
+
+    elif processing_level == 'DISP-S1':
+        _check_properties_set(scene.properties, [
+            'frameNumber', 'OperaDispStackID', 'zarrUri', 'zarrStackUri',
+        ])
+    
+    if processing_level == 'TROPO-ZENITH':
+        assert scene.properties['centerLat'] is None
+        assert scene.properties['centerLon'] is None
+    
+def _test_S1BurstProduct(scene: Products.S1BurstProduct):
+    burst_properties = [
+        "absoluteBurstID",
+        "relativeBurstID",
+        "fullBurstID",
+        "burstIndex",
+        "samplesPerBurst",
+        "subswath",
+        "azimuthTime",
+        "azimuthAnxTime",
+    ]
+
+    _check_properties_set(scene.properties['burst'], burst_properties)
+
+
+def _check_properties_set(properties: dict, properties_list: list[str]):
+    for prop in properties_list:
+        assert properties[prop] is not None


=====================================
tests/pytest-config.yml
=====================================
@@ -5,6 +5,11 @@ test_types:
     required_keys: products
     method: test_ASFProduct
 
+- For running ASFProduct Subproduct tests:
+    required_in_title: ASFSubproduct
+    required_keys: ['scenes', 'expected_subclass']
+    method: test_ASFSubproduct
+
 - For running ASFProduct_Stack tests:
     required_keys: ["product", "preprocessed_stack", "processed_stack"]
     required_in_title: ASFProduct_Stack


=====================================
tests/pytest-managers.py
=====================================
@@ -80,6 +80,8 @@ from Serialization.test_serialization import run_test_serialization
 import nbformat
 from nbconvert.preprocessors import ExecutePreprocessor
 
+from ASFProduct.test_ASFSubproduct import run_test_ASFSubproduct
+
 
 # asf_search.ASFProduct Tests
 def test_ASFProduct(**args) -> None:
@@ -90,7 +92,16 @@ def test_ASFProduct(**args) -> None:
     geographic_response = get_resource(test_info['products'])
     run_test_ASFProduct(geographic_response)
 
+def test_ASFSubproduct(**args) -> None:
+    """
+    Tests ASFProduct subclasses for properties and basic functionality
+    """
+    test_info = args['test_info']
+    scene_names = test_info['scenes']
+    expected_subclass = test_info['expected_subclass']
 
+    run_test_ASFSubproduct(scene_names=scene_names, expected_subclass=expected_subclass)
+    
 def test_ASFProduct_Stack(**args) -> None:
     """
     Tests ASFProduct.stack() with reference and corresponding stack


=====================================
tests/yml_tests/test_ASFSubproduct.yml
=====================================
@@ -0,0 +1,16 @@
+tests:
+  - Test OPERA-S1 ASFSubproduct:
+      scenes:
+        [
+          "OPERA_L2_RTC-S1_T160-342208-IW3_20221221T161230Z_20250302T093113Z_S1A_30_v1.0",
+          "OPERA_L2_CSLC-S1_T160-342208-IW3_20221127T161232Z_20240801T232256Z_S1A_VV_v1.1",
+          "OPERA_L2_RTC-S1-STATIC_T160-342208-IW3_20140403_S1B_30_v1.0",
+          "OPERA_L2_CSLC-S1-STATIC_T160-342208-IW3_20140403_S1B_v1.0",
+          "OPERA_L3_DISP-S1_IW_F42776_VV_20180504T161139Z_20180516T161139Z_v1.0_20250829T201146Z",
+          "OPERA_L4_TROPO-ZENITH_20250930T180000Z_20251003T000713Z_HRES_v1.0",
+        ]
+      expected_subclass: OPERAS1Product
+
+  - Test S1Burst ASFSubproduct:
+      scenes: ["S1_055219_EW1_20250418T163543_HH_1D57-BURST"]
+      expected_subclass: S1BurstProduct
\ No newline at end of file



View it on GitLab: https://salsa.debian.org/debian-gis-team/asf-search/-/commit/4272fa2629a635cce43172fe1cfb808a43c2b5f9

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/asf-search/-/commit/4272fa2629a635cce43172fe1cfb808a43c2b5f9
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20251012/19b17f66/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list