[Git][debian-gis-team/asf-search][upstream] New upstream version 10.2.0
Antonio Valentino (@antonio.valentino)
gitlab at salsa.debian.org
Sat Nov 15 17:47:15 GMT 2025
Antonio Valentino pushed to branch upstream at Debian GIS Project / asf-search
Commits:
1b157633 by Antonio Valentino at 2025-11-15T17:39:22+00:00
New upstream version 10.2.0
- - - - -
19 changed files:
- + .github/workflows/run-pytest-authenticated.yml
- .github/workflows/run-pytest.yml
- CHANGELOG.md
- README.md
- asf_search/CMR/datasets.py
- asf_search/CMR/translate.py
- asf_search/Products/NISARProduct.py
- asf_search/Products/SEASATProduct.py
- asf_search/constants/PRODUCT_TYPE.py
- asf_search/export/jsonlite.py
- asf_search/export/jsonlite2.py
- asf_search/export/metalink.py
- asf_search/search/search_generator.py
- + conftest.py
- tests/ASFProduct/test_ASFSubproduct.py
- tests/pytest-managers.py
- tests/yml_tests/test_ASFSubproduct.yml
- + tests/yml_tests/test_authenticated/test_ASFSubproduct_Auth.yml
- tests/yml_tests/test_search.yml
Changes:
=====================================
.github/workflows/run-pytest-authenticated.yml
=====================================
@@ -0,0 +1,40 @@
+name: authenticated tests
+permissions:
+ contents: read
+# For tests that require authenticated searches
+
+on:
+ push:
+ branches:
+ - master
+
+
+jobs:
+ run-tests:
+ runs-on: ubuntu-latest
+ environment: pre-release
+ steps:
+ - uses: actions/checkout at v5
+ - uses: actions/setup-python at v6
+
+ with:
+ python-version: '3.10'
+ - name: Install Dependencies
+ run: |
+ python3 -m pip install --upgrade pip
+ python3 -m pip install .[extras,test,asf-enumeration]
+
+ - name: Run Tests
+ env:
+ EDL_TOKEN: ${{ secrets.EDL_TOKEN }}
+ run: python3 -m pytest --should_auth_session TRUE
+
+ # - name: Upload coverage to Codecov
+ # uses: codecov/codecov-action at v5
+ # with:
+ # token: ${{ secrets.CODECOV_TOKEN }}
+ # fail_ci_if_error: false
+ # files: ./coverage.xml
+ # flags: unittests
+ # name: asf_admin pytest
+ # verbose: true
=====================================
.github/workflows/run-pytest.yml
=====================================
@@ -15,7 +15,7 @@ jobs:
python3 -m pip install .[extras,test,asf-enumeration]
- name: Run Tests
- run: python3 -m pytest -n auto --cov=asf_search --cov-report=xml --dont-run-file test_known_bugs .
+ run: python3 -m pytest -n auto --cov=asf_search --cov-report=xml --dont-run-file test_known_bugs --ignore=tests/yml_tests/test_authenticated .
- name: Upload coverage to Codecov
uses: codecov/codecov-action at v5
=====================================
CHANGELOG.md
=====================================
@@ -25,6 +25,18 @@ and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
-
-->
+------
+## [v10.2.0](https://github.com/asfadmin/Discovery-asf_search/compare/v10.1.2...v10.2.0)
+### Changed
+- `SEATSAT 1` collections in CMR have been merged into a new single collection. Filtering by processing level for same scene no longer necessary. Different file urls, s3 uris, sizes, and md5sums of each scene now accessible via `additionalUrls`, `s3Urls`, `bytes`, and `md5sum` keys in each scene's `properties` dict.
+- NISAR products have collection id and collection name in products and jsonlite export
+
+### Added
+- Added authentication test case support for main `master` branch, `--auth_with_creds` and `--auth-with-token` for local test session authentication
+
+### Fixed
+- Removed `CRSD` from NISAR dataset and relevant constants, no longer part of `L0B` science product type
+
------
## [v10.1.2](https://github.com/asfadmin/Discovery-asf_search/compare/v10.1.1...v10.1.2)
### Added
=====================================
README.md
=====================================
@@ -160,7 +160,12 @@ For more configure options on `logging`, please visit [their howto page](https:/
After installing asf-search's test requirement (see `INSTALL` section above) you can run the test suite locally. Run the following command from your terminal in the root project directory:
```bash
-python3 -m pytest -n auto .
+python3 -m pytest tests/yml_tests --ignore=tests/yml_tests/test_authenticated
+```
+
+For test cases that require authentication you can use your EDL credentials
+```bash
+python3 -m pytest tests/yml_tests/test_authenticated -s --auth_with_creds
```
Tests should be written to relevant subfolder & files in `/tests`
=====================================
asf_search/CMR/datasets.py
=====================================
@@ -3,7 +3,7 @@ from typing import List
from asf_search.constants import PRODUCT_TYPE
NISAR_PRODUCT_TYPES = [
- PRODUCT_TYPE.CRSD, PRODUCT_TYPE.RRSD, #L0
+ PRODUCT_TYPE.RRSD, #L0
PRODUCT_TYPE.SME2, # L3
PRODUCT_TYPE.GSLC, PRODUCT_TYPE.GCOV, PRODUCT_TYPE.GUNW, PRODUCT_TYPE.GOFF, # L2
PRODUCT_TYPE.RSLC, PRODUCT_TYPE.RIFG, PRODUCT_TYPE.RUNW, PRODUCT_TYPE.ROFF, # L1
@@ -63,21 +63,6 @@ dataset_collections = {
'C1257349115-ASF',
'C3622265756-ASF',
],
- 'NISAR_L0B_CRSD_BETA_V1': [
- 'C1261815276-ASFDEV',
- 'C1273831262-ASF',
- 'C2850225137-ASF',
- ],
- 'NISAR_L0B_CRSD_PROVISIONAL_V1': [
- 'C1261832632-ASFDEV',
- 'C1261832671-ASF',
- 'C2853091612-ASF',
- ],
- 'NISAR_L0B_CRSD_V1': [
- 'C1256358463-ASFDEV',
- 'C1257349114-ASF',
- 'C3622254588-ASF',
- ],
'NISAR_L1_RSLC_BETA_V1': [
'C1261813489-ASFDEV',
'C1273831203-ASF',
@@ -622,8 +607,7 @@ dataset_collections = {
'AIRSAR_INT': ['C1208652494-ASF'],
},
'SEASAT': {
- 'SEASAT_SAR_L1_TIFF': ['C1206500826-ASF', 'C1206752770-ASF'],
- 'SEASAT_SAR_L1_HDF5': ['C1206500991-ASF', 'C1206144699-ASF'],
+ 'SEASAT_L1_SAR': ['C3576379529-ASF', 'C1271768606-ASF'],
},
}
@@ -966,10 +950,8 @@ collections_per_platform = {
'C1208703384-ASF',
],
'SEASAT 1': [
- 'C1206500826-ASF',
- 'C1206500991-ASF',
- 'C1206752770-ASF',
- 'C1206144699-ASF',
+ 'C3576379529-ASF', # prod
+ 'C1271768606-ASF', # uat
],
'SMAP': [
'C1243122884-ASF',
@@ -1090,9 +1072,6 @@ collections_per_platform = {
'C1261815274-ASFDEV',
'C1261832497-ASFDEV',
'C1256358262-ASFDEV',
- 'C1261815276-ASFDEV',
- 'C1261832632-ASFDEV',
- 'C1256358463-ASFDEV',
'C1261813489-ASFDEV',
'C1261832868-ASFDEV',
'C1273095154-ASFDEV',
@@ -1143,9 +1122,6 @@ collections_per_platform = {
'C1273831320-ASF',
'C1261832659-ASF',
'C1257349115-ASF',
- 'C1273831262-ASF',
- 'C1261832671-ASF',
- 'C1257349114-ASF',
'C1273831203-ASF',
'C1261833052-ASF',
'C1273831205-ASF',
@@ -1196,9 +1172,6 @@ collections_per_platform = {
'C3622228339-ASF',
'C2853089814-ASF',
'C3622265756-ASF',
- 'C2850225137-ASF',
- 'C2853091612-ASF',
- 'C3622254588-ASF',
'C2850225585-ASF',
'C2853145197-ASF',
'C3622236985-ASF',
@@ -1491,8 +1464,6 @@ collections_by_processing_level = {
'C1207177736-ASF',
'C1206936391-ASF',
'C1205181982-ASF',
- 'C1206500991-ASF',
- 'C1206144699-ASF',
],
'3FP': ['C1213921661-ASF', 'C1213928843-ASF', 'C1205256880-ASF', 'C1208713702-ASF'],
'JPG': ['C1213921626-ASF', 'C1000000306-ASF'],
@@ -1504,7 +1475,6 @@ collections_by_processing_level = {
'LSTOKES': ['C1213927939-ASF'],
'PSTOKES': ['C1213928209-ASF'],
'ATI': ['C1208652494-ASF'],
- 'GEOTIFF': ['C1206500826-ASF', 'C1206752770-ASF'],
'L1A_Radar_RO_ISO_XML': [
'C1243122884-ASF',
'C1243141638-ASF',
@@ -1631,16 +1601,6 @@ collections_by_processing_level = {
'C1258125097-ASFDEV',
'C1258836794-ASF',
'C2887469134-ASF',
-
- 'C1261815276-ASFDEV', # CRSD
- 'C1273831262-ASF',
- 'C2850225137-ASF',
- 'C1261832632-ASFDEV', # Provisional
- 'C1261832671-ASF',
- 'C2853091612-ASF',
- 'C1256358463-ASFDEV', # Validate
- 'C1257349114-ASF',
- 'C3622254588-ASF',
],
'RRSD': [
'C1261815274-ASFDEV', # RRSD BETA
@@ -1657,17 +1617,6 @@ collections_by_processing_level = {
'C1258836794-ASF',
'C2887469134-ASF',
],
- 'CRSD': [
- 'C1261815276-ASFDEV', # CRSD
- 'C1273831262-ASF',
- 'C2850225137-ASF',
- 'C1261832632-ASFDEV', # Provisional
- 'C1261832671-ASF',
- 'C2853091612-ASF',
- 'C1256358463-ASFDEV', # Validate
- 'C1257349114-ASF',
- 'C3622254588-ASF',
- ],
'RSLC': [
'C1261813489-ASFDEV', # Beta
'C1273831203-ASF',
=====================================
asf_search/CMR/translate.py
=====================================
@@ -134,7 +134,7 @@ def fix_cmr_shapes(fixed_params: Dict[str, Any]) -> Dict[str, Any]:
def should_use_asf_frame(cmr_opts):
- asf_frame_platforms = ['SENTINEL-1A', 'SENTINEL-1B', 'SENTINEL-1C', 'ALOS', 'ALOS-2', 'NISAR']
+ asf_frame_platforms = ['SENTINEL-1A', 'SENTINEL-1B', 'SENTINEL-1C', 'ALOS', 'ALOS-2', 'NISAR', 'SEASAT 1']
asf_frame_collections = get_concept_id_alias(asf_frame_platforms, collections_per_platform)
=====================================
asf_search/Products/NISARProduct.py
=====================================
@@ -23,6 +23,7 @@ class NISARProduct(ASFStackableProduct):
'productionConfiguration': {'path': ['AdditionalAttributes', ('Name', 'PRODUCTION_PIPELINE'), 'Values', 0]},
'processingLevel': {'path': ['AdditionalAttributes', ('Name', 'PRODUCT_TYPE'), 'Values', 0]},
'bytes': {'path': ['DataGranule', 'ArchiveAndDistributionInformation']},
+ 'collectionName': {'path': ["CollectionReference", "ShortName"]},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
@@ -39,6 +40,7 @@ class NISARProduct(ASFStackableProduct):
entry['Name']: {'bytes': entry['SizeInBytes'], 'format': entry['Format']}
for entry in self.properties['bytes']
}
+ self.properties["conceptID"] = self.umm_get(self.meta, "collection-concept-id")
@staticmethod
def get_default_baseline_product_type() -> Union[str, None]:
=====================================
asf_search/Products/SEASATProduct.py
=====================================
@@ -1,6 +1,6 @@
from typing import Dict
from asf_search import ASFSession, ASFProduct
-from asf_search.CMR.translate import try_round_float
+from asf_search.CMR.translate import try_parse_int, try_round_float
class SEASATProduct(ASFProduct):
@@ -10,13 +10,30 @@ class SEASATProduct(ASFProduct):
_base_properties = {
**ASFProduct._base_properties,
- 'bytes': {
- 'path': ['AdditionalAttributes', ('Name', 'BYTES'), 'Values', 0],
- 'cast': try_round_float,
- },
- 'insarStackId': {'path': ['AdditionalAttributes', ('Name', 'INSAR_STACK_ID'), 'Values', 0]},
'md5sum': {'path': ['AdditionalAttributes', ('Name', 'MD5SUM'), 'Values', 0]},
+ 'frameNumber': {'path': ['AdditionalAttributes', ('Name', 'FRAME_NUMBER'), 'Values', 0], 'cast': try_parse_int}, # for consolidated collection
+ 'bytes': {'path': ['DataGranule', 'ArchiveAndDistributionInformation']},
}
def __init__(self, args: Dict = {}, session: ASFSession = ASFSession()):
super().__init__(args, session)
+
+ bytes_mapping = {
+ entry['Name']: {'bytes': entry['SizeInBytes'], 'format': entry['Format']}
+ for entry in self.properties['bytes']
+ }
+ md5sum_mapping = {
+ entry['Name']: entry['Checksum']['Value']
+ for entry in self.properties['bytes']
+ }
+
+ self.properties['bytes'] = bytes_mapping
+ self.properties['md5sum'] = md5sum_mapping
+
+ self.properties['additionalUrls'] = self._get_additional_urls()
+ self.properties['browse'] = [url for url in self._get_urls() if url.endswith('.png') or url.endswith('.jpg') or url.endswith('.jpeg')]
+ self.properties['s3Urls'] = self._get_s3_uris()
+
+ center = self.centroid()
+ self.properties['centerLat'] = center.y
+ self.properties['centerLon'] = center.x
=====================================
asf_search/constants/PRODUCT_TYPE.py
=====================================
@@ -91,10 +91,6 @@ CSTOKES = 'CSTOKES'
DEM = 'DEM'
THREEFP = '3FP'
-# SEASAT
-GEOTIFF = 'GEOTIFF'
-# L1 provided by RADARSAT
-
# OPERA-S1
RTC = 'RTC'
CSLC = 'CSLC'
@@ -105,9 +101,8 @@ TROPO_ZENITH = 'TROPO-ZENITH'
# NISAR
L0B = 'L0B'
-"""Convenient alias for CRSD and RRSD Level Zero B product types"""
+"""alias for RRSD Level Zero B product types"""
-CRSD = 'CRSD'
RRSD = 'RRSD'
RSLC = 'RSLC'
=====================================
asf_search/export/jsonlite.py
=====================================
@@ -247,6 +247,13 @@ class JSONLiteStreamArray(list):
'rangeBandwidth': p.get('rangeBandwidth'),
'sizeMB': p.get('bytes'),
}
+ result["collectionName"] = p.get("collectionName")
+ result["conceptID"] = p.get("conceptID")
+ elif p.get('platform') == 'SEASAT 1':
+ result['additionalUrls'] = p.get('additionalUrls', [])
+ result['s3Urls'] = p.get('s3Urls', [])
+ result['sizeMB'] = p.get('bytes', {})
+
elif result.get('productID', result.get('fileName', '')).startswith('S1-GUNW'):
result.pop("perpendicularBaseline", None)
if p.get('ariaVersion') is None:
=====================================
asf_search/export/jsonlite2.py
=====================================
@@ -63,6 +63,8 @@ class JSONLite2StreamArray(JSONLiteStreamArray):
"w": p.get("wkt"),
"wu": p.get("wkt_unwrapped"),
"pge": p.get("pgeVersion"),
+ "adu": p.get("additionalUrls"),
+ 's3u': p.get("s3Urls"),
}
if 'temporalBaseline' in p.keys():
@@ -79,6 +81,8 @@ class JSONLite2StreamArray(JSONLiteStreamArray):
if p.get('nisar') is not None:
result['nsr'] = p['nisar']
+ result["cnm"] = p["collectionName"]
+ result["cid"] = p["conceptID"]
if p.get('ariaVersion') is not None:
result['ariav'] = p.get('ariaVersion')
=====================================
asf_search/export/metalink.py
=====================================
@@ -71,9 +71,18 @@ class MetalinkStreamArray(list):
if p.get('md5sum') and p.get('md5sum') != 'NA':
verification = ETree.Element('verification')
- h = ETree.Element('hash', {'type': 'md5'})
- h.text = p['md5sum']
- verification.append(h)
+ if isinstance(p.get('md5sum'), dict):
+ a = parse.urlparse(p['url'])
+ file_name = os.path.basename(a.path)
+ md5_entry = p['md5sum'].get(file_name)
+ h = ETree.Element('hash', {'type': 'md5'})
+ if md5_entry is not None:
+ h.text=md5_entry
+ verification.append(h)
+ else:
+ h = ETree.Element('hash', {'type': 'md5'})
+ h.text = p['md5sum']
+ verification.append(h)
file.append(verification)
if p['bytes'] and p['bytes'] != 'NA':
=====================================
asf_search/search/search_generator.py
=====================================
@@ -454,7 +454,7 @@ def set_science_product_alias(opts: ASFSearchOptions):
"""Alias certain product types (primarily NISAR L0B)"""
if opts.processingLevel is not None:
processingLevelAliases = {
- 'L0B': ['CRSD', 'RRSD']
+ 'L0B': ['RRSD']
}
processing_levels = []
=====================================
conftest.py
=====================================
@@ -0,0 +1,63 @@
+import argparse
+import pytest
+import os
+from asf_search.ASFSession import ASFSession
+from getpass import getpass
+
+def string_to_session(user_input: str) -> ASFSession:
+ session = ASFSession()
+
+ if user_input is not None and len(user_input):
+ session.auth_with_token(user_input)
+
+ return session
+
+def set_should_auth_session(user_input: str) -> ASFSession:
+ should_auth = string_to_bool(user_input)
+ session = ASFSession()
+ if should_auth:
+ if (token:=os.environ.get('EDL_TOKEN')) is not None:
+ try:
+ session.auth_with_token(token=token)
+ except Exception as exc:
+ raise argparse.ArgumentTypeError(f"Unabled to authenticate with the given environment's `EDL_TOKEN` (token may need to be refreshed). Original exception: {str(exc)}")
+ else:
+ raise argparse.ArgumentTypeError("ERROR: Environment variable `EDL_TOKEN` token not set, cannot create authenticated session for tests. Are you running this in the correct local/github action environment?")
+
+ return session
+
+def set_should_auth_session_with_creds(user_input: str) -> ASFSession:
+ should_auth = string_to_bool(user_input)
+ session = ASFSession()
+ if should_auth:
+ session.auth_with_creds(input('EDL Username'), getpass('EDL Password'))
+
+ return session
+
+def set_should_auth_session_with_token(user_input: str) -> ASFSession:
+ should_auth = string_to_bool(user_input)
+ session = ASFSession()
+ if should_auth:
+ session.auth_with_token(getpass('EDL Token'))
+
+ return session
+
+def string_to_bool(user_input: str) -> bool:
+ user_input = str(user_input).upper()
+ if 'TRUE'.startswith(user_input):
+ return True
+ elif 'FALSE'.startswith(user_input):
+ return False
+ else:
+ raise argparse.ArgumentTypeError(f"ERROR: Could not convert '{user_input}' to bool (true/false/t/f).")
+
+def pytest_addoption(parser: pytest.Parser):
+ parser.addoption("--should_auth_session", action="store", dest="authenticated_session", type=set_should_auth_session, default='FALSE',
+ help = "'should_auth_session': Set if the test case requires authentication (pull from `EDL_TOKEN` environment variable)"
+ )
+
+ parser.addoption("--auth_with_creds", action="store", dest="authenticated_session", type=set_should_auth_session_with_creds, default='FALSE',
+ help = "'auth_with_creds': Use EDL username and password to authenticate session for relevant tests")
+
+ parser.addoption("--auth_with_token", action="store", dest="authenticated_session", type=set_should_auth_session_with_token, default='FALSE',
+ help = "'auth_with_creds': Use EDL token to authenticate session for relevant tests")
=====================================
tests/ASFProduct/test_ASFSubproduct.py
=====================================
@@ -1,7 +1,10 @@
-from asf_search import ASFProduct, Products, granule_search
+from asf_search import Products, search, ASFSearchOptions
+from asf_search.ASFSearchResults import ASFSearchResults
+import json
+import pytest
-def run_test_ASFSubproduct(scene_names: list[str], expected_subclass: str):
- scenes = granule_search(scene_names)
+def run_test_ASFSubproduct(scene_names: list[str], expected_subclass: str, opts: ASFSearchOptions):
+ scenes = search(granule_list=scene_names, opts=opts)
assert sorted([scene.properties['fileID'] for scene in scenes]) == sorted(scene_names)
@@ -11,6 +14,14 @@ def run_test_ASFSubproduct(scene_names: list[str], expected_subclass: str):
_test_OPERAS1Product(scene)
if isinstance(scene, Products.S1BurstProduct):
_test_S1BurstProduct(scene)
+ if isinstance(scene, Products.SEASATProduct):
+ _test_SEASATProduct(scene)
+
+ for output_format in ['geojson', 'json', 'jsonlite', 'jsonlite2', 'csv', 'metalink', 'kml']:
+ try:
+ _get_output(scenes, output_format)
+ except BaseException as exc:
+ pytest.fail(f'Failed to serialized scenes {[scene.properties["fileID"] for scene in scenes]} as output format {output_format}. Original exception: {str(exc)}')
def _test_OPERAS1Product(scene: Products.OPERAS1Product):
processing_level = scene.properties['processingLevel']
@@ -30,6 +41,13 @@ def _test_OPERAS1Product(scene: Products.OPERAS1Product):
assert scene.properties['centerLat'] is None
assert scene.properties['centerLon'] is None
+def _test_SEASATProduct(scene: Products.SEASATProduct):
+ assert isinstance(scene.properties['md5sum'], dict)
+ assert isinstance(scene.properties['bytes'], dict)
+
+ bytes_entries = scene.properties['bytes'].keys()
+ _check_properties_set(scene.properties['md5sum'], bytes_entries)
+
def _test_S1BurstProduct(scene: Products.S1BurstProduct):
burst_properties = [
"absoluteBurstID",
@@ -48,3 +66,21 @@ def _test_S1BurstProduct(scene: Products.S1BurstProduct):
def _check_properties_set(properties: dict, properties_list: list[str]):
for prop in properties_list:
assert properties[prop] is not None
+
+def _get_output(scenes: ASFSearchResults, output_format: str):
+ match output_format.lower():
+ case 'geojson':
+ return scenes.geojson()
+ case 'json':
+ return json.loads(''.join(scenes.json()))
+ case 'jsonlite':
+ return json.loads(''.join(scenes.jsonlite()))
+ case 'jsonlite2':
+ return json.loads(''.join(scenes.jsonlite2()))
+ case 'csv':
+ return ''.join(scenes.csv())
+ case 'metalink':
+ return ''.join(scenes.metalink())
+ case 'kml':
+ return scenes.kml()
+
=====================================
tests/pytest-managers.py
=====================================
@@ -96,11 +96,14 @@ def test_ASFSubproduct(**args) -> None:
"""
Tests ASFProduct subclasses for properties and basic functionality
"""
+ session = args["config"].getoption("authenticated_session")
+
test_info = args['test_info']
scene_names = test_info['scenes']
expected_subclass = test_info['expected_subclass']
+ opts = ASFSearchOptions(**test_info.get('opts', {}), session=session)
- run_test_ASFSubproduct(scene_names=scene_names, expected_subclass=expected_subclass)
+ run_test_ASFSubproduct(scene_names=scene_names, expected_subclass=expected_subclass, opts=opts)
def test_ASFProduct_Stack(**args) -> None:
"""
=====================================
tests/yml_tests/test_ASFSubproduct.yml
=====================================
@@ -13,4 +13,4 @@ tests:
- Test S1Burst ASFSubproduct:
scenes: ["S1_055219_EW1_20250418T163543_HH_1D57-BURST"]
- expected_subclass: S1BurstProduct
\ No newline at end of file
+ expected_subclass: S1BurstProduct
=====================================
tests/yml_tests/test_authenticated/test_ASFSubproduct_Auth.yml
=====================================
@@ -0,0 +1,6 @@
+tests:
+ - Test SEASAT ASFSubproduct:
+ scenes: ["SS_01502_STD_F2536"]
+ opts:
+ dataset: SEASAT
+ expected_subclass: SEASATProduct
=====================================
tests/yml_tests/test_search.yml
=====================================
@@ -187,9 +187,6 @@ nisar_collections: &nisar_collections
'C1261815274-ASFDEV',
'C1261832497-ASFDEV',
'C1256358262-ASFDEV',
- 'C1261815276-ASFDEV',
- 'C1261832632-ASFDEV',
- 'C1256358463-ASFDEV',
'C1261813489-ASFDEV',
'C1261832868-ASFDEV',
'C1273095154-ASFDEV',
@@ -240,9 +237,6 @@ nisar_collections: &nisar_collections
'C1273831320-ASF',
'C1261832659-ASF',
'C1257349115-ASF',
- 'C1273831262-ASF',
- 'C1261832671-ASF',
- 'C1257349114-ASF',
'C1273831203-ASF',
'C1261833052-ASF',
'C1273831205-ASF',
@@ -293,9 +287,6 @@ nisar_collections: &nisar_collections
'C3622228339-ASF',
'C2853089814-ASF',
'C3622265756-ASF',
- 'C2850225137-ASF',
- 'C2853091612-ASF',
- 'C3622254588-ASF',
'C2850225585-ASF',
'C2853145197-ASF',
'C3622236985-ASF',
@@ -405,10 +396,6 @@ tests:
dataset:
- NISAR
expected: [
- {
- processingLevel: ['CRSD'],
- collections: *nisar_collections
- },
{
processingLevel: ['RRSD'],
collections: *nisar_collections
View it on GitLab: https://salsa.debian.org/debian-gis-team/asf-search/-/commit/1b157633e27901b27ae4760a861d6c856d544f9c
--
View it on GitLab: https://salsa.debian.org/debian-gis-team/asf-search/-/commit/1b157633e27901b27ae4760a861d6c856d544f9c
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20251115/ef01047c/attachment-0001.htm>
More information about the Pkg-grass-devel
mailing list