[Git][debian-gis-team/asf-search][upstream] New upstream version 10.1.0
Antonio Valentino (@antonio.valentino)
gitlab at salsa.debian.org
Thu Oct 2 07:26:57 BST 2025
Antonio Valentino pushed to branch upstream at Debian GIS Project / asf-search
Commits:
0934dd75 by Antonio Valentino at 2025-10-02T06:12:28+00:00
New upstream version 10.1.0
- - - - -
11 changed files:
- .github/ISSUE_TEMPLATE/bug_report.md
- + .github/PULL_REQUEST_TEMPLATE.md
- .github/workflows/prod-request-merged.yml
- CHANGELOG.md
- README.md
- asf_search/constants/INTERNAL.py
- asf_search/export/csv.py
- asf_search/export/kml.py
- asf_search/export/metalink.py
- examples/Advanced-Custom-ASFProduct-Subclassing.ipynb
- tests/yml_tests/Resources/ARIAS1GUNW_stack.yml
Changes:
=====================================
.github/ISSUE_TEMPLATE/bug_report.md
=====================================
@@ -11,11 +11,24 @@ assignees: ''
A clear and concise description of what the bug is.
**To Reproduce**
-Steps to reproduce the behavior:
-1. Go to '...'
-2. Click on '....'
-3. Scroll down to '....'
-4. See error
+Provide a minimal python snippet to reproduce the behavior.
+
+\*Reminder: If authentication is required **do not** leave any sensitive credentials in the snippet. Use the `getpass` module https://docs.python.org/3/library/getpass.html
+
+Example snippet:
+``` python
+import asf_search as asf
+from getpass import getpass
+
+granule_list= ['S1A_IW_GRDH_1SDV_20250922T162824_20250922T162849_061103_079DCA_9515']
+response = asf.search(granule_list=granule_list)
+
+session = asf.ASFSession()
+session.auth_with_token(getpass('Earth Data Login Token'))
+
+# The line below raises an error for some reason
+response[0].download('./', session=session)
+```
**Expected behavior**
A clear and concise description of what you expected to happen.
=====================================
.github/PULL_REQUEST_TEMPLATE.md
=====================================
@@ -0,0 +1,40 @@
+# Merge Requirements:
+The following requirements must be met for your pull request to be considered for review & merging. Until these requirements are met please mark the pull request as a draft.
+
+## Purpose
+Why is this pull request necessary? Provide a reference to a related issue in this repository that your pull request addresses (if applicable).
+
+## Description
+A brief description of the changes proposed in the pull request. If there are any changes to packaging requirements please list them.
+
+## Snippet
+If the pull request provides a new feature, provide an example demonstrating the use-case(s) for this pull request (If applicable).
+
+Example:
+``` python
+import asf_search as asf
+
+response = asf.search(dataset=asf.DATASET.SENTINEL1, maxResults=250)
+
+useful_data = response.new_feature()
+```
+
+## Error/Warning/Regression Free
+Your code runs without any unhandled errors, warnings, or regressions
+
+## Unit Tests
+You have added unit tests to the test suite see the [README Testing section](https://github.com/asfadmin/Discovery-asf_search?tab=readme-ov-file#testing) for an overview on adding tests to the test suite.
+
+## Target Merge Branch
+Your pull request targets the `master` branch
+
+
+***
+
+### Checklist
+- [ ] Purpose
+- [ ] Description
+- [ ] Snippet
+- [ ] Error/Warning/Regression Free
+- [ ] Unit Tests
+- [ ] Target Merge Branch
\ No newline at end of file
=====================================
.github/workflows/prod-request-merged.yml
=====================================
@@ -31,7 +31,7 @@ jobs:
(${{ contains(github.event.pull_request.labels.*.name, 'minor') }} && echo "version_type=minor" >> $GITHUB_ENV) || true
(${{ contains(github.event.pull_request.labels.*.name, 'major') }} && echo "version_type=major" >> $GITHUB_ENV) || true
- name: Create a Release
- uses: zendesk/action-create-release at v1
+ uses: zendesk/action-create-release at v3
env:
# NOT built in token, so this can trigger other actions:
GITHUB_TOKEN: ${{ secrets.DISCO_GITHUB_MACHINE_USER }}
=====================================
CHANGELOG.md
=====================================
@@ -25,6 +25,20 @@ and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
-
-->
+------
+## [v10.1.0](https://github.com/asfadmin/Discovery-asf_search/compare/v10.0.5...v10.1.0)
+### Changed
+- Updated publish `action-create-release` github action to v3
+
+### Fixed
+- Updated ARIA test case
+
+------
+## [v10.0.5](https://github.com/asfadmin/Discovery-asf_search/compare/v10.0.4...v10.0.5)
+### Fixed
+- Updates `ASFSession.auth_with_creds` to check for `asf-urs` in cookies
+- Cleans up NISAR size display for csv/kml/metalink formats
+
------
## [v10.0.4](https://github.com/asfadmin/Discovery-asf_search/compare/v10.0.3...v10.0.4)
### Fixed
=====================================
README.md
=====================================
@@ -154,3 +154,56 @@ ASF_LOGGER.error("This is only a drill. Please do not panic.")
```
For more configure options on `logging`, please visit [their howto page](https://docs.python.org/3/howto/logging.html).
+
+### Testing
+
+After installing asf-search's test requirement (see `INSTALL` section above) you can run the test suite locally. Run the following command from your terminal in the root project directory:
+
+```bash
+python3 -m pytest -n auto .
+```
+
+Tests should be written to relevant subfolder & files in `/tests`
+
+The test suite uses the `pytest-automation` pytest plugin which allows us to define and re-use input for test cases in the yaml format. Test cases are written to files in `tests/yml_tests/`, and reusable resources for those tests `tests/yml_tests/Resources/`.
+
+```yaml
+
+tests:
+- Test Nisar Product L1 RSLC: # this is a test case
+ product: NISAR_L1_PR_RSLC_087_039_D_114_2005_DHDH_A_20251102T222008_20251102T222017_T00407_N_P_J_001.yml # this file should be in `tests/yml_tests/Resources/`. See other yml files in the folder to see how you might structure the yml object
+ product_level: L1
+
+- Test Nisar Product L2 GSLC: # this is another test case
+ product: NISAR_L2_PR_GSLC_087_039_D_112_2005_DHDH_A_20251102T221859_20251102T221935_T00407_N_F_J_001.yml
+ product_level: L2
+```
+
+We can create the mapping from our yaml test cases in `tests/pytest-config.yml`, which will be used to call the desired python function in `tests/pytest-managers.py`
+
+In `tests/pytest-config.yml`:
+```yaml
+- For running ASFProduct tests:
+ required_keys: ['product', 'product_level'] # the keys the test case requires
+ method: test_NISARProduct # the python function in pytest-managers.py that will be called
+ required_in_title: Test Nisar Product # (OPTIONAL) will only run test cases defined with `Test Nisar Product` in the name, so the above two test cases would be run with our tests.
+```
+
+
+In `tests/pytest-managers.py`:
+```python
+def test_NISARProduct(**args) -> None: # Must match the name in pytest-config.yml like above for `method`
+ """
+ Test asf_search.search.baseline_search.stack_from_product, asserting stack returned is ordered
+ by temporalBaseline value in ascending order
+ """
+ test_info = args['test_info'] # these are the args defined in our test case (in this case [`product`, `product_level`])
+ product_level = test_info['product_level']
+
+ product_yml_file = test_info['product']
+ product = get_resource(product_yml_file) # `get_resources()` is a helper function that can read yml files from `tests/yml_tests/Resources/`
+
+
+ # `run_[test_name]` should contain your actual test logic
+ run_test_NISARProduct(product, product_level)
+```
=====================================
asf_search/constants/INTERNAL.py
=====================================
@@ -17,6 +17,6 @@ EDL_CLIENT_ID = 'BO_n7nTIlMljdvU6kRRB3g'
DEFAULT_PROVIDER = 'ASF'
AUTH_DOMAINS = ['asf.alaska.edu', 'earthdata.nasa.gov'] #, 'earthdatacloud.nasa.gov']
-AUTH_COOKIES = ['urs_user_already_logged', 'uat_urs_user_already_logged']
+AUTH_COOKIES = ['urs_user_already_logged', 'uat_urs_user_already_logged', 'asf-urs']
ERROR_REPORTING_ENDPOINT = 'search-error-report.asf.alaska.edu'
=====================================
asf_search/export/csv.py
=====================================
@@ -1,5 +1,7 @@
import csv
+import os
from types import GeneratorType
+from urllib import parse
from asf_search import ASF_LOGGER
from asf_search.export.export_translators import ASFSearchResults_to_properties_list
@@ -129,6 +131,15 @@ class CSVStreamArray(list):
ASF_LOGGER.info('Finished streaming csv results')
def getItem(self, p):
+ if p.get('sizeMB') is None and p.get('platform') == 'NISAR':
+ if isinstance(p.get('bytes'), dict):
+ a = parse.urlparse(p['url'])
+ file_name = os.path.basename(a.path)
+ bytes_entry = p['bytes'].get(file_name)
+ if bytes_entry is not None:
+ size_mb = bytes_entry['bytes'] / 1000000
+ p['sizeMB'] = str(size_mb) if size_mb < 0.01 else "{:10.2f}".format(size_mb)
+
return {
'Granule Name': p.get('sceneName'),
'Platform': p.get('platform'),
=====================================
asf_search/export/kml.py
=====================================
@@ -117,13 +117,23 @@ class KMLStreamArray(MetalinkStreamArray):
li.text = text + str(value)
ul.append(li)
+ if p.get('platform') == 'NISAR':
+ h3.text = "Files"
+ div.append(h3)
+ ul_files = ETree.Element("ul")
+ div.append(ul_files)
+ for url in p.get('additionalUrls'):
+ li = ETree.Element("li")
+ li.text = url
+ ul_files.append(li)
+
d = ETree.Element(
"div", attrib={"style": "position:absolute;left:300px;top:250px"}
)
description.append(d)
a = ETree.Element("a")
- if p.get("browse") is not None:
+ if p.get("browse") is not None and len(p.get("browse")):
a.set("href", p.get("browse")[0])
else:
a.set("href", "")
=====================================
asf_search/export/metalink.py
=====================================
@@ -1,6 +1,9 @@
import inspect
+import os
from types import GeneratorType
+from urllib import parse
import xml.etree.ElementTree as ETree
+
from asf_search import ASF_LOGGER
from asf_search.export.export_translators import ASFSearchResults_to_properties_list
@@ -75,7 +78,16 @@ class MetalinkStreamArray(list):
if p['bytes'] and p['bytes'] != 'NA':
size = ETree.Element('size')
- size.text = str(p['bytes'])
+ if isinstance(p.get('bytes'), dict):
+ a = parse.urlparse(p['url'])
+ file_name = os.path.basename(a.path)
+ bytes_entry = p['bytes'].get(file_name)
+ if bytes_entry is not None:
+ size.text=str(bytes_entry['bytes'])
+ else:
+ size.text = str(p['bytes'])
+ else:
+ size.text = str(p['bytes'])
file.append(size)
return '\n' + (8 * ' ') + ETree.tostring(file, encoding='unicode')
=====================================
examples/Advanced-Custom-ASFProduct-Subclassing.ipynb
=====================================
@@ -33,7 +33,7 @@
"outputs": [],
"source": [
"import asf_search as asf\n",
- "products = ['S1A_IW_SLC__1SDV_20231226T162948_20231226T163016_051828_0642C6_272F-SLC', 'S1_185682_IW2_20210224T161634_VV_035E-BURST','S1-GUNW-D-R-087-tops-20190301_20190223-161540-20645N_18637N-PP-7a85-v2_0_1-unwrappedPhase','ALPSRP111041130-RTC_HI_RES', 'UA_newyor_03204_22005-013_22010-002_0014d_s01_L090_01-INTERFEROMETRY']\n",
+ "products = ['S1A_IW_SLC__1SDV_20231226T162948_20231226T163016_051828_0642C6_272F-SLC', 'S1_185682_IW2_20210224T161634_VV_035E-BURST','S1-GUNW-A-R-035-tops-20241214_20241108-020814-00124W_00037N-PP-677a-v3_0_1', 'ALPSRP111041130-RTC_HI_RES', 'UA_newyor_03204_22005-013_22010-002_0014d_s01_L090_01-INTERFEROMETRY']\n",
"results = asf.product_search(product_list=products)\n",
"results"
]
@@ -54,7 +54,7 @@
"metadata": {},
"outputs": [],
"source": [
- "s1, uavsar, s1Burst, ariaGunw, alos = results\n",
+ "ariaGunw, s1, uavsar, s1Burst, alos = results\n",
"\n",
"def compare_properties(lhs: asf.ASFProduct, rhs: asf.ASFProduct):\n",
" # Compares properties of two ASFProduct objects in a color coded table\n",
@@ -344,7 +344,7 @@
],
"metadata": {
"kernelspec": {
- "display_name": "asf-search-env-current",
+ "display_name": "3.11.5",
"language": "python",
"name": "python3"
},
=====================================
tests/yml_tests/Resources/ARIAS1GUNW_stack.yml
=====================================
The diff for this file was not included because it is too large.
View it on GitLab: https://salsa.debian.org/debian-gis-team/asf-search/-/commit/0934dd75ea7926f9fda82575518002f646da6bbf
--
View it on GitLab: https://salsa.debian.org/debian-gis-team/asf-search/-/commit/0934dd75ea7926f9fda82575518002f646da6bbf
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20251002/028a710a/attachment-0001.htm>
More information about the Pkg-grass-devel
mailing list