[Git][debian-gis-team/mintpy][master] 5 commits: New upstream version 1.6.3

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Thu Nov 27 07:37:48 GMT 2025



Antonio Valentino pushed to branch master at Debian GIS Project / mintpy


Commits:
989eae1a by Antonio Valentino at 2025-11-27T07:29:53+00:00
New upstream version 1.6.3
- - - - -
6618207e by Antonio Valentino at 2025-11-27T07:29:57+00:00
Update upstream source from tag 'upstream/1.6.3'

Update to upstream version '1.6.3'
with Debian dir d1bce2e42d297d95c937f15d1b5897829e363c45
- - - - -
327c03cc by Antonio Valentino at 2025-11-27T07:30:34+00:00
New upstream release

- - - - -
d62d5a59 by Antonio Valentino at 2025-11-27T07:33:08+00:00
Refresh all patches

- - - - -
73a35be9 by Antonio Valentino at 2025-11-27T07:33:34+00:00
Set distribution to unstable

- - - - -


30 changed files:

- .circleci/config.yml
- .github/workflows/build-docker.yml
- .github/workflows/build-n-publish-to-pypi.yml
- .pre-commit-config.yaml
- debian/changelog
- debian/patches/0001-Fix-privacy-breachs.patch
- docs/QGIS.md
- docs/api/colormaps.md
- docs/installation.md
- pyproject.toml
- src/mintpy/cli/plate_motion.py
- src/mintpy/cli/tsview.py
- src/mintpy/cli/view.py
- src/mintpy/objects/colors.py
- src/mintpy/objects/progress.py
- src/mintpy/objects/sensor.py
- src/mintpy/plate_motion.py
- src/mintpy/prep_hyp3.py
- src/mintpy/tsview.py
- src/mintpy/utils/arg_utils.py
- src/mintpy/utils/plot.py
- src/mintpy/utils/readfile.py
- src/mintpy/utils/utils.py
- src/mintpy/view.py
- + tests/conftest.py
- + tests/data/S1AC_20251001T204513_20251007T204359_HHR006_INT40_G_ueF_1DBE.txt
- + tests/data/S1_044_000000s1n00-093117s2n01-093118s3n01_IW_20250718_20250730_VV_INT80_B4FA.txt
- + tests/data/S1_056072_IW2_20220814_20220907_VV_INT80_E09B.txt
- tests/requirements.txt
- + tests/test_prep_hyp3.py


Changes:

=====================================
.circleci/config.yml
=====================================
@@ -11,7 +11,8 @@ jobs:
     resource_class: large
 
     steps:
-      - checkout
+      - checkout:
+          method: blobless
       - run:
           name: Setting Up Environment with Miniforge
           command: |
@@ -34,7 +35,7 @@ jobs:
           command: |
             export PYTHONUNBUFFERED=1
             # install dependencies and source code
-            mamba install --verbose --yes --file ${MINTPY_HOME}/requirements.txt gdal libgdal-netcdf
+            mamba install --verbose --yes --file ${MINTPY_HOME}/requirements.txt gdal libgdal-netcdf pytest
             python -m pip install ${MINTPY_HOME}
             # test installation
             smallbaselineApp.py -h
@@ -48,6 +49,7 @@ jobs:
             ${MINTPY_HOME}/tests/objects/ionex.py
             ${MINTPY_HOME}/tests/asc_desc2horz_vert.py
             ${MINTPY_HOME}/tests/dem_error.py
+            pytest ${MINTPY_HOME}/tests/
 
       - run:
           name: Integration Test 1/6 - FernandinaSenDT128 (ISCE2/topsStack)


=====================================
.github/workflows/build-docker.yml
=====================================
@@ -15,7 +15,7 @@ jobs:
     name: Build Docker image and push to GitHub Container Registry
     runs-on: ubuntu-latest
     steps:
-      - uses: actions/checkout at v4
+      - uses: actions/checkout at v5
         with:
           fetch-depth: 0
 


=====================================
.github/workflows/build-n-publish-to-pypi.yml
=====================================
@@ -16,12 +16,12 @@ jobs:
     runs-on: ubuntu-latest
     steps:
 
-      - uses: actions/checkout at v4
+      - uses: actions/checkout at v5
         with:
           fetch-depth: 0
 
       - name: Set up Python 3.10
-        uses: actions/setup-python at v5
+        uses: actions/setup-python at v6
         with:
           python-version: "3.10"
 
@@ -41,7 +41,7 @@ jobs:
           --outdir dist/
           .
 
-      - uses: actions/upload-artifact at v4
+      - uses: actions/upload-artifact at v5
         with:
           path: |
             dist/*.tar.gz
@@ -54,7 +54,7 @@ jobs:
     if: github.repository_owner == 'insarlab' && github.event_name == 'push'
     steps:
 
-      - uses: actions/download-artifact at v4
+      - uses: actions/download-artifact at v6
         with:
           # unpacks default artifact into dist/
           # if `name: artifact` is omitted, the action will create extra parent dir


=====================================
.pre-commit-config.yaml
=====================================
@@ -7,7 +7,7 @@ fail_fast: true
 
 repos:
   - repo: https://github.com/pre-commit/pre-commit-hooks
-    rev: "v5.0.0"
+    rev: "v6.0.0"
     hooks:
       - id: check-added-large-files
         args: ['--maxkb=20']
@@ -27,7 +27,7 @@ repos:
         exclude: tests/data/
 
   - repo: https://github.com/PyCQA/isort
-    rev: "6.0.1"
+    rev: "7.0.0"
     hooks:
       - id: isort
         name: sort imports
@@ -36,7 +36,7 @@ repos:
                '--combine-as']
 
   - repo: https://github.com/asottile/pyupgrade
-    rev: "v3.20.0"
+    rev: "v3.21.1"
     hooks:
       - id: pyupgrade
         name: modernize python


=====================================
debian/changelog
=====================================
@@ -1,11 +1,16 @@
-mintpy (1.6.2-2) UNRELEASED; urgency=medium
+mintpy (1.6.3-1) unstable; urgency=medium
 
-  * Team upload.
+  [ Bas Couwenberg ]
   * Update lintian overrides.
   * Drop Rules-Requires-Root: no, default since dpkg 1.22.13.
   * Use test-build-validate-cleanup instead of test-build-twice.
 
- -- Bas Couwenberg <sebastic at debian.org>  Fri, 12 Sep 2025 17:40:51 +0200
+  [ Antonio Valentino ]
+  * New upstream release.
+  * debian/patches:
+    - Refresh all patches.
+
+ -- Antonio Valentino <antonio.valentino at tiscali.it>  Thu, 27 Nov 2025 07:33:19 +0000
 
 mintpy (1.6.2-1) unstable; urgency=medium
 


=====================================
debian/patches/0001-Fix-privacy-breachs.patch
=====================================
@@ -16,10 +16,10 @@ Forwarded: not-needed
  9 files changed, 43 insertions(+), 50 deletions(-)
 
 diff --git a/docs/QGIS.md b/docs/QGIS.md
-index 11451a4..416b346 100644
+index fc56868..879a1d5 100644
 --- a/docs/QGIS.md
 +++ b/docs/QGIS.md
-@@ -24,7 +24,9 @@ save_explorer.py geo_timeseries.h5 -v geo_velocity.h5 -o geo_maskTempCoh.h5 -o t
+@@ -22,7 +22,9 @@ save_explorer.py geo_timeseries.h5 -v geo_velocity.h5 -o geo_maskTempCoh.h5 -o t
  3. Launch InSAR Explorer and click on any point to plot the time series.
  
  <p align="left">
@@ -30,7 +30,7 @@ index 11451a4..416b346 100644
  </p>
  
  #### c. Using shapefile ###
-@@ -42,14 +44,16 @@ save_qgis.py geo/geo_timeseries_ERA5_ramp_demErr.h5 -g geo/geo_geometryRadar.h5
+@@ -40,14 +42,16 @@ save_qgis.py geo/geo_timeseries_ERA5_ramp_demErr.h5 -g geo/geo_geometryRadar.h5
  ramp_color('RdBu', scale_linear(VEL, -20, 20, 0, 1))
  ```
  
@@ -49,8 +49,8 @@ index 11451a4..416b346 100644
 +  </a>
  </p>
  
- ### d. More information ###
-@@ -83,11 +87,13 @@ ramp_color('RdBu', scale_linear(VEL, -20, 20, 0, 1))
+ ### 2. QGIS with [PS Time Series Viewer](https://plugins.qgis.org/plugins/pstimeseries/)
+@@ -75,11 +79,13 @@ ramp_color('RdBu', scale_linear(VEL, -20, 20, 0, 1))
  ```
  
  <p align="left">
@@ -119,7 +119,7 @@ index 9c26b3d..0540c31 100644
          <section id="downloads" class="clearfix">
            {% if site.github.is_project_page %}
 diff --git a/docs/api/colormaps.md b/docs/api/colormaps.md
-index a405a1b..e1bd7b2 100644
+index 765d1db..778cfa5 100644
 --- a/docs/api/colormaps.md
 +++ b/docs/api/colormaps.md
 @@ -8,8 +8,8 @@ MintPy support the following colormaps:
@@ -148,7 +148,7 @@ index a405a1b..e1bd7b2 100644
 +  https://docs.generic-mapping-tools.org/5.4/_images/GMT_App_M_1b.png</a>
  </p>
  
- ### Colormaps from [cpt-city](http://soliton.vm.bytemark.co.uk/pub/cpt-city/views/totp-cpt.html) ###
+ ### Colormaps from [cpt-city](http://seaviewsensing.com/pub/cpt-city/views/totp-cpt.html) ###
 @@ -63,8 +66,8 @@ The following colormaps is included by default:
  + vikO (cyclic diverging)
  + More at [Scientific Color-Maps](http://www.fabiocrameri.ch/colourmaps.php) ([Crameri, 2018](https://doi.org/10.5194/gmd-11-2541-2018))


=====================================
docs/QGIS.md
=====================================
@@ -2,14 +2,12 @@ The displacement time-series result can be exported as a QGIS-compatible format
 
 ### 1. QGIS with [InSAR Explorer](https://insar-explorer.readthedocs.io/) ###
 
-The InSAR Explorer plugin supports both the GRD format and the shapefile format.
+The InSAR Explorer plugin supports both the GRD format and the shapefile format. For more details, please refer to the [InSAR Explorer documentation](https://insar-explorer.readthedocs.io/).
 
 #### a. Setup ####
 
 1. Download and install [QGIS](https://qgis.org/en/site/) if you have not done so.
-2. Install the plugin:
-  - Install within QGIS via "Plugins -> Manage and Install Plugins", then search for "InSAR Explorer".
-  - Alternatively, download the plugin as a *.zip file from [the plugin page](https://plugins.qgis.org/plugins/insar_explorer-dev/), and install it through "Plugins -> Manage and Install Plugins -> Install from ZIP".
+2. Install the plugin: Install within QGIS via "Plugins -> Manage and Install Plugins", then search for "InSAR Explorer". Alternatively, download the plugin as a *.zip file from [the plugin page](https://plugins.qgis.org/plugins/insar_explorer-dev/), and install it through "Plugins -> Manage and Install Plugins -> Install from ZIP".
 3. Launch the plugin: Access it from the toolbar or through "Plugins -> InSAR Explorer -> InSAR Explorer".
 
 #### b. Usage for GRD files ####
@@ -52,10 +50,6 @@ ramp_color('RdBu', scale_linear(VEL, -20, 20, 0, 1))
   <img width="1000" src="https://insarlab.github.io/figs/docs/mintpy/QGIS-InSAR-Explorer-point.png">
 </p>
 
-### d. More information ###
-
-For more details on using the plugin, please refer to the [InSAR Explorer documentation](https://insar-explorer.readthedocs.io/).
-
 ### 2. QGIS with [PS Time Series Viewer](https://plugins.qgis.org/plugins/pstimeseries/)
 
 The PS Time Series Viewer plugin supports the shapefile format only.
@@ -63,9 +57,7 @@ The PS Time Series Viewer plugin supports the shapefile format only.
 #### a. Setup
 
 1. Download and install [QGIS](https://qgis.org/en/site/) if you have not done so.
-2. Install the plugin:
-  - Install within QGIS via "Plugins -> Manage and Install Plugins", then search "PS Time Series Viewer".
-  - Alternatively, download the plugin as a *.zip file from [the plugin page](https://plugins.qgis.org/plugins/pstimeseries/), and install it through “Plugins -> Manage and Install Plugins -> Install from ZIP”.
+2. Install the plugin: Install within QGIS via "Plugins -> Manage and Install Plugins", then search "PS Time Series Viewer". Alternatively, download the plugin as a *.zip file from [the plugin page](https://plugins.qgis.org/plugins/pstimeseries/), and install it through “Plugins -> Manage and Install Plugins -> Install from ZIP”.
 
 #### b. Usage
 


=====================================
docs/api/colormaps.md
=====================================
@@ -34,7 +34,7 @@ All GMT cpt files, e.g. the 20 built-in colormaps shown below, can be recognized
   <img width="600" src="https://docs.generic-mapping-tools.org/5.4/_images/GMT_App_M_1b.png">
 </p>
 
-### Colormaps from [cpt-city](http://soliton.vm.bytemark.co.uk/pub/cpt-city/views/totp-cpt.html) ###
+### Colormaps from [cpt-city](http://seaviewsensing.com/pub/cpt-city/views/totp-cpt.html) ###
 
 The following colormaps is included by default:
 
@@ -49,7 +49,7 @@ The following colormaps is included by default:
 + wiki-2.0
 + wiki-schwarzwald-d050
 + wiki-scotland
-+ More at [cpt-city](http://soliton.vm.bytemark.co.uk/pub/cpt-city/views/totp-cpt.html)
++ More at [cpt-city](http://seaviewsensing.com/pub/cpt-city/views/totp-cpt.html)
 
 ### Colormaps from [Scientific Color-Maps](http://www.fabiocrameri.ch/colourmaps.php) by Fabio Crameri ###
 


=====================================
docs/installation.md
=====================================
@@ -10,7 +10,7 @@ MintPy is available on the <a href="https://anaconda.org/conda-forge/mintpy">con
 conda install -c conda-forge mintpy
 ```
 
-or via <code>mamba</code> as:
+or via <code>mamba</code> [recommended] as:
 
 ```bash
 mamba install -c conda-forge mintpy
@@ -73,7 +73,7 @@ Install <a href="https://github.com/conda-forge/miniforge">miniforge</a> if you
 # for macOS, use Miniforge3-MacOSX-x86_64.sh instead.
 wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh
 bash Miniforge3-Linux-x86_64.sh -b -p ~/tools/miniforge
-~/tools/miniforge/bin/mamba init bash
+~/tools/miniforge/bin/conda init bash
 ```
 
 Install dependencies into a new environment, e.g. named "insar":


=====================================
pyproject.toml
=====================================
@@ -1,5 +1,5 @@
 [build-system]
-requires = ["setuptools>=64.0", "setuptools_scm[toml]>=8"]
+requires = ["setuptools>=77.0", "setuptools_scm[toml]>=8"]
 build-backend = "setuptools.build_meta"
 
 [project]
@@ -12,12 +12,11 @@ authors = [
 requires-python = ">=3.8"
 
 keywords = ["InSAR", "deformation", "time-series", "volcano", "tectonics", "geodesy", "geophysics", "remote-sensing"]
-license = {text = "GPL-3.0-or-later"}
+license = "GPL-3.0-or-later"
 classifiers=[
     "Development Status :: 4 - Beta",
     "Intended Audience :: Science/Research",
     "Topic :: Scientific/Engineering",
-    "License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)",
     "Operating System :: OS Independent",
     "Programming Language :: Python :: 3",
 ]


=====================================
src/mintpy/cli/plate_motion.py
=====================================
@@ -22,18 +22,14 @@ REFERENCE = """reference:
   Stephenson, O. L., Liu, Y. K., Yunjun, Z., Simons, M., Rosen, P. and Xu, X., (2022),
     The Impact of Plate Motions on Long-Wavelength InSAR-Derived Velocity Fields,
     Geophys. Res. Lett. 49, e2022GL099835, doi:10.1029/2022GL099835.
+  Liu, Y.-K., Yunjun, Z., & Simons, M. (2025). Inferring Tectonic Plate Rotations From
+    InSAR Time Series. Geophys. Res. Lett., 52(12), e2025GL115137, doi:10.1029/2025GL115137.
 
   # list of no-net-rotation (NNR) plate motion models (PMMs):
   # ONLY ITRF14 should be used, as Sentinel-1's orbit is in ITRF2014 reference frame.
   # Other values, e.g. MORVEL56, should be converted into ITR2014 before use.
-  ITRF14 - Table 1 of Altamimi et al. (2017) - 11 plates
-    Altamimi, Z., Métivier, L., Rebischung, P., Rouby, H., & Collilieux, X. (2017).
-    ITRF2014 plate motion model. Geophysical Journal International, 209(3), 1906-1912.
-    doi:10.1093/gji/ggx136
-  MORVEL - Table 1 of Argus et al. (2011) - 56 plates
-    Argus, D. F., Gordon, R. G., & DeMets, C. (2011). Geologically current motion of 56
-    plates relative to the no-net-rotation reference frame. Geochemistry, Geophysics,
-    Geosystems, 12(11). doi:10.1029/2011GC003751
+  ITRF14 - Table 1 of Altamimi et al. (2017, GJI) - 11 plates
+  MORVEL - Table 1 of Argus et al. (2011, G3) - 56 plates
 """
 
 EXAMPLE = """example:


=====================================
src/mintpy/cli/tsview.py
=====================================
@@ -108,6 +108,7 @@ def create_parser(subparsers=None):
     parser = arg_utils.add_map_argument(parser)
     parser = arg_utils.add_memory_argument(parser)
     parser = arg_utils.add_reference_argument(parser)
+    parser = arg_utils.add_shape_argument(parser)
     parser = arg_utils.add_save_argument(parser)
     parser = arg_utils.add_subset_argument(parser)
 


=====================================
src/mintpy/cli/view.py
=====================================
@@ -40,8 +40,9 @@ EXAMPLE = """example:
   view.py geo_velocity_msk.h5 velocity --show-gnss --gnss-comp enu2los --ref-gnss GV01 --gnss-source ESESES
   view.py geo_timeseries_ERA5_ramp_demErr.h5 20180619 --ref-date 20141213 --show-gnss --gnss-comp enu2los --ref-gnss GV01
 
-  # Faults
-  view.py filt_dense_offsets.bil range --faultline simple_fault_confident.lonlat
+  # Polygon/lines, e.g. faults
+  view.py geo_velocity.h5 velocity --shp-file highway.shp ringroad.shp
+  view.py filt_dense_offsets.bil range --shp-file simple_fault_confident.lonlat
 
   # Save and Output
   view.py velocity.h5 --save
@@ -89,6 +90,7 @@ def create_parser(subparsers=None):
     parser = arg_utils.add_map_argument(parser)
     parser = arg_utils.add_memory_argument(parser)
     parser = arg_utils.add_point_argument(parser)
+    parser = arg_utils.add_shape_argument(parser)
     parser = arg_utils.add_reference_argument(parser)
     parser = arg_utils.add_save_argument(parser)
     parser = arg_utils.add_subset_argument(parser)
@@ -155,7 +157,7 @@ def cmd_line_parse(iargs=None):
             print('WARNING: --cbar-ext is NOT compatible with --dem-blend, ignore --cbar-ext and continue.')
 
     # check: conflicted options (geo-only options if inpput file is in radar-coordinates)
-    geo_opt_names = ['--coord', '--show-gnss', '--coastline', '--lalo-label', '--lalo-step', '--scalebar', '--faultline']
+    geo_opt_names = ['--coord', '--show-gnss', '--coastline', '--lalo-label', '--lalo-step', '--scalebar', '--shp-file']
     geo_opt_names = list(set(geo_opt_names) & set(inps.argv))
     if geo_opt_names and 'Y_FIRST' not in readfile.read_attribute(inps.file).keys():
         for opt_name in geo_opt_names:


=====================================
src/mintpy/objects/colors.py
=====================================
@@ -5,7 +5,7 @@
 # Author: Zhang Yunjun, 2019                               #
 ############################################################
 # Recommend import:
-#     from mintpy.objects.colors import ColormapExt
+#     from mintpy.objects.colors import ColormapExt, save_cpt_file
 #     from mintpy.utils import plot as pp
 #     cmap = pp.ColormapExt('cmy').colormap
 
@@ -18,7 +18,7 @@ import re
 import numpy as np
 from matplotlib import pyplot as plt
 from matplotlib.cm import ScalarMappable
-from matplotlib.colors import LinearSegmentedColormap, to_rgb
+from matplotlib.colors import LinearSegmentedColormap, Normalize, to_rgb
 
 import mintpy
 
@@ -200,8 +200,6 @@ class ColormapExt(ScalarMappable):
                      '#bf86f6', '#c67ff6', '#cc79f6', '#d273f6', '#d86df6', '#de67f6',
                      '#e561f6', '#e967ec', '#ed6de2', '#f173d7']
             colormap = LinearSegmentedColormap.from_list('dismph', clist, N=cmap_lut)
-            #colormap = self.cmap_map(lambda x: x/2 + 0.5, colormap)  # brighten colormap
-            #colormap = self.cmap_map(lambda x: x*0.75, colormap)     # darken colormap
             colormap.set_bad('w', 0.0)
 
         elif cmap_name == 'cmy':
@@ -273,143 +271,197 @@ class ColormapExt(ScalarMappable):
             if os.path.isfile(cpt_file):
                 break
 
-        colormap = self.read_cpt_file(cpt_file, cmap_lut=cmap_lut)
+        colormap = read_cpt_file(cpt_file, cmap_lut=cmap_lut)
 
         if reverse_colormap:
             colormap = colormap.reversed()
         return colormap
 
+################################## ColormapExt class end ########################################
 
-    @staticmethod
-    def read_cpt_file(cpt_file, cmap_lut=256):
-        """Read *.cpt file into colorDict
-        Modified from Scipy Cookbook originally written by James Boyle.
-        Link: http://scipy-cookbook.readthedocs.io/items/Matplotlib_Loading_a_colormap_dynamically.html
-        """
-        if not os.path.isfile(cpt_file):
-            raise FileNotFoundError(f"file {cpt_file} not found")
-
-        # read file into list of strings
-        with open(cpt_file) as f:
-            lines = f.readlines()
 
-        # list of string --> x/r/g/b
-        x, r, g, b = [], [], [], []
-        colorModel = "RGB"
-        for line in lines:
-            ls = re.split(' |\t|\n|/', line)
 
-            # skip empty lines
-            if not ls:
-                continue
+################################## Utility Functions ############################################
 
-            # remove empty element
-            ls = [i for i in ls if i]
+def read_cpt_file(cpt_file, cmap_lut=256, print_msg=False):
+    """Read *.cpt file into matplotlib colormap object.
+    Modified from Scipy Cookbook originally written by James Boyle.
+    Link: http://scipy-cookbook.readthedocs.io/items/Matplotlib_Loading_a_colormap_dynamically.html
 
-            # parse header info
-            if line[0] == "#":
-                if ls[-1] == "HSV":
-                    colorModel = "HSV"
-                    continue
-                else:
-                    continue
+    Parameters: cpt_file - str, path to the GMT-compatible CPT file
+                cmap_lut - int, number of color steps
+    Returns:    colormap - matplotlib colormap object
+    """
 
-            # skip BFN info
-            if ls[0] in ["B", "F", "N"]:
+    if not os.path.isfile(cpt_file):
+        raise FileNotFoundError(f"file {cpt_file} not found")
+
+    # read file into list of strings
+    if print_msg:
+        print(f'read CPT file: {cpt_file}')
+    with open(cpt_file) as f:
+        lines = f.readlines()
+
+    # list of string --> x/r/g/b
+    x, r, g, b = [], [], [], []
+    colorModel = "RGB"
+    for line in lines:
+        # skip lines containing only whitespace or tabs
+        if not line.strip():
+            continue
+        ls = re.split(' |\t|\n|/', line)
+
+        # remove empty element
+        ls = [i for i in ls if i]
+
+        # parse header info
+        if line[0] == "#":
+            if ls[-1] == "HSV":
+                colorModel = "HSV"
+                continue
+            else:
                 continue
 
-            # convert color name (in GMT cpt file sometimes) to rgb values
-            if not isnumber(ls[1]):
-                ls0 = list(ls) + [0,0]
-                ls0[1:4] = [i*255. for i in to_rgb(ls[1])]
-                ls0[4:] = ls[2:]
-                ls = list(ls0)
-
-            if not isnumber(ls[5]):
-                ls0 = list(ls) + [0,0]
-                ls0[5:8] = [i*255. for i in to_rgb(ls[5])]
-                ls = list(ls0)
-
-            # convert str to float
-            ls = [float(i) for i in ls]
-
-            # parse color vectors
-            x.append(ls[0])
-            r.append(ls[1])
-            g.append(ls[2])
-            b.append(ls[3])
-
-            # save last row
-            xtemp = ls[4]
-            rtemp = ls[5]
-            gtemp = ls[6]
-            btemp = ls[7]
-        x.append(xtemp)
-        r.append(rtemp)
-        g.append(gtemp)
-        b.append(btemp)
-
-        x = np.array(x, np.float32)
-        r = np.array(r, np.float32)
-        g = np.array(g, np.float32)
-        b = np.array(b, np.float32)
-
-        if colorModel == "HSV":
-            # convert HSV to RGB
-            for i in range(r.shape[0]):
-                r[i], g[i], b[i] = colorsys.hsv_to_rgb(r[i]/360., g[i], b[i])
-        elif colorModel == "RGB":
-            r /= 255.
-            g /= 255.
-            b /= 255.
-
-        # x/r/g/b --> colorDict
-        red, blue, green = [], [], []
-        xNorm = (x - x[0]) / (x[-1] - x[0])
-        for i in range(len(x)):
-            red.append((xNorm[i], r[i], r[i]))
-            green.append((xNorm[i], g[i], g[i]))
-            blue.append((xNorm[i], b[i], b[i]))
-
-        # return colormap
-        cmap_name = os.path.splitext(os.path.basename(cpt_file))[0]
-        colorDict = {"red":tuple(red), "green":tuple(green), "blue":tuple(blue)}
-        colormap = LinearSegmentedColormap(cmap_name, colorDict, N=cmap_lut)
-        return colormap
-
-
-    @staticmethod
-    def cmap_map(function, cmap):
-        """ Applies function (which should operate on vectors of shape 3: [r, g, b]), on colormap cmap.
-        This routine will break any discontinuous points in a colormap.
-        Link: http://scipy-cookbook.readthedocs.io/items/Matplotlib_ColormapTransformations.html
-        """
-        cdict = cmap._segmentdata
-        step_dict = {}
-
-        # First get the list of points where the segments start or end
-        for key in ('red', 'green', 'blue'):
-            step_dict[key] = list(map(lambda x: x[0], cdict[key]))
-        step_list = sum(step_dict.values(), [])
-        step_list = np.array(list(set(step_list)))
-
-        # Then compute the LUT, and apply the function to the LUT
-        reduced_cmap = lambda step : np.array(cmap(step)[0:3])
-        old_LUT = np.array(list(map(reduced_cmap, step_list)))
-        new_LUT = np.array(list(map(function, old_LUT)))
-
-        # Now try to make a minimal segment definition of the new LUT
-        cdict = {}
-        for i, key in enumerate(['red','green','blue']):
-            this_cdict = {}
-            for j, step in enumerate(step_list):
-                if step in step_dict[key]:
-                    this_cdict[step] = new_LUT[j, i]
-                elif new_LUT[j,i] != old_LUT[j, i]:
-                    this_cdict[step] = new_LUT[j, i]
-            colorvector = list(map(lambda x: x + (x[1], ), this_cdict.items()))
-            colorvector.sort()
-            cdict[key] = colorvector
-        return LinearSegmentedColormap('colormap',cdict,1024)
-
-################################## ColormapExt class end ########################################
+        # skip BFN info
+        if ls[0] in ["B", "F", "N"]:
+            continue
+
+        # convert color name (in GMT cpt file sometimes) to rgb values
+        if not isnumber(ls[1]):
+            ls0 = list(ls) + [0,0]
+            ls0[1:4] = [i*255. for i in to_rgb(ls[1])]
+            ls0[4:] = ls[2:]
+            ls = list(ls0)
+        if not isnumber(ls[5]):
+            ls0 = list(ls) + [0,0]
+            ls0[5:8] = [i*255. for i in to_rgb(ls[5])]
+            ls = list(ls0)
+
+        # convert str to float
+        ls = [float(i) for i in ls]
+
+        # parse color vectors
+        x.append(ls[0])
+        r.append(ls[1])
+        g.append(ls[2])
+        b.append(ls[3])
+
+        # save last row
+        xtemp = ls[4]
+        rtemp = ls[5]
+        gtemp = ls[6]
+        btemp = ls[7]
+
+    x.append(xtemp)
+    r.append(rtemp)
+    g.append(gtemp)
+    b.append(btemp)
+    x = np.array(x, np.float32)
+    r = np.array(r, np.float32)
+    g = np.array(g, np.float32)
+    b = np.array(b, np.float32)
+
+    if colorModel == "HSV":
+        # convert HSV to RGB
+        for i in range(r.shape[0]):
+            r[i], g[i], b[i] = colorsys.hsv_to_rgb(r[i]/360., g[i], b[i])
+    elif colorModel == "RGB":
+        r /= 255.
+        g /= 255.
+        b /= 255.
+
+    # x/r/g/b --> color_dict
+    red, blue, green = [], [], []
+    xNorm = (x - x[0]) / (x[-1] - x[0])
+    for i in range(len(x)):
+        red.append((xNorm[i], r[i], r[i]))
+        green.append((xNorm[i], g[i], g[i]))
+        blue.append((xNorm[i], b[i], b[i]))
+
+    # return colormap
+    cmap_name = os.path.splitext(os.path.basename(cpt_file))[0]
+    color_dict = {"red":tuple(red), "green":tuple(green), "blue":tuple(blue)}
+    colormap = LinearSegmentedColormap(cmap_name, color_dict, N=cmap_lut)
+
+    return colormap
+
+
+def save_cpt_file(colormap, cpt_file, cmap_lut=256, vmin=0, vmax=1, print_msg=True):
+    """Save matplotlib colormap object into GMT-compatible CPT file.
+
+    Parameters: colormap - matplotlib colormap object
+                cpt_file - str, path to the output CPT file
+                cmap_lut - int, number of color steps
+                vmin/max - float, data range for the CPT file
+    Returns:    cpt_file - str, path to the output CPT file
+    Examples:   save_cpt_file("viridis", "viridis.cpt", n=256, vmin=0, vmax=1)
+    """
+    # check inputs
+    if vmin >= vmax:
+        raise ValueError(f"Invalid vmin/vmax: vmin ({vmin}) must be less than vmax ({vmax}).")
+
+    # Get the colormap object
+    colormap = plt.get_cmap(colormap) if isinstance(colormap, str) else colormap
+
+    # Sample values evenly between vmin and vmax
+    values = np.linspace(vmin, vmax, cmap_lut)
+    norm = Normalize(vmin=vmin, vmax=vmax)
+
+    cpt_file = os.path.abspath(cpt_file)
+    if print_msg:
+        print(f'save colormap to CPT file: {cpt_file}')
+
+    with open(cpt_file, 'w') as f:
+        f.write("# COLOR_MODEL = RGB\n")
+        for i in range(cmap_lut - 1):
+            # Normalize to [0, 1] and get RGBA
+            c1 = np.array(colormap(norm(values[i]))[:3]) * 255
+            c2 = np.array(colormap(norm(values[i + 1]))[:3]) * 255
+            f.write(f"{values[i]:.6f} {int(round(c1[0]))} {int(round(c1[1]))} {int(round(c1[2]))} "
+                    f"{values[i+1]:.6f} {int(round(c2[0]))} {int(round(c2[1]))} {int(round(c2[2]))}\n")
+
+        # Optional: NaN color
+        f.write("B 0 0 0\n")       # Background
+        f.write("F 255 255 255\n") # Foreground
+        f.write("N 128 128 128\n") # NaN color
+
+    return cpt_file
+
+
+def cmap_map(function, cmap):
+    """ Applies function (which should operate on vectors of shape 3: [r, g, b]), on colormap cmap.
+    This routine will break any discontinuous points in a colormap.
+    Link: http://scipy-cookbook.readthedocs.io/items/Matplotlib_ColormapTransformations.html
+
+    Examples:
+        colormap = self.cmap_map(lambda x: x/2 + 0.5, colormap)  # brighten colormap
+        colormap = self.cmap_map(lambda x: x*0.75, colormap)     # darken   colormap
+    """
+    cdict = cmap._segmentdata
+    step_dict = {}
+
+    # First get the list of points where the segments start or end
+    for key in ('red', 'green', 'blue'):
+        step_dict[key] = list(map(lambda x: x[0], cdict[key]))
+    step_list = sum(step_dict.values(), [])
+    step_list = np.array(list(set(step_list)))
+
+    # Then compute the LUT, and apply the function to the LUT
+    reduced_cmap = lambda step : np.array(cmap(step)[0:3])
+    old_LUT = np.array(list(map(reduced_cmap, step_list)))
+    new_LUT = np.array(list(map(function, old_LUT)))
+
+    # Now try to make a minimal segment definition of the new LUT
+    cdict = {}
+    for i, key in enumerate(['red','green','blue']):
+        this_cdict = {}
+        for j, step in enumerate(step_list):
+            if step in step_dict[key]:
+                this_cdict[step] = new_LUT[j, i]
+            elif new_LUT[j,i] != old_LUT[j, i]:
+                this_cdict[step] = new_LUT[j, i]
+        colorvector = list(map(lambda x: x + (x[1], ), this_cdict.items()))
+        colorvector.sort()
+        cdict[key] = colorvector
+
+    return LinearSegmentedColormap('colormap',cdict,1024)


=====================================
src/mintpy/objects/progress.py
=====================================
@@ -107,7 +107,10 @@ class progressBar:
         # Figure out the new percent done (round to an integer)
         diffFromMin = float(self.amount - self.min)
         percentDone = (diffFromMin / float(self.span)) * 100.0
-        percentDone = int(np.round(percentDone))
+        # prevent ZeroDivisionError when span is zero
+        percentDone = 100 if self.span == 0 else percentDone
+        # clamp to the [0,100] range
+        percentDone = max(0, min(100, round(percentDone)))
 
         # Figure out how many hash bars the percentage should be
         allFull = self.width - 2 - 18


=====================================
src/mintpy/objects/sensor.py
=====================================
@@ -24,6 +24,7 @@ SENSOR_NAME_VARIATION = {
     'ksat5' : ['ksat5', 'kompsat5', 'kompsat', 'kmps5'],
     'lt1'   : ['lt1', 'lt', 'lutan', 'lutan1'],
     'ni'    : ['ni', 'nisar'],
+    'saocom': ['saocom', 'sao', 'saocom1a', 'saocom1b'],
     'rs1'   : ['rs1', 'rsat', 'rsat1', 'radarsat', 'radarsat1'],
     'rs2'   : ['rs2', 'rsat2', 'radarsat2'],
     'rcm'   : ['rcm', 'rsatc', 'radarsat-constellation', 'radarsat-constellation-mission'],
@@ -129,7 +130,7 @@ def get_unavco_mission_name(meta_dict):
         return mission_name
 
     # Convert to UNAVCO Mission name
-    ## ERS, ENV, S1, RS1, RS2, CSK, TSX, JERS, ALOS, ALOS2
+    ## ERS, ENV, S1, RS1, RS2, CSK, TSX, JERS, ALOS, ALOS2, SAOCOM
     if value.startswith(('alos', 'palsar')):
         if value.endswith('2'):
             mission_name = 'ALOS2'
@@ -161,6 +162,9 @@ def get_unavco_mission_name(meta_dict):
     elif value.startswith(('tsx', 'tdx', 'terra', 'tandem')):
         mission_name = 'TSX'
 
+    elif value.startswith(('saocom', 'sao')):
+        mission_name = 'SAOCOM'
+
     elif value.startswith('uav'):
         mission_name = 'UAV'
 
@@ -221,6 +225,7 @@ SWOT = {
 # end    date: operational
 # from Table 1 in Jung et al. (2014)
 # https://www.eoportal.org/satellite-missions/terrasar-x
+# Bachmann et al. (2010, IEEE-TGRS), https://doi.org/10.1109/TGRS.2009.2033934
 TSX = {
     # orbit
     'altitude'                   : 514.8e3,   # m, mean value, 505-533 km
@@ -228,8 +233,8 @@ TSX = {
     'repeat_cycle'               : 11,        # day
     # sar / antenna
     'carrier_frequency'          : 9.65e9,    # Hz
-    'antenna_length'             : 4.8,       # m
-    'antenna_width'              : 0.8,       # m
+    'antenna_length'             : 4.8,       # m, Bachmann et al. (2010)
+    'antenna_width'              : 0.7,       # m, Bachmann et al. (2010)
     'doppler_bandwidth'          : 2770,      # Hz
     'pulse_repetition_frequency' : 3800,      # Hz
     'chirp_bandwidth'            : 100e6,     # Hz
@@ -298,6 +303,18 @@ ICEYE = {
     'chirp_bandwidth'            : [37.6e6, 299e6], # Hz
 }
 
+# TeLEOS-2
+# launch date: 2023-04-22
+# end    date: operational
+# https://geo-insights.ai/wp-content/uploads/2024/06/ST-Engineering-Geo-Insights-TeLEOS-2-Datasheet.pdf
+# https://www.eoportal.org/satellite-missions/teleos-1
+TELEOS2 = {
+    # orbit
+    'altitude'                   : 574e3,           # m, near-equitorial
+    'orbit_inclination'          : 10,              # deg
+    # sar / antenna
+}
+
 
 ##--------------------  C-band  --------------------##
 
@@ -410,6 +427,7 @@ RCM = {
 # https://www.eoportal.org/satellite-missions/gaofen-3
 # Li et al. (2018, RS) at https://doi.org/10.3390/rs10121929
 # Table I in Yang et al. (2023, IEEE-TGRS) at https://doi.org/10.1109/TGRS.2023.3238707
+# Table 2 in Sun et al. (2017, RS) at https://doi.org/10.3390/s17102419
 GF3 = {
     # orbit
     'altitude'                   : 755e3,     # m
@@ -418,7 +436,7 @@ GF3 = {
     # sar / antenna
     'carrier_frequency'          : 5.4e9,     # Hz
     'antenna_length'             : 15,        # m
-    'antenna_width'              : 1.232,     # m
+    'antenna_width'              : 1.232,     # m, Table 2 in Sun et al. (2017)
     'pulse_repetition_frequency' : 1412.18,   # Hz
     'chirp_bandwidth'            : 60.00e6,   # Hz
     'sampling_frequency'         : 533.33e6,  # Hz, IF sampling
@@ -469,18 +487,21 @@ SEN = {
 # https://www.eoportal.org/satellite-missions/hj-1
 # Liu et al. (2014, J Radar), doi: 10.3724/SP.J.1300.2013.13050
 # Zhang et al. (2014, J Radar), doi: https://doi.org/10.3724/SP.J.1300.2014.13135
+# Yu et al. (2014, J Radar), doi: https://doi.org/10.3724/sp.J.1300.2013.13050
 # spatial resolution: 10 m (4 looks)
 # swath width: 100 km
 HJ1C = {
     # orbit
     'altitude'                   : 502e3,     # m
-    'orbit_inclination'          : 97.3,      # deg
+    'orbit_inclination'          : 97.3671,   # deg, Yu et al. (2014)
     'repeat_cycle'               : 31,        # day
     # sar / antenna
     'carrier_frequency'          : 3.13e9,    # Hz
+    'antenna_length'             : 6.0,       # m, Yu et al. (2014)
+    'antenna_width'              : 2.8,       # m, Yu et al. (2014)
     'pulse_repetition_frequency' : 2600,      # Hz, 2600-3700
     'chirp_bandwidth'            : 60.0e6,    # Hz
-    'noise_equivalent_sigma_zero': -22,       # dB
+    'noise_equivalent_sigma_zero': -19,       # dB, Yu et al. (2014)
 }
 
 # NISAR S-band
@@ -505,8 +526,10 @@ NISAR_S = {
 # Seasat
 # launch date: 1978-06-27
 # end    date: 1978-10-10
-# from Table 6-1 in Kim and Jordan (2006)
+# References:
+# Table 6-1 in Kim and Jordan (2006)
 # https://www.eoportal.org/satellite-missions/seasat
+# Table 1.2 in Curlander & Mcdonough (1991)
 SEASAT = {
     # orbit
     'altitude'                   : 787e3,     # m, mean value, 775-799 km
@@ -518,6 +541,7 @@ SEASAT = {
     'antenna_width'              : 2.16,      # m
     'pulse_repetition_frequency' : 1555,      # Hz, 1463-1647
     'chirp_bandwidth'            : 19e6,      # Hz
+    'noise_equivalent_sigma_zero': -18,       # dB, Table 1.2 in Curlander & Mcdonough (1991)
 }
 
 # JERS-1
@@ -613,6 +637,7 @@ ALOS2 = {
 # end    date: operational
 # https://www.eorc.jaxa.jp/ALOS/en/alos-4/a4_about_e.htm
 # https://www.eorc.jaxa.jp/ALOS/en/alos-4/a4_sensor_e.htm
+# https://www.eorc.jaxa.jp/ALOS/jp/activity/kyoto/pdf/2-07_KC25_ALOS4_Eng_A.pdf
 # using stripmap 200km at 3m mode as reference
 ALOS4 = {
     # orbit (same as ALOS-2)
@@ -621,6 +646,8 @@ ALOS4 = {
     'repeat_cycle'               : 14,        # day, (15-3/14 rev/day)
     # sar / antenna
     'carrier_frequency'          : 1257.5e6,  # Hz (spotlight, 3m SM), 1236.5/1257.5/1278.5 MHz
+    'antenna_length'             : 3.7,       # m
+    'antenna_width'              : 10.0,      # m
     'chirp_bandwidth'            : 84e6,      # Hz, 84/42/28
     'range_resolution'           : 3,         # m
     'noise_equivalent_sigma_zero': -20,       # dB, -20/-24/-28
@@ -638,7 +665,7 @@ SAOCOM = {
     'orbit_inclination'          : 97.86,     # deg
     'repeat_cycle'               : 16,        # day, single satellite
     # sar / antenna
-    'carrrier_frequency'         : 1.27414e9, # Hz
+    'carrier_frequency'          : 1.27414e9, # Hz
     'antenna_length'             : 10,        # m
     'pulse_repetition_frequency' : 4545,      # Hz
     'sampling_frequency'         : 50.0e6,    # Hz
@@ -769,6 +796,7 @@ SENSOR_DICT = {
     'lt1'   : LT1,
     'uav'   : UAV_L,
     'ni'    : NISAR_L,
+    'saocom': SAOCOM,
     # P-band
     'bio'   : BIOMASS,
 }


=====================================
src/mintpy/plate_motion.py
=====================================
@@ -10,6 +10,8 @@
 #   Stephenson, O. L., Liu, Y. K., Yunjun, Z., Simons, M., Rosen, P. and Xu, X., (2022),
 #     The Impact of Plate Motions on Long-Wavelength InSAR-Derived Velocity Fields,
 #     Geophys. Res. Lett. 49, e2022GL099835, doi:10.1029/2022GL099835.
+#   Liu, Y.-K., Yunjun, Z., & Simons, M. (2025). Inferring Tectonic Plate Rotations From
+#     InSAR Time Series. Geophys. Res. Lett., 52(12), e2025GL115137, doi:10.1029/2025GL115137.
 
 
 import numpy as np


=====================================
src/mintpy/prep_hyp3.py
=====================================
@@ -7,55 +7,70 @@
 
 import datetime as dt
 import os
+import re
 
 from mintpy.constants import SPEED_OF_LIGHT
 from mintpy.objects import sensor
 from mintpy.utils import readfile, utils1 as ut, writefile
 
-
 #########################################################################
+
+def _get_product_name_and_type(filename: str) -> tuple[str, str]:
+    if match := re.match(
+        r'S1_\d{6}_IW[123](_\d{8}){2}_(VV|HH)_INT\d{2}_[0-9A-F]{4}',
+        filename,
+    ):
+        job_type = 'INSAR_ISCE_BURST'
+
+    elif match := re.match(
+        r'S1_\d{3}_\d{6}s1n\d{2}-\d{6}s2n\d{2}-\d{6}s3n\d{2}_IW(_\d{8}){2}_(VV|HH)_INT\d{2}_[0-9A-F]{4}',
+        filename,
+    ):
+        job_type = 'INSAR_ISCE_MULTI_BURST'
+
+    elif match := re.match(
+        r'S1[ABC]{2}(_\d{8}T\d{6}){2}_(VV|HH)[PRO]\d{3}_INT\d{2}_G_[uw][ec][123F]_[0-9A-F]{4}',
+        filename,
+    ):
+        job_type = 'INSAR_GAMMA'
+
+    else:
+        raise ValueError(f'Failed to parse product name from filename: {filename}')
+
+    return match.group(), job_type
+
+
 def add_hyp3_metadata(fname, meta, is_ifg=True):
-    '''Read/extract metadata from HyP3 metadata file and add to metadata dictionary.
+    """Read/extract metadata from HyP3 metadata file and add to metadata dictionary.
 
-    Two types of ASF HyP3 products are supported: isce2_burst, gamma_scene
-    1. isce2_burst (burst-wide product using ISCE2) metadata file:
-        format: {SAT}_{FRAME}_{SUBSWATH}_{DATE1}_{DATE2}_{POL}_{RES}_{IDS}.txt
+    Three types of ASF HyP3 products are supported:
+
+    1. INSAR_ISCE_BURST (legacy single-burst product using ISCE2) metadata file:
+        format: https://hyp3-docs.asf.alaska.edu/guides/burst_insar_product_guide/#naming-convention-insar_isce_burst
         example: S1_213524_IW1_20170411_20170517_VV_INT80_8E81.txt
-        content:
-            Reference Granule: S1_213524_IW1_20170411T133605_VV_BD30-BURST
-            ...
-    2. gamma_scene (scene-wide product using Gamma) metadata file:
-        format: {SAT}_{DATE1}_{DATE2}_{POL}_{RES}_{SOFT}_{PROC}_{IDS}.txt
+
+    2. INSAR_ISCE_MULTI_BURST (multi-burst product using ISCE2) metadata file:
+        format: https://hyp3-docs.asf.alaska.edu/guides/burst_insar_product_guide/#naming-convention-insar_isce_multi_burst
+        example: S1_064_000000s1n00-136231s2n02-000000s3n00_IW_20200604_20200616_VV_INT80_77F1
+
+    3. INSAR_GAMMA (scene-wide product using GAMMA) metadata file:
+        format: https://hyp3-docs.asf.alaska.edu/guides/insar_product_guide/#naming-convention
         example: S1AA_20190610T135156_20190622T135157_VVP012_INT80_G_ueF_F8BF.txt
-        content:
-            Reference Granule: S1A_IW_SLC__1SDV_20190704T135158_20190704T135225_027968_032877_1C4D
-            ...
 
     Parameters: fname  - str, path to the hyp3 data file, e.g. *unw_phase_clip*.tif, *dem_clip*.tif
                 meta   - dict, existing metadata
                 is_ifg - bool, is the data file interferogram (unw/corr) or geometry (dem/angles)
+
     Returns:    meta   - dict, return metadata
-    '''
-
-    # job_id -> prod_type and date1/2 objects
-    job_id = '_'.join(os.path.basename(fname).split('_')[:8])
-    if job_id.split('_')[2].startswith('IW'):
-        # burst-wide product using ISCE2
-        prod_type = 'isce2_burst'
-        date1, date2 = (dt.datetime.strptime(x,'%Y%m%d') for x in job_id.split('_')[3:5])
-    else:
-        # scene-wide product using Gamma
-        prod_type = 'gamma_scene'
-        date1, date2 = (dt.datetime.strptime(x,'%Y%m%dT%H%M%S') for x in job_id.split('_')[1:3])
+    """
+    product_name, job_type = _get_product_name_and_type(os.path.basename(fname))
 
-    # read hyp3 metadata file
-    meta_file = os.path.join(os.path.dirname(fname), f'{job_id}.txt')
+    meta_file = os.path.join(os.path.dirname(fname), f'{product_name}.txt')
     hyp3_meta = {}
     with open(meta_file) as f:
         for line in f:
             key, value = line.strip().replace(' ','').split(':')[:2]
             hyp3_meta[key] = value
-    ref_granule = hyp3_meta['ReferenceGranule']
 
     # add universal hyp3 metadata
     meta['PROCESSOR'] = 'hyp3'
@@ -94,59 +109,66 @@ def add_hyp3_metadata(fname, meta, is_ifg=True):
         meta['LON_REF4'] = str(W)
 
     # hard-coded metadata for Sentinel-1
-    if ref_granule.startswith('S1'):
-        meta['PLATFORM'] = 'Sen'
-        meta['ANTENNA_SIDE'] = -1
-        meta['WAVELENGTH'] = SPEED_OF_LIGHT / sensor.SEN['carrier_frequency']
-        meta['RANGE_PIXEL_SIZE'] = sensor.SEN['range_pixel_size'] * int(meta['RLOOKS'])
-        meta['AZIMUTH_PIXEL_SIZE'] = sensor.SEN['azimuth_pixel_size'] * int(meta['ALOOKS'])
+    meta['PLATFORM'] = 'Sen'
+    meta['ANTENNA_SIDE'] = -1
+    meta['WAVELENGTH'] = SPEED_OF_LIGHT / sensor.SEN['carrier_frequency']
+    meta['RANGE_PIXEL_SIZE'] = sensor.SEN['range_pixel_size'] * int(meta['RLOOKS'])
+    meta['AZIMUTH_PIXEL_SIZE'] = sensor.SEN['azimuth_pixel_size'] * int(meta['ALOOKS'])
 
     # HyP3 (incidence, azimuth) angle datasets are in the unit of radian,
     # which is different from the isce-2 convention of degree
     if any(x in os.path.basename(fname) for x in ['lv_theta', 'lv_phi']):
         meta['UNIT'] = 'radian'
 
-    # interferogram related metadata
-    if is_ifg:
-        meta['DATE12'] = f'{date1.strftime("%y%m%d")}-{date2.strftime("%y%m%d")}'
-        meta['P_BASELINE_TOP_HDR'] = hyp3_meta['Baseline']
-        meta['P_BASELINE_BOTTOM_HDR'] = hyp3_meta['Baseline']
-
-    # [optional] HDF-EOS5 metadata, including:
+    # HDF-EOS5 metadata, including:
     # beam_mode/swath, relative_orbit, first/last_frame, unwrap_method
-    if ref_granule.startswith('S1'):
-        # beam_mode
-        meta['beam_mode'] = 'IW'
 
-        if prod_type == 'isce2_burst':
-            # burst-wide product using ISCE2
-            meta['beam_swath'] = job_id.split('_')[2][2:]
+    meta['beam_mode'] = 'IW'
+    meta['unwrap_method'] = hyp3_meta['Unwrappingtype']
+
+    if job_type == 'INSAR_ISCE_BURST':
+        date1, date2 = (dt.datetime.strptime(x,'%Y%m%d') for x in product_name.split('_')[3:5])
+        meta['beam_swath'] = product_name.split('_')[2][2]
+
+        # relative_orbit [to be added]
+        # first/last_frame [to be added]
+
+    elif job_type == 'INSAR_ISCE_MULTI_BURST':
+        date1, date2 = (dt.datetime.strptime(x, '%Y%m%d') for x in product_name.split('_')[4:6])
+        swath_tokens = product_name.split('_')[2].split('-')
+        meta['beam_swath'] = ''.join(s[7] for s in swath_tokens if not s.startswith('000000s'))
+
+    else:
+        assert job_type == 'INSAR_GAMMA'
+
+        date1, date2 = (dt.datetime.strptime(x,'%Y%m%dT%H%M%S') for x in product_name.split('_')[1:3])
+        meta['beam_swath'] = '123'
 
-            # relative_orbit [to be added]
-            # first/last_frame [to be added]
+        ref_granule = hyp3_meta['ReferenceGranule']
+        assert ref_granule.startswith('S1')
 
+        abs_orbit = int(hyp3_meta['ReferenceOrbitNumber'])
+        if ref_granule.startswith('S1A'):
+            meta['relative_orbit'] = ((abs_orbit - 73) % 175) + 1
+        elif ref_granule.startswith('S1B'):
+            meta['relative_orbit'] = ((abs_orbit - 202) % 175) + 1
+        elif ref_granule.startswith('S1C'):
+            meta['relative_orbit'] = ((abs_orbit - 172) % 175) + 1
         else:
-            # scene-wide product using Gamma
-            meta['beam_swath'] = '123'
-
-            # relative_orbit
-            abs_orbit = int(hyp3_meta['ReferenceOrbitNumber'])
-            if ref_granule.startswith('S1A'):
-                meta['relative_orbit'] = ((abs_orbit - 73) % 175) + 1
-            elif ref_granule.startswith('S1B'):
-                meta['relative_orbit'] = ((abs_orbit - 202) % 175) + 1
-            else:
-                # add equation for Sentinel-C/D in the future
-                raise ValueError('Un-recognized Sentinel-1 satellite from {ref_granule}!')
-
-            # first/last_frame [to be completed]
-            t0, t1 = ref_granule.split('_')[-5:-3]
-            meta['startUTC'] = dt.datetime.strptime(t0, '%Y%m%dT%H%M%S').strftime('%Y-%m-%d %H:%M:%S.%f')
-            meta['stopUTC']  = dt.datetime.strptime(t1, '%Y%m%dT%H%M%S').strftime('%Y-%m-%d %H:%M:%S.%f')
-            # ascendingNodeTime [to be added]
-
-    # unwrap_method
-    meta['unwrap_method'] = hyp3_meta['Unwrappingtype']
+            # add equation for Sentinel-D in the future
+            raise ValueError(f'Un-recognized Sentinel-1 satellite from {ref_granule}!')
+
+        # first/last_frame [to be completed]
+        t0, t1 = ref_granule.split('_')[-5:-3]
+        meta['startUTC'] = dt.datetime.strptime(t0, '%Y%m%dT%H%M%S').strftime('%Y-%m-%d %H:%M:%S.%f')
+        meta['stopUTC']  = dt.datetime.strptime(t1, '%Y%m%dT%H%M%S').strftime('%Y-%m-%d %H:%M:%S.%f')
+        # ascendingNodeTime [to be added]
+
+    # interferogram related metadata
+    if is_ifg:
+        meta['DATE12'] = f'{date1.strftime("%y%m%d")}-{date2.strftime("%y%m%d")}'
+        meta['P_BASELINE_TOP_HDR'] = hyp3_meta['Baseline']
+        meta['P_BASELINE_BOTTOM_HDR'] = hyp3_meta['Baseline']
 
     return meta
 


=====================================
src/mintpy/tsview.py
=====================================
@@ -185,11 +185,11 @@ def read_init_info(inps):
 
     # do not plot native reference point if it's out of the coverage due to subset
     if (inps.ref_yx and 'Y_FIRST' in atr.keys()
-        and inps.ref_yx == (int(atr.get('REF_Y',-999)), int(atr.get('REF_X',-999)))
+        and is_native_reference_point(inps.ref_yx, atr)
         and not (    inps.pix_box[0] <= inps.ref_yx[1] < inps.pix_box[2]
                  and inps.pix_box[1] <= inps.ref_yx[0] < inps.pix_box[3])):
         inps.disp_ref_pixel = False
-        vprint('the native REF_Y/X is out of subset box, thus do not display')
+        vprint('WARNING: the native REF_Y/X is out of subset box, thus do not display')
 
     ## initial pixel coord
     if inps.lalo:
@@ -260,6 +260,25 @@ def subset_and_multilook_yx(yx, pix_box=None, multilook_num=1):
     return (y, x)
 
 
+def is_native_reference_point(ref_yx, atr, max_err=2):
+    """Check if the given ref_yx is the native reference point or not.
+
+    Parameters: ref_yx  - list of int, input reference point in row/col
+                atr     - dict, attributes, to retrieve the native REF_Y/X
+                max_err - int, maximum allowable error to account for
+                          potential geo2radar coordinate conversion error
+    """
+    if 'REF_Y' not in atr.keys():
+        return False
+
+    ref_x = int(atr['REF_X'])
+    ref_y = int(atr['REF_Y'])
+    x0, x1 = ref_x - max_err, ref_x + max_err
+    y0, y1 = ref_y - max_err, ref_y + max_err
+
+    return x0 <= ref_yx[1] < x1 and y0 <= ref_yx[0] < y1
+
+
 def read_exclude_date(input_ex_date, dateListAll):
     """Read exclude list of dates
     Parameters: input_ex_date : list of string in YYYYMMDD or filenames for excluded dates
@@ -308,7 +327,7 @@ def read_timeseries_data(inps):
             vprint('input data is complex, calculate its amplitude and continue')
             data = np.abs(data)
 
-        if inps.ref_yx and inps.ref_yx != (int(atr.get('REF_Y', -1)), int(atr.get('REF_X', -1))):
+        if inps.ref_yx and not is_native_reference_point(inps.ref_yx, atr):
             (ry, rx) = subset_and_multilook_yx(inps.ref_yx, inps.pix_box, inps.multilook_num)
             ref_phase = data[:, ry, rx]
             data -= np.tile(ref_phase.reshape(-1, 1, 1), (1, data.shape[-2], data.shape[-1]))


=====================================
src/mintpy/utils/arg_utils.py
=====================================
@@ -347,16 +347,6 @@ def add_map_argument(parser):
                       metavar='NUM', type=float, default=1,
                       help='Coastline linewidth (default: %(default)s).')
 
-    # faultline
-    mapg.add_argument('--faultline', dest='faultline_file', type=str,
-                      help='Draw fault line using specified GMT lonlat file.')
-    mapg.add_argument('--faultline-lw', '--faultline-linewidth', dest='faultline_linewidth',
-                      metavar='NUM', type=float, default=0.5,
-                      help='Faultline linewidth (default: %(default)s).')
-    mapg.add_argument('--faultline-min-dist','--faultline-min-len', dest='faultline_min_dist',
-                      metavar='NUM', type=float, default=0.1,
-                      help='Show fault segments with length >= X km (default: %(default)s).')
-
     # lalo label
     mapg.add_argument('--lalo-label', dest='lalo_label', action='store_true',
                       help='Show N, S, E, W tick label for plot in geo-coordinate.\n'
@@ -484,6 +474,22 @@ def add_save_argument(parser):
     return parser
 
 
+def add_shape_argument(parser):
+    """Argument group parser to plot shapes (line, polygon) in ESRI shapefile or GMT lonlat format."""
+    shp = parser.add_argument_group('Shapes', 'Plot various shapes (line, polygon) in ESRI or GMT format.')
+    shp.add_argument('--shp-file','--faultline', dest='shp_file', type=str, nargs='*', metavar='FILE',
+                     help='Shape files in ESRI shapefile or GMT lonlat format.')
+    shp.add_argument('--shp-color', dest='shp_color', type=str, default='k', metavar='COLOR',
+                     help='Shape color (default: %(default)s).')
+    shp.add_argument('--shp-lw','--shp-linewidth','--faultline-lw', dest='shp_linewidth',
+                     default=0.5, type=float, metavar='NUM',
+                     help='Shape linewidth (default: %(default)s).')
+    shp.add_argument('--shp-min-dist','--faultline-min-dist', dest='shp_min_dist', type=float,
+                     default=0.1, metavar='NUM',
+                     help='Do NOT plot line segments with length < X km (default: %(default)s).')
+    return parser
+
+
 def add_subset_argument(parser, geo=True):
     """Argument group parser for subset options"""
     sub = parser.add_argument_group('Subset', 'Display dataset in subset range')


=====================================
src/mintpy/utils/plot.py
=====================================
@@ -388,9 +388,23 @@ def auto_colormap_name(metadata, cmap_name=None, datasetName=None, print_msg=Tru
 
 
 def auto_adjust_colormap_lut_and_disp_limit(data, num_multilook=1, max_discrete_num_step=20, print_msg=True):
+    """Auto adjust the colormap lookup table and display limit for the given 2D/3D matrix.
+
+    Parameters: data                  - 2D/3D np.ndarray, data to be dispalyed
+                num_multilook         - int, number of looks applied to avoid occansional large values
+                max_discrete_num_step - int, maximum number of color steps allowed for discrete colormaps
+    Returns:    cmap_lut              - int, number of colors in the colormap lookup table
+                vlim                  - list(float), min/max value for display
+                unique_values         - np.ndarray, unique values of the given data
+    """
+    # prevent empty input data
+    finite_values = np.ma.masked_invalid(data).compressed()
+    if finite_values.size == 0:
+        warnings.warn('NO pixel with finite value found!')
+        return 256, [0.0, 0.0], None
 
     # max step size / min step number for a uniform colormap
-    unique_values = np.unique(data[~np.isnan(data) * np.isfinite(data)])
+    unique_values = np.unique(finite_values)
     min_val = np.min(unique_values).astype(float)
     max_val = np.max(unique_values).astype(float)
 
@@ -1512,47 +1526,150 @@ def plot_colorbar(inps, im, cax):
     return inps, cbar
 
 
-def plot_faultline(ax, faultline_file, SNWE, linewidth=0.5, min_dist=0.1, print_msg=True):
-    """Plot fault lines.
+def plot_shape(ax, shp_files, SNWE, color='k', linewidth=0.5, min_dist=0.1, print_msg=True):
+    """Plot shapes (line, polygon) in ESRI shapefile or GMT lonlat format.
 
-    Parameters: ax             - matplotlib.axes object
-                faultline_file - str, path to the fault line file in GMT lonlat format
-                SNWE           - tuple of 4 float, for south, north, west and east
-    Returns:    ax             - matplotlib.axes object
-                faults         - list of 2D np.ndarray in size of [num_point, 2] in float32
-                                 with each row for one point in [lon, lat] in degrees
+    Parameters: ax        - matplotlib.axes object
+                shp_files - list(str), path(s) to the shape file in ESRI or GMT format
+                SNWE      - tuple of 4 float, for south, north, west and east
+                color     - str, line color
+                linewidth - float, linewidth in points
+                min_dist  - float, minimum segment distance (for GMT format only)
+    Returns:    ax        - matplotlib.axes object
     """
+    num_file = len(shp_files)
+    kwargs = dict(color=color, linewidth=linewidth, print_msg=print_msg)
 
-    if print_msg:
-        print(f'plot fault lines from GMT lonlat file: {faultline_file}')
+    for i, shp_file in enumerate(shp_files):
+        if print_msg:
+            print(f'plotting shapes from {i+1}/{num_file} files: {shp_file}')
+
+        if shp_file.endswith('.shp'):
+            plot_shapefile(ax, shp_file, **kwargs)
+
+        elif shp_file.endswith('.lonlat'):
+            plot_gmt_lonlat_file(ax, shp_file, SNWE, min_dist=min_dist, **kwargs)
+
+    # keep the same axis limit
+    S, N, W, E = SNWE
+    ax.set_xlim(W, E)
+    ax.set_ylim(S, N)
 
-    # read faults
+    return ax
+
+
+def plot_shapefile(ax, shp_file, color='k', linewidth=0.5, print_msg=True):
+    """Plot shapes (line or polygon) in ESRI shapefile format.
+
+    Parameters: ax        - matplotlib.axes object
+                shp_file  - str, path to the fault line file in GMT lonlat format
+                color     - str, line color
+                linewidth - float, linewidth in points
+    Returns:    ax       - matplotlib.axes object
+    """
+
+    from osgeo import ogr, osr
+
+    # read shapefile using GDAL
+    driver = ogr.GetDriverByName("ESRI Shapefile")
+    ds = driver.Open(shp_file, 0)
+    if ds is None:
+        raise RuntimeError(f"Could not open {shp_file} using GDAL/OGR!")
+    layer = ds.GetLayer()
+
+    # convert to lat/lon
+    source_srs = layer.GetSpatialRef()
+    if source_srs is None:
+        if print_msg:
+            print("⚠️ No CRS found in shapefile (.prj missing). Assuming WGS84.")
+        source_srs = osr.SpatialReference()
+        source_srs.ImportFromEPSG(4326)
+
+    target_srs = osr.SpatialReference()
+    target_srs.ImportFromEPSG(4326)   # WGS84 (lat/lon)
+    target_srs.SetAxisMappingStrategy(osr.OAMS_TRADITIONAL_GIS_ORDER)   # set to lon/lat order
+
+    transform = osr.CoordinateTransformation(source_srs, target_srs)
+
+    # plot: loop through each feature
+    kwargs = dict(color=color, linewidth=linewidth)
+    for feature in layer:
+        geom = feature.GetGeometryRef()
+        if not source_srs.IsGeographic():
+            if print_msg:
+                print("The shapefile is projected (e.g. UTM). Converting to lat/lon...")
+            geom.Transform(transform)  # convert to lat/lon
+        geom_type = geom.GetGeometryType()
+
+        def draw_polygon(polygon):
+            """Helper to draw single polygon."""
+            for i in range(polygon.GetGeometryCount()):
+                ring = polygon.GetGeometryRef(i)
+                x = [ring.GetX(j) for j in range(ring.GetPointCount())]
+                y = [ring.GetY(j) for j in range(ring.GetPointCount())]
+                ax.plot(x, y, **kwargs)
+
+        # handle different geometry types
+        if geom_type in (ogr.wkbPolygon, ogr.wkbPolygon25D):
+            draw_polygon(geom)
+
+        elif geom_type in (ogr.wkbMultiPolygon, ogr.wkbMultiPolygon25D):
+            for i in range(geom.GetGeometryCount()):
+                draw_polygon(geom.GetGeometryRef(i))
+
+        elif geom_type in (ogr.wkbLineString, ogr.wkbLineString25D):
+            x = [geom.GetX(i) for i in range(geom.GetPointCount())]
+            y = [geom.GetY(i) for i in range(geom.GetPointCount())]
+            ax.plot(x, y, **kwargs)
+
+        elif geom_type in (ogr.wkbMultiLineString, ogr.wkbMultiLineString25D):
+            for i in range(geom.GetGeometryCount()):
+                line = geom.GetGeometryRef(i)
+                x = [line.GetX(j) for j in range(line.GetPointCount())]
+                y = [line.GetY(j) for j in range(line.GetPointCount())]
+                ax.plot(x, y, **kwargs)
+
+        elif geom_type == ogr.wkbPoint:
+            ax.plot(geom.GetX(), geom.GetY(), "o", **kwargs)
+
+        else:
+            warnings.warn(f'Un-recognized geometry type: {geom_type}! Ignore and continue.')
+
+    return ax
+
+
+def plot_gmt_lonlat_file(ax, shp_file, SNWE, min_dist=0.1, color='k', linewidth=0.5, print_msg=True):
+    """Plot lines in GMT lonlat format.
+
+    Parameters: ax        - matplotlib.axes object
+                shp_file  - str, path to the fault line file in GMT lonlat format
+                SNWE      - tuple of 4 float, for south, north, west and east
+                min_dist  - float, minimum segment distance (for GMT format only)
+                color     - str, line color
+                linewidth - float, linewidth in points
+    Returns:    ax       - matplotlib.axes object
+    """
+    # read
     faults = readfile.read_gmt_lonlat_file(
-        faultline_file,
+        shp_file,
         SNWE=SNWE,
         min_dist=min_dist,
         print_msg=print_msg,
     )
 
     if len(faults) == 0:
-        warnings.warn(f'No fault lines found within {SNWE} with length >= {min_dist} km!')
-        print('  continue without fault lines.')
+        warnings.warn(f'No lines found within {SNWE} with length >= {min_dist} km! Skip plotting.')
         return ax, faults
 
     # plot
     print_msg = False if len(faults) < 1000 else print_msg
     prog_bar = ptime.progressBar(maxValue=len(faults), print_msg=print_msg)
     for i, fault in enumerate(faults):
-        ax.plot(fault[:,0], fault[:,1], 'k-', lw=linewidth)
+        ax.plot(fault[:,0], fault[:,1], color=color, linewidth=linewidth)
         prog_bar.update(i+1, every=10)
     prog_bar.close()
 
-    # keep the same axis limit
-    S, N, W, E = SNWE
-    ax.set_xlim(W, E)
-    ax.set_ylim(S, N)
-
-    return ax, faults
+    return ax
 
 
 def add_arrow(line, position=None, direction='right', size=15, color=None):


=====================================
src/mintpy/utils/readfile.py
=====================================
@@ -353,6 +353,8 @@ def read(fname, box=None, datasetName=None, print_msg=True, xstep=1, ystep=1, da
     length, width = int(atr['LENGTH']), int(atr['WIDTH'])
     if not box:
         box = (0, 0, width, length)
+    elif box[0] < 0 or box[1] < 0 or box[2] > width or box[3] > length:
+        raise ValueError(f'Input box {tuple(box)} is NOT within the data size range (0, 0, {width}, {length})!')
 
     # read data
     kwargs = dict(


=====================================
src/mintpy/utils/utils.py
=====================================
@@ -325,7 +325,7 @@ def transect_yx(z, atr, start_yx, end_yx, interpolation='nearest'):
 
     # Calculate Distance along the line
     earth_radius = 6.3781e6    # in meter
-    dist_unit = 'm'
+    #dist_unit = 'm'
     if 'Y_FIRST' in atr.keys():
         y_step = float(atr['Y_STEP'])
         x_step = float(atr['X_STEP'])
@@ -341,7 +341,7 @@ def transect_yx(z, atr, start_yx, end_yx, interpolation='nearest'):
         except KeyError:
             x_step = 1
             y_step = 1
-            dist_unit = 'pixel'
+            #dist_unit = 'pixel'
     dist_line = np.hypot((xs - x0) * x_step,
                          (ys - y0) * y_step)
 
@@ -355,7 +355,7 @@ def transect_yx(z, atr, start_yx, end_yx, interpolation='nearest'):
     transect['X'] = xs[mask]
     transect['value'] = z_line[mask]
     transect['distance'] = dist_line[mask]
-    transect['distance_unit'] = dist_unit
+    #transect['distance_unit'] = dist_unit
 
     return transect
 


=====================================
src/mintpy/view.py
=====================================
@@ -574,14 +574,15 @@ def plot_slice(ax, data, metadata, inps):
         else:
             raise ValueError(f'Un-recognized plotting style: {inps.style}!')
 
-        # Draw faultline using GMT lonlat file
-        if inps.faultline_file:
-            pp.plot_faultline(
+        # Draw shapes (line, polygon) from ESRI shapefile or GMT lonlat file
+        if inps.shp_file:
+            pp.plot_shape(
                 ax=ax,
-                faultline_file=inps.faultline_file,
+                shp_files=inps.shp_file,
                 SNWE=SNWE,
-                linewidth=inps.faultline_linewidth,
-                min_dist=inps.faultline_min_dist,
+                color=inps.shp_color,
+                linewidth=inps.shp_linewidth,
+                min_dist=inps.shp_min_dist,
                 print_msg=inps.print_msg,
             )
 


=====================================
tests/conftest.py
=====================================
@@ -0,0 +1,8 @@
+from pathlib import Path
+
+import pytest
+
+
+ at pytest.fixture()
+def test_data_dir() -> Path:
+    return Path(__file__).parent / 'data'


=====================================
tests/data/S1AC_20251001T204513_20251007T204359_HHR006_INT40_G_ueF_1DBE.txt
=====================================
@@ -0,0 +1,33 @@
+Reference Granule: S1A_IW_SLC__1SDH_20251001T204513_20251001T204540_061237_07A332_A16A
+Secondary Granule: S1C_IW_SLC__1SDH_20251007T204359_20251007T204426_004461_008D5C_0C40
+Reference Pass Direction: ASCENDING
+Reference Orbit Number: 61237
+Secondary Pass Direction: ASCENDING
+Secondary Orbit Number: 4461
+Baseline: 48.0195
+UTC time: 74714.869576
+Heading: -18.1505111
+Spacecraft height: 705239.4029000001
+Earth radius at nadir: 6362008.3766
+Slant range near: 803591.1723
+Slant range center: 882714.7457
+Slant range far: 961838.3190
+Range looks: 10
+Azimuth looks: 2
+INSAR phase filter: none
+Phase filter parameter: 0.0
+Resolution of output (m): 40
+Range bandpass filter: no
+Azimuth bandpass filter: no
+DEM source: GLO-30
+DEM resolution (m): 80
+Unwrapping type: mcf
+Phase at reference point: 3.32844
+Azimuth line of the reference point in SAR space: 497
+Range pixel of the reference point in SAR space: 5724
+Y coordinate of the reference point in the map projection: 6703535.9184
+X coordinate of the reference point in the map projection: 515679.7018
+Latitude of the reference point (WGS84): 60.46772014
+Longitude of the reference point (WGS84): -44.7148542
+Unwrapping threshold: none
+Speckle filter: no


=====================================
tests/data/S1_044_000000s1n00-093117s2n01-093118s3n01_IW_20250718_20250730_VV_INT80_B4FA.txt
=====================================
@@ -0,0 +1,25 @@
+Reference Granule: S1_093118_IW3_20250718T172125_VV_238F-BURST, S1_093117_IW2_20250718T172122_VV_238F-BURST
+Secondary Granule: S1_093118_IW3_20250730T172125_VV_6465-BURST, S1_093117_IW2_20250730T172121_VV_6465-BURST
+Reference Pass Direction: DESCENDING
+Reference Orbit Number: 60141
+Secondary Pass Direction: DESCENDING
+Secondary Orbit Number: 60316
+Baseline: 33.88969537726557
+UTC time: 62482.201944
+Heading: -164.275514873119
+Spacecraft height: 693000.0
+Earth radius at nadir: 6337286.638938101
+Slant range near: 849199.1730339259
+Slant range center: 877732.8145960167
+Slant range far: 906266.4561581073
+Range looks: 20
+Azimuth looks: 4
+INSAR phase filter: yes
+Phase filter parameter: 0.5
+Range bandpass filter: no
+Azimuth bandpass filter: no
+DEM source: GLO_30
+DEM resolution (m): 30
+Unwrapping type: snaphu_mcf
+Speckle filter: yes
+Water mask: yes


=====================================
tests/data/S1_056072_IW2_20220814_20220907_VV_INT80_E09B.txt
=====================================
@@ -0,0 +1,34 @@
+Reference Granule: S1_056072_IW2_20220814T125829_VV_67BC-BURST
+Secondary Granule: S1_056072_IW2_20220907T125830_VV_97A5-BURST
+Reference Pass Direction: ASCENDING
+Reference Orbit Number: 44549
+Secondary Pass Direction: ASCENDING
+Secondary Orbit Number: 44899
+Baseline: 158.57439820410497
+UTC time: 46709.112304
+Heading: -13.3101122688203
+Spacecraft height: 693000.0
+Earth radius at nadir: 6337286.638938101
+Slant range near: 846099.1914484155
+Slant range center: 875670.6529326118
+Slant range far: 905242.1144168081
+Range looks: 20
+Azimuth looks: 4
+INSAR phase filter: yes
+Phase filter parameter: 0.5
+Range bandpass filter: no
+Azimuth bandpass filter: no
+DEM source: GLO_30
+DEM resolution (m): 30
+Unwrapping type: snaphu_mcf
+Speckle filter: yes
+Water mask: yes
+Radar n lines: 376
+Radar n samples: 1269
+Radar first valid line: 7
+Radar n valid lines: 363
+Radar first valid sample: 23
+Radar n valid samples: 1218
+Multilook azimuth time interval: 0.008222225199999992
+Multilook range pixel size: 46.59124229430646
+Radar sensing stop: 2022-08-14T12:58:32.201805


=====================================
tests/requirements.txt
=====================================
@@ -3,4 +3,5 @@ libgdal-netcdf  # for users of GMTSAR users
 isce2           # for users of ISCE-2 users
 pre-commit      # for developers
 pyfftw
+pytest
 setuptools_scm


=====================================
tests/test_prep_hyp3.py
=====================================
@@ -0,0 +1,180 @@
+import pytest
+
+from mintpy.prep_hyp3 import _get_product_name_and_type, add_hyp3_metadata
+
+
+def test_get_product_name_and_type():
+    assert _get_product_name_and_type(
+        'S1_136231_IW2_20200604_20200616_VV_INT80_10C1_foo.tif'
+    ) == ('S1_136231_IW2_20200604_20200616_VV_INT80_10C1', 'INSAR_ISCE_BURST')
+
+    assert _get_product_name_and_type(
+        'S1_064_000000s1n00-136231s2n02-000000s3n00_IW_20200604_20200616_VV_INT80_77F1_foo.tif'
+    ) == ('S1_064_000000s1n00-136231s2n02-000000s3n00_IW_20200604_20200616_VV_INT80_77F1', 'INSAR_ISCE_MULTI_BURST')
+
+    assert _get_product_name_and_type(
+        'S1AA_20150504T120217_20150621T120220_VVP048_INT80_G_ueF_5CED_foo.tif'
+    ) == ('S1AA_20150504T120217_20150621T120220_VVP048_INT80_G_ueF_5CED', 'INSAR_GAMMA')
+
+    # Old INSAR_ISCE_MULTI_BURST naming convention
+    with pytest.raises(
+        ValueError,
+        match=r'^Failed to parse product name from filename: '
+              r'S1A_064_E053_1_N27_3_E054_1_N27_8_20200604_20200616_VV_INT80_3FBF_foo\.tif$',
+    ):
+        _get_product_name_and_type(
+            'S1A_064_E053_1_N27_3_E054_1_N27_8_20200604_20200616_VV_INT80_3FBF_foo.tif'
+        )
+
+    with pytest.raises(ValueError, match=r'^Failed to parse product name from filename: foo$'):
+        _get_product_name_and_type('foo')
+
+
+def test_add_hyp3_metadata_insar_isce_burst(test_data_dir):
+    assert add_hyp3_metadata(
+        fname=str(test_data_dir / 'S1_056072_IW2_20220814_20220907_VV_INT80_E09B_corr.tif'),
+        meta={
+            'WIDTH': 1335,
+            'LENGTH': 485,
+            'X_STEP': 80.0,
+            'Y_STEP': -80.0,
+            'X_FIRST': 445520.0,
+            'Y_FIRST': 4289840.0
+        },
+        is_ifg=True,
+    ) == {
+       'WIDTH': 1335,
+       'LENGTH': 485,
+       'X_STEP': 80.0,
+       'Y_STEP': -80.0,
+       'X_FIRST': 445520.0,
+       'Y_FIRST': 4289840.0,
+       'PROCESSOR': 'hyp3',
+       'CENTER_LINE_UTC': '46709.112304',
+       'ALOOKS': '4',
+       'RLOOKS': '20',
+       'EARTH_RADIUS': '6337286.638938101',
+       'HEIGHT': '693000.0',
+       'STARTING_RANGE': '846099.1914484155',
+       'HEADING': -13.310112268820319,
+       'ORBIT_DIRECTION': 'ASCENDING',
+       'LAT_REF1': '4251040.0',
+       'LAT_REF2': '4251040.0',
+       'LAT_REF3': '4289840.0',
+       'LAT_REF4': '4289840.0',
+       'LON_REF1': '445520.0',
+       'LON_REF2': '552320.0',
+       'LON_REF3': '445520.0',
+       'LON_REF4': '552320.0',
+       'PLATFORM': 'Sen',
+       'ANTENNA_SIDE': -1,
+       'WAVELENGTH': 0.055465764662349676,
+       'RANGE_PIXEL_SIZE': 46.0,
+       'AZIMUTH_PIXEL_SIZE': 56.4,
+       'DATE12': '220814-220907',
+       'P_BASELINE_TOP_HDR': '158.57439820410497',
+       'P_BASELINE_BOTTOM_HDR': '158.57439820410497',
+       'beam_mode': 'IW',
+       'beam_swath': '2',
+       'unwrap_method': 'snaphu_mcf'
+   }
+
+
+def test_add_hyp3_metadata_insar_isce_multi_burst(test_data_dir):
+    assert add_hyp3_metadata(
+        fname=str(test_data_dir / 'S1_044_000000s1n00-093117s2n01-093118s3n01_IW_20250718_20250730_VV_INT80_B4FA_unw_phase.tif'),
+        meta={
+            'WIDTH': 2314,
+            'LENGTH': 718,
+            'X_STEP': 80.0,
+            'Y_STEP': -80.0,
+            'X_FIRST': 660960.0,
+            'Y_FIRST': 5950880.0,
+        },
+        is_ifg=True,
+    ) == {
+       'WIDTH': 2314,
+       'LENGTH': 718,
+       'X_STEP': 80.0,
+       'Y_STEP': -80.0,
+       'X_FIRST': 660960.0,
+       'Y_FIRST': 5950880.0,
+       'PROCESSOR': 'hyp3',
+       'CENTER_LINE_UTC': '62482.201944',
+       'ALOOKS': '4',
+       'RLOOKS': '20',
+       'EARTH_RADIUS': '6337286.638938101',
+       'HEIGHT': '693000.0',
+       'STARTING_RANGE': '849199.1730339259',
+       'HEADING': -164.275514873119,
+       'ORBIT_DIRECTION': 'DESCENDING',
+       'LAT_REF1': '5950880.0',
+       'LAT_REF2': '5950880.0',
+       'LAT_REF3': '5893440.0',
+       'LAT_REF4': '5893440.0',
+       'LON_REF1': '846080.0',
+       'LON_REF2': '660960.0',
+       'LON_REF3': '846080.0',
+       'LON_REF4': '660960.0',
+       'PLATFORM': 'Sen',
+       'ANTENNA_SIDE': -1,
+       'WAVELENGTH': 0.055465764662349676,
+       'RANGE_PIXEL_SIZE': 46.0,
+       'AZIMUTH_PIXEL_SIZE': 56.4,
+       'DATE12': '250718-250730',
+       'P_BASELINE_TOP_HDR': '33.88969537726557',
+       'P_BASELINE_BOTTOM_HDR': '33.88969537726557',
+       'beam_mode': 'IW',
+       'beam_swath': '23',
+       'unwrap_method': 'snaphu_mcf',
+   }
+
+
+def test_add_hyp3_metadata_insar_gamma(test_data_dir):
+    assert add_hyp3_metadata(
+        fname=str(test_data_dir / 'S1AC_20251001T204513_20251007T204359_HHR006_INT40_G_ueF_1DBE_dem.tif'),
+        meta={
+            'WIDTH': 6829,
+            'LENGTH': 3735,
+            'X_STEP': 40.0,
+            'Y_STEP': -40.0,
+            'X_FIRST': 282180.0,
+            'Y_FIRST': 6802380.0,
+        },
+        is_ifg=False,
+    ) == {
+       'WIDTH': 6829,
+       'LENGTH': 3735,
+       'X_STEP': 40.0,
+       'Y_STEP': -40.0,
+       'X_FIRST': 282180.0,
+       'Y_FIRST': 6802380.0,
+       'PROCESSOR': 'hyp3',
+       'CENTER_LINE_UTC': '74714.869576',
+       'ALOOKS': '2',
+       'RLOOKS': '10',
+       'EARTH_RADIUS': '6362008.3766',
+       'HEIGHT': '705239.4029000001',
+       'STARTING_RANGE': '803591.1723',
+       'HEADING': -18.150511100000017,
+       'ORBIT_DIRECTION': 'ASCENDING',
+       'LAT_REF1': '6652980.0',
+       'LAT_REF2': '6652980.0',
+       'LAT_REF3': '6802380.0',
+       'LAT_REF4': '6802380.0',
+       'LON_REF1': '282180.0',
+       'LON_REF2': '555340.0',
+       'LON_REF3': '282180.0',
+       'LON_REF4': '555340.0',
+       'PLATFORM': 'Sen',
+       'ANTENNA_SIDE': -1,
+       'WAVELENGTH': 0.055465764662349676,
+       'RANGE_PIXEL_SIZE': 23.0,
+       'AZIMUTH_PIXEL_SIZE': 28.2,
+       'beam_mode': 'IW',
+       'beam_swath': '123',
+       'relative_orbit': 90,
+       'startUTC': '2025-10-01 20:45:13.000000',
+       'stopUTC': '2025-10-01 20:45:40.000000',
+       'unwrap_method': 'mcf'
+   }



View it on GitLab: https://salsa.debian.org/debian-gis-team/mintpy/-/compare/e32ae6f9c56335176e284789dc6c1873c0aa94c9...73a35be96f72c7b163c2265c0540945618a811d9

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/mintpy/-/compare/e32ae6f9c56335176e284789dc6c1873c0aa94c9...73a35be96f72c7b163c2265c0540945618a811d9
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20251127/3562d96c/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list