[Git][debian-gis-team/pytroll-schedule][upstream] New upstream version 0.6.0

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Sat Dec 11 16:40:36 GMT 2021



Antonio Valentino pushed to branch upstream at Debian GIS Project / pytroll-schedule


Commits:
94565e50 by Antonio Valentino at 2021-12-11T14:25:49+00:00
New upstream version 0.6.0
- - - - -


25 changed files:

- .github/PULL_REQUEST_TEMPLATE.md
- + .github/workflows/ci.yaml
- + .stickler.yml
- .travis.yml
- CHANGELOG.md
- README.md
- RELEASING.md
- + codecov.yml
- generate_schedule_xmlpage.py
- setup.cfg
- setup.py
- trollsched/__init__.py
- trollsched/boundary.py
- trollsched/combine.py
- trollsched/compare.py
- trollsched/drawing.py
- trollsched/graph.py
- trollsched/helper_functions.py
- trollsched/satpass.py
- trollsched/schedule.py
- trollsched/tests/test_satpass.py
- trollsched/tests/test_schedule.py
- trollsched/tests/test_spherical.py
- trollsched/utils.py
- trollsched/version.py


Changes:

=====================================
.github/PULL_REQUEST_TEMPLATE.md
=====================================
@@ -3,5 +3,5 @@
  - [ ] Closes #xxxx <!-- remove if there is no corresponding issue, which should only be the case for minor changes -->
  - [ ] Tests added <!-- for all bug fixes or enhancements -->
  - [ ] Tests passed <!-- for all non-documentation changes) -->
- - [ ] Passes ``git diff origin/master **/*py | flake8 --diff`` <!-- remove if you did not edit any Python files -->
+ - [ ] Passes ``git diff origin/main **/*py | flake8 --diff`` <!-- remove if you did not edit any Python files -->
  - [ ] Fully documented <!-- remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later -->


=====================================
.github/workflows/ci.yaml
=====================================
@@ -0,0 +1,35 @@
+name: Run tests
+
+on:
+  - push
+  - pull_request
+
+jobs:
+  build:
+    runs-on: ubuntu-latest
+    strategy:
+      fail-fast: true
+      matrix:
+        python-version: ["3.8", "3.9", "3.10"]
+        experimental: [false]
+    steps:
+      - name: Checkout source
+        uses: actions/checkout at v2
+      - name: Set up Python ${{ matrix.python-version }}
+        uses: actions/setup-python at v2
+        with:
+          python-version: ${{ matrix.python-version }}
+      - name: Install dependencies
+        run: |
+          pip install -U pytest pytest-cov numpy pyresample pyorbital six pyyaml
+      - name: Install pytroll-collectors
+        run: |
+          pip install --no-deps -e .
+      - name: Run tests
+        run: |
+          pytest --cov=trollsched trollsched/tests --cov-report=xml
+      - name: Upload coverage to Codecov
+        uses: codecov/codecov-action at v1
+        with:
+          file: ./coverage.xml
+          env_vars: PYTHON_VERSION


=====================================
.stickler.yml
=====================================
@@ -0,0 +1,12 @@
+linters:
+  flake8:
+    python: 3
+    config: setup.cfg
+    fixer: true
+fixers:
+  enable: true
+
+files:
+    ignore:
+        - 'docs/Makefile'
+        - 'docs/make.bat'


=====================================
.travis.yml
=====================================
@@ -5,13 +5,13 @@ env:
   - PYTHON_VERSION=$TRAVIS_PYTHON_VERSION
   - NUMPY_VERSION=stable
   - MAIN_CMD='python setup.py'
-  - CONDA_DEPENDENCIES='scipy coveralls coverage codecov mock six appdirs pykdtree pyresample docutils pyyaml matplotlib xarray'
+  - CONDA_DEPENDENCIES='sphinx cartopy scipy coveralls coverage codecov behave mock pycoast pydecorate six appdirs pykdtree pyresample docutils pyyaml matplotlib'
   - PIP_DEPENDENCIES=''
   - SETUP_XVFB=False
   - EVENT_TYPE='push pull_request cron'
   - SETUP_CMD='test'
   - CONDA_CHANNELS='conda-forge'
-  - CONDA_CHANNEL_PRIORITY='True'
+  - CONDA_CHANNEL_PRIORITY='strict'
 matrix:
   include:
     - env: PYTHON_VERSION=2.7
@@ -25,11 +25,12 @@ matrix:
       os: osx
       language: generic
 install:
-    - git clone --depth 1 git://github.com/astropy/ci-helpers.git
+    #- git clone --depth 1 git://github.com/astropy/ci-helpers.git
+    - git clone --depth 1 -b all-the-fixes git://github.com/djhoese/ci-helpers.git
     - source ci-helpers/travis/setup_conda.sh
     # reactivate environment to set proj environment variables
-    - conda deactivate
-    - conda activate test
+    #- conda deactivate
+    #- conda activate test
 script: coverage run --source=trollsched setup.py test
 after_success:
 - if [[ $PYTHON_VERSION == 3.6 ]]; then coveralls; fi


=====================================
CHANGELOG.md
=====================================
@@ -1,3 +1,47 @@
+## Version 0.6.0 (2021/12/09)
+
+### Issues Closed
+
+* [Issue 62](https://github.com/pytroll/pytroll-schedule/issues/62) - Remove remnants of Python 2 support ([PR 67](https://github.com/pytroll/pytroll-schedule/pull/67) by [@pnuu](https://github.com/pnuu))
+* [Issue 60](https://github.com/pytroll/pytroll-schedule/issues/60) - Deprecated import of Mapping
+* [Issue 59](https://github.com/pytroll/pytroll-schedule/issues/59) - Failures in Schedule tests ([PR 61](https://github.com/pytroll/pytroll-schedule/pull/61) by [@pnuu](https://github.com/pnuu))
+* [Issue 54](https://github.com/pytroll/pytroll-schedule/issues/54) - Deprecated use of abstract base classes ([PR 57](https://github.com/pytroll/pytroll-schedule/pull/57) by [@pnuu](https://github.com/pnuu))
+* [Issue 53](https://github.com/pytroll/pytroll-schedule/issues/53) - The unittests are not run automatically ([PR 55](https://github.com/pytroll/pytroll-schedule/pull/55) by [@pnuu](https://github.com/pnuu))
+* [Issue 52](https://github.com/pytroll/pytroll-schedule/issues/52) - Boundary calculations are broken ([PR 56](https://github.com/pytroll/pytroll-schedule/pull/56) by [@pnuu](https://github.com/pnuu))
+* [Issue 49](https://github.com/pytroll/pytroll-schedule/issues/49) - Three unit tests failed.
+
+In this release 7 issues were closed.
+
+### Pull Requests Merged
+
+#### Bugs fixed
+
+* [PR 61](https://github.com/pytroll/pytroll-schedule/pull/61) - Allow `mersi-2` as instrument name ([59](https://github.com/pytroll/pytroll-schedule/issues/59))
+* [PR 56](https://github.com/pytroll/pytroll-schedule/pull/56) - Remove a bug introduced in PR38 ([52](https://github.com/pytroll/pytroll-schedule/issues/52))
+* [PR 51](https://github.com/pytroll/pytroll-schedule/pull/51) - Remove some redundant code and fix a failed unit test. 
+* [PR 45](https://github.com/pytroll/pytroll-schedule/pull/45) - Use recent ssl protocol for older python versions
+* [PR 38](https://github.com/pytroll/pytroll-schedule/pull/38) - Fix S3 olci scan duration
+
+#### Features added
+
+* [PR 67](https://github.com/pytroll/pytroll-schedule/pull/67) - Refactor remove legacy code support ([62](https://github.com/pytroll/pytroll-schedule/issues/62))
+* [PR 66](https://github.com/pytroll/pytroll-schedule/pull/66) - Change tested Python versions to 3.8, 3.9 and 3.10
+* [PR 64](https://github.com/pytroll/pytroll-schedule/pull/64) - Use safe loading for YAML config file
+* [PR 61](https://github.com/pytroll/pytroll-schedule/pull/61) - Allow `mersi-2` as instrument name ([59](https://github.com/pytroll/pytroll-schedule/issues/59))
+* [PR 58](https://github.com/pytroll/pytroll-schedule/pull/58) - Fix a test failure on Python 3.7
+* [PR 57](https://github.com/pytroll/pytroll-schedule/pull/57) - Fix an import raising deprecation warning ([54](https://github.com/pytroll/pytroll-schedule/issues/54))
+* [PR 55](https://github.com/pytroll/pytroll-schedule/pull/55) - Add GitHub actions to run unittests ([53](https://github.com/pytroll/pytroll-schedule/issues/53))
+* [PR 50](https://github.com/pytroll/pytroll-schedule/pull/50) - Add a southern hemisphere pass test.
+* [PR 46](https://github.com/pytroll/pytroll-schedule/pull/46) - Give the option to plot multiple polygons
+* [PR 45](https://github.com/pytroll/pytroll-schedule/pull/45) - Use recent ssl protocol for older python versions
+* [PR 44](https://github.com/pytroll/pytroll-schedule/pull/44) - Make plot filename more complete, including the instrument name
+* [PR 42](https://github.com/pytroll/pytroll-schedule/pull/42) - Make it possible to tell cartopy to use offline shapefiles
+* [PR 41](https://github.com/pytroll/pytroll-schedule/pull/41) - Fix nasa ftp retrieval
+* [PR 38](https://github.com/pytroll/pytroll-schedule/pull/38) - Fix S3 olci scan duration
+
+In this release 19 pull requests were closed.
+
+
 ## Version 0.5.2 (2019/03/19)
 
 


=====================================
README.md
=====================================
@@ -2,9 +2,9 @@ pytroll-schedule
 ================
 
 [![Codacy Badge](https://api.codacy.com/project/badge/Grade/9f039d7d640846ca89be8a78fa11e1f6)](https://www.codacy.com/app/adybbroe/pytroll-schedule?utm_source=github.com&utm_medium=referral&utm_content=pytroll/pytroll-schedule&utm_campaign=badger)
-[![Build Status](https://travis-ci.org/pytroll/pytroll-schedule.png?branch=master)](https://travis-ci.org/pytroll/pytroll-schedule)
-[![Coverage Status](https://coveralls.io/repos/github/pytroll/pytroll-schedule/badge.svg?branch=master)](https://coveralls.io/github/pytroll/pytroll-schedule?branch=master)
-[![Code Health](https://landscape.io/github/pytroll/pytroll-schedule/master/landscape.png)](https://landscape.io/github/pytroll/pytroll-schedule/master)
+[![Build Status](https://travis-ci.org/pytroll/pytroll-schedule.png?branch=main)](https://travis-ci.org/pytroll/pytroll-schedule)
+[![Coverage Status](https://coveralls.io/repos/github/pytroll/pytroll-schedule/badge.svg?branch=main)](https://coveralls.io/github/pytroll/pytroll-schedule?branch=main)
+[![Code Health](https://landscape.io/github/pytroll/pytroll-schedule/main/landscape.png)](https://landscape.io/github/pytroll/pytroll-schedule/main)
 [![PyPI version](https://badge.fury.io/py/pytroll-schedule.svg)](https://badge.fury.io/py/pytroll-schedule)
 
 


=====================================
RELEASING.md
=====================================
@@ -1,6 +1,6 @@
 # Releasing pytroll-schedule
 
-1. checkout master
+1. checkout main branch
 2. pull from repo
 3. run the unittests
 4. run `loghub` and update the `CHANGELOG.md` file:


=====================================
codecov.yml
=====================================


=====================================
generate_schedule_xmlpage.py
=====================================
@@ -1,7 +1,7 @@
 #!/usr/bin/env python
 # -*- coding: utf-8 -*-
 
-# Copyright (c) 2016, 2018 Adam.Dybbroe
+# Copyright (c) 2016, 2018, 2019 Adam.Dybbroe
 
 # Author(s):
 
@@ -26,14 +26,11 @@ schedule request xml files and then triggers the png and xml output generation.
 
 """
 
-import os
-from six.moves.configparser import RawConfigParser
 import logging
 import sys
-try:
-    from urlparse import urlparse
-except ImportError:
-    from urllib.parse import urlparse
+import os
+from six.moves.configparser import RawConfigParser
+from six.moves.urllib.parse import urlparse
 import posttroll.subscriber
 from posttroll.publisher import Publish
 import xml.etree.ElementTree as ET


=====================================
setup.cfg
=====================================
@@ -1,5 +1,5 @@
 [bdist_rpm]
-requires=numpy pyresample pyorbital
+requires=numpy pyresample pyorbital pyyaml
 release=1
 
 # See the docstring in versioneer.py for instructions. Note that you must


=====================================
setup.py
=====================================
@@ -1,7 +1,7 @@
 #!/usr/bin/env python
 # -*- coding: utf-8 -*-
 
-# Copyright (c) 2014 - 2018 PyTroll Community
+# Copyright (c) 2014 - 2019 PyTroll Community
 
 # Author(s):
 
@@ -29,12 +29,8 @@ from setuptools import setup
 import sys
 import versioneer
 
-requires = ['numpy', 'pyresample', 'pyorbital']
-test_requires = ['satpy']
-
-if sys.version_info < (2, 7):
-    # multiprocessing is not in the standard library
-    requires.append('argparse')
+requires = ['numpy', 'pyresample', 'pyorbital', 'pyyaml']
+test_requires = []
 
 setup(name='pytroll-schedule',
       version=versioneer.get_version(),


=====================================
trollsched/__init__.py
=====================================
@@ -1,11 +1,12 @@
 #!/usr/bin/env python
 # -*- coding: utf-8 -*-
 
-# Copyright (c) 2014, 2018 PyTroll Community
+# Copyright (c) 2014 - 2019 PyTroll Community
 
 # Author(s):
 
 #   Martin Raspaud <martin.raspaud at smhi.se>
+#   Adam Dybbroe <adam.dybbroe at smhi.se>
 
 # This program is free software: you can redistribute it and/or modify
 # it under the terms of the GNU General Public License as published by
@@ -38,8 +39,12 @@ NUMBER_OF_FOVS = {
     'avhrr': 2048,
     'mhs': 90,
     'amsua': 30,
+    'mwhs2': 98,
+    'atms': 96,
     'ascat': 42,
-    'viirs': 6400
+    'viirs': 6400,
+    'atms': 96,
+    'mwhs-2': 98
 }
 
 SATELLITE_NAMES = {'npp': 'Suomi NPP',
@@ -52,7 +57,8 @@ SATELLITE_NAMES = {'npp': 'Suomi NPP',
                    'metopb': 'Metop-B',
                    'metopa': 'Metop-A',
                    'noaa20': 'NOAA-20',
-                   'fengyun3d': 'FY-3D'
+                   'fengyun3d': 'FY-3D',
+                   'fengyun3c': 'FY-3C'
                    }
 
 INSTRUMENT = {'Suomi NPP': 'viirs',
@@ -65,4 +71,5 @@ INSTRUMENT = {'Suomi NPP': 'viirs',
               'Metop-A': 'avhrr',
               'Metop-B': 'avhrr',
               'Metop-C': 'avhrr',
-              'FY-3D': 'avhrr'}
+              'FY-3D': 'avhrr',
+              'FY-3C': 'avhrr'}


=====================================
trollsched/boundary.py
=====================================
@@ -37,7 +37,8 @@ logger = logging.getLogger(__name__)
 
 INSTRUMENT = {'avhrr/3': 'avhrr',
               'avhrr/2': 'avhrr',
-              'avhrr-3': 'avhrr'}
+              'avhrr-3': 'avhrr',
+              'mwhs-2': 'mwhs2'}
 
 
 class SwathBoundary(Boundary):
@@ -65,7 +66,7 @@ class SwathBoundary(Boundary):
         elif overpass.satellite == "noaa 16":
             scan_angle = 55.25
             instrument = "avhrr"
-        elif instrument == "mersi2":
+        elif instrument.startswith("mersi"):
             scan_angle = 55.4
             instrument = "avhrr"
         else:
@@ -78,10 +79,16 @@ class SwathBoundary(Boundary):
             sgeom = instrument_fun(scans_nb, scanpoints, scan_angle=scan_angle, frequency=100)
         elif instrument in ["ascat", ]:
             sgeom = instrument_fun(scans_nb, scanpoints)
+        elif instrument in ["amsua", 'mhs']:
+            sgeom = instrument_fun(scans_nb, scanpoints)
+        elif instrument in ["mwhs2", ]:
+            sgeom = instrument_fun(scans_nb, scanpoints)
         elif instrument in ["olci", ]:
             sgeom = instrument_fun(scans_nb, scanpoints)
         elif instrument == 'viirs':
             sgeom = instrument_fun(scans_nb, scanpoints, scan_step=scan_step)
+        elif instrument in ['mhs', 'atms', 'mwhs-2']:
+            sgeom = instrument_fun(scans_nb, scanpoints)
         else:
             logger.warning("Instrument not tested: %s", instrument)
             sgeom = instrument_fun(scans_nb)
@@ -99,7 +106,6 @@ class SwathBoundary(Boundary):
 
     def __init__(self, overpass, scan_step=50, frequency=200):
         # compute area covered by pass
-
         Boundary.__init__(self)
 
         self.overpass = overpass
@@ -115,34 +121,62 @@ class SwathBoundary(Boundary):
             sec_scan_duration = 1.779166667
             along_scan_reduce_factor = 1
         elif self.overpass.instrument.startswith("avhrr"):
-            sec_scan_duration = 1./6.
+            sec_scan_duration = 1. / 6.
             along_scan_reduce_factor = 0.1
         elif self.overpass.instrument == 'ascat':
             sec_scan_duration = 3.74747474747
             along_scan_reduce_factor = 1
             # Overwrite the scan step
             scan_step = 1
+        elif self.overpass.instrument == 'amsua':
+            sec_scan_duration = 8.
+            along_scan_reduce_factor = 1
+            # Overwrite the scan step
+            scan_step = 1
+        elif self.overpass.instrument == 'mhs':
+            sec_scan_duration = 8./3.
+            along_scan_reduce_factor = 1
+            # Overwrite the scan step
+            scan_step = 1
+        elif self.overpass.instrument == 'mwhs2':
+            sec_scan_duration = 8./3.
+            along_scan_reduce_factor = 1
+            # Overwrite the scan step
+            scan_step = 1
+        elif self.overpass.instrument == 'olci':
+            # 3 minutes of data is 4091 300meter lines:
+            sec_scan_duration = 0.04399902224395014
+            along_scan_reduce_factor = 1
+            # Overwrite the scan step
+            scan_step = 100
+        elif self.overpass.instrument == 'atms':
+            sec_scan_duration = 8/3.
+            along_scan_reduce_factor = 1
+            # Overwrite the scan step
+            scan_step = 1
+
         else:
             # Assume AVHRR!
             logmsg = ("Instrument scan duration not known. Setting it to AVHRR. Instrument: ")
             logger.info(logmsg + "%s", str(self.overpass.instrument))
-            sec_scan_duration = 1./6.
+            sec_scan_duration = 1. / 6.
             along_scan_reduce_factor = 0.1
 
         # From pass length in seconds and the seconds for one scan derive the number of scans in the swath:
-        scans_nb = scanlength_seconds/sec_scan_duration * along_scan_reduce_factor
+        scans_nb = scanlength_seconds / sec_scan_duration * along_scan_reduce_factor
         # Devide by the scan step to a reduced number of scans:
-        scans_nb = np.floor(scans_nb/scan_step)
+        scans_nb = np.floor(scans_nb / scan_step)
         scans_nb = int(max(scans_nb, 1))
 
         sides_lons, sides_lats = self.get_instrument_points(self.overpass,
                                                             overpass.risetime,
                                                             scans_nb,
-                                                            np.array([0, self.overpass.number_of_fovs-1]),
+                                                            np.array([0, self.overpass.number_of_fovs - 1]),
                                                             scan_step=scan_step)
 
         side_shape = sides_lons[::-1, 0].shape[0]
         nmod = 1
+
         if side_shape != scans_nb:
             nmod = side_shape // scans_nb
             logger.debug('Number of scan lines (%d) does not match number of scans (%d)',
@@ -182,6 +216,8 @@ class SwathBoundary(Boundary):
         self.top_lons = lons[0]
         self.top_lats = lats[0]
 
+        return
+
     def decimate(self, ratio):
         l = len(self.top_lons)
         start = (l % ratio) / 2
@@ -201,6 +237,8 @@ class SwathBoundary(Boundary):
         self.left_lons = self.left_lons[points]
         self.left_lats = self.left_lats[points]
 
+        return
+
     def contour(self):
         lons = np.concatenate((self.top_lons,
                                self.right_lons[1:-1],


=====================================
trollsched/combine.py
=====================================
@@ -114,9 +114,10 @@ def add_graphs(graphs, passes, delay=timedelta(seconds=0)):
                             wl.append(0)
                         else:
                             wl.append(n[1] or grl[s].weight(pl[s].index(p[0]) + 1, pl[s].index(n[0]) + 1))
-                    except:
+                    except Exception:
                         logger.error(
-                            "Collecting weights: stat %d - parnode %s %s - newnode %s %s", s, parnode, p, newnode, n, exc_info=1)
+                            "Collecting weights: stat %d - parnode %s %s - newnode %s %s",
+                            s, parnode, p, newnode, n, exc_info=1)
                         raise
                 # Apply vertix-count to the sum of collected weights.
                 # vertix-count: number of vertices with reference to same
@@ -198,7 +199,7 @@ def collect_nodes(statnr, parnode, graph_set, newgraph, newpasses, passes_list,
         # current passes node.
         try:
             gn = g.neighbours(passes_list[statnr].index(p[0]) + 1)
-        except:
+        except Exception:
             print("len(passes_list)", len(passes_list), "   len(graph_set)",
                   len(graph_set), "   statnr", statnr, "   p", p)
             print("passes_list", passes_list)
@@ -281,7 +282,7 @@ def collect_nodes(statnr, parnode, graph_set, newgraph, newpasses, passes_list,
                         else:
                             print("uh-oh, something curious happened ...")
 
-                except:
+                except Exception:
                     print("\nCATCH\ngn:", gn, "-> n", n, " col:", col,
                           "-> cx", cx, "statnr", statnr, "statnr+i", statnr + 1)
                     print("len(passes_list -n -cx)", len(passes_list[statnr]), len(passes_list[statnr + 1]))
@@ -410,7 +411,8 @@ def main():
 #             print_matrix(graph[station].adj_matrix, ly=5)
 #             print_matrix(graph[station].weight_matrix, ly=5, lx=-1)
 
-#             allpasses[station] = get_passes_from_xml_file(os.path.join(opts.report, "acquisition-schedule-report." + station + ".xml"))
+#             allpasses[station] = get_passes_from_xml_file(os.path.join(opts.report,
+#                                                           "acquisition-schedule-report." + station + ".xml"))
 #             print len(allpasses[station]),allpasses[station]
 
 #             for v in graph[station].neighbours(1):
@@ -435,7 +437,7 @@ def main():
 
         combined_stations(opts, pattern, station_list, graph, allpasses, start_time, start, forward)
 
-    except:
+    except Exception:
         logger.exception("Something wrong happened!")
         raise
 


=====================================
trollsched/compare.py
=====================================
@@ -31,6 +31,7 @@ import glob
 
 logger = logging.getLogger(__name__)
 
+
 def xml_compare(x1_, x2_, reporter=None, skiptags=None):
     """Compare xml objects.
     """
@@ -87,6 +88,7 @@ def text_compare(t1_, t2_):
         return True
     return (t1_ or '').strip() == (t2_ or '').strip()
 
+
 def compare(file1, file2):
     """Compare two xml files, request and confirmation.
     """
@@ -139,7 +141,7 @@ def run():
                         " corresponding confirmation, from the given directory")
     parser.add_argument("-c", "--confirmation",
                         help="directory for the confirmation files")
-    
+
     opts = parser.parse_args()
 
     if opts.log:
@@ -160,7 +162,6 @@ def run():
     logging.getLogger('').setLevel(loglevel)
     logging.getLogger('').addHandler(handler)
 
-
     if opts.mail:
         mhandler = logging.handlers.SMTPHandler("localhost",
                                                 "satsateknik at smhi.se",
@@ -184,8 +185,6 @@ def run():
 
     #     notifier.loop()
 
-
-
     if opts.most_recent:
         logger.debug("looking for most recent file in " +
                      os.path.join(opts.most_recent, "*request*.xml"))
@@ -195,12 +194,12 @@ def run():
         reqdir, newfile = os.path.split(newest)
         confdir = opts.confirmation or reqdir
         confname = os.path.join(confdir,
-                                newfile[:-15] + "confirmation" +  newfile[-8:])
+                                newfile[:-15] + "confirmation" + newfile[-8:])
         logger.debug("against " + confname)
         try:
             compare(newest, confname)
         except IOError:
-            logger.exception("Something went wrong!") 
+            logger.exception("Something went wrong!")
 
 
 if __name__ == '__main__':


=====================================
trollsched/drawing.py
=====================================
@@ -1,11 +1,11 @@
 #!/usr/bin/env python
 # -*- coding: utf-8 -*-
 
-# Copyright (c) 2018 Adam.Dybbroe
+# Copyright (c) 2018 - 2020 Pytroll Community
 
 # Author(s):
 
-#   Adam.Dybbroe <a000680 at c20671.ad.smhi.se>
+#   Adam.Dybbroe <adam.dybbroe at smhi.se>
 
 # This program is free software: you can redistribute it and/or modify
 # it under the terms of the GNU General Public License as published by
@@ -40,6 +40,11 @@ except ImportError:
     logger.warning("Failed loading Cartopy, will try Basemap instead")
     BASEMAP_NOT_CARTOPY = True
 
+if not BASEMAP_NOT_CARTOPY:
+    import cartopy
+    cartopy.config['pre_existing_data_dir'] = os.environ.get(
+        "CARTOPY_PRE_EXISTING_DATA_DIR", cartopy.config['pre_existing_data_dir'])
+
 
 class MapperBasemap(object):
     """A class to generate nice plots with basemap.
@@ -143,9 +148,19 @@ def save_fig(pass_obj,
              overwrite=False,
              labels=None,
              extension=".png",
-             outline='-r'):
+             outline='-r',
+             plot_parameters=None,
+             plot_title=None,
+             poly_color=None):
     """Save the pass as a figure. Filename is automatically generated.
     """
+    poly = poly or []
+    poly_color = poly_color or []
+    if not isinstance(poly, (list, tuple)):
+        poly = [poly]
+    if not isinstance(poly_color, (list, tuple)):
+        poly_color = [poly_color]
+
     mpl.use('Agg')
     import matplotlib.pyplot as plt
     plt.clf()
@@ -156,30 +171,42 @@ def save_fig(pass_obj,
     if not os.path.exists(directory):
         logger.debug("Create plot dir " + directory)
         os.makedirs(directory)
-    filename = os.path.join(
-        directory,
-        (rise + pass_obj.satellite.name.replace(" ", "_") + fall + extension))
-
-    pass_obj.fig = filename
-    if not overwrite and os.path.exists(filename):
-        return filename
 
-    logger.debug("Filename = <%s>", filename)
-    with Mapper() as mapper:
+    filename = '{rise}_{satname}_{instrument}_{fall}{extension}'.format(rise=rise,
+                                                                        satname=pass_obj.satellite.name.replace(
+                                                                            " ", "_"),
+                                                                        instrument=pass_obj.instrument.replace(
+                                                                            "/", "-"),
+                                                                        fall=fall, extension=extension)
+    filepath = os.path.join(directory, filename)
+    pass_obj.fig = filepath
+    if not overwrite and os.path.exists(filepath):
+        return filepath
+
+    logger.debug("Filename = <%s>", filepath)
+    plot_parameters = plot_parameters or {}
+    with Mapper(**plot_parameters) as mapper:
         mapper.nightshade(pass_obj.uptime, alpha=0.2)
+        for i, polygon in enumerate(poly):
+            try:
+                col = poly_color[i]
+            except IndexError:
+                col = '-b'
+            draw(polygon, mapper, col)
         logger.debug("Draw: outline = <%s>", outline)
         draw(pass_obj.boundary.contour_poly, mapper, outline)
-        if poly is not None:
-            draw(poly, mapper, "-b")
 
     logger.debug("Title = %s", str(pass_obj))
-    plt.title(str(pass_obj))
+    if not plot_title:
+        plt.title(str(pass_obj))
+    else:
+        plt.title(plot_title)
     for label in labels or []:
         plt.figtext(*label[0], **label[1])
     logger.debug("Save plot...")
-    plt.savefig(filename)
+    plt.savefig(filepath)
     logger.debug("Return...")
-    return filename
+    return filepath
 
 
 def show(pass_obj,


=====================================
trollsched/graph.py
=====================================
@@ -24,6 +24,7 @@
 """
 import numpy as np
 
+
 class Graph(object):
 
     def __init__(self, n_vertices=None, adj_matrix=None):


=====================================
trollsched/helper_functions.py
=====================================
@@ -60,12 +60,12 @@ def sun_pos(dt=None):
     axial_tilt = 23.4
     ref_solstice = datetime(2016, 6, 21, 22, 22)
     days_per_year = 365.2425
-    seconds_per_day = 24*60*60.0
+    seconds_per_day = 24 * 60 * 60.0
 
-    days_since_ref = (dt - ref_solstice).total_seconds()/seconds_per_day
-    lat = axial_tilt*np.cos(2*np.pi*days_since_ref/days_per_year)
+    days_since_ref = (dt - ref_solstice).total_seconds() / seconds_per_day
+    lat = axial_tilt * np.cos(2 * np.pi * days_since_ref / days_per_year)
     sec_since_midnight = (dt - datetime(dt.year, dt.month, dt.day)).seconds
-    lng = -(sec_since_midnight/seconds_per_day - 0.5)*360
+    lng = -(sec_since_midnight / seconds_per_day - 0.5) * 360
     return lat, lng
 
 


=====================================
trollsched/satpass.py
=====================================
@@ -29,27 +29,25 @@ import logging
 import logging.handlers
 import operator
 import os
-import six
 import socket
-from functools import reduce as fctools_reduce
-try:
-    from urllib.parse import urlparse
-except ImportError:
-    from urlparse import urlparse
-
+import sys
 from datetime import datetime, timedelta
+from functools import reduce as fctools_reduce
 from tempfile import mkstemp
+
 import numpy as np
+from urllib.parse import urlparse
 
 from pyorbital import orbital, tlefile
 from pyresample.boundary import AreaDefBoundary
+from trollsched import MIN_PASS, NOAA20_NAME, NUMBER_OF_FOVS
 from trollsched.boundary import SwathBoundary
-from trollsched import (MIN_PASS, NOAA20_NAME, NUMBER_OF_FOVS)
 
 logger = logging.getLogger(__name__)
 
 VIIRS_PLATFORM_NAMES = ['SUOMI NPP', 'SNPP',
                         'NOAA-20', 'NOAA 20']
+MERSI_PLATFORM_NAMES = ['FENGYUN 3C', 'FENGYUN-3C', 'FY-3C']
 MERSI2_PLATFORM_NAMES = ['FENGYUN 3D', 'FENGYUN-3D', 'FY-3D',
                          'FENGYUN 3E', 'FENGYUN-3E', 'FY-3E']
 
@@ -147,7 +145,7 @@ class Pass(SimplePass):
 
         if isinstance(instrument, (list, set)):
             if 'avhrr' in instrument:
-                logger.warning("Instrument is a sequence Assume avhrr...")
+                logger.warning("Instrument is a sequence! Assume avhrr...")
                 instrument = 'avhrr'
             elif 'viirs' in instrument:
                 logger.warning("Instrument is a sequence! Assume viirs...")
@@ -155,6 +153,12 @@ class Pass(SimplePass):
             elif 'modis' in instrument:
                 logger.warning("Instrument is a sequence! Assume modis...")
                 instrument = 'modis'
+            elif 'mersi' in instrument:
+                logger.warning("Instrument is a sequence! Assume mersi...")
+                instrument = 'mersi'
+            elif 'mersi-2' in instrument:
+                logger.warning("Instrument is a sequence! Assume mersi-2...")
+                instrument = 'mersi-2'
             else:
                 raise TypeError("Instrument is a sequence! Don't know which one to choose!")
 
@@ -245,6 +249,107 @@ class Pass(SimplePass):
             return 0
         return inter.area() / area_boundary.area()
 
+    def generate_metno_xml(self, coords, root):
+        import xml.etree.ElementTree as ET
+
+        asimuth_at_max_elevation, max_elevation = self.orb.get_observer_look(self.uptime, *coords)
+        pass_direction = self.pass_direction().capitalize()[:1]
+        # anl = self.orb.get_lonlatalt(self.orb.get_last_an_time(self.risetime))[0] % 360
+        asimuth_at_aos, aos_elevation = self.orb.get_observer_look(self.risetime, *coords)
+        orbit = self.orb.get_orbit_number(self.risetime)
+        # aos_epoch=int((self.risetime-datetime(1970,1,1)).total_seconds())
+        sat_lon, sat_lat, alt = self.orb.get_lonlatalt(self.risetime)
+
+        ovpass = ET.SubElement(root, "pass")
+        ovpass.set("satellite", self.satellite.name)
+        ovpass.set("aos", self.risetime.strftime("%Y%m%d%H%M%S"))
+        ovpass.set("los", self.falltime.strftime("%Y%m%d%H%M%S"))
+        ovpass.set("orbit", "{:d}".format(orbit))
+        ovpass.set("max-elevation", "{:.3f}".format(max_elevation))
+        ovpass.set("asimuth-at-max-elevation", "{:.3f}".format(asimuth_at_max_elevation))
+        ovpass.set("asimuth-at-aos", "{:.3f}".format(asimuth_at_aos))
+        ovpass.set("pass-direction", pass_direction)
+        ovpass.set("satellite-lon-at-aos", "{:.3f}".format(sat_lon))
+        ovpass.set("satellite-lat-at-aos", "{:.3f}".format(sat_lat))
+        ovpass.set("tle-epoch", self.orb.orbit_elements.epoch.astype(datetime).strftime("%Y%m%d%H%M%S.%f"))
+        if self.fig:
+            ovpass.set("figure", self.fig)
+
+        return True
+
+    def print_meos(self, coords, line_no):
+        """
+         No. Date    Satellite  Orbit Max EL  AOS      Ovlp  LOS      Durtn  Az(AOS/MAX)
+        """
+
+        asimuth_at_max_elevation, max_elevation = self.orb.get_observer_look(self.uptime, *coords)
+        pass_direction = self.pass_direction().capitalize()[:1]
+        # anl = self.orb.get_lonlatalt(self.orb.get_last_an_time(self.risetime))[0] % 360
+        asimuth_at_aos, aos_elevation = self.orb.get_observer_look(self.risetime, *coords)
+        orbit = self.orb.get_orbit_number(self.risetime)
+        aos_epoch = int((self.risetime - datetime(1970, 1, 1)).total_seconds())
+        sat_lon, sat_lat, alt = self.orb.get_lonlatalt(self.risetime)
+
+        dur_secs = (self.falltime - self.risetime).seconds
+        dur_hours, dur_reminder = divmod(dur_secs, 3600)
+        dur_minutes, dur_seconds = divmod(dur_reminder, 60)
+        duration = "{:0>2}:{:0>2}".format(dur_minutes, dur_seconds)
+
+        satellite_meos_translation = {"NOAA 19": "NOAA_19",
+                                      "NOAA 18": "NOAA_18",
+                                      "NOAA 15": "NOAA_15",
+                                      "METOP-A": "M02",
+                                      "METOP-B": "M01",
+                                      "FENGYUN 3A": "FENGYUN-3A",
+                                      "FENGYUN 3B": "FENGYUN-3B",
+                                      "FENGYUN 3C": "FENGYUN-3C",
+                                      "SUOMI NPP": "NPP"}
+
+        import hashlib
+
+        pass_key = hashlib.md5(("{:s}|{:d}|{:d}|{:.3f}|{:.3f}".
+                                format(satellite_meos_translation.get(self.satellite.name.upper(),
+                                                                      self.satellite.name.upper()),
+                                       int(orbit),
+                                       aos_epoch,
+                                       sat_lon,
+                                       sat_lat)).encode('utf-8')).hexdigest()
+
+        line_list = [" {line_no:>2}",
+                     "{date}",
+                     "{satellite:<10}",
+                     "{orbit:>5}",
+                     "{elevation:>6.3f} ",
+                     "{risetime}",
+                     "{overlap:<5s}",
+                     "{falltime}",
+                     "{duration}",
+                     "{asimuth_at_aos:>5.1f}",
+                     "{asimuth_at_max:>5.1f}",
+                     "-- Undefined(Scheduling not done {aos_epoch} )",
+                     "{passkey}",
+                     "{pass_direction}"
+                     ]
+
+        line = " ".join(line_list).format(
+            # line_no=line_no,
+            line_no=1,
+            date=self.risetime.strftime("%Y%m%d"),
+            satellite=satellite_meos_translation.get(self.satellite.name.upper(),
+                                                     self.satellite.name.upper()),
+            orbit=orbit,
+            elevation=max_elevation,
+            risetime=self.risetime.strftime("%H:%M:%S"),
+            overlap="n/a",
+            falltime=self.falltime.strftime("%H:%M:%S"),
+            duration=duration,
+            asimuth_at_aos=asimuth_at_aos,
+            asimuth_at_max=asimuth_at_max_elevation,
+            aos_epoch=aos_epoch,
+            passkey=pass_key,
+            pass_direction=pass_direction)
+        return line
+
     def print_vcs(self, coords):
         """Should look like this::
 
@@ -327,13 +432,13 @@ def get_aqua_terra_dumpdata_from_ftp(sat, dump_url):
     """
 
     logger.info("Fetch %s dump info from internet", str(sat.name))
-    if isinstance(dump_url, six.text_type):
+    if isinstance(dump_url, str):
         url = urlparse(dump_url % sat.name)
     else:
         url = urlparse(HOST % sat.name)
     logger.debug("Connect to ftp server")
     try:
-        f = ftplib.FTP(url.netloc)
+        f = ftplib.FTP_TLS(url.netloc)
     except (socket.error, socket.gaierror) as e:
         logger.error('cannot reach to %s ' % HOST + str(e))
         f = None
@@ -350,6 +455,7 @@ def get_aqua_terra_dumpdata_from_ftp(sat, dump_url):
     if f is not None:
         data = []
         try:
+            f.prot_p()  # explicitly call for protected transfer
             f.dir(url.path, data.append)
         except socket.error as e:
             logger.error("Can't get any data: " + str(e))
@@ -377,6 +483,7 @@ def get_aqua_terra_dumpdata_from_ftp(sat, dump_url):
         lines = []
         if not os.path.exists(os.path.join("/tmp", filedates[date])):
             try:
+                f.prot_p()  # explicitly call for protected transfer
                 f.retrlines('RETR ' + os.path.join(url.path, filedates[date]),
                             lines.append)
             except ftplib.error_perm:
@@ -422,7 +529,9 @@ def get_next_passes(satellites,
                     forward,
                     coords,
                     tle_file=None,
-                    aqua_terra_dumps=None):
+                    aqua_terra_dumps=None,
+                    min_pass=MIN_PASS,
+                    local_horizon=0):
     """Get the next passes for *satellites*, starting at *utctime*, for a
     duration of *forward* hours, with observer at *coords* ie lon (°E), lat
     (°N), altitude (km). Uses *tle_file* if provided, downloads from celestrack
@@ -448,24 +557,30 @@ def get_next_passes(satellites,
             sat = Satellite(sat, 0, 0)
 
         satorb = orbital.Orbital(sat.name, tle_file=tle_file)
-        passlist = satorb.get_next_passes(utctime, forward, *coords)
-
-        if sat.name == "metop-a":
-            # Take care of metop-a
+        passlist = satorb.get_next_passes(utctime,
+                                          forward,
+                                          horizon=local_horizon,
+                                          *coords
+                                          )
+
+        if sat.name.lower() == "metop-a":
+            # Take care of metop-a special case
             passes["metop-a"] = get_metopa_passes(sat, passlist, satorb)
-
-        elif sat.name in ["aqua", "terra"] and aqua_terra_dumps:
+        elif sat.name.lower() in ["aqua", "terra"] and aqua_terra_dumps:
             # Take care of aqua (dumps in svalbard and poker flat)
             # Get the Terra/Aqua passes and fill the passes dict:
             get_terra_aqua_passes(passes, utctime, forward, sat, passlist, satorb, aqua_terra_dumps)
-
         else:
             if sat.name.upper() in VIIRS_PLATFORM_NAMES:
                 instrument = "viirs"
             elif sat.name.lower().startswith("metop") or sat.name.lower().startswith("noaa"):
                 instrument = "avhrr"
+            elif sat.name.lower() in ["aqua", "terra"]:  # when aqua_terra_dumps=False
+                instrument = "modis"
+            elif sat.name.upper() in MERSI_PLATFORM_NAMES:
+                instrument = "mersi"
             elif sat.name.upper() in MERSI2_PLATFORM_NAMES:
-                instrument = "mersi2"
+                instrument = "mersi-2"
             else:
                 instrument = "unknown"
 


=====================================
trollsched/schedule.py
=====================================
@@ -26,17 +26,18 @@
 import logging
 import logging.handlers
 import os
-try:
-    from urllib.parse import urlparse
-except ImportError:
-    from urlparse import urlparse
-
+from urllib.parse import urlparse
 from datetime import datetime, timedelta
 from pprint import pformat
 
 import numpy as np
 from pyorbital import astronomy
-from pyresample import utils as resample_utils
+try:
+    from pyresample import parse_area_file
+except ImportError:
+    # Older versions of pyresample:
+    from pyresample.utils import parse_area_file
+
 from trollsched import utils
 from trollsched.spherical import get_twilight_poly
 from trollsched.graph import Graph
@@ -55,7 +56,8 @@ class Station(object):
 
     """docstring for Station."""
 
-    def __init__(self, station_id, name, longitude, latitude, altitude, area, satellites, area_file=None):
+    def __init__(self, station_id, name, longitude, latitude, altitude, area, satellites, area_file=None,
+                 min_pass=None, local_horizon=0):
         super(Station, self).__init__()
         self.id = station_id
         self.name = name
@@ -67,9 +69,11 @@ class Station(object):
 
         if area_file is not None:
             try:
-                self.area = resample_utils.parse_area_file(area_file, area)[0]
+                self.area = parse_area_file(area_file, area)[0]
             except TypeError:
                 pass
+        self.min_pass = min_pass
+        self.local_horizon = local_horizon
 
     @property
     def coords(self):
@@ -100,7 +104,9 @@ class Station(object):
                                     self.coords, tle_file,
                                     aqua_terra_dumps=(sched.dump_url or True
                                                       if opts.no_aqua_terra_dump
-                                                      else None)
+                                                      else None),
+                                    min_pass=self.min_pass,
+                                    local_horizon=self.local_horizon
                                     )
         logger.info("Computation of next overpasses done")
 
@@ -118,7 +124,9 @@ class Station(object):
                 args=(allpasses,
                       self.area.poly,
                       build_filename(
-                          "dir_plots", pattern, pattern_args)
+                          "dir_plots", pattern, pattern_args),
+                      sched.plot_parameters,
+                      sched.plot_title
                       )
             )
             image_saver.start()
@@ -143,16 +151,22 @@ class Station(object):
             generate_sch_file(build_filename("file_sci", pattern,
                                              pattern_args), allpasses, self.coords)
 
+        if opts.meos:
+            generate_meos_file(build_filename("file_meos", pattern, pattern_args), allpasses,
+                               self.coords, start_time + timedelta(hours=sched.start), True)  # Ie report mode
+
+        if opts.plot:
+            logger.info("Waiting for images to be saved...")
+            image_saver.join()
+            logger.info("Done!")
+
+        if opts.metno_xml:
+            generate_metno_xml_file(build_filename("file_metno_xml", pattern, pattern_args), allpasses,
+                                    self.coords, start_time + timedelta(hours=sched.start),
+                                    start_time + timedelta(hours=sched.forward), self.id, sched.center_id, True)
+
         if opts.xml or opts.report:
             url = urlparse(opts.output_url or opts.output_dir)
-            if url.scheme not in ["file", ""]:
-                directory = "/tmp"
-            else:
-                directory = url.path
-            if opts.plot:
-                logger.info("Waiting for images to be saved...")
-                image_saver.join()
-                logger.info("Done!")
             if opts.xml or opts.report:
                 """Allways create xml-file in request-mode"""
                 pattern_args['mode'] = "request"
@@ -224,7 +238,7 @@ class Scheduler(object):
 
     """docstring for Scheduler."""
 
-    def __init__(self, stations, min_pass, forward, start, dump_url, patterns, center_id):
+    def __init__(self, stations, min_pass, forward, start, dump_url, patterns, center_id, plot_parameters, plot_title):
         super(Scheduler, self).__init__()
         self.stations = stations
         self.min_pass = min_pass
@@ -233,6 +247,8 @@ class Scheduler(object):
         self.dump_url = dump_url
         self.patterns = patterns
         self.center_id = center_id
+        self.plot_parameters = plot_parameters
+        self.plot_title = plot_title
         self.opts = None
 
 
@@ -484,6 +500,56 @@ def get_max(groups, fun):
     return groups[argmax(scores)]
 
 
+def generate_metno_xml_file(output_file, allpasses, coords, start, end, station_name, center_id, report_mode=False):
+    import xml.etree.ElementTree as ET
+
+    reqtime = datetime.utcnow()
+
+    with open(output_file, "w") as out:
+        out.write("<?xml version='1.0' encoding='utf-8'?>")
+
+        root = ET.Element("acquisition-schedule")
+        props = ET.SubElement(root, "properties")
+        proj = ET.SubElement(props, "project")
+        proj.text = "Pytroll"
+        typep = ET.SubElement(props, "type")
+        if report_mode:
+            typep.text = "report"
+        else:
+            typep.text = "request"
+        station = ET.SubElement(props, "station")
+        station.text = station_name
+        file_start = ET.SubElement(props, "file-start")
+        file_start.text = start.strftime("%Y-%m-%dT%H:%M:%S")
+        file_end = ET.SubElement(props, "file-end")
+        file_end.text = end.strftime("%Y-%m-%dT%H:%M:%S")
+        reqby = ET.SubElement(props, "requested-by")
+        reqby.text = center_id
+        reqon = ET.SubElement(props, "requested-on")
+        reqon.text = reqtime.strftime("%Y-%m-%dT%H:%M:%S")
+
+        for overpass in sorted(allpasses, key=lambda x: x.risetime):
+            if (overpass.rec or report_mode) and overpass.risetime > start:
+                overpass.generate_metno_xml(coords, root)
+
+        out.write(ET.tostring(root).decode("utf-8"))
+        out.close()
+    return output_file
+
+
+def generate_meos_file(output_file, allpasses, coords, start, report_mode=False):
+
+    with open(output_file, "w") as out:
+        out.write(" No. Date    Satellite  Orbit Max EL  AOS      Ovlp  LOS      Durtn  Az(AOS/MAX)\n")
+        line_no = 1
+        for overpass in sorted(allpasses, key=lambda x: x.risetime):
+            if (overpass.rec or report_mode) and overpass.risetime > start:
+                out.write(overpass.print_meos(coords, line_no) + "\n")
+                line_no += 1
+        out.close()
+    return output_file
+
+
 def generate_sch_file(output_file, overpasses, coords):
 
     with open(output_file, "w") as out:
@@ -500,7 +566,8 @@ def generate_sch_file(output_file, overpasses, coords):
         out.write("#\n#\n#Pass List\n#\n")
 
         out.write(
-            "#SCName          RevNum Risetime        Falltime        Elev Dura ANL   Rec Dir Man Ovl OvlSCName        OvlRev OvlRisetime     OrigRisetime    OrigFalltime    OrigDuration\n#\n")
+            "#SCName          RevNum Risetime        Falltime        Elev Dura ANL   Rec Dir Man Ovl OvlSCName        "
+            "OvlRev OvlRisetime     OrigRisetime    OrigFalltime    OrigDuration\n#\n")
 
         for overpass in sorted(overpasses):
             out.write(overpass.print_vcs(coords) + "\n")
@@ -561,7 +628,7 @@ def generate_xml_file(sched, start, end, xml_file, station, center_id, report_mo
         if report_mode:
             fp_.write("<?xml version='1.0' encoding='utf-8'?>"
                       "<?xml-stylesheet type='text/xsl' href='reqreader.xsl'?>")
-        fp_.write(str(ET.tostring(tree)))
+        fp_.write(ET.tostring(tree).decode("utf-8"))
     os.rename(tmp_filename, filename)
     return filename
 
@@ -572,12 +639,12 @@ def parse_datetime(strtime):
     return datetime.strptime(strtime, "%Y%m%d%H%M%S")
 
 
-def save_passes(allpasses, poly, output_dir):
+def save_passes(allpasses, poly, output_dir, plot_parameters=None, plot_title=None):
     """Save overpass plots to png and store in directory *output_dir*
     """
     from trollsched.drawing import save_fig
     for overpass in allpasses:
-        save_fig(overpass, poly=poly, directory=output_dir)
+        save_fig(overpass, poly=poly, directory=output_dir, plot_parameters=plot_parameters, plot_title=plot_title)
 
 
 def get_passes_from_xml_file(filename):
@@ -632,11 +699,7 @@ def combined_stations(scheduler, start_time, graph, allpasses):
         """Collect labels, each with one pass per station."""
         # TODO: is there a simpler way?
         clabels = []
-        from sys import version_info
-        if version_info < (2, 7):
-            npasses = dict((s, set()) for s in stats)
-        else:
-            npasses = {s: set() for s in stats}
+        npasses = {s: set() for s in stats}
         for npass in newpasses:
             cl = []
             for i, s in zip(range(len(stats)), stats):
@@ -668,9 +731,9 @@ def combined_stations(scheduler, start_time, graph, allpasses):
             passes[s] = list(ap)
             for p in passes[s]:
                 p.rec = False
-    except:
+    except Exception:
         logger.exception("Failed to reset 'rec' for s:%s  ap:%s  passes[s]:%s  p:%s",
-                         a, ap, passes[s], p)
+                         s, ap, passes[s], p)
         raise
 
     stats, schedule, (newgraph, newpasses) = get_combined_sched(graph, passes)
@@ -727,6 +790,24 @@ def combined_stations(scheduler, start_time, graph, allpasses):
                                         True)
             logger.info("Generated " + str(xmlfile))
 
+        if scheduler.opts.meos:
+            meosfile = generate_meos_file(build_filename("file_meos", scheduler.patterns, pattern_args),
+                                          passes[station_id],
+                                          # station_meta[station]['coords'],
+                                          [s.coords for s in scheduler.stations if s.id == station_id][0],
+                                          start_time + timedelta(hours=scheduler.start),
+                                          False)  # Ie only print schedule passes
+            logger.info("Generated " + str(meosfile))
+        if scheduler.opts.metno_xml:
+            metno_xmlfile = generate_metno_xml_file(build_filename("file_metno_xml", scheduler.patterns, pattern_args),
+                                                    passes[station_id],
+                                                    # station_meta[station]['coords'],
+                                                    [s.coords for s in scheduler.stations if s.id == station_id][0],
+                                                    start_time + timedelta(hours=scheduler.start),
+                                                    start_time + timedelta(hours=scheduler.forward),
+                                                    station_id, scheduler.center_id, False)
+            logger.info("Generated " + str(metno_xmlfile))
+
     logger.info("Finished coordinated schedules.")
 
 
@@ -761,8 +842,8 @@ def run():
     group_postim.add_argument("-s", "--start-time", type=parse_datetime,
                               help="start time of the schedule to compute")
     group_postim.add_argument("-d", "--delay", default=60, type=float,
-                              help="delay (in seconds) needed between two "
-                              + "consecutive passes (60 seconds by default)")
+                              help="delay (in seconds) needed between two " +
+                              "consecutive passes (60 seconds by default)")
     # argument group: special behaviour
     group_spec = parser.add_argument_group(title="special",
                                            description="(additional parameter changing behaviour)")
@@ -778,8 +859,8 @@ def run():
     group_outp.add_argument("-o", "--output-dir", default=None,
                             help="where to put generated files")
     group_outp.add_argument("-u", "--output-url", default=None,
-                            help="URL where to put generated schedule file(s)"
-                            + ", otherwise use output-dir")
+                            help="URL where to put generated schedule file(s)" +
+                            ", otherwise use output-dir")
     group_outp.add_argument("-x", "--xml", action="store_true",
                             help="generate an xml request file (schedule)"
                             )
@@ -791,6 +872,10 @@ def run():
                             help="generate plot images")
     group_outp.add_argument("-g", "--graph", action="store_true",
                             help="save graph info")
+    group_outp.add_argument("--meos", action="store_true",
+                            help="generate a MEOS schedule file")
+    group_outp.add_argument("--metno-xml", action="store_true",
+                            help="generate a METNO xml pass data file")
     opts = parser.parse_args()
 
     if opts.config:
@@ -805,7 +890,7 @@ def run():
         parser.error("Coordinates must be provided in the absence of "
                      "configuration file.")
 
-    if not (opts.xml or opts.scisys or opts.report):
+    if not (opts.xml or opts.scisys or opts.report or opts.metno_xml):
         parser.error("No output specified, use '--scisys' or '-x/--xml'")
 
     if opts.output_dir is None:
@@ -843,8 +928,6 @@ def run():
     logger = logging.getLogger("trollsched")
 
     tle_file = opts.tle
-    if opts.forward:
-        forward = opts.forward
     if opts.start_time:
         start_time = opts.start_time
     else:
@@ -917,6 +1000,6 @@ def run():
 if __name__ == '__main__':
     try:
         run()
-    except:
+    except Exception:
         logger.exception("Something wrong happened!")
         raise


=====================================
trollsched/tests/test_satpass.py
=====================================
@@ -1,7 +1,7 @@
 #!/usr/bin/env python
 # -*- coding: utf-8 -*-
 
-# Copyright (c) 2018 - 2019 PyTroll
+# Copyright (c) 2018 - 2021 Pytroll-schedule developers
 
 # Author(s):
 
@@ -29,7 +29,7 @@ from datetime import datetime, timedelta
 from trollsched.satpass import Pass
 from trollsched.boundary import SwathBoundary
 from pyorbital.orbital import Orbital
-
+from pyresample.geometry import AreaDefinition, create_area_def
 
 LONS1 = np.array([-122.29913729160562, -131.54385362589042, -155.788034272281,
                   143.1730880418349, 105.69172088208997, 93.03135571771092,
@@ -128,6 +128,11 @@ LATS3 = np.array([66.94713585, 67.07854554, 66.53108388, 65.27837805, 63.5022359
                   58.33858588, 57.71210872, 55.14964148, 55.72506407, 60.40889798,
                   61.99561474, 63.11425455, 63.67173255, 63.56939058], dtype='float64')
 
+AREA_DEF_EURON1 = AreaDefinition('euron1', 'Northern Europe - 1km',
+                                 '', {'proj': 'stere', 'ellps': 'WGS84',
+                                      'lat_0': 90.0, 'lon_0': 0.0, 'lat_ts': 60.0},
+                                 3072, 3072, (-1000000.0, -4500000.0, 2072000.0, -1428000.0))
+
 
 def assertNumpyArraysEqual(self, other):
     if self.shape != other.shape:
@@ -154,13 +159,14 @@ def get_n19_orbital():
     return Orbital('NOAA-19', line1=tle1, line2=tle2)
 
 
-def get_region(areaid):
-    try:
-        from satpy.resample import get_area_def
-    except ImportError:
-        from mpop.projector import get_area_def
+def get_mb_orbital():
+    """Return orbital for a given set of TLEs for MetOp-B.
 
-    return get_area_def(areaid)
+    From 2021-02-04
+    """
+    tle1 = "1 38771U 12049A   21034.58230818 -.00000012  00000-0  14602-4 0 9998"
+    tle2 = "2 38771  98.6992  96.5537 0002329  71.3979  35.1836 14.21496632434867"
+    return Orbital("Metop-B", line1=tle1, line2=tle2)
 
 
 class TestPass(unittest.TestCase):
@@ -175,6 +181,11 @@ class TestPass(unittest.TestCase):
         tstart = datetime(2018, 10, 16, 2, 48, 29)
         tend = datetime(2018, 10, 16, 3, 2, 38)
 
+        instruments = set(('viirs', 'avhrr', 'modis', 'mersi', 'mersi-2'))
+        for instrument in instruments:
+            overp = Pass('NOAA-20', tstart, tend, orb=self.n20orb, instrument=instrument)
+            self.assertEqual(overp.instrument, instrument)
+
         instruments = set(('viirs', 'avhrr', 'modis'))
         overp = Pass('NOAA-20', tstart, tend, orb=self.n20orb, instrument=instruments)
         self.assertEqual(overp.instrument, 'avhrr')
@@ -198,7 +209,24 @@ class TestSwathBoundary(unittest.TestCase):
         """Set up"""
         self.n20orb = get_n20_orbital()
         self.n19orb = get_n19_orbital()
-        self.euron1 = get_region('euron1')
+        self.mborb = get_mb_orbital()
+        self.euron1 = AREA_DEF_EURON1
+        self.antarctica = create_area_def(
+            "antarctic",
+            {'ellps': 'WGS84', 'lat_0': '-90', 'lat_ts': '-60',
+             'lon_0': '0', 'no_defs': 'None', 'proj': 'stere',
+             'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'},
+            width=1000, height=1000,
+            area_extent=(-4008875.4031, -4000855.294,
+                         4000855.9937, 4008874.7048))
+        self.arctica = create_area_def(
+            "arctic",
+            {'ellps': 'WGS84', 'lat_0': '90', 'lat_ts': '60',
+             'lon_0': '0', 'no_defs': 'None', 'proj': 'stere',
+             'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'},
+            width=1000, height=1000,
+            area_extent=(-4008875.4031, -4000855.294,
+                         4000855.9937, 4008874.7048))
 
     def test_swath_boundary(self):
 
@@ -313,9 +341,80 @@ class TestSwathBoundary(unittest.TestCase):
 
         mypass = Pass('FENGYUN 3D', tstart, tend, instrument='mersi2', tle1=tle1, tle2=tle2)
         cov = mypass.area_coverage(self.euron1)
+        self.assertAlmostEqual(cov, 0.786836, 5)
 
+        mypass = Pass('FENGYUN 3D', tstart, tend, instrument='mersi-2', tle1=tle1, tle2=tle2)
+        cov = mypass.area_coverage(self.euron1)
         self.assertAlmostEqual(cov, 0.786836, 5)
 
+    def test_arctic_is_not_antarctic(self):
+
+        tstart = datetime(2021, 2, 3, 16, 28, 3)
+        tend = datetime(2021, 2, 3, 16, 31, 3)
+
+        overp = Pass('Metop-B', tstart, tend, orb=self.mborb, instrument='avhrr')
+
+        cov_south = overp.area_coverage(self.antarctica)
+        cov_north = overp.area_coverage(self.arctica)
+
+        assert cov_north == 0
+        assert cov_south != 0
+
+    def tearDown(self):
+        """Clean up"""
+        pass
+
+
+class TestPassList(unittest.TestCase):
+
+    def setUp(self):
+        """Set up"""
+        pass
+
+    def test_meos_pass_list(self):
+        orig = ("  1 20190105 FENGYUN 3D  5907 52.943  01:01:45 n/a   01:17:15 15:30  18.6 107.4 -- "
+                "Undefined(Scheduling not done 1546650105 ) a3d0df0cd289244e2f39f613f229a5cc D")
+
+        tstart = datetime.strptime("2019-01-05T01:01:45", "%Y-%m-%dT%H:%M:%S")
+        tend = tstart + timedelta(seconds=60 * 15.5)
+
+        tle1 = '1 43010U 17072A   18363.54078832 -.00000045  00000-0 -79715-6 0  9999'
+        tle2 = '2 43010  98.6971 300.6571 0001567 143.5989 216.5282 14.19710974 58158'
+
+        mypass = Pass('FENGYUN 3D', tstart, tend, instrument='mersi2', tle1=tle1, tle2=tle2)
+        coords = (10.72, 59.942, 0.1)
+        meos_format_str = mypass.print_meos(coords, line_no=1)
+        self.assertEqual(meos_format_str, orig)
+
+        mypass = Pass('FENGYUN 3D', tstart, tend, instrument='mersi-2', tle1=tle1, tle2=tle2)
+        coords = (10.72, 59.942, 0.1)
+        meos_format_str = mypass.print_meos(coords, line_no=1)
+        self.assertEqual(meos_format_str, orig)
+
+    def test_generate_metno_xml(self):
+        import xml.etree.ElementTree as ET
+        root = ET.Element("acquisition-schedule")
+
+        orig = ('<acquisition-schedule><pass satellite="FENGYUN 3D" aos="20190105010145" los="20190105011715" '
+                'orbit="5907" max-elevation="52.943" asimuth-at-max-elevation="107.385" asimuth-at-aos="18.555" '
+                'pass-direction="D" satellite-lon-at-aos="76.204" satellite-lat-at-aos="80.739" '
+                'tle-epoch="20181229125844.110848" /></acquisition-schedule>')
+
+        tstart = datetime.strptime("2019-01-05T01:01:45", "%Y-%m-%dT%H:%M:%S")
+        tend = tstart + timedelta(seconds=60 * 15.5)
+
+        tle1 = '1 43010U 17072A   18363.54078832 -.00000045  00000-0 -79715-6 0  9999'
+        tle2 = '2 43010  98.6971 300.6571 0001567 143.5989 216.5282 14.19710974 58158'
+
+        mypass = Pass('FENGYUN 3D', tstart, tend, instrument='mersi2', tle1=tle1, tle2=tle2)
+
+        coords = (10.72, 59.942, 0.1)
+        mypass.generate_metno_xml(coords, root)
+
+        # Dictionaries don't have guaranteed ordering in Python 3.7, so convert the strings to sets and compare them
+        res = set(ET.tostring(root).decode("utf-8").split())
+        self.assertEqual(res, set(orig.split()))
+
     def tearDown(self):
         """Clean up"""
         pass
@@ -328,5 +427,6 @@ def suite():
     mysuite = unittest.TestSuite()
     mysuite.addTest(loader.loadTestsFromTestCase(TestSwathBoundary))
     mysuite.addTest(loader.loadTestsFromTestCase(TestPass))
+    mysuite.addTest(loader.loadTestsFromTestCase(TestPassList))
 
     return mysuite


=====================================
trollsched/tests/test_schedule.py
=====================================
@@ -35,15 +35,9 @@ from trollsched.satpass import get_aqua_terra_dumps
 from trollsched.satpass import get_metopa_passes
 
 import sys
-if sys.version_info < (2, 7):
-    import unittest2 as unittest
-else:
-    import unittest
+import unittest
 
-try:
-    from unittest.mock import patch
-except ImportError:
-    from mock import patch
+from unittest.mock import patch
 
 
 # class TestPass(unittest.TestCase):
@@ -302,26 +296,20 @@ class TestAll(unittest.TestCase):
 
             self.assertEqual(len(allpasses), 2)
 
-            n20pass1 = allpasses.pop()
-
             rt1 = datetime(2018, 11, 28, 10, 53, 42, 79483)
             ft1 = datetime(2018, 11, 28, 11, 9, 6, 916787)
             rt2 = datetime(2018, 11, 28, 12, 34, 44, 667963)
             ft2 = datetime(2018, 11, 28, 12, 49, 25, 134067)
 
-            dt_ = n20pass1.risetime - rt1
-            self.assertAlmostEqual(dt_.seconds, 0)
-
-            dt_ = n20pass1.falltime - ft1
-            self.assertAlmostEqual(dt_.seconds, 0)
+            rise_times = [p.risetime for p in allpasses]
+            fall_times = [p.falltime for p in allpasses]
 
-            n20pass2 = allpasses.pop()
+            assert rt1 in rise_times
+            assert rt2 in rise_times
+            assert ft1 in fall_times
+            assert ft2 in fall_times
 
-            dt_ = n20pass2.risetime - rt2
-            self.assertAlmostEqual(dt_.seconds, 0)
-
-            dt_ = n20pass2.falltime - ft2
-            self.assertAlmostEqual(dt_.seconds, 0)
+            assert all([p.instrument == 'viirs' for p in allpasses])
 
     @patch('os.path.exists')
     @patch('trollsched.satpass.get_aqua_terra_dumpdata_from_ftp')
@@ -364,6 +352,8 @@ class TestAll(unittest.TestCase):
 
                 self.assertAlmostEqual(dtmin.seconds, 0)
 
+                self.assertEqual(mypass.instrument, 'modis')
+
     @patch('trollsched.satpass.get_aqua_terra_dumpdata_from_ftp')
     def test_get_aqua_terra_dumps(self, dumps_from_ftp):
         dumps_from_ftp.return_value = self.dumpdata_terra
@@ -405,9 +395,9 @@ class TestAll(unittest.TestCase):
 
             self.assertEqual(len(metopa_passes), 2)
             self.assertEqual(metopa_passes[0].pass_direction(), 'descending')
-            self.assertEqual(metopa_passes[0].seconds(), 462.466119)
+            self.assertAlmostEqual(metopa_passes[0].seconds(), 487.512589, 5)
             self.assertEqual((metopa_passes[0].uptime - datetime(2018, 12, 4, 9, 17, 48, 530484)).seconds, 0)
-            self.assertEqual((metopa_passes[0].risetime - datetime(2018, 12, 4, 9, 17, 46, 691075)).seconds, 0)
+            self.assertEqual((metopa_passes[0].risetime - datetime(2018, 12, 4, 9, 17, 21, 644605)).seconds, 0)
 
     def tearDown(self):
         """Clean up"""


=====================================
trollsched/tests/test_spherical.py
=====================================
@@ -184,8 +184,8 @@ class TestArc(unittest.TestCase):
                    SCoordinate(np.deg2rad(10), 0))
         lon, lat = arc1.intersection(arc2)
 
-        self.assertTrue(np.allclose(np.rad2deg(lon), 5))
-        self.assertEquals(np.rad2deg(lat), 5.0575148968282093)
+        np.testing.assert_allclose(np.rad2deg(lon), 5)
+        np.testing.assert_allclose(np.rad2deg(lat), 5.0575148968282093)
 
         arc1 = Arc(SCoordinate(0, 0),
                    SCoordinate(np.deg2rad(10), np.deg2rad(10)))


=====================================
trollsched/utils.py
=====================================
@@ -25,13 +25,10 @@
 
 import yaml
 import logging
-from collections import Mapping
-from six.moves.configparser import ConfigParser
+from collections.abc import Mapping
+from configparser import ConfigParser
 
-try:
-    from trollsched import schedule
-except ImportError:
-    import schedule
+from trollsched import schedule
 
 
 logger = logging.getLogger("trollsched")
@@ -44,7 +41,7 @@ def read_yaml_file(file_name):
     conf_dict = {}
     for file_obj in file_name:
         with open(file_obj) as fp:
-            tmp_dict = yaml.load(fp)
+            tmp_dict = yaml.safe_load(fp)
         conf_dict = recursive_dict_update(conf_dict, tmp_dict)
     return conf_dict
 
@@ -85,10 +82,10 @@ def read_config_cfg(filename):
         for k, v in cfg.items(section):
             try:
                 kv_dict[k] = int(v)
-            except:
+            except Exception:
                 try:
                     kv_dict[k] = float(v)
-                except:
+                except Exception:
                     kv_dict[k] = v
         return kv_dict
 
@@ -145,6 +142,9 @@ def read_config_yaml(filename):
         pattern[k] = v
 
     sched_params = cfg['default']
+    plot_parameters = sched_params.get('plot_parameters', {})
+    plot_title = sched_params.get('plot_title', None)
+
     scheduler = schedule.Scheduler(stations=[stations[st_id]
                                              for st_id in sched_params['station']],
                                    min_pass=sched_params.get('min_pass', 4),
@@ -152,6 +152,8 @@ def read_config_yaml(filename):
                                    start=sched_params['start'],
                                    dump_url=sched_params.get('dump_url'),
                                    patterns=pattern,
-                                   center_id=sched_params.get('center_id', 'unknown'))
+                                   center_id=sched_params.get('center_id', 'unknown'),
+                                   plot_parameters=plot_parameters,
+                                   plot_title=plot_title)
 
     return scheduler


=====================================
trollsched/version.py
=====================================
@@ -23,9 +23,9 @@ def get_keywords():
     # setup.py/versioneer.py will grep for the variable names, so they must
     # each be defined on a line of their own. _version.py will just call
     # get_keywords().
-    git_refnames = " (HEAD -> master, tag: v0.5.2)"
-    git_full = "9b594a35c74a12c895a7957565929b994f15f8fd"
-    git_date = "2019-03-19 10:53:57 +0100"
+    git_refnames = " (HEAD -> main, tag: v0.6.0)"
+    git_full = "5f2cb59a0c99ad27643e2615cbf6fd4977e6c3c0"
+    git_date = "2021-12-09 16:05:37 +0200"
     keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
     return keywords
 



View it on GitLab: https://salsa.debian.org/debian-gis-team/pytroll-schedule/-/commit/94565e5053073e50acd3b730e0d284ce021f10bb

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/pytroll-schedule/-/commit/94565e5053073e50acd3b730e0d284ce021f10bb
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20211211/5090ea9b/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list