From noreply at release.debian.org Sun Sep 1 05:39:26 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sun, 01 Sep 2019 04:39:26 +0000 Subject: rasterio 1.0.26-1 MIGRATED to testing Message-ID: FYI: The status of the rasterio source package in Debian's testing distribution has changed. Previous version: 1.0.25-1 Current version: 1.0.26-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Sun Sep 1 06:34:36 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 05:34:36 +0000 Subject: [Git][debian-gis-team/python-pyproj][pristine-tar] pristine-tar data for python-pyproj_2.3.1+ds.orig.tar.xz Message-ID: <5d6b586c35ad6_577b2ade5d825508191838@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-pyproj Commits: 94379184 by Bas Couwenberg at 2019-09-01T05:23:55Z pristine-tar data for python-pyproj_2.3.1+ds.orig.tar.xz - - - - - 2 changed files: - + python-pyproj_2.3.1+ds.orig.tar.xz.delta - + python-pyproj_2.3.1+ds.orig.tar.xz.id Changes: ===================================== python-pyproj_2.3.1+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/python-pyproj_2.3.1+ds.orig.tar.xz.delta differ ===================================== python-pyproj_2.3.1+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +fa502c64a330f4089875c74527a00ba24edea2c3 View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/94379184c66d522ba080c7063bc56e9db558b3f2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/94379184c66d522ba080c7063bc56e9db558b3f2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 06:34:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 05:34:44 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag debian/2.3.1+ds-1_exp1 Message-ID: <5d6b5874d317e_577b2ade5d5c1a8c1920d4@godard.mail> Bas Couwenberg pushed new tag debian/2.3.1+ds-1_exp1 at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/debian/2.3.1+ds-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 06:34:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 05:34:53 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag upstream/2.3.1+ds Message-ID: <5d6b587dd3ab6_577b2ade5d7ee8f0192430@godard.mail> Bas Couwenberg pushed new tag upstream/2.3.1+ds at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/upstream/2.3.1+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 06:34:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 05:34:53 +0000 Subject: [Git][debian-gis-team/python-pyproj][master] 5 commits: New upstream version 2.3.1+ds Message-ID: <5d6b587d354bd_577b2ade5d7673c81922cd@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pyproj Commits: ffd25ee3 by Bas Couwenberg at 2019-09-01T05:23:54Z New upstream version 2.3.1+ds - - - - - 138cba3a by Bas Couwenberg at 2019-09-01T05:23:55Z Update upstream source from tag 'upstream/2.3.1+ds' Update to upstream version '2.3.1+ds' with Debian dir 4ec3340a3f6b8f683c045287c6389cc648e45c61 - - - - - da4db387 by Bas Couwenberg at 2019-09-01T05:24:07Z New upstream release. - - - - - ce36da04 by Bas Couwenberg at 2019-09-01T05:27:18Z Drop patches. - - - - - c704cbcd by Bas Couwenberg at 2019-09-01T05:27:37Z Set distribution to experimental. - - - - - 18 changed files: - debian/changelog - − debian/patches/0001-reduce-precision-constrants-on-geodesic-tests-405.patch - − debian/patches/0001-use-mock-for-changing-os.environ-and-sys.prefix-in-d.patch - − debian/patches/series - pyproj/__init__.py - pyproj/_crs.pyx - pyproj/_datadir.pxd - pyproj/_datadir.pyx - pyproj/_proj.pxd - pyproj/_proj.pyx - pyproj/_transformer.pyx - pyproj/datadir.py - pyproj/geod.py - test/test_datadir.py - test/test_doctest_wrapper.py - test/test_geod.py - test/test_proj.py - test/test_transformer.py Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +python-pyproj (2.3.1+ds-1~exp1) experimental; urgency=medium + + * New upstream release. + * Drop patches. + + -- Bas Couwenberg Sun, 01 Sep 2019 07:27:23 +0200 + python-pyproj (2.3.0+ds-1) unstable; urgency=medium * Move from experimental to unstable. ===================================== debian/patches/0001-reduce-precision-constrants-on-geodesic-tests-405.patch deleted ===================================== @@ -1,172 +0,0 @@ -Description: reduce precision constrants on geodesic tests -Author: "Alan D. Snow" -Origin: https://github.com/pyproj4/pyproj/commit/2fe0630d399f6647215031b2e07c03448f61947a -Bug: https://github.com/pyproj4/pyproj/pull/405 - ---- a/pyproj/geod.py -+++ b/pyproj/geod.py -@@ -396,8 +396,8 @@ class Geod(_Geod): - >>> lons = [-74, -102, -102, -131, -163, 163, 172, 140, 113, - ... 88, 59, 25, -4, -14, -33, -46, -61] - >>> poly_area, poly_perimeter = geod.polygon_area_perimeter(lons, lats) -- >>> "{:.3f} {:.3f}".format(poly_area, poly_perimeter) -- '13376856682207.406 14710425.407' -+ >>> "{:.1f} {:.1f}".format(poly_area, poly_perimeter) -+ '13376856682207.4 14710425.4' - - - Parameters ---- a/test/test_geod.py -+++ b/test/test_geod.py -@@ -183,7 +183,7 @@ def test_geometry_length__linestring(): - assert_almost_equal( - geod.geometry_length(LineString([Point(1, 2), Point(3, 4)])), - 313588.39721259556, -- decimal=3, -+ decimal=2, - ) - - -@@ -201,7 +201,7 @@ def test_geometry_length__linestring__ra - radians=True, - ), - 313588.39721259556, -- decimal=3, -+ decimal=2, - ) - - -@@ -213,7 +213,7 @@ def test_geometry_length__linearring(): - LinearRing(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) - ), - 1072185.2103813463, -- decimal=3, -+ decimal=2, - ) - - -@@ -225,7 +225,7 @@ def test_geometry_length__polygon(): - Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) - ), - 1072185.2103813463, -- decimal=3, -+ decimal=2, - ) - - -@@ -246,7 +246,7 @@ def test_geometry_length__polygon__radia - radians=True, - ), - 1072185.2103813463, -- decimal=3, -+ decimal=2, - ) - - -@@ -257,7 +257,7 @@ def test_geometry_length__multipolygon() - assert_almost_equal( - geod.geometry_length(MultiPolygon([polygon, polygon])), - 2 * 1072185.2103813463, -- decimal=3, -+ decimal=2, - ) - - -@@ -276,7 +276,7 @@ def test_geometry_length__multipolygon__ - assert_almost_equal( - geod.geometry_length(MultiPolygon([polygon, polygon]), radians=True), - 2 * 1072185.2103813463, -- decimal=3, -+ decimal=2, - ) - - -@@ -287,7 +287,7 @@ def test_geometry_length__multilinestrin - assert_almost_equal( - geod.geometry_length(MultiLineString([line_string, line_string])), - 1254353.5888503822, -- decimal=3, -+ decimal=2, - ) - - -@@ -311,7 +311,7 @@ def test_geometry_area_perimeter__linest - assert_almost_equal( - geod.geometry_area_perimeter(LineString([Point(1, 2), Point(3, 4)])), - (0.0, 627176.7944251911), -- decimal=3, -+ decimal=2, - ) - - -@@ -329,7 +329,7 @@ def test_geometry_area_perimeter__linest - radians=True, - ), - (0.0, 627176.7944251911), -- decimal=3, -+ decimal=2, - ) - - -@@ -341,7 +341,7 @@ def test_geometry_area_perimeter__linear - LinearRing(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) - ), - (-49187690467.58623, 1072185.2103813463), -- decimal=3, -+ decimal=2, - ) - - -@@ -353,7 +353,7 @@ def test_geometry_area_perimeter__polygo - Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) - ), - (-49187690467.58623, 1072185.2103813463), -- decimal=3, -+ decimal=2, - ) - - -@@ -374,7 +374,7 @@ def test_geometry_area_perimeter__polygo - radians=True, - ), - (-49187690467.58623, 1072185.2103813463), -- decimal=3, -+ decimal=2, - ) - - -@@ -389,6 +389,7 @@ def test_geometry_area_perimeter__polygo - ) - ), - (-944373881400.3394, 3979008.0359657984), -+ decimal=2, - ) - - -@@ -399,7 +400,7 @@ def test_geometry_area_perimeter__multip - assert_almost_equal( - geod.geometry_area_perimeter(MultiPolygon([polygon, polygon])), - (-98375380935.17245, 2144370.4207626926), -- decimal=3, -+ decimal=2, - ) - - -@@ -418,7 +419,7 @@ def test_geometry_area_perimeter__multip - assert_almost_equal( - geod.geometry_area_perimeter(MultiPolygon([polygon, polygon]), radians=True), - (-98375380935.17245, 2144370.4207626926), -- decimal=3, -+ decimal=2, - ) - - -@@ -429,7 +430,7 @@ def test_geometry_area_perimeter__multil - assert_almost_equal( - geod.geometry_area_perimeter(MultiLineString([line_string, line_string])), - (-98375380935.17245, 2144370.4207626926), -- decimal=3, -+ decimal=2, - ) - - ===================================== debian/patches/0001-use-mock-for-changing-os.environ-and-sys.prefix-in-d.patch deleted ===================================== @@ -1,197 +0,0 @@ -Description: use mock for changing os.environ and sys.prefix in data directory tests -Author: "Alan D. Snow" -Origin: https://github.com/pyproj4/pyproj/commit/288c149f5306e8b62ee2652bfe0ac5efe555f681 -Bug: https://github.com/pyproj4/pyproj/pull/404 - ---- a/test/test_datadir.py -+++ b/test/test_datadir.py -@@ -27,16 +27,9 @@ def proj_env(): - """ - Ensure environment variable the same at the end of the test. - """ -- proj_lib = os.environ.get("PROJ_LIB") - try: - yield - finally: -- if proj_lib is not None: -- # add it back if it used to be there -- os.environ["PROJ_LIB"] = proj_lib -- else: -- # remove it if it wasn't there previously -- os.environ.pop("PROJ_LIB", None) - # make sure the data dir is cleared - set_data_dir(None) - -@@ -53,72 +46,101 @@ def temporary_directory(): - shutil.rmtree(temp_dir) - - -- at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") -+_INVALID_PATH = "/invalid/path/to/nowhere" -+ -+ -+def setup_os_mock(os_mock, abspath_return=_INVALID_PATH, proj_dir=None): -+ os_mock.path.abspath.return_value = abspath_return -+ os_mock.path.join = os.path.join -+ os_mock.path.dirname = os.path.dirname -+ os_mock.path.exists = os.path.exists -+ os_mock.pathsep = os.pathsep -+ if proj_dir is None: -+ os_mock.environ = {} -+ else: -+ os_mock.environ = {"PROJ_LIB": proj_dir} -+ -+ - def test_get_data_dir__missing(): - with proj_env(), pytest.raises(DataDirError), patch( -- "pyproj.datadir.os.path.abspath", return_value="INVALID" -- ), patch("pyproj.datadir.find_executable", return_value=None): -+ "pyproj.datadir.find_executable", return_value=None -+ ), patch("pyproj.datadir.os") as os_mock, patch("pyproj.datadir.sys") as sys_mock: -+ sys_mock.prefix = _INVALID_PATH -+ setup_os_mock(os_mock) - unset_data_dir() -- os.environ.pop("PROJ_LIB", None) - assert get_data_dir() is None - - - def test_get_data_dir__from_user(): -- with proj_env(), temporary_directory() as tmpdir, temporary_directory() as tmpdir_env: # noqa: E501 -+ with proj_env(), temporary_directory() as tmpdir, patch( -+ "pyproj.datadir.os" -+ ) as os_mock, patch( -+ "pyproj.datadir.sys" -+ ) as sys_mock, temporary_directory() as tmpdir_env: # noqa: E501 -+ setup_os_mock( -+ os_mock, -+ abspath_return=os.path.join(tmpdir, "randomfilename.py"), -+ proj_dir=tmpdir_env, -+ ) -+ sys_mock.prefix = tmpdir_env - create_projdb(tmpdir) -- os.environ["PROJ_LIB"] = tmpdir_env - create_projdb(tmpdir_env) - set_data_dir(tmpdir) - internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") - os.makedirs(internal_proj_dir) - create_projdb(internal_proj_dir) -- with patch("pyproj.datadir.os.path.abspath") as abspath_mock: -- abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") -- assert get_data_dir() == tmpdir -+ assert get_data_dir() == tmpdir - - - def test_get_data_dir__internal(): -- with proj_env(), temporary_directory() as tmpdir: -+ with proj_env(), temporary_directory() as tmpdir, patch( -+ "pyproj.datadir.os" -+ ) as os_mock, temporary_directory() as tmpdir_fake, patch( -+ "pyproj.datadir.sys" -+ ) as sys_mock: -+ setup_os_mock( -+ os_mock, -+ abspath_return=os.path.join(tmpdir, "randomfilename.py"), -+ proj_dir=tmpdir_fake, -+ ) -+ sys_mock.prefix = tmpdir_fake - unset_data_dir() -- os.environ["PROJ_LIB"] = tmpdir - create_projdb(tmpdir) -+ create_projdb(tmpdir_fake) - internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") - os.makedirs(internal_proj_dir) - create_projdb(internal_proj_dir) -- with patch("pyproj.datadir.os.path.abspath") as abspath_mock: -- abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") -- assert get_data_dir() == internal_proj_dir -+ assert get_data_dir() == internal_proj_dir - - -- at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") - def test_get_data_dir__from_env_var(): - with proj_env(), temporary_directory() as tmpdir, patch( -- "pyproj.datadir.os.path.abspath", return_value="INVALID" -- ): -+ "pyproj.datadir.os" -+ ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: -+ setup_os_mock(os_mock, proj_dir=tmpdir) -+ sys_mock.prefix = _INVALID_PATH - unset_data_dir() -- os.environ["PROJ_LIB"] = tmpdir - create_projdb(tmpdir) - assert get_data_dir() == tmpdir - - -- at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") - def test_get_data_dir__from_env_var__multiple(): - with proj_env(), temporary_directory() as tmpdir, patch( -- "pyproj.datadir.os.path.abspath", return_value="INVALID" -- ): -+ "pyproj.datadir.os" -+ ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: -+ setup_os_mock(os_mock, proj_dir=os.pathsep.join([tmpdir, tmpdir, tmpdir])) -+ sys_mock.prefix = _INVALID_PATH - unset_data_dir() -- os.environ["PROJ_LIB"] = os.pathsep.join([tmpdir, tmpdir, tmpdir]) - create_projdb(tmpdir) - assert get_data_dir() == os.pathsep.join([tmpdir, tmpdir, tmpdir]) - - -- at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") - def test_get_data_dir__from_prefix(): - with proj_env(), temporary_directory() as tmpdir, patch( -- "pyproj.datadir.os.path.abspath", return_value="INVALID" -- ), patch("pyproj.datadir.sys") as sys_mock: -+ "pyproj.datadir.os" -+ ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: -+ setup_os_mock(os_mock) - unset_data_dir() -- os.environ.pop("PROJ_LIB", None) - sys_mock.prefix = tmpdir - proj_dir = os.path.join(tmpdir, "share", "proj") - os.makedirs(proj_dir) -@@ -126,13 +148,15 @@ def test_get_data_dir__from_prefix(): - assert get_data_dir() == proj_dir - - -- at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") - def test_get_data_dir__from_path(): - with proj_env(), temporary_directory() as tmpdir, patch( -- "pyproj.datadir.os.path.abspath", return_value="INVALID" -- ), patch("pyproj.datadir.find_executable") as find_exe: -+ "pyproj.datadir.os" -+ ) as os_mock, patch("pyproj.datadir.sys") as sys_mock, patch( -+ "pyproj.datadir.find_executable" -+ ) as find_exe: -+ setup_os_mock(os_mock) -+ sys_mock.prefix = _INVALID_PATH - unset_data_dir() -- os.environ.pop("PROJ_LIB", None) - find_exe.return_value = os.path.join(tmpdir, "bin", "proj") - proj_dir = os.path.join(tmpdir, "share", "proj") - os.makedirs(proj_dir) -@@ -141,18 +165,18 @@ def test_get_data_dir__from_path(): - - - def test_append_data_dir__internal(): -- with proj_env(), temporary_directory() as tmpdir: -+ with proj_env(), temporary_directory() as tmpdir, patch( -+ "pyproj.datadir.os" -+ ) as os_mock: -+ setup_os_mock(os_mock, os.path.join(tmpdir, "randomfilename.py")) - unset_data_dir() -- os.environ["PROJ_LIB"] = tmpdir - create_projdb(tmpdir) - internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") - os.makedirs(internal_proj_dir) - create_projdb(internal_proj_dir) - extra_datadir = str(os.path.join(tmpdir, "extra_datumgrids")) -- with patch("pyproj.datadir.os.path.abspath") as abspath_mock: -- abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") -- append_data_dir(extra_datadir) -- assert get_data_dir() == os.pathsep.join([internal_proj_dir, extra_datadir]) -+ append_data_dir(extra_datadir) -+ assert get_data_dir() == os.pathsep.join([internal_proj_dir, extra_datadir]) - - - def test_creating_multiple_crs_without_file_limit(): ===================================== debian/patches/series deleted ===================================== @@ -1,2 +0,0 @@ -0001-use-mock-for-changing-os.environ-and-sys.prefix-in-d.patch -0001-reduce-precision-constrants-on-geodesic-tests-405.patch ===================================== pyproj/__init__.py ===================================== @@ -47,7 +47,7 @@ CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. """ -__version__ = "2.3.0" +__version__ = "2.3.1" __all__ = [ "Proj", "Geod", @@ -66,6 +66,7 @@ __all__ = [ ] import sys +from pyproj._datadir import PYPROJ_CONTEXT from pyproj._list import ( # noqa: F401 get_angular_units_map, get_ellps_map, @@ -80,6 +81,8 @@ from pyproj.geod import Geod, geodesic_version_str, pj_ellps # noqa: F401 from pyproj.proj import Proj, pj_list, proj_version_str # noqa: F401 from pyproj.transformer import Transformer, itransform, transform # noqa: F401 +PYPROJ_CONTEXT.set_search_paths() + def test(**kwargs): """run the examples in the docstrings using the doctest module""" ===================================== pyproj/_crs.pyx ===================================== @@ -91,7 +91,9 @@ cdef _to_wkt(PJ* projobj, version=WktVersion.WKT2_2018, pretty=False): PROJ_CONTEXT.context, projobj, wkt_out_type, - options_wkt) + options_wkt, + ) + CRSError.clear() return cstrdecode(proj_string) @@ -122,14 +124,15 @@ cdef _to_proj4(PJ* projobj, version): PROJ_CONTEXT.context, projobj, proj_out_type, - NULL) + NULL, + ) + CRSError.clear() return cstrdecode(proj_string) cdef PJ* _from_authority( auth_name, code, PJ_CATEGORY category, int use_proj_alternative_grid_names=False ): - CRSError.clear() b_auth_name = cstrencode(auth_name) cdef char *c_auth_name = b_auth_name b_code = cstrencode(str(code)) @@ -145,7 +148,6 @@ cdef PJ* _from_authority( cdef PJ* _from_string(proj_string, expected_types): - CRSError.clear() cdef PJ* base_pj = proj_create( PROJ_CONTEXT.context, cstrencode(proj_string) @@ -404,6 +406,7 @@ cdef class CoordinateSystem(Base): coord_system.name = _COORD_SYSTEM_TYPE_MAP[cs_type] except KeyError: raise CRSError("Not a coordinate system.") + CRSError.clear() return coord_system @property @@ -458,19 +461,18 @@ cdef class Ellipsoid(Base): cdef Ellipsoid ellips = Ellipsoid() ellips.projobj = ellipsoid_pj cdef int is_semi_minor_computed = 0 - try: - proj_ellipsoid_get_parameters( - PROJ_CONTEXT.context, - ellips.projobj, - &ellips._semi_major_metre, - &ellips._semi_minor_metre, - &is_semi_minor_computed, - &ellips._inv_flattening) - ellips.ellipsoid_loaded = True - ellips.is_semi_minor_computed = is_semi_minor_computed == 1 - except Exception: - pass + proj_ellipsoid_get_parameters( + PROJ_CONTEXT.context, + ellips.projobj, + &ellips._semi_major_metre, + &ellips._semi_minor_metre, + &is_semi_minor_computed, + &ellips._inv_flattening, + ) + ellips.ellipsoid_loaded = True + ellips.is_semi_minor_computed = is_semi_minor_computed == 1 ellips._set_name() + CRSError.clear() return ellips @staticmethod @@ -498,6 +500,7 @@ cdef class Ellipsoid(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return Ellipsoid.create(ellipsoid_pj) @staticmethod @@ -546,7 +549,7 @@ cdef class Ellipsoid(Base): pystrdecode(ellipsoid_string) ) ) - + CRSError.clear() return Ellipsoid.create(ellipsoid_pj) @property @@ -620,6 +623,7 @@ cdef class PrimeMeridian(Base): ) prime_meridian.unit_name = decode_or_undefined(unit_name) prime_meridian._set_name() + CRSError.clear() return prime_meridian @staticmethod @@ -647,6 +651,7 @@ cdef class PrimeMeridian(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return PrimeMeridian.create(prime_meridian_pj) @staticmethod @@ -665,7 +670,6 @@ cdef class PrimeMeridian(Base): """ return PrimeMeridian.from_authority("EPSG", code) - @staticmethod def from_string(prime_meridian_string): """ @@ -696,7 +700,7 @@ cdef class PrimeMeridian(Base): pystrdecode(prime_meridian_string) ) ) - + CRSError.clear() return PrimeMeridian.create(prime_meridian_pj) @@ -746,6 +750,7 @@ cdef class Datum(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return Datum.create(datum_pj) @staticmethod @@ -801,7 +806,7 @@ cdef class Datum(Base): pystrdecode(datum_string) ) ) - + CRSError.clear() return Datum.create(datum_pj) @property @@ -817,6 +822,7 @@ cdef class Datum(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if ellipsoid_pj == NULL: self._ellipsoid = False return None @@ -836,6 +842,7 @@ cdef class Datum(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if prime_meridian_pj == NULL: self._prime_meridian = False return None @@ -999,6 +1006,7 @@ cdef class Grid: grid.direct_download = direct_download == 1 grid.open_license = open_license == 1 grid.available = available == 1 + CRSError.clear() return grid def __str__(self): @@ -1085,7 +1093,7 @@ cdef class CoordinateOperation(Base): PROJ_CONTEXT.context, coord_operation.projobj ) == 1 - + CRSError.clear() return coord_operation @staticmethod @@ -1116,6 +1124,7 @@ cdef class CoordinateOperation(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return CoordinateOperation.create(coord_operation_pj) @staticmethod @@ -1170,7 +1179,7 @@ cdef class CoordinateOperation(Base): pystrdecode(coordinate_operation_string) ) ) - + CRSError.clear() return CoordinateOperation.create(coord_operation_pj) @property @@ -1195,6 +1204,7 @@ cdef class CoordinateOperation(Base): param_idx ) ) + CRSError.clear() return self._params @property @@ -1219,6 +1229,7 @@ cdef class CoordinateOperation(Base): grid_idx ) ) + CRSError.clear() return self._grids @property @@ -1331,6 +1342,7 @@ cdef class _CRS(Base): self._type = proj_get_type(self.projobj) self.type_name = _CRS_TYPE_MAP[self._type] self._set_name() + CRSError.clear() @property def axis_info(self): @@ -1366,6 +1378,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj ) + CRSError.clear() if ellipsoid_pj == NULL: self._ellipsoid = False return None @@ -1385,6 +1398,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if prime_meridian_pj == NULL: self._prime_meridian = False return None @@ -1406,6 +1420,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if datum_pj == NULL: self._datum = False return None @@ -1426,6 +1441,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj ) + CRSError.clear() if coord_system_pj == NULL: self._coordinate_system = False return None @@ -1447,6 +1463,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj ) + CRSError.clear() if coord_pj == NULL: self._coordinate_operation = False return None @@ -1465,6 +1482,7 @@ cdef class _CRS(Base): return None if self._source_crs is False else self._source_crs cdef PJ * projobj projobj = proj_get_source_crs(PROJ_CONTEXT.context, self.projobj) + CRSError.clear() if projobj == NULL: self._source_crs = False return None @@ -1485,6 +1503,7 @@ cdef class _CRS(Base): return None if self._target_crs is False else self._target_crs cdef PJ * projobj projobj = proj_get_target_crs(PROJ_CONTEXT.context, self.projobj) + CRSError.clear() if projobj == NULL: self._target_crs = False return None @@ -1519,7 +1538,7 @@ cdef class _CRS(Base): proj_destroy(projobj) # deallocate temp proj iii += 1 projobj = proj_crs_get_sub_crs(PROJ_CONTEXT.context, self.projobj, iii) - + CRSError.clear() return self._sub_crs_list @property @@ -1533,6 +1552,7 @@ cdef class _CRS(Base): return self._geodetic_crs if self. _geodetic_crs is not False else None cdef PJ * projobj projobj = proj_crs_get_geodetic_crs(PROJ_CONTEXT.context, self.projobj) + CRSError.clear() if projobj == NULL: self._geodetic_crs = False return None @@ -1604,7 +1624,7 @@ cdef class _CRS(Base): if auth_info is not None and auth_info[0].upper() == "EPSG": return int(auth_info[1]) return None - + def to_authority(self, auth_name=None, min_confidence=70): """ Return the authority name and code best matching the CRS @@ -1667,6 +1687,7 @@ cdef class _CRS(Base): finally: if out_confidence_list != NULL: proj_int_list_destroy(out_confidence_list) + CRSError.clear() # check to make sure that the projection found is valid if proj_list == NULL or num_proj_objects <= 0 or out_confidence < min_confidence: @@ -1680,6 +1701,7 @@ cdef class _CRS(Base): proj = proj_list_get(PROJ_CONTEXT.context, proj_list, 0) finally: proj_list_destroy(proj_list) + CRSError.clear() if proj == NULL: return None @@ -1693,6 +1715,7 @@ cdef class _CRS(Base): return pystrdecode(out_auth_name), pystrdecode(code) finally: proj_destroy(proj) + CRSError.clear() return None @@ -1728,7 +1751,6 @@ cdef class _CRS(Base): is_property = self._type in property_types return is_property - @property def is_geographic(self): """ ===================================== pyproj/_datadir.pxd ===================================== @@ -4,3 +4,4 @@ cdef ContextManager PROJ_CONTEXT cdef class ContextManager: cdef PJ_CONTEXT *context + cdef object _set_search_paths \ No newline at end of file ===================================== pyproj/_datadir.pyx ===================================== @@ -13,7 +13,6 @@ cdef void pyproj_log_function(void *user_data, int level, const char *error_msg) if level == PJ_LOG_ERROR: ProjError.internal_proj_error = pystrdecode(error_msg) - cdef class ContextManager: def __cinit__(self): self.context = NULL @@ -25,24 +24,28 @@ cdef class ContextManager: def __init__(self): self.context = proj_context_create() - self.set_search_paths() + self._set_search_paths = False proj_context_use_proj4_init_rules(self.context, 1) proj_log_func(self.context, NULL, pyproj_log_function) - def set_search_paths(self): + def set_search_paths(self, reset=False): """ This method sets the search paths based on pyproj.datadir.get_data_dir() """ + if self._set_search_paths and not reset: + return data_dir_list = get_data_dir().split(os.pathsep) cdef char **c_data_dir = malloc(len(data_dir_list) * sizeof(char*)) try: for iii in range(len(data_dir_list)): b_data_dir = cstrencode(data_dir_list[iii]) c_data_dir[iii] = b_data_dir + proj_context_set_search_paths(NULL, len(data_dir_list), c_data_dir) proj_context_set_search_paths(self.context, len(data_dir_list), c_data_dir) finally: free(c_data_dir) + self._set_search_paths = True cdef ContextManager PROJ_CONTEXT = ContextManager() ===================================== pyproj/_proj.pxd ===================================== @@ -1,6 +1,6 @@ include "proj.pxi" cdef class Proj: - cdef PJ * projpj - cdef PJ_PROJ_INFO projpj_info + cdef PJ * projobj + cdef PJ_PROJ_INFO projobj_info cdef readonly srs ===================================== pyproj/_proj.pyx ===================================== @@ -18,30 +18,31 @@ proj_version_str = "{0}.{1}.{2}".format( cdef class Proj: def __cinit__(self): - self.projpj = NULL + self.projobj = NULL def __init__(self, const char *projstring): self.srs = pystrdecode(projstring) # initialize projection - self.projpj = proj_create(PROJ_CONTEXT.context, projstring) - if self.projpj is NULL: + self.projobj = proj_create(PROJ_CONTEXT.context, projstring) + if self.projobj is NULL: raise ProjError("Invalid projection {}.".format(projstring)) - self.projpj_info = proj_pj_info(self.projpj) + self.projobj_info = proj_pj_info(self.projobj) + ProjError.clear() def __dealloc__(self): """destroy projection definition""" - if self.projpj is not NULL: - proj_destroy(self.projpj) - self.projpj = NULL + if self.projobj is not NULL: + proj_destroy(self.projobj) + self.projobj = NULL @property def definition(self): - return self.projpj_info.definition + return self.projobj_info.definition @property def has_inverse(self): """Returns true if this projection has an inverse""" - return self.projpj_info.has_inverse == 1 + return self.projobj_info.has_inverse == 1 def __reduce__(self): """special method that allows pyproj.Proj instance to be pickled""" @@ -63,18 +64,19 @@ cdef class Proj: cdef double *latsdata cdef void *londata cdef void *latdata - cdef int err + cdef int errno # if buffer api is supported, get pointer to data buffers. if PyObject_AsWriteBuffer(lons, &londata, &buflenx) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") if PyObject_AsWriteBuffer(lats, &latdata, &bufleny) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") # process data in buffer if buflenx != bufleny: raise ProjError("Buffer lengths not the same") ndim = buflenx//_DOUBLESIZE lonsdata = londata latsdata = latdata + proj_errno_reset(self.projobj) for iii in range(ndim): # if inputs are nan's, return big number. if lonsdata[iii] != lonsdata[iii] or latsdata[iii] != latsdata[iii]: @@ -82,17 +84,19 @@ cdef class Proj: if errcheck: raise ProjError("projection_undefined") continue - if proj_angular_input(self.projpj, PJ_FWD): + if proj_angular_input(self.projobj, PJ_FWD): projlonlatin.uv.u = _DG2RAD * lonsdata[iii] projlonlatin.uv.v = _DG2RAD * latsdata[iii] else: projlonlatin.uv.u = lonsdata[iii] projlonlatin.uv.v = latsdata[iii] - projxyout = proj_trans(self.projpj, PJ_FWD, projlonlatin) - if errcheck: - err = proj_errno(self.projpj) - if err != 0: - raise ProjError(pystrdecode(proj_errno_string(err))) + projxyout = proj_trans(self.projobj, PJ_FWD, projlonlatin) + errno = proj_errno(self.projobj) + if errcheck and errno: + raise ProjError("proj error: {}".format( + pystrdecode(proj_errno_string(errno)))) + elif errcheck and ProjError.internal_proj_error is not None: + raise ProjError("proj error") # since HUGE_VAL can be 'inf', # change it to a real (but very large) number. # also check for NaNs. @@ -104,12 +108,13 @@ cdef class Proj: raise ProjError("projection_undefined") lonsdata[iii] = 1.e30 latsdata[iii] = 1.e30 - elif proj_angular_output(self.projpj, PJ_FWD): + elif proj_angular_output(self.projobj, PJ_FWD): lonsdata[iii] = _RAD2DG * projxyout.xy.x latsdata[iii] = _RAD2DG * projxyout.xy.y else: lonsdata[iii] = projxyout.xy.x latsdata[iii] = projxyout.xy.y + ProjError.clear() @cython.boundscheck(False) @@ -131,13 +136,13 @@ cdef class Proj: cdef void *ydata cdef double *xdatab cdef double *ydatab - cdef int err + cdef int errno # if buffer api is supported, get pointer to data buffers. if PyObject_AsWriteBuffer(x, &xdata, &buflenx) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") if PyObject_AsWriteBuffer(y, &ydata, &bufleny) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") # process data in buffer # (for numpy/regular python arrays). if buflenx != bufleny: @@ -145,6 +150,9 @@ cdef class Proj: ndim = buflenx//_DOUBLESIZE xdatab = xdata ydatab = ydata + # reset errors potentially left over + proj_errno_reset(self.projobj) + for iii in range(ndim): # if inputs are nan's, return big number. if xdatab[iii] != xdatab[iii] or ydatab[iii] != ydatab[iii]: @@ -152,17 +160,19 @@ cdef class Proj: if errcheck: raise ProjError("projection_undefined") continue - if proj_angular_input(self.projpj, PJ_INV): + if proj_angular_input(self.projobj, PJ_INV): projxyin.uv.u = _DG2RAD * xdatab[iii] projxyin.uv.v = _DG2RAD * ydatab[iii] else: projxyin.uv.u = xdatab[iii] projxyin.uv.v = ydatab[iii] - projlonlatout = proj_trans(self.projpj, PJ_INV, projxyin) - if errcheck: - err = proj_errno(self.projpj) - if err != 0: - raise ProjError(pystrdecode(proj_errno_string(err))) + projlonlatout = proj_trans(self.projobj, PJ_INV, projxyin) + errno = proj_errno(self.projobj) + if errcheck and errno: + raise ProjError("proj error: {}".format( + pystrdecode(proj_errno_string(errno)))) + elif errcheck and ProjError.internal_proj_error is not None: + raise ProjError("proj error") # since HUGE_VAL can be 'inf', # change it to a real (but very large) number. # also check for NaNs. @@ -174,23 +184,25 @@ cdef class Proj: raise ProjError("projection_undefined") xdatab[iii] = 1.e30 ydatab[iii] = 1.e30 - elif proj_angular_output(self.projpj, PJ_INV): + elif proj_angular_output(self.projobj, PJ_INV): xdatab[iii] = _RAD2DG * projlonlatout.uv.u ydatab[iii] = _RAD2DG * projlonlatout.uv.v else: xdatab[iii] = projlonlatout.uv.u ydatab[iii] = projlonlatout.uv.v + ProjError.clear() + def __repr__(self): return "Proj('{srs}', preserve_units=True)".format(srs=self.srs) def _is_exact_same(self, Proj other): return proj_is_equivalent_to( - self.projpj, other.projpj, PJ_COMP_STRICT) == 1 + self.projobj, other.projobj, PJ_COMP_STRICT) == 1 def _is_equivalent(self, Proj other): return proj_is_equivalent_to( - self.projpj, other.projpj, PJ_COMP_EQUIVALENT) == 1 + self.projobj, other.projobj, PJ_COMP_EQUIVALENT) == 1 def __eq__(self, other): if not isinstance(other, Proj): ===================================== pyproj/_transformer.pyx ===================================== @@ -67,7 +67,6 @@ def transformer_list_from_crs( cdef int is_instantiable = 0 cdef CoordinateOperation coordinate_operation cdef double west_lon_degree, south_lat_degree, east_lon_degree, north_lat_degree - operations = [] try: operation_factory_context = proj_create_operation_factory_context( @@ -142,6 +141,8 @@ def transformer_list_from_crs( if pj_operations != NULL: proj_list_destroy(pj_operations) pj_operations = NULL + ProjError.clear() + return operations @@ -183,10 +184,10 @@ cdef class _Transformer(Base): def _initialize_from_projobj(self): self.proj_info = proj_pj_info(self.projobj) if self.proj_info.id == NULL: - ProjError.clear() raise ProjError("Input is not a transformation.") cdef PJ_TYPE transformer_type = proj_get_type(self.projobj) self.type_name = _TRANSFORMER_TYPE_MAP[transformer_type] + ProjError.clear() @property def id(self): @@ -228,7 +229,6 @@ cdef class _Transformer(Base): always_xy=False, area_of_interest=None ): - ProjError.clear() cdef PJ_AREA *pj_area_of_interest = NULL cdef double west_lon_degree, south_lat_degree, east_lon_degree, north_lat_degree if area_of_interest is not None: @@ -299,7 +299,6 @@ cdef class _Transformer(Base): @staticmethod def from_pipeline(const char *proj_pipeline): - ProjError.clear() cdef _Transformer transformer = _Transformer() # initialize projection transformer.projobj = proj_create(PROJ_CONTEXT.context, proj_pipeline) @@ -329,17 +328,15 @@ cdef class _Transformer(Base): cdef Py_ssize_t buflenx, bufleny, buflenz, buflent, npts, iii cdef int err if PyObject_AsWriteBuffer(inx, &xdata, &buflenx) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") if PyObject_AsWriteBuffer(iny, &ydata, &bufleny) <> 0: - raise ProjError - if inz is not None: - if PyObject_AsWriteBuffer(inz, &zdata, &buflenz) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") + if inz is not None and PyObject_AsWriteBuffer(inz, &zdata, &buflenz) <> 0: + raise ProjError("object does not provide the python buffer writeable interface") else: buflenz = bufleny - if intime is not None: - if PyObject_AsWriteBuffer(intime, &tdata, &buflent) <> 0: - raise ProjError + if intime is not None and PyObject_AsWriteBuffer(intime, &tdata, &buflent) <> 0: + raise ProjError("object does not provide the python buffer writeable interface") else: buflent = bufleny @@ -371,7 +368,6 @@ cdef class _Transformer(Base): xx[iii] = xx[iii]*_RAD2DG yy[iii] = yy[iii]*_RAD2DG - ProjError.clear() proj_errno_reset(self.projobj) proj_trans_generic( self.projobj, @@ -401,6 +397,7 @@ cdef class _Transformer(Base): for iii in range(npts): xx[iii] = xx[iii]*_DG2RAD yy[iii] = yy[iii]*_DG2RAD + ProjError.clear() @cython.boundscheck(False) @cython.wraparound(False) @@ -421,7 +418,6 @@ cdef class _Transformer(Base): double *z double *tt Py_ssize_t buflen, npts, iii, jjj - int err if stride < 2: raise ProjError("coordinates must contain at least 2 values") @@ -467,7 +463,6 @@ cdef class _Transformer(Base): else: tt = NULL - ProjError.clear() proj_errno_reset(self.projobj) proj_trans_generic ( self.projobj, @@ -500,3 +495,5 @@ cdef class _Transformer(Base): jjj = stride * iii coords[jjj] *= _DG2RAD coords[jjj + 1] *= _DG2RAD + + ProjError.clear() ===================================== pyproj/datadir.py ===================================== @@ -24,7 +24,7 @@ def set_data_dir(proj_data_dir): # reset search paths from pyproj._datadir import PYPROJ_CONTEXT - PYPROJ_CONTEXT.set_search_paths() + PYPROJ_CONTEXT.set_search_paths(reset=True) def append_data_dir(proj_data_dir): ===================================== pyproj/geod.py ===================================== @@ -396,8 +396,8 @@ class Geod(_Geod): >>> lons = [-74, -102, -102, -131, -163, 163, 172, 140, 113, ... 88, 59, 25, -4, -14, -33, -46, -61] >>> poly_area, poly_perimeter = geod.polygon_area_perimeter(lons, lats) - >>> "{:.3f} {:.3f}".format(poly_area, poly_perimeter) - '13376856682207.406 14710425.407' + >>> "{:.1f} {:.1f}".format(poly_area, poly_perimeter) + '13376856682207.4 14710425.4' Parameters ===================================== test/test_datadir.py ===================================== @@ -1,13 +1,13 @@ import os import shutil import tempfile -import unittest from contextlib import contextmanager import pytest from mock import patch from pyproj import CRS +from pyproj._datadir import ContextManager from pyproj.datadir import DataDirError, append_data_dir, get_data_dir, set_data_dir @@ -27,16 +27,9 @@ def proj_env(): """ Ensure environment variable the same at the end of the test. """ - proj_lib = os.environ.get("PROJ_LIB") try: yield finally: - if proj_lib is not None: - # add it back if it used to be there - os.environ["PROJ_LIB"] = proj_lib - else: - # remove it if it wasn't there previously - os.environ.pop("PROJ_LIB", None) # make sure the data dir is cleared set_data_dir(None) @@ -53,72 +46,108 @@ def temporary_directory(): shutil.rmtree(temp_dir) - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") +_INVALID_PATH = "/invalid/path/to/nowhere" + + +def setup_os_mock(os_mock, abspath_return=_INVALID_PATH, proj_dir=None): + os_mock.path.abspath.return_value = abspath_return + os_mock.path.join = os.path.join + os_mock.path.dirname = os.path.dirname + os_mock.path.exists = os.path.exists + os_mock.pathsep = os.pathsep + if proj_dir is None: + os_mock.environ = {} + else: + os_mock.environ = {"PROJ_LIB": proj_dir} + + def test_get_data_dir__missing(): with proj_env(), pytest.raises(DataDirError), patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ), patch("pyproj.datadir.find_executable", return_value=None): + "pyproj.datadir.find_executable", return_value=None + ), patch("pyproj.datadir.os") as os_mock, patch("pyproj.datadir.sys") as sys_mock: + sys_mock.prefix = _INVALID_PATH + setup_os_mock(os_mock) unset_data_dir() - os.environ.pop("PROJ_LIB", None) assert get_data_dir() is None +def test_condext_manager_datadir_missing(): + with proj_env(), pytest.raises(DataDirError), patch( + "pyproj._datadir.get_data_dir", side_effect=DataDirError("test") + ): + ContextManager().set_search_paths() + + def test_get_data_dir__from_user(): - with proj_env(), temporary_directory() as tmpdir, temporary_directory() as tmpdir_env: # noqa: E501 + with proj_env(), temporary_directory() as tmpdir, patch( + "pyproj.datadir.os" + ) as os_mock, patch( + "pyproj.datadir.sys" + ) as sys_mock, temporary_directory() as tmpdir_env: # noqa: E501 + setup_os_mock( + os_mock, + abspath_return=os.path.join(tmpdir, "randomfilename.py"), + proj_dir=tmpdir_env, + ) + sys_mock.prefix = tmpdir_env create_projdb(tmpdir) - os.environ["PROJ_LIB"] = tmpdir_env create_projdb(tmpdir_env) set_data_dir(tmpdir) internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") os.makedirs(internal_proj_dir) create_projdb(internal_proj_dir) - with patch("pyproj.datadir.os.path.abspath") as abspath_mock: - abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") - assert get_data_dir() == tmpdir + assert get_data_dir() == tmpdir def test_get_data_dir__internal(): - with proj_env(), temporary_directory() as tmpdir: + with proj_env(), temporary_directory() as tmpdir, patch( + "pyproj.datadir.os" + ) as os_mock, temporary_directory() as tmpdir_fake, patch( + "pyproj.datadir.sys" + ) as sys_mock: + setup_os_mock( + os_mock, + abspath_return=os.path.join(tmpdir, "randomfilename.py"), + proj_dir=tmpdir_fake, + ) + sys_mock.prefix = tmpdir_fake unset_data_dir() - os.environ["PROJ_LIB"] = tmpdir create_projdb(tmpdir) + create_projdb(tmpdir_fake) internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") os.makedirs(internal_proj_dir) create_projdb(internal_proj_dir) - with patch("pyproj.datadir.os.path.abspath") as abspath_mock: - abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") - assert get_data_dir() == internal_proj_dir + assert get_data_dir() == internal_proj_dir - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_env_var(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ): + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: + setup_os_mock(os_mock, proj_dir=tmpdir) + sys_mock.prefix = _INVALID_PATH unset_data_dir() - os.environ["PROJ_LIB"] = tmpdir create_projdb(tmpdir) assert get_data_dir() == tmpdir - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_env_var__multiple(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ): + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: + setup_os_mock(os_mock, proj_dir=os.pathsep.join([tmpdir, tmpdir, tmpdir])) + sys_mock.prefix = _INVALID_PATH unset_data_dir() - os.environ["PROJ_LIB"] = os.pathsep.join([tmpdir, tmpdir, tmpdir]) create_projdb(tmpdir) assert get_data_dir() == os.pathsep.join([tmpdir, tmpdir, tmpdir]) - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_prefix(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ), patch("pyproj.datadir.sys") as sys_mock: + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: + setup_os_mock(os_mock) unset_data_dir() - os.environ.pop("PROJ_LIB", None) sys_mock.prefix = tmpdir proj_dir = os.path.join(tmpdir, "share", "proj") os.makedirs(proj_dir) @@ -126,13 +155,15 @@ def test_get_data_dir__from_prefix(): assert get_data_dir() == proj_dir - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_path(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ), patch("pyproj.datadir.find_executable") as find_exe: + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock, patch( + "pyproj.datadir.find_executable" + ) as find_exe: + setup_os_mock(os_mock) + sys_mock.prefix = _INVALID_PATH unset_data_dir() - os.environ.pop("PROJ_LIB", None) find_exe.return_value = os.path.join(tmpdir, "bin", "proj") proj_dir = os.path.join(tmpdir, "share", "proj") os.makedirs(proj_dir) @@ -141,18 +172,18 @@ def test_get_data_dir__from_path(): def test_append_data_dir__internal(): - with proj_env(), temporary_directory() as tmpdir: + with proj_env(), temporary_directory() as tmpdir, patch( + "pyproj.datadir.os" + ) as os_mock: + setup_os_mock(os_mock, os.path.join(tmpdir, "randomfilename.py")) unset_data_dir() - os.environ["PROJ_LIB"] = tmpdir create_projdb(tmpdir) internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") os.makedirs(internal_proj_dir) create_projdb(internal_proj_dir) extra_datadir = str(os.path.join(tmpdir, "extra_datumgrids")) - with patch("pyproj.datadir.os.path.abspath") as abspath_mock: - abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") - append_data_dir(extra_datadir) - assert get_data_dir() == os.pathsep.join([internal_proj_dir, extra_datadir]) + append_data_dir(extra_datadir) + assert get_data_dir() == os.pathsep.join([internal_proj_dir, extra_datadir]) def test_creating_multiple_crs_without_file_limit(): ===================================== test/test_doctest_wrapper.py ===================================== @@ -3,6 +3,7 @@ This is a wrapper for the doctests in lib/pyproj/__init__.py so that pytest can conveniently run all the tests in a single command line. """ import os +import platform import pyproj @@ -15,7 +16,7 @@ def test_doctests(): try: import shapely # noqa except ImportError: - if os.name == "nt": + if os.name == "nt" or platform.uname()[4] != "x86_64": expected_failure_count = 6 # if the below line fails, doctests have failed ===================================== test/test_geod.py ===================================== @@ -1,6 +1,7 @@ import math import os import pickle +import platform import shutil import tempfile from contextlib import contextmanager @@ -26,8 +27,9 @@ except ImportError: SHAPELY_LOADED = False -skip_shapely_windows = pytest.mark.skipif( - not SHAPELY_LOADED and os.name == "nt", reason="Missing shapely wheels for Windows." +skip_shapely = pytest.mark.skipif( + not SHAPELY_LOADED and (os.name == "nt" or platform.uname()[4] != "x86_64"), + reason="Missing shapely wheels for Windows.", ) @@ -171,23 +173,23 @@ def test_polygon_area_perimeter__single_point(): assert perimeter == 0 - at skip_shapely_windows + at skip_shapely def test_geometry_length__point(): geod = Geod(ellps="WGS84") assert geod.geometry_length(Point(1, 2)) == 0 - at skip_shapely_windows + at skip_shapely def test_geometry_length__linestring(): geod = Geod(ellps="WGS84") assert_almost_equal( geod.geometry_length(LineString([Point(1, 2), Point(3, 4)])), 313588.39721259556, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__linestring__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -201,11 +203,11 @@ def test_geometry_length__linestring__radians(): radians=True, ), 313588.39721259556, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__linearring(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -213,11 +215,11 @@ def test_geometry_length__linearring(): LinearRing(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__polygon(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -225,11 +227,11 @@ def test_geometry_length__polygon(): Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__polygon__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -246,22 +248,22 @@ def test_geometry_length__polygon__radians(): radians=True, ), 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multipolygon(): geod = Geod(ellps="WGS84") polygon = Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) assert_almost_equal( geod.geometry_length(MultiPolygon([polygon, polygon])), 2 * 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multipolygon__radians(): geod = Geod(ellps="WGS84") polygon = Polygon( @@ -276,22 +278,22 @@ def test_geometry_length__multipolygon__radians(): assert_almost_equal( geod.geometry_length(MultiPolygon([polygon, polygon]), radians=True), 2 * 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multilinestring(): geod = Geod(ellps="WGS84") line_string = LineString([Point(1, 2), Point(3, 4), Point(5, 2)]) assert_almost_equal( geod.geometry_length(MultiLineString([line_string, line_string])), 1254353.5888503822, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multipoint(): geod = Geod(ellps="WGS84") assert ( @@ -299,23 +301,23 @@ def test_geometry_length__multipoint(): ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__point(): geod = Geod(ellps="WGS84") assert geod.geometry_area_perimeter(Point(1, 2)) == (0, 0) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__linestring(): geod = Geod(ellps="WGS84") assert_almost_equal( geod.geometry_area_perimeter(LineString([Point(1, 2), Point(3, 4)])), (0.0, 627176.7944251911), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__linestring__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -329,11 +331,11 @@ def test_geometry_area_perimeter__linestring__radians(): radians=True, ), (0.0, 627176.7944251911), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__linearring(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -341,11 +343,11 @@ def test_geometry_area_perimeter__linearring(): LinearRing(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), (-49187690467.58623, 1072185.2103813463), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__polygon(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -353,11 +355,11 @@ def test_geometry_area_perimeter__polygon(): Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), (-49187690467.58623, 1072185.2103813463), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__polygon__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -374,11 +376,11 @@ def test_geometry_area_perimeter__polygon__radians(): radians=True, ), (-49187690467.58623, 1072185.2103813463), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__polygon__holes(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -389,21 +391,22 @@ def test_geometry_area_perimeter__polygon__holes(): ) ), (-944373881400.3394, 3979008.0359657984), + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multipolygon(): geod = Geod(ellps="WGS84") polygon = Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) assert_almost_equal( geod.geometry_area_perimeter(MultiPolygon([polygon, polygon])), (-98375380935.17245, 2144370.4207626926), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multipolygon__radians(): geod = Geod(ellps="WGS84") polygon = Polygon( @@ -418,22 +421,22 @@ def test_geometry_area_perimeter__multipolygon__radians(): assert_almost_equal( geod.geometry_area_perimeter(MultiPolygon([polygon, polygon]), radians=True), (-98375380935.17245, 2144370.4207626926), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multilinestring(): geod = Geod(ellps="WGS84") line_string = LineString([Point(1, 2), Point(3, 4), Point(5, 2)]) assert_almost_equal( geod.geometry_area_perimeter(MultiLineString([line_string, line_string])), (-98375380935.17245, 2144370.4207626926), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multipoint(): geod = Geod(ellps="WGS84") assert geod.geometry_area_perimeter( ===================================== test/test_proj.py ===================================== @@ -403,5 +403,13 @@ def test_is_exact_same_different_type(): assert not Proj("epsg:4326").is_exact_same(None) +def test_reset_errno(): + proj = Proj( + {"proj": "laea", "lat_0": -90, "lon_0": 0, "a": 6371228.0, "units": "m"} + ) + assert not proj.crs.is_geographic + assert proj(0, 0, inverse=True, errcheck=True) == (0.0, -90.0) + + if __name__ == "__main__": unittest.main() ===================================== test/test_transformer.py ===================================== @@ -1,3 +1,5 @@ +from pkg_resources import parse_version + import numpy as np import pytest from numpy.testing import assert_almost_equal @@ -93,7 +95,9 @@ def test_equivalent_proj(): def test_equivalent_proj__disabled(): transformer = Transformer.from_proj(3857, pyproj.Proj(3857).crs.to_proj4()) assert not transformer._transformer.skip_equivalent - assert not transformer._transformer.projections_equivalent + assert transformer._transformer.projections_equivalent == ( + parse_version(pyproj.proj_version_str) >= parse_version("6.2.0") + ) assert not transformer._transformer.projections_exact_same View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/compare/0e71301c5ee75eb472caa300d4f5cac5fc667787...c704cbcdfbc95a8564aefa616a088027a636088c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/compare/0e71301c5ee75eb472caa300d4f5cac5fc667787...c704cbcdfbc95a8564aefa616a088027a636088c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 06:35:03 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 05:35:03 +0000 Subject: [Git][debian-gis-team/python-pyproj][upstream] New upstream version 2.3.1+ds Message-ID: <5d6b5887a3528_577b2ade5d84e91c1925c2@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pyproj Commits: ffd25ee3 by Bas Couwenberg at 2019-09-01T05:23:54Z New upstream version 2.3.1+ds - - - - - 14 changed files: - pyproj/__init__.py - pyproj/_crs.pyx - pyproj/_datadir.pxd - pyproj/_datadir.pyx - pyproj/_proj.pxd - pyproj/_proj.pyx - pyproj/_transformer.pyx - pyproj/datadir.py - pyproj/geod.py - test/test_datadir.py - test/test_doctest_wrapper.py - test/test_geod.py - test/test_proj.py - test/test_transformer.py Changes: ===================================== pyproj/__init__.py ===================================== @@ -47,7 +47,7 @@ CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. """ -__version__ = "2.3.0" +__version__ = "2.3.1" __all__ = [ "Proj", "Geod", @@ -66,6 +66,7 @@ __all__ = [ ] import sys +from pyproj._datadir import PYPROJ_CONTEXT from pyproj._list import ( # noqa: F401 get_angular_units_map, get_ellps_map, @@ -80,6 +81,8 @@ from pyproj.geod import Geod, geodesic_version_str, pj_ellps # noqa: F401 from pyproj.proj import Proj, pj_list, proj_version_str # noqa: F401 from pyproj.transformer import Transformer, itransform, transform # noqa: F401 +PYPROJ_CONTEXT.set_search_paths() + def test(**kwargs): """run the examples in the docstrings using the doctest module""" ===================================== pyproj/_crs.pyx ===================================== @@ -91,7 +91,9 @@ cdef _to_wkt(PJ* projobj, version=WktVersion.WKT2_2018, pretty=False): PROJ_CONTEXT.context, projobj, wkt_out_type, - options_wkt) + options_wkt, + ) + CRSError.clear() return cstrdecode(proj_string) @@ -122,14 +124,15 @@ cdef _to_proj4(PJ* projobj, version): PROJ_CONTEXT.context, projobj, proj_out_type, - NULL) + NULL, + ) + CRSError.clear() return cstrdecode(proj_string) cdef PJ* _from_authority( auth_name, code, PJ_CATEGORY category, int use_proj_alternative_grid_names=False ): - CRSError.clear() b_auth_name = cstrencode(auth_name) cdef char *c_auth_name = b_auth_name b_code = cstrencode(str(code)) @@ -145,7 +148,6 @@ cdef PJ* _from_authority( cdef PJ* _from_string(proj_string, expected_types): - CRSError.clear() cdef PJ* base_pj = proj_create( PROJ_CONTEXT.context, cstrencode(proj_string) @@ -404,6 +406,7 @@ cdef class CoordinateSystem(Base): coord_system.name = _COORD_SYSTEM_TYPE_MAP[cs_type] except KeyError: raise CRSError("Not a coordinate system.") + CRSError.clear() return coord_system @property @@ -458,19 +461,18 @@ cdef class Ellipsoid(Base): cdef Ellipsoid ellips = Ellipsoid() ellips.projobj = ellipsoid_pj cdef int is_semi_minor_computed = 0 - try: - proj_ellipsoid_get_parameters( - PROJ_CONTEXT.context, - ellips.projobj, - &ellips._semi_major_metre, - &ellips._semi_minor_metre, - &is_semi_minor_computed, - &ellips._inv_flattening) - ellips.ellipsoid_loaded = True - ellips.is_semi_minor_computed = is_semi_minor_computed == 1 - except Exception: - pass + proj_ellipsoid_get_parameters( + PROJ_CONTEXT.context, + ellips.projobj, + &ellips._semi_major_metre, + &ellips._semi_minor_metre, + &is_semi_minor_computed, + &ellips._inv_flattening, + ) + ellips.ellipsoid_loaded = True + ellips.is_semi_minor_computed = is_semi_minor_computed == 1 ellips._set_name() + CRSError.clear() return ellips @staticmethod @@ -498,6 +500,7 @@ cdef class Ellipsoid(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return Ellipsoid.create(ellipsoid_pj) @staticmethod @@ -546,7 +549,7 @@ cdef class Ellipsoid(Base): pystrdecode(ellipsoid_string) ) ) - + CRSError.clear() return Ellipsoid.create(ellipsoid_pj) @property @@ -620,6 +623,7 @@ cdef class PrimeMeridian(Base): ) prime_meridian.unit_name = decode_or_undefined(unit_name) prime_meridian._set_name() + CRSError.clear() return prime_meridian @staticmethod @@ -647,6 +651,7 @@ cdef class PrimeMeridian(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return PrimeMeridian.create(prime_meridian_pj) @staticmethod @@ -665,7 +670,6 @@ cdef class PrimeMeridian(Base): """ return PrimeMeridian.from_authority("EPSG", code) - @staticmethod def from_string(prime_meridian_string): """ @@ -696,7 +700,7 @@ cdef class PrimeMeridian(Base): pystrdecode(prime_meridian_string) ) ) - + CRSError.clear() return PrimeMeridian.create(prime_meridian_pj) @@ -746,6 +750,7 @@ cdef class Datum(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return Datum.create(datum_pj) @staticmethod @@ -801,7 +806,7 @@ cdef class Datum(Base): pystrdecode(datum_string) ) ) - + CRSError.clear() return Datum.create(datum_pj) @property @@ -817,6 +822,7 @@ cdef class Datum(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if ellipsoid_pj == NULL: self._ellipsoid = False return None @@ -836,6 +842,7 @@ cdef class Datum(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if prime_meridian_pj == NULL: self._prime_meridian = False return None @@ -999,6 +1006,7 @@ cdef class Grid: grid.direct_download = direct_download == 1 grid.open_license = open_license == 1 grid.available = available == 1 + CRSError.clear() return grid def __str__(self): @@ -1085,7 +1093,7 @@ cdef class CoordinateOperation(Base): PROJ_CONTEXT.context, coord_operation.projobj ) == 1 - + CRSError.clear() return coord_operation @staticmethod @@ -1116,6 +1124,7 @@ cdef class CoordinateOperation(Base): raise CRSError( "Invalid authority or code ({0}, {1})".format(auth_name, code) ) + CRSError.clear() return CoordinateOperation.create(coord_operation_pj) @staticmethod @@ -1170,7 +1179,7 @@ cdef class CoordinateOperation(Base): pystrdecode(coordinate_operation_string) ) ) - + CRSError.clear() return CoordinateOperation.create(coord_operation_pj) @property @@ -1195,6 +1204,7 @@ cdef class CoordinateOperation(Base): param_idx ) ) + CRSError.clear() return self._params @property @@ -1219,6 +1229,7 @@ cdef class CoordinateOperation(Base): grid_idx ) ) + CRSError.clear() return self._grids @property @@ -1331,6 +1342,7 @@ cdef class _CRS(Base): self._type = proj_get_type(self.projobj) self.type_name = _CRS_TYPE_MAP[self._type] self._set_name() + CRSError.clear() @property def axis_info(self): @@ -1366,6 +1378,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj ) + CRSError.clear() if ellipsoid_pj == NULL: self._ellipsoid = False return None @@ -1385,6 +1398,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if prime_meridian_pj == NULL: self._prime_meridian = False return None @@ -1406,6 +1420,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj, ) + CRSError.clear() if datum_pj == NULL: self._datum = False return None @@ -1426,6 +1441,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj ) + CRSError.clear() if coord_system_pj == NULL: self._coordinate_system = False return None @@ -1447,6 +1463,7 @@ cdef class _CRS(Base): PROJ_CONTEXT.context, self.projobj ) + CRSError.clear() if coord_pj == NULL: self._coordinate_operation = False return None @@ -1465,6 +1482,7 @@ cdef class _CRS(Base): return None if self._source_crs is False else self._source_crs cdef PJ * projobj projobj = proj_get_source_crs(PROJ_CONTEXT.context, self.projobj) + CRSError.clear() if projobj == NULL: self._source_crs = False return None @@ -1485,6 +1503,7 @@ cdef class _CRS(Base): return None if self._target_crs is False else self._target_crs cdef PJ * projobj projobj = proj_get_target_crs(PROJ_CONTEXT.context, self.projobj) + CRSError.clear() if projobj == NULL: self._target_crs = False return None @@ -1519,7 +1538,7 @@ cdef class _CRS(Base): proj_destroy(projobj) # deallocate temp proj iii += 1 projobj = proj_crs_get_sub_crs(PROJ_CONTEXT.context, self.projobj, iii) - + CRSError.clear() return self._sub_crs_list @property @@ -1533,6 +1552,7 @@ cdef class _CRS(Base): return self._geodetic_crs if self. _geodetic_crs is not False else None cdef PJ * projobj projobj = proj_crs_get_geodetic_crs(PROJ_CONTEXT.context, self.projobj) + CRSError.clear() if projobj == NULL: self._geodetic_crs = False return None @@ -1604,7 +1624,7 @@ cdef class _CRS(Base): if auth_info is not None and auth_info[0].upper() == "EPSG": return int(auth_info[1]) return None - + def to_authority(self, auth_name=None, min_confidence=70): """ Return the authority name and code best matching the CRS @@ -1667,6 +1687,7 @@ cdef class _CRS(Base): finally: if out_confidence_list != NULL: proj_int_list_destroy(out_confidence_list) + CRSError.clear() # check to make sure that the projection found is valid if proj_list == NULL or num_proj_objects <= 0 or out_confidence < min_confidence: @@ -1680,6 +1701,7 @@ cdef class _CRS(Base): proj = proj_list_get(PROJ_CONTEXT.context, proj_list, 0) finally: proj_list_destroy(proj_list) + CRSError.clear() if proj == NULL: return None @@ -1693,6 +1715,7 @@ cdef class _CRS(Base): return pystrdecode(out_auth_name), pystrdecode(code) finally: proj_destroy(proj) + CRSError.clear() return None @@ -1728,7 +1751,6 @@ cdef class _CRS(Base): is_property = self._type in property_types return is_property - @property def is_geographic(self): """ ===================================== pyproj/_datadir.pxd ===================================== @@ -4,3 +4,4 @@ cdef ContextManager PROJ_CONTEXT cdef class ContextManager: cdef PJ_CONTEXT *context + cdef object _set_search_paths \ No newline at end of file ===================================== pyproj/_datadir.pyx ===================================== @@ -13,7 +13,6 @@ cdef void pyproj_log_function(void *user_data, int level, const char *error_msg) if level == PJ_LOG_ERROR: ProjError.internal_proj_error = pystrdecode(error_msg) - cdef class ContextManager: def __cinit__(self): self.context = NULL @@ -25,24 +24,28 @@ cdef class ContextManager: def __init__(self): self.context = proj_context_create() - self.set_search_paths() + self._set_search_paths = False proj_context_use_proj4_init_rules(self.context, 1) proj_log_func(self.context, NULL, pyproj_log_function) - def set_search_paths(self): + def set_search_paths(self, reset=False): """ This method sets the search paths based on pyproj.datadir.get_data_dir() """ + if self._set_search_paths and not reset: + return data_dir_list = get_data_dir().split(os.pathsep) cdef char **c_data_dir = malloc(len(data_dir_list) * sizeof(char*)) try: for iii in range(len(data_dir_list)): b_data_dir = cstrencode(data_dir_list[iii]) c_data_dir[iii] = b_data_dir + proj_context_set_search_paths(NULL, len(data_dir_list), c_data_dir) proj_context_set_search_paths(self.context, len(data_dir_list), c_data_dir) finally: free(c_data_dir) + self._set_search_paths = True cdef ContextManager PROJ_CONTEXT = ContextManager() ===================================== pyproj/_proj.pxd ===================================== @@ -1,6 +1,6 @@ include "proj.pxi" cdef class Proj: - cdef PJ * projpj - cdef PJ_PROJ_INFO projpj_info + cdef PJ * projobj + cdef PJ_PROJ_INFO projobj_info cdef readonly srs ===================================== pyproj/_proj.pyx ===================================== @@ -18,30 +18,31 @@ proj_version_str = "{0}.{1}.{2}".format( cdef class Proj: def __cinit__(self): - self.projpj = NULL + self.projobj = NULL def __init__(self, const char *projstring): self.srs = pystrdecode(projstring) # initialize projection - self.projpj = proj_create(PROJ_CONTEXT.context, projstring) - if self.projpj is NULL: + self.projobj = proj_create(PROJ_CONTEXT.context, projstring) + if self.projobj is NULL: raise ProjError("Invalid projection {}.".format(projstring)) - self.projpj_info = proj_pj_info(self.projpj) + self.projobj_info = proj_pj_info(self.projobj) + ProjError.clear() def __dealloc__(self): """destroy projection definition""" - if self.projpj is not NULL: - proj_destroy(self.projpj) - self.projpj = NULL + if self.projobj is not NULL: + proj_destroy(self.projobj) + self.projobj = NULL @property def definition(self): - return self.projpj_info.definition + return self.projobj_info.definition @property def has_inverse(self): """Returns true if this projection has an inverse""" - return self.projpj_info.has_inverse == 1 + return self.projobj_info.has_inverse == 1 def __reduce__(self): """special method that allows pyproj.Proj instance to be pickled""" @@ -63,18 +64,19 @@ cdef class Proj: cdef double *latsdata cdef void *londata cdef void *latdata - cdef int err + cdef int errno # if buffer api is supported, get pointer to data buffers. if PyObject_AsWriteBuffer(lons, &londata, &buflenx) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") if PyObject_AsWriteBuffer(lats, &latdata, &bufleny) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") # process data in buffer if buflenx != bufleny: raise ProjError("Buffer lengths not the same") ndim = buflenx//_DOUBLESIZE lonsdata = londata latsdata = latdata + proj_errno_reset(self.projobj) for iii in range(ndim): # if inputs are nan's, return big number. if lonsdata[iii] != lonsdata[iii] or latsdata[iii] != latsdata[iii]: @@ -82,17 +84,19 @@ cdef class Proj: if errcheck: raise ProjError("projection_undefined") continue - if proj_angular_input(self.projpj, PJ_FWD): + if proj_angular_input(self.projobj, PJ_FWD): projlonlatin.uv.u = _DG2RAD * lonsdata[iii] projlonlatin.uv.v = _DG2RAD * latsdata[iii] else: projlonlatin.uv.u = lonsdata[iii] projlonlatin.uv.v = latsdata[iii] - projxyout = proj_trans(self.projpj, PJ_FWD, projlonlatin) - if errcheck: - err = proj_errno(self.projpj) - if err != 0: - raise ProjError(pystrdecode(proj_errno_string(err))) + projxyout = proj_trans(self.projobj, PJ_FWD, projlonlatin) + errno = proj_errno(self.projobj) + if errcheck and errno: + raise ProjError("proj error: {}".format( + pystrdecode(proj_errno_string(errno)))) + elif errcheck and ProjError.internal_proj_error is not None: + raise ProjError("proj error") # since HUGE_VAL can be 'inf', # change it to a real (but very large) number. # also check for NaNs. @@ -104,12 +108,13 @@ cdef class Proj: raise ProjError("projection_undefined") lonsdata[iii] = 1.e30 latsdata[iii] = 1.e30 - elif proj_angular_output(self.projpj, PJ_FWD): + elif proj_angular_output(self.projobj, PJ_FWD): lonsdata[iii] = _RAD2DG * projxyout.xy.x latsdata[iii] = _RAD2DG * projxyout.xy.y else: lonsdata[iii] = projxyout.xy.x latsdata[iii] = projxyout.xy.y + ProjError.clear() @cython.boundscheck(False) @@ -131,13 +136,13 @@ cdef class Proj: cdef void *ydata cdef double *xdatab cdef double *ydatab - cdef int err + cdef int errno # if buffer api is supported, get pointer to data buffers. if PyObject_AsWriteBuffer(x, &xdata, &buflenx) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") if PyObject_AsWriteBuffer(y, &ydata, &bufleny) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") # process data in buffer # (for numpy/regular python arrays). if buflenx != bufleny: @@ -145,6 +150,9 @@ cdef class Proj: ndim = buflenx//_DOUBLESIZE xdatab = xdata ydatab = ydata + # reset errors potentially left over + proj_errno_reset(self.projobj) + for iii in range(ndim): # if inputs are nan's, return big number. if xdatab[iii] != xdatab[iii] or ydatab[iii] != ydatab[iii]: @@ -152,17 +160,19 @@ cdef class Proj: if errcheck: raise ProjError("projection_undefined") continue - if proj_angular_input(self.projpj, PJ_INV): + if proj_angular_input(self.projobj, PJ_INV): projxyin.uv.u = _DG2RAD * xdatab[iii] projxyin.uv.v = _DG2RAD * ydatab[iii] else: projxyin.uv.u = xdatab[iii] projxyin.uv.v = ydatab[iii] - projlonlatout = proj_trans(self.projpj, PJ_INV, projxyin) - if errcheck: - err = proj_errno(self.projpj) - if err != 0: - raise ProjError(pystrdecode(proj_errno_string(err))) + projlonlatout = proj_trans(self.projobj, PJ_INV, projxyin) + errno = proj_errno(self.projobj) + if errcheck and errno: + raise ProjError("proj error: {}".format( + pystrdecode(proj_errno_string(errno)))) + elif errcheck and ProjError.internal_proj_error is not None: + raise ProjError("proj error") # since HUGE_VAL can be 'inf', # change it to a real (but very large) number. # also check for NaNs. @@ -174,23 +184,25 @@ cdef class Proj: raise ProjError("projection_undefined") xdatab[iii] = 1.e30 ydatab[iii] = 1.e30 - elif proj_angular_output(self.projpj, PJ_INV): + elif proj_angular_output(self.projobj, PJ_INV): xdatab[iii] = _RAD2DG * projlonlatout.uv.u ydatab[iii] = _RAD2DG * projlonlatout.uv.v else: xdatab[iii] = projlonlatout.uv.u ydatab[iii] = projlonlatout.uv.v + ProjError.clear() + def __repr__(self): return "Proj('{srs}', preserve_units=True)".format(srs=self.srs) def _is_exact_same(self, Proj other): return proj_is_equivalent_to( - self.projpj, other.projpj, PJ_COMP_STRICT) == 1 + self.projobj, other.projobj, PJ_COMP_STRICT) == 1 def _is_equivalent(self, Proj other): return proj_is_equivalent_to( - self.projpj, other.projpj, PJ_COMP_EQUIVALENT) == 1 + self.projobj, other.projobj, PJ_COMP_EQUIVALENT) == 1 def __eq__(self, other): if not isinstance(other, Proj): ===================================== pyproj/_transformer.pyx ===================================== @@ -67,7 +67,6 @@ def transformer_list_from_crs( cdef int is_instantiable = 0 cdef CoordinateOperation coordinate_operation cdef double west_lon_degree, south_lat_degree, east_lon_degree, north_lat_degree - operations = [] try: operation_factory_context = proj_create_operation_factory_context( @@ -142,6 +141,8 @@ def transformer_list_from_crs( if pj_operations != NULL: proj_list_destroy(pj_operations) pj_operations = NULL + ProjError.clear() + return operations @@ -183,10 +184,10 @@ cdef class _Transformer(Base): def _initialize_from_projobj(self): self.proj_info = proj_pj_info(self.projobj) if self.proj_info.id == NULL: - ProjError.clear() raise ProjError("Input is not a transformation.") cdef PJ_TYPE transformer_type = proj_get_type(self.projobj) self.type_name = _TRANSFORMER_TYPE_MAP[transformer_type] + ProjError.clear() @property def id(self): @@ -228,7 +229,6 @@ cdef class _Transformer(Base): always_xy=False, area_of_interest=None ): - ProjError.clear() cdef PJ_AREA *pj_area_of_interest = NULL cdef double west_lon_degree, south_lat_degree, east_lon_degree, north_lat_degree if area_of_interest is not None: @@ -299,7 +299,6 @@ cdef class _Transformer(Base): @staticmethod def from_pipeline(const char *proj_pipeline): - ProjError.clear() cdef _Transformer transformer = _Transformer() # initialize projection transformer.projobj = proj_create(PROJ_CONTEXT.context, proj_pipeline) @@ -329,17 +328,15 @@ cdef class _Transformer(Base): cdef Py_ssize_t buflenx, bufleny, buflenz, buflent, npts, iii cdef int err if PyObject_AsWriteBuffer(inx, &xdata, &buflenx) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") if PyObject_AsWriteBuffer(iny, &ydata, &bufleny) <> 0: - raise ProjError - if inz is not None: - if PyObject_AsWriteBuffer(inz, &zdata, &buflenz) <> 0: - raise ProjError + raise ProjError("object does not provide the python buffer writeable interface") + if inz is not None and PyObject_AsWriteBuffer(inz, &zdata, &buflenz) <> 0: + raise ProjError("object does not provide the python buffer writeable interface") else: buflenz = bufleny - if intime is not None: - if PyObject_AsWriteBuffer(intime, &tdata, &buflent) <> 0: - raise ProjError + if intime is not None and PyObject_AsWriteBuffer(intime, &tdata, &buflent) <> 0: + raise ProjError("object does not provide the python buffer writeable interface") else: buflent = bufleny @@ -371,7 +368,6 @@ cdef class _Transformer(Base): xx[iii] = xx[iii]*_RAD2DG yy[iii] = yy[iii]*_RAD2DG - ProjError.clear() proj_errno_reset(self.projobj) proj_trans_generic( self.projobj, @@ -401,6 +397,7 @@ cdef class _Transformer(Base): for iii in range(npts): xx[iii] = xx[iii]*_DG2RAD yy[iii] = yy[iii]*_DG2RAD + ProjError.clear() @cython.boundscheck(False) @cython.wraparound(False) @@ -421,7 +418,6 @@ cdef class _Transformer(Base): double *z double *tt Py_ssize_t buflen, npts, iii, jjj - int err if stride < 2: raise ProjError("coordinates must contain at least 2 values") @@ -467,7 +463,6 @@ cdef class _Transformer(Base): else: tt = NULL - ProjError.clear() proj_errno_reset(self.projobj) proj_trans_generic ( self.projobj, @@ -500,3 +495,5 @@ cdef class _Transformer(Base): jjj = stride * iii coords[jjj] *= _DG2RAD coords[jjj + 1] *= _DG2RAD + + ProjError.clear() ===================================== pyproj/datadir.py ===================================== @@ -24,7 +24,7 @@ def set_data_dir(proj_data_dir): # reset search paths from pyproj._datadir import PYPROJ_CONTEXT - PYPROJ_CONTEXT.set_search_paths() + PYPROJ_CONTEXT.set_search_paths(reset=True) def append_data_dir(proj_data_dir): ===================================== pyproj/geod.py ===================================== @@ -396,8 +396,8 @@ class Geod(_Geod): >>> lons = [-74, -102, -102, -131, -163, 163, 172, 140, 113, ... 88, 59, 25, -4, -14, -33, -46, -61] >>> poly_area, poly_perimeter = geod.polygon_area_perimeter(lons, lats) - >>> "{:.3f} {:.3f}".format(poly_area, poly_perimeter) - '13376856682207.406 14710425.407' + >>> "{:.1f} {:.1f}".format(poly_area, poly_perimeter) + '13376856682207.4 14710425.4' Parameters ===================================== test/test_datadir.py ===================================== @@ -1,13 +1,13 @@ import os import shutil import tempfile -import unittest from contextlib import contextmanager import pytest from mock import patch from pyproj import CRS +from pyproj._datadir import ContextManager from pyproj.datadir import DataDirError, append_data_dir, get_data_dir, set_data_dir @@ -27,16 +27,9 @@ def proj_env(): """ Ensure environment variable the same at the end of the test. """ - proj_lib = os.environ.get("PROJ_LIB") try: yield finally: - if proj_lib is not None: - # add it back if it used to be there - os.environ["PROJ_LIB"] = proj_lib - else: - # remove it if it wasn't there previously - os.environ.pop("PROJ_LIB", None) # make sure the data dir is cleared set_data_dir(None) @@ -53,72 +46,108 @@ def temporary_directory(): shutil.rmtree(temp_dir) - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") +_INVALID_PATH = "/invalid/path/to/nowhere" + + +def setup_os_mock(os_mock, abspath_return=_INVALID_PATH, proj_dir=None): + os_mock.path.abspath.return_value = abspath_return + os_mock.path.join = os.path.join + os_mock.path.dirname = os.path.dirname + os_mock.path.exists = os.path.exists + os_mock.pathsep = os.pathsep + if proj_dir is None: + os_mock.environ = {} + else: + os_mock.environ = {"PROJ_LIB": proj_dir} + + def test_get_data_dir__missing(): with proj_env(), pytest.raises(DataDirError), patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ), patch("pyproj.datadir.find_executable", return_value=None): + "pyproj.datadir.find_executable", return_value=None + ), patch("pyproj.datadir.os") as os_mock, patch("pyproj.datadir.sys") as sys_mock: + sys_mock.prefix = _INVALID_PATH + setup_os_mock(os_mock) unset_data_dir() - os.environ.pop("PROJ_LIB", None) assert get_data_dir() is None +def test_condext_manager_datadir_missing(): + with proj_env(), pytest.raises(DataDirError), patch( + "pyproj._datadir.get_data_dir", side_effect=DataDirError("test") + ): + ContextManager().set_search_paths() + + def test_get_data_dir__from_user(): - with proj_env(), temporary_directory() as tmpdir, temporary_directory() as tmpdir_env: # noqa: E501 + with proj_env(), temporary_directory() as tmpdir, patch( + "pyproj.datadir.os" + ) as os_mock, patch( + "pyproj.datadir.sys" + ) as sys_mock, temporary_directory() as tmpdir_env: # noqa: E501 + setup_os_mock( + os_mock, + abspath_return=os.path.join(tmpdir, "randomfilename.py"), + proj_dir=tmpdir_env, + ) + sys_mock.prefix = tmpdir_env create_projdb(tmpdir) - os.environ["PROJ_LIB"] = tmpdir_env create_projdb(tmpdir_env) set_data_dir(tmpdir) internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") os.makedirs(internal_proj_dir) create_projdb(internal_proj_dir) - with patch("pyproj.datadir.os.path.abspath") as abspath_mock: - abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") - assert get_data_dir() == tmpdir + assert get_data_dir() == tmpdir def test_get_data_dir__internal(): - with proj_env(), temporary_directory() as tmpdir: + with proj_env(), temporary_directory() as tmpdir, patch( + "pyproj.datadir.os" + ) as os_mock, temporary_directory() as tmpdir_fake, patch( + "pyproj.datadir.sys" + ) as sys_mock: + setup_os_mock( + os_mock, + abspath_return=os.path.join(tmpdir, "randomfilename.py"), + proj_dir=tmpdir_fake, + ) + sys_mock.prefix = tmpdir_fake unset_data_dir() - os.environ["PROJ_LIB"] = tmpdir create_projdb(tmpdir) + create_projdb(tmpdir_fake) internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") os.makedirs(internal_proj_dir) create_projdb(internal_proj_dir) - with patch("pyproj.datadir.os.path.abspath") as abspath_mock: - abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") - assert get_data_dir() == internal_proj_dir + assert get_data_dir() == internal_proj_dir - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_env_var(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ): + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: + setup_os_mock(os_mock, proj_dir=tmpdir) + sys_mock.prefix = _INVALID_PATH unset_data_dir() - os.environ["PROJ_LIB"] = tmpdir create_projdb(tmpdir) assert get_data_dir() == tmpdir - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_env_var__multiple(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ): + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: + setup_os_mock(os_mock, proj_dir=os.pathsep.join([tmpdir, tmpdir, tmpdir])) + sys_mock.prefix = _INVALID_PATH unset_data_dir() - os.environ["PROJ_LIB"] = os.pathsep.join([tmpdir, tmpdir, tmpdir]) create_projdb(tmpdir) assert get_data_dir() == os.pathsep.join([tmpdir, tmpdir, tmpdir]) - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_prefix(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ), patch("pyproj.datadir.sys") as sys_mock: + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock: + setup_os_mock(os_mock) unset_data_dir() - os.environ.pop("PROJ_LIB", None) sys_mock.prefix = tmpdir proj_dir = os.path.join(tmpdir, "share", "proj") os.makedirs(proj_dir) @@ -126,13 +155,15 @@ def test_get_data_dir__from_prefix(): assert get_data_dir() == proj_dir - at unittest.skipIf(os.name == "nt", reason="Cannot modify Windows environment variables.") def test_get_data_dir__from_path(): with proj_env(), temporary_directory() as tmpdir, patch( - "pyproj.datadir.os.path.abspath", return_value="INVALID" - ), patch("pyproj.datadir.find_executable") as find_exe: + "pyproj.datadir.os" + ) as os_mock, patch("pyproj.datadir.sys") as sys_mock, patch( + "pyproj.datadir.find_executable" + ) as find_exe: + setup_os_mock(os_mock) + sys_mock.prefix = _INVALID_PATH unset_data_dir() - os.environ.pop("PROJ_LIB", None) find_exe.return_value = os.path.join(tmpdir, "bin", "proj") proj_dir = os.path.join(tmpdir, "share", "proj") os.makedirs(proj_dir) @@ -141,18 +172,18 @@ def test_get_data_dir__from_path(): def test_append_data_dir__internal(): - with proj_env(), temporary_directory() as tmpdir: + with proj_env(), temporary_directory() as tmpdir, patch( + "pyproj.datadir.os" + ) as os_mock: + setup_os_mock(os_mock, os.path.join(tmpdir, "randomfilename.py")) unset_data_dir() - os.environ["PROJ_LIB"] = tmpdir create_projdb(tmpdir) internal_proj_dir = os.path.join(tmpdir, "proj_dir", "share", "proj") os.makedirs(internal_proj_dir) create_projdb(internal_proj_dir) extra_datadir = str(os.path.join(tmpdir, "extra_datumgrids")) - with patch("pyproj.datadir.os.path.abspath") as abspath_mock: - abspath_mock.return_value = os.path.join(tmpdir, "randomfilename.py") - append_data_dir(extra_datadir) - assert get_data_dir() == os.pathsep.join([internal_proj_dir, extra_datadir]) + append_data_dir(extra_datadir) + assert get_data_dir() == os.pathsep.join([internal_proj_dir, extra_datadir]) def test_creating_multiple_crs_without_file_limit(): ===================================== test/test_doctest_wrapper.py ===================================== @@ -3,6 +3,7 @@ This is a wrapper for the doctests in lib/pyproj/__init__.py so that pytest can conveniently run all the tests in a single command line. """ import os +import platform import pyproj @@ -15,7 +16,7 @@ def test_doctests(): try: import shapely # noqa except ImportError: - if os.name == "nt": + if os.name == "nt" or platform.uname()[4] != "x86_64": expected_failure_count = 6 # if the below line fails, doctests have failed ===================================== test/test_geod.py ===================================== @@ -1,6 +1,7 @@ import math import os import pickle +import platform import shutil import tempfile from contextlib import contextmanager @@ -26,8 +27,9 @@ except ImportError: SHAPELY_LOADED = False -skip_shapely_windows = pytest.mark.skipif( - not SHAPELY_LOADED and os.name == "nt", reason="Missing shapely wheels for Windows." +skip_shapely = pytest.mark.skipif( + not SHAPELY_LOADED and (os.name == "nt" or platform.uname()[4] != "x86_64"), + reason="Missing shapely wheels for Windows.", ) @@ -171,23 +173,23 @@ def test_polygon_area_perimeter__single_point(): assert perimeter == 0 - at skip_shapely_windows + at skip_shapely def test_geometry_length__point(): geod = Geod(ellps="WGS84") assert geod.geometry_length(Point(1, 2)) == 0 - at skip_shapely_windows + at skip_shapely def test_geometry_length__linestring(): geod = Geod(ellps="WGS84") assert_almost_equal( geod.geometry_length(LineString([Point(1, 2), Point(3, 4)])), 313588.39721259556, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__linestring__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -201,11 +203,11 @@ def test_geometry_length__linestring__radians(): radians=True, ), 313588.39721259556, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__linearring(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -213,11 +215,11 @@ def test_geometry_length__linearring(): LinearRing(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__polygon(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -225,11 +227,11 @@ def test_geometry_length__polygon(): Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__polygon__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -246,22 +248,22 @@ def test_geometry_length__polygon__radians(): radians=True, ), 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multipolygon(): geod = Geod(ellps="WGS84") polygon = Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) assert_almost_equal( geod.geometry_length(MultiPolygon([polygon, polygon])), 2 * 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multipolygon__radians(): geod = Geod(ellps="WGS84") polygon = Polygon( @@ -276,22 +278,22 @@ def test_geometry_length__multipolygon__radians(): assert_almost_equal( geod.geometry_length(MultiPolygon([polygon, polygon]), radians=True), 2 * 1072185.2103813463, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multilinestring(): geod = Geod(ellps="WGS84") line_string = LineString([Point(1, 2), Point(3, 4), Point(5, 2)]) assert_almost_equal( geod.geometry_length(MultiLineString([line_string, line_string])), 1254353.5888503822, - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_length__multipoint(): geod = Geod(ellps="WGS84") assert ( @@ -299,23 +301,23 @@ def test_geometry_length__multipoint(): ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__point(): geod = Geod(ellps="WGS84") assert geod.geometry_area_perimeter(Point(1, 2)) == (0, 0) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__linestring(): geod = Geod(ellps="WGS84") assert_almost_equal( geod.geometry_area_perimeter(LineString([Point(1, 2), Point(3, 4)])), (0.0, 627176.7944251911), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__linestring__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -329,11 +331,11 @@ def test_geometry_area_perimeter__linestring__radians(): radians=True, ), (0.0, 627176.7944251911), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__linearring(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -341,11 +343,11 @@ def test_geometry_area_perimeter__linearring(): LinearRing(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), (-49187690467.58623, 1072185.2103813463), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__polygon(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -353,11 +355,11 @@ def test_geometry_area_perimeter__polygon(): Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) ), (-49187690467.58623, 1072185.2103813463), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__polygon__radians(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -374,11 +376,11 @@ def test_geometry_area_perimeter__polygon__radians(): radians=True, ), (-49187690467.58623, 1072185.2103813463), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__polygon__holes(): geod = Geod(ellps="WGS84") assert_almost_equal( @@ -389,21 +391,22 @@ def test_geometry_area_perimeter__polygon__holes(): ) ), (-944373881400.3394, 3979008.0359657984), + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multipolygon(): geod = Geod(ellps="WGS84") polygon = Polygon(LineString([Point(1, 2), Point(3, 4), Point(5, 2)])) assert_almost_equal( geod.geometry_area_perimeter(MultiPolygon([polygon, polygon])), (-98375380935.17245, 2144370.4207626926), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multipolygon__radians(): geod = Geod(ellps="WGS84") polygon = Polygon( @@ -418,22 +421,22 @@ def test_geometry_area_perimeter__multipolygon__radians(): assert_almost_equal( geod.geometry_area_perimeter(MultiPolygon([polygon, polygon]), radians=True), (-98375380935.17245, 2144370.4207626926), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multilinestring(): geod = Geod(ellps="WGS84") line_string = LineString([Point(1, 2), Point(3, 4), Point(5, 2)]) assert_almost_equal( geod.geometry_area_perimeter(MultiLineString([line_string, line_string])), (-98375380935.17245, 2144370.4207626926), - decimal=3, + decimal=2, ) - at skip_shapely_windows + at skip_shapely def test_geometry_area_perimeter__multipoint(): geod = Geod(ellps="WGS84") assert geod.geometry_area_perimeter( ===================================== test/test_proj.py ===================================== @@ -403,5 +403,13 @@ def test_is_exact_same_different_type(): assert not Proj("epsg:4326").is_exact_same(None) +def test_reset_errno(): + proj = Proj( + {"proj": "laea", "lat_0": -90, "lon_0": 0, "a": 6371228.0, "units": "m"} + ) + assert not proj.crs.is_geographic + assert proj(0, 0, inverse=True, errcheck=True) == (0.0, -90.0) + + if __name__ == "__main__": unittest.main() ===================================== test/test_transformer.py ===================================== @@ -1,3 +1,5 @@ +from pkg_resources import parse_version + import numpy as np import pytest from numpy.testing import assert_almost_equal @@ -93,7 +95,9 @@ def test_equivalent_proj(): def test_equivalent_proj__disabled(): transformer = Transformer.from_proj(3857, pyproj.Proj(3857).crs.to_proj4()) assert not transformer._transformer.skip_equivalent - assert not transformer._transformer.projections_equivalent + assert transformer._transformer.projections_equivalent == ( + parse_version(pyproj.proj_version_str) >= parse_version("6.2.0") + ) assert not transformer._transformer.projections_exact_same View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/ffd25ee3d30ed1e39985b720c70e2f78a74fca8d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/ffd25ee3d30ed1e39985b720c70e2f78a74fca8d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 1 06:40:46 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 05:40:46 +0000 Subject: Processing of python-pyproj_2.3.1+ds-1~exp1_source.changes Message-ID: python-pyproj_2.3.1+ds-1~exp1_source.changes uploaded successfully to localhost along with the files: python-pyproj_2.3.1+ds-1~exp1.dsc python-pyproj_2.3.1+ds.orig.tar.xz python-pyproj_2.3.1+ds-1~exp1.debian.tar.xz python-pyproj_2.3.1+ds-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 1 06:51:02 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 05:51:02 +0000 Subject: python-pyproj_2.3.1+ds-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 01 Sep 2019 07:27:23 +0200 Source: python-pyproj Architecture: source Version: 2.3.1+ds-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pyproj (2.3.1+ds-1~exp1) experimental; urgency=medium . * New upstream release. * Drop patches. Checksums-Sha1: eaef7738f8c2211e5deb6b4220240fe3f93cdf85 2221 python-pyproj_2.3.1+ds-1~exp1.dsc 76c9eaf6abb46ac08c5328bf7e20094148638686 78052 python-pyproj_2.3.1+ds.orig.tar.xz 1e1508cb35761847c02657c90d45ef11ccb0ca03 6056 python-pyproj_2.3.1+ds-1~exp1.debian.tar.xz 53d5d700845706be5a2c89cdb2a6166c824ebc5d 8673 python-pyproj_2.3.1+ds-1~exp1_amd64.buildinfo Checksums-Sha256: aa8b640fa5f3fe3ad8a624977f22c610c9469d4c589cf7646a5fcd5041a12bee 2221 python-pyproj_2.3.1+ds-1~exp1.dsc 12ec5b3e820275d71643cc2cd746db397a4b8b6044adc2adb0e2a957265f751a 78052 python-pyproj_2.3.1+ds.orig.tar.xz 959f53f3a8976851c6510979d4573b1ad9f9928b73a621c7fafeae145f210ce6 6056 python-pyproj_2.3.1+ds-1~exp1.debian.tar.xz 915ea60a73d0ae7f2e568f615f73a85d43e407fc2b1f4c111f9ae37984ea64eb 8673 python-pyproj_2.3.1+ds-1~exp1_amd64.buildinfo Files: ccdd345acda3eb282fadd08bfd158ab1 2221 python optional python-pyproj_2.3.1+ds-1~exp1.dsc 81ae518c7b69e84cdfab6df7e3699a37 78052 python optional python-pyproj_2.3.1+ds.orig.tar.xz d16e892caeb61cdaaf1c7ff2d848278e 6056 python optional python-pyproj_2.3.1+ds-1~exp1.debian.tar.xz a967087cb8e60b193f1217be62a706e3 8673 python optional python-pyproj_2.3.1+ds-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1rWEQACgkQZ1DxCuiN SvHAYhAAipu3zHlY7HvUM7jxKv+KGPW6D7xR2hR9Kacz73LmDKk6DKAMvGjD1WGF dylcmZ52n6zoqQQqVbfUYdml8yNDZbw9u9T/twt5IxPqNCmM1ZIbSLZej7KYc88k cV8KBIsajyJZhCa/ffX1NOj+Uh2oasPFU+hD+pXLfpFojK0DtzEsW5GUOsXv1Fyh 4QY3mX4xoLBaDg635QpIYY2nztuLyeKY+ELlTc9wWIrIDK+Vw9fdMJ3WKesATACk 3J4P0eqOMNdyUlItyTg59yaKRJ8o2X4m3uP+fTVRkgHjUCiKBIGWcRQIs56xhWSM DuvmbE3Z8gepykuLLAP7gaE8vwr6HUeIZJbFysSRI8pueVcmqt0Rcxtt2YI/S6rg 1kdG7iTDtWWX7wK2vH8FruHKaJor8iHCBZyDk2tAQ2o+NYF9GJEHKoWf/YOyyf2l KZ+nDookixUMZ8u0KQmZh8KoQ+HB6ZJrFRxDsETyq2PoAxfM8OZlXrAfBJ6YCD0H pXf0ObE7QoVoF4vvbFSEI+21pVg7ngND7Lm33s64HLyUS1q8DGQJGl2uk+LBSOlD e7esYzyfRvQZ4pCNHW7Qjkq+JLrdpoD5CQs8MmVkVaUeVW42EWLheO4qzDeXAG6o QDZSZHyaxFWojw5dQaYXtcgDDJDwtEwo0kKfC1FP41JPei2iM2Q= =jePt -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sun Sep 1 07:35:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:35:09 +0000 Subject: [Git][debian-gis-team/pycsw][master] 7 commits: New upstream version 2.4.1+dfsg Message-ID: <5d6b669dc8290_577b2ade5d82550819576d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pycsw Commits: ed0a670a by Bas Couwenberg at 2019-09-01T05:35:51Z New upstream version 2.4.1+dfsg - - - - - 0ee00b29 by Bas Couwenberg at 2019-09-01T05:35:56Z Update upstream source from tag 'upstream/2.4.1+dfsg' Update to upstream version '2.4.1+dfsg' with Debian dir 9cef66a7c682ada7045f69eb6fbb0dca56eb3bdf - - - - - 15745189 by Bas Couwenberg at 2019-09-01T05:36:15Z New upstream release. - - - - - f229d81b by Bas Couwenberg at 2019-09-01T05:38:28Z Refresh patches. - - - - - 2f391a26 by Bas Couwenberg at 2019-09-01T06:03:21Z Drop removal of empty directories removed upstream. - - - - - 09ada130 by Bas Couwenberg at 2019-09-01T06:27:45Z Add patch to not require exact versions of dependencies. - - - - - b4045d03 by Bas Couwenberg at 2019-09-01T06:27:45Z Set distribution to unstable. - - - - - 30 changed files: - VERSION.txt - debian/changelog - debian/patches/0002-Remove-externally-linked-files.patch - debian/patches/series - + debian/patches/version-requirements.patch - debian/rules - pycsw/core/metadata.py - requirements-standalone.txt - requirements.txt - setup.py - tests/functionaltests/suites/apiso-inspire/default.cfg - tests/functionaltests/suites/apiso-inspire/expected/get_GetCapabilities-lang.xml - tests/functionaltests/suites/apiso-inspire/expected/get_GetCapabilities.xml - tests/functionaltests/suites/apiso/default.cfg - tests/functionaltests/suites/apiso/expected/post_DescribeRecord.xml - tests/functionaltests/suites/apiso/expected/post_GetCapabilities.xml - tests/functionaltests/suites/apiso/expected/post_GetRecordById-brief.xml - tests/functionaltests/suites/apiso/expected/post_GetRecordById-full.xml - tests/functionaltests/suites/apiso/expected/post_GetRecordById-srv-brief.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-all-csw-output.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-all.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-cql-title.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-elementname.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-and-nested-spatial-or-dateline.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-anytext.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-bbox-csw-output.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-bbox.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-servicetype.xml - tests/functionaltests/suites/atom/default.cfg - tests/functionaltests/suites/atom/expected/get_opensearch-description.xml The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/compare/c3e756c6d887d230bef314c396f0e07e5c5d5fb9...b4045d0393e4d1c25ba6f4a7d58f37a84e53659e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/compare/c3e756c6d887d230bef314c396f0e07e5c5d5fb9...b4045d0393e4d1c25ba6f4a7d58f37a84e53659e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:35:11 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:35:11 +0000 Subject: [Git][debian-gis-team/pycsw][pristine-tar] pristine-tar data for pycsw_2.4.1+dfsg.orig.tar.xz Message-ID: <5d6b669faee7_577b2ade5d825508195938@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / pycsw Commits: d94be770 by Bas Couwenberg at 2019-09-01T05:35:55Z pristine-tar data for pycsw_2.4.1+dfsg.orig.tar.xz - - - - - 2 changed files: - + pycsw_2.4.1+dfsg.orig.tar.xz.delta - + pycsw_2.4.1+dfsg.orig.tar.xz.id Changes: ===================================== pycsw_2.4.1+dfsg.orig.tar.xz.delta ===================================== Binary files /dev/null and b/pycsw_2.4.1+dfsg.orig.tar.xz.delta differ ===================================== pycsw_2.4.1+dfsg.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +6c47488e87b85606687ef285504562ce01d37bcd View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/commit/d94be770669bd63fc8a94f7f557dd7c266e5d610 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/commit/d94be770669bd63fc8a94f7f557dd7c266e5d610 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:35:12 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:35:12 +0000 Subject: [Git][debian-gis-team/pycsw][upstream] New upstream version 2.4.1+dfsg Message-ID: <5d6b66a02a854_577b2ade5d84e91c196130@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / pycsw Commits: ed0a670a by Bas Couwenberg at 2019-09-01T05:35:51Z New upstream version 2.4.1+dfsg - - - - - 30 changed files: - VERSION.txt - pycsw/core/metadata.py - requirements-standalone.txt - requirements.txt - setup.py - tests/functionaltests/suites/apiso-inspire/default.cfg - tests/functionaltests/suites/apiso-inspire/expected/get_GetCapabilities-lang.xml - tests/functionaltests/suites/apiso-inspire/expected/get_GetCapabilities.xml - tests/functionaltests/suites/apiso/default.cfg - tests/functionaltests/suites/apiso/expected/post_DescribeRecord.xml - tests/functionaltests/suites/apiso/expected/post_GetCapabilities.xml - tests/functionaltests/suites/apiso/expected/post_GetRecordById-brief.xml - tests/functionaltests/suites/apiso/expected/post_GetRecordById-full.xml - tests/functionaltests/suites/apiso/expected/post_GetRecordById-srv-brief.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-all-csw-output.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-all.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-cql-title.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-elementname.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-and-nested-spatial-or-dateline.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-anytext.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-bbox-csw-output.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-bbox.xml - tests/functionaltests/suites/apiso/expected/post_GetRecords-filter-servicetype.xml - tests/functionaltests/suites/atom/default.cfg - tests/functionaltests/suites/atom/expected/get_opensearch-description.xml - tests/functionaltests/suites/atom/expected/get_opensearch-ogc-bbox-and-time.xml - tests/functionaltests/suites/atom/expected/get_opensearch-ogc-bbox.xml - tests/functionaltests/suites/atom/expected/get_opensearch-ogc-count-and-page1.xml - tests/functionaltests/suites/atom/expected/get_opensearch-ogc-count-and-page2.xml - tests/functionaltests/suites/atom/expected/get_opensearch-ogc-q-and-bbox.xml The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/commit/ed0a670af7b6411314553253f936ddc6d28c3651 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/commit/ed0a670af7b6411314553253f936ddc6d28c3651 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:35:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:35:14 +0000 Subject: [Git][debian-gis-team/pycsw] Pushed new tag debian/2.4.1+dfsg-1 Message-ID: <5d6b66a2bf659_577b2ade5d7673c8196348@godard.mail> Bas Couwenberg pushed new tag debian/2.4.1+dfsg-1 at Debian GIS Project / pycsw -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/tree/debian/2.4.1+dfsg-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:35:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:35:15 +0000 Subject: [Git][debian-gis-team/pycsw] Pushed new tag upstream/2.4.1+dfsg Message-ID: <5d6b66a391dd4_577b2ade5d84e91c19658@godard.mail> Bas Couwenberg pushed new tag upstream/2.4.1+dfsg at Debian GIS Project / pycsw -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/tree/upstream/2.4.1+dfsg You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 1 07:41:03 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 06:41:03 +0000 Subject: Processing of pycsw_2.4.1+dfsg-1_amd64.changes Message-ID: pycsw_2.4.1+dfsg-1_amd64.changes uploaded successfully to localhost along with the files: pycsw_2.4.1+dfsg-1.dsc pycsw_2.4.1+dfsg.orig.tar.xz pycsw_2.4.1+dfsg-1.debian.tar.xz pycsw-doc_2.4.1+dfsg-1_all.deb pycsw-wsgi_2.4.1+dfsg-1_all.deb pycsw_2.4.1+dfsg-1_all.deb pycsw_2.4.1+dfsg-1_amd64.buildinfo python3-pycsw_2.4.1+dfsg-1_all.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 1 07:50:00 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 06:50:00 +0000 Subject: pycsw_2.4.1+dfsg-1_amd64.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 01 Sep 2019 07:38:37 +0200 Source: pycsw Binary: pycsw pycsw-doc pycsw-wsgi python3-pycsw Architecture: source all Version: 2.4.1+dfsg-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: pycsw - OGC compliant metadata (Catalogue Service for the Web) server pycsw-doc - OGC compliant metadata (Catalogue Service for the Web) server - d pycsw-wsgi - WSGI Apache CSW service based on pycsw python3-pycsw - OGC compliant metadata (Catalogue Service for the Web) server - P Changes: pycsw (2.4.1+dfsg-1) unstable; urgency=medium . * Team upload. * New upstream release. * Refresh patches. * Drop removal of empty directories removed upstream. * Add patch to not require exact versions of dependencies. Checksums-Sha1: f9de68433cc5bfc1b9dd863c2bc319b9166398de 2471 pycsw_2.4.1+dfsg-1.dsc 6edacb04b64de46dee490eadc344e0ca876c809c 1645384 pycsw_2.4.1+dfsg.orig.tar.xz 31b9a06c8e181e89ade97593d382c61e9d99642f 16892 pycsw_2.4.1+dfsg-1.debian.tar.xz 29b2a10e1bf1007616f941fe6283d7a2c54f2ac0 1308636 pycsw-doc_2.4.1+dfsg-1_all.deb 92373e427d656900145b4b2cc88ce7a7ab12d8e7 125708 pycsw-wsgi_2.4.1+dfsg-1_all.deb e84adeb5321a1c5060d1e9f4b12278a7ecb5082f 20852 pycsw_2.4.1+dfsg-1_all.deb bf230a10ba9ffbda5181cf3089a5a647a0cd6003 11390 pycsw_2.4.1+dfsg-1_amd64.buildinfo 79f78d015032fcdf881519a532d88c93efd91006 245524 python3-pycsw_2.4.1+dfsg-1_all.deb Checksums-Sha256: 43aa500fbb943b0632861327575c8e5237f190bca821edb083af66a9d6dcaace 2471 pycsw_2.4.1+dfsg-1.dsc 78599eaea0e9e950462d95bc6d9c9bf3f66d2cce70313b936dfef8487b033636 1645384 pycsw_2.4.1+dfsg.orig.tar.xz 5f7715feac2eaba3644092d88b873e7464285874473b9c96d99384c7a5b6aa4c 16892 pycsw_2.4.1+dfsg-1.debian.tar.xz e1e7703d303606e70d934a4927eddb13d7ff198f0229b9ca12176bff25d44d20 1308636 pycsw-doc_2.4.1+dfsg-1_all.deb 2bcb1a4279849f1427e3739d140d4bcbd6f9fea6e6bc506f82784975600aff3c 125708 pycsw-wsgi_2.4.1+dfsg-1_all.deb 191ad3a9633dc3f8a44d3cab5f425a0d304a168ef7702a9325f73aeefd53fe92 20852 pycsw_2.4.1+dfsg-1_all.deb 39b69929d64958385e05449295b9fa0947419dc031672b3e00bde2a87230b873 11390 pycsw_2.4.1+dfsg-1_amd64.buildinfo 09e388dce365d749b6d4d9fb43ac9a58c72bca976ccb307f0e858e20e8d516b0 245524 python3-pycsw_2.4.1+dfsg-1_all.deb Files: c7c8c431fd5d01ac154de535e91dc199 2471 non-free/python optional pycsw_2.4.1+dfsg-1.dsc 57254c68cf2c598a8de3e6ff681245b7 1645384 non-free/python optional pycsw_2.4.1+dfsg.orig.tar.xz 6cba98efa5e5899e1a393150c1c2d139 16892 non-free/python optional pycsw_2.4.1+dfsg-1.debian.tar.xz 70a763c35fba313bc19dcfc232257575 1308636 non-free/doc optional pycsw-doc_2.4.1+dfsg-1_all.deb 5cab80f73d4c9bdb50835ea92623579e 125708 non-free/web optional pycsw-wsgi_2.4.1+dfsg-1_all.deb 3d76e950b75ade4431f8a485c22e01d1 20852 non-free/web optional pycsw_2.4.1+dfsg-1_all.deb d7db4fa91b990017acb5228fb9824802 11390 non-free/python optional pycsw_2.4.1+dfsg-1_amd64.buildinfo ca9442b40b7890f1f566b2232a03e0fb 245524 non-free/python optional python3-pycsw_2.4.1+dfsg-1_all.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1rZoUACgkQZ1DxCuiN SvF2Mg//YIq7cAAUsWL08mv7sbROBUAm3r/KIB3N92X7mFsBzIggy7TM/9zI7h4V AEzwdn3a2jiONW4E1sdDRcN6sN10tCHxHbCtwkCwCgHh98KfF1D/EQMsKp18Qbhg ST08UnsV2HpfLOp0ZoKK02oPP6cXdiW9m2a+sJe0gebitm5lCRB1pHx0TKetYLRM 96UGE9ewKLtOhL27C8K48A9SFnFI+y9h8Zgkbbhyi6kqhvMtk6v85I7J+K/EBDKX pMmr46CR8VJAmlFSOcRBsJbSMWYQKBp0hTQLwDB7qLp2hliHjW3TaC3xXJmfxHvB CDDIPYXuvsoS70veVI2AfyMfDLWr32Cp3vt0CNWutnVqzXDiEKYgPSu9bnAdSavd RG9vnq5HRaWfJ23hmJWQ9oVQBqXfC2p4oJzsGFtyM9+Kbmgp0og6iD+2W7xAJigL 1Ny7rxl+JQJl/Q7cNZmBMLwHVimEyArSMAGBywxomHb6E0rps3/5Q7/sLuh847IX Yjz8cbJHKCaqQ1v8hP5Y03nB/B0wL2UffgrkDKsnKaSJsYkUoLZkKs8qKgk/eofh wgZemqEouDwvSbTaeDKU+20j8v37JlmHonNHCBN7QKkwypSb8ArhffCPYGsg2vcE /XjZx7Uv1cPsYEorGKEUCmHPGhhdzlVQCchQQkLJPapURxpUhnk= =/pHe -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Sun Sep 1 07:51:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sun, 01 Sep 2019 06:51:03 +0000 Subject: Processed: reassign 939022 References: <8a268c2f-06da-5531-ca54-13583a24f3d9@tiscali.it> Message-ID: Processing commands for control at bugs.debian.org: > reassign 939022 pyresample Bug #939022 [basemap] pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) Bug reassigned from package 'basemap' to 'pyresample'. Ignoring request to alter found versions of bug #939022 to the same values previously set Ignoring request to alter fixed versions of bug #939022 to the same values previously set > thanks Stopping processing here. Please contact me if you need assistance. -- 939022: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939022 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From gitlab at salsa.debian.org Sun Sep 1 07:51:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:51:14 +0000 Subject: [Git][debian-gis-team/mkgmap][master] 4 commits: New upstream version 0.0.0+svn4289 Message-ID: <5d6b6a6266ebe_577b2ade5d825508197290@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mkgmap Commits: 1eb55fcc by Bas Couwenberg at 2019-09-01T06:43:10Z New upstream version 0.0.0+svn4289 - - - - - 94f8270a by Bas Couwenberg at 2019-09-01T06:43:26Z Update upstream source from tag 'upstream/0.0.0+svn4289' Update to upstream version '0.0.0+svn4289' with Debian dir 8c91cbeaafba59f5910f8a951116a594885992ea - - - - - 5cc7726b by Bas Couwenberg at 2019-09-01T06:43:40Z New upstream SVN snapshot. - - - - - b3e4c5ff by Bas Couwenberg at 2019-09-01T06:44:18Z Set distribution to unstable. - - - - - 4 changed files: - debian/changelog - resources/mkgmap-version.properties - src/uk/me/parabola/mkgmap/reader/osm/RestrictionRelation.java - src/uk/me/parabola/mkgmap/reader/polish/PolishMapDataSource.java Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mkgmap (0.0.0+svn4289-1) unstable; urgency=medium + + * New upstream SVN snapshot. + + -- Bas Couwenberg Sun, 01 Sep 2019 08:44:09 +0200 + mkgmap (0.0.0+svn4287-1) unstable; urgency=medium * New upstream SVN snapshot. ===================================== resources/mkgmap-version.properties ===================================== @@ -1,2 +1,2 @@ -svn.version: 4287 -build.timestamp: 2019-06-05T14:49:17+0100 +svn.version: 4289 +build.timestamp: 2019-08-23T19:00:49+0100 ===================================== src/uk/me/parabola/mkgmap/reader/osm/RestrictionRelation.java ===================================== @@ -516,7 +516,7 @@ public class RestrictionRelation extends Relation { for (Coord v: viaPoints){ CoordNode vn = nodeIdMap.get(v); if (vn == null){ - log.error(messagePrefix,"via node is not a routing node"); + log.warn(messagePrefix,"via node is not a routing node, restriction relation is ignored"); return; } viaNodes.add(vn); ===================================== src/uk/me/parabola/mkgmap/reader/polish/PolishMapDataSource.java ===================================== @@ -508,11 +508,11 @@ public class PolishMapDataSource extends MapperBasedMapDataSource implements Loa } else nextPos = 10; city = strings[nextPos]; - if ("-1".equals(city)){ + if (!"-1".equals(city)){ region = strings[nextPos + 1]; country = strings[nextPos + 2]; nums.setCityInfo(Numbers.RIGHT, createCityInfo(city, region, country)); - } + } } return nums; } View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/compare/4ca9e93f055157f9fd52397ae72a85bb10c7499d...b3e4c5ff903750612f6cf4014609d88c76d2161f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/compare/4ca9e93f055157f9fd52397ae72a85bb10c7499d...b3e4c5ff903750612f6cf4014609d88c76d2161f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:51:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:51:15 +0000 Subject: [Git][debian-gis-team/mkgmap][pristine-tar] pristine-tar data for mkgmap_0.0.0+svn4289.orig.tar.gz Message-ID: <5d6b6a6346925_577b2ade5d84e91c197434@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / mkgmap Commits: 0d921ac6 by Bas Couwenberg at 2019-09-01T06:43:26Z pristine-tar data for mkgmap_0.0.0+svn4289.orig.tar.gz - - - - - 2 changed files: - + mkgmap_0.0.0+svn4289.orig.tar.gz.delta - + mkgmap_0.0.0+svn4289.orig.tar.gz.id Changes: ===================================== mkgmap_0.0.0+svn4289.orig.tar.gz.delta ===================================== Binary files /dev/null and b/mkgmap_0.0.0+svn4289.orig.tar.gz.delta differ ===================================== mkgmap_0.0.0+svn4289.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +5202d5b2403098f6484785470b9fb09c3da61d21 View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/commit/0d921ac6f7821cc66165a3efb36cff93c4981bb4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/commit/0d921ac6f7821cc66165a3efb36cff93c4981bb4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:51:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:51:17 +0000 Subject: [Git][debian-gis-team/mkgmap][upstream] New upstream version 0.0.0+svn4289 Message-ID: <5d6b6a651a1c0_577b2ade5d7673c81976f9@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / mkgmap Commits: 1eb55fcc by Bas Couwenberg at 2019-09-01T06:43:10Z New upstream version 0.0.0+svn4289 - - - - - 3 changed files: - resources/mkgmap-version.properties - src/uk/me/parabola/mkgmap/reader/osm/RestrictionRelation.java - src/uk/me/parabola/mkgmap/reader/polish/PolishMapDataSource.java Changes: ===================================== resources/mkgmap-version.properties ===================================== @@ -1,2 +1,2 @@ -svn.version: 4287 -build.timestamp: 2019-06-05T14:49:17+0100 +svn.version: 4289 +build.timestamp: 2019-08-23T19:00:49+0100 ===================================== src/uk/me/parabola/mkgmap/reader/osm/RestrictionRelation.java ===================================== @@ -516,7 +516,7 @@ public class RestrictionRelation extends Relation { for (Coord v: viaPoints){ CoordNode vn = nodeIdMap.get(v); if (vn == null){ - log.error(messagePrefix,"via node is not a routing node"); + log.warn(messagePrefix,"via node is not a routing node, restriction relation is ignored"); return; } viaNodes.add(vn); ===================================== src/uk/me/parabola/mkgmap/reader/polish/PolishMapDataSource.java ===================================== @@ -508,11 +508,11 @@ public class PolishMapDataSource extends MapperBasedMapDataSource implements Loa } else nextPos = 10; city = strings[nextPos]; - if ("-1".equals(city)){ + if (!"-1".equals(city)){ region = strings[nextPos + 1]; country = strings[nextPos + 2]; nums.setCityInfo(Numbers.RIGHT, createCityInfo(city, region, country)); - } + } } return nums; } View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/commit/1eb55fcc7d296e1248b2c55423d6bef8970c4364 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/commit/1eb55fcc7d296e1248b2c55423d6bef8970c4364 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:51:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:51:27 +0000 Subject: [Git][debian-gis-team/mkgmap] Pushed new tag debian/0.0.0+svn4289-1 Message-ID: <5d6b6a6f5fec7_577b2ade5d84e91c19784d@godard.mail> Bas Couwenberg pushed new tag debian/0.0.0+svn4289-1 at Debian GIS Project / mkgmap -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/tree/debian/0.0.0+svn4289-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 07:51:28 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 06:51:28 +0000 Subject: [Git][debian-gis-team/mkgmap] Pushed new tag upstream/0.0.0+svn4289 Message-ID: <5d6b6a7014ae6_577b2ade5d7ee8f019803@godard.mail> Bas Couwenberg pushed new tag upstream/0.0.0+svn4289 at Debian GIS Project / mkgmap -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/tree/upstream/0.0.0+svn4289 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 1 08:01:09 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 07:01:09 +0000 Subject: Processing of mkgmap_0.0.0+svn4289-1_source.changes Message-ID: mkgmap_0.0.0+svn4289-1_source.changes uploaded successfully to localhost along with the files: mkgmap_0.0.0+svn4289-1.dsc mkgmap_0.0.0+svn4289.orig.tar.gz mkgmap_0.0.0+svn4289-1.debian.tar.xz mkgmap_0.0.0+svn4289-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From owner at bugs.debian.org Sun Sep 1 08:03:09 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sun, 01 Sep 2019 07:03:09 +0000 Subject: Processed: found 939022 in pyresample/1.12.3-5 References: <1567321290-727-bts-sebastic@debian.org> Message-ID: Processing commands for control at bugs.debian.org: > found 939022 pyresample/1.12.3-5 Bug #939022 [pyresample] pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) Marked as found in versions pyresample/1.12.3-5. > thanks Stopping processing here. Please contact me if you need assistance. -- 939022: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939022 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From ftpmaster at ftp-master.debian.org Sun Sep 1 08:19:50 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 07:19:50 +0000 Subject: mkgmap_0.0.0+svn4289-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 01 Sep 2019 08:44:09 +0200 Source: mkgmap Architecture: source Version: 0.0.0+svn4289-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: mkgmap (0.0.0+svn4289-1) unstable; urgency=medium . * New upstream SVN snapshot. Checksums-Sha1: b56cbd3891e1b45c876ff45e5e2c0245112a10f2 2181 mkgmap_0.0.0+svn4289-1.dsc ab3630e8f0d88b4c822de527bd3170795e9d5eb7 1774566 mkgmap_0.0.0+svn4289.orig.tar.gz ca867d77042b8ea4ed1defb3cc6e684143eb74f7 7276 mkgmap_0.0.0+svn4289-1.debian.tar.xz a3970235ce1313d1164215aa8e241754e6d5bffc 10422 mkgmap_0.0.0+svn4289-1_amd64.buildinfo Checksums-Sha256: 2cbb8904cef8fbda2556c50214c943bae805cdb30458d78a62c02c7da7981866 2181 mkgmap_0.0.0+svn4289-1.dsc 965b3d5a57c37087c35741865a78c2512b68dfa3446e79cef86e0a2f41e6853e 1774566 mkgmap_0.0.0+svn4289.orig.tar.gz 541f12a41eeb2cad8690441dce3960a8152cb3956e2d82fb33b7f6d8ceeae88a 7276 mkgmap_0.0.0+svn4289-1.debian.tar.xz b6d07e7ccd5166a1eb44422bf8a3456d4ffd7c44cf7108bf02b7d371baef4fdb 10422 mkgmap_0.0.0+svn4289-1_amd64.buildinfo Files: cd1bbac945856c38db59610ab05238b9 2181 utils optional mkgmap_0.0.0+svn4289-1.dsc 13b1ce35bb2323b069edb4d1d190c9f3 1774566 utils optional mkgmap_0.0.0+svn4289.orig.tar.gz 11e0bf87d26c88152405e1171e2ded1e 7276 utils optional mkgmap_0.0.0+svn4289-1.debian.tar.xz f1c5b5099955fda2ce4558722dad8ff3 10422 utils optional mkgmap_0.0.0+svn4289-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1rakcACgkQZ1DxCuiN SvER2Q//ZdL2iRQE6umhHIW3zKBBMzgwjuleUy+znhyJsKqJCDHzRmrcToX/zaTS DcvSFomtl4CPqzC5fdpKb0D10QyaFCrhTXIXhhXqJw4rgTK/MxHFjOu2D28Ac3ca pMgfPagrn9GQktyDdt/va2KNU+ACNDCz7wV1Lhv+ydg3473EP/zXRy+DJHXYYqn+ 61Xjiig+FVJYjZJuTHiRazK3YWfUWSLAP4aCZfIfr9qTplDmNQzsU9QOdLwXremR 7N5QbmKSpkelf/jS7Ll3ORAMTcDLMA/fl9rY673BPnhoal7GxuR0yr94DXfUdGGP jUxgf7/URk/bvPGL4Y9aEBFp1jT6xDLVTnmekyIM+HyIMHy6btq/ctT+39FUSZo4 zd1X63es2sf/gPXn9wTgbtY/CV6m0d9Usl1k9v9sEmlHUAP1d3FS+scM4gvYV6+/ sGwY8axKAQFZYZDws16BeaNn/ypmp7sli0jMUorYcUxX6P5i8fPhVGCpAyw1AgE2 iKTpFjyUKNi9z4PBOu+4DbyOxX70EcFmoxUd+NgQjSaroC1EkI1kj4NN8rFqeWEj H2GCuDkyhbLhMCMjs4AwdltZCfyojPrnMywGx6zX0iRJJjAzEMFnEAe3LecqHFoo NXkXACDq5keM9SESvQStgMkHFT6f+UhcWP3x1CCcytsWdM4H5bY= =aRfw -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sun Sep 1 08:55:24 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 07:55:24 +0000 Subject: [Git][debian-gis-team/proj][pristine-tar] pristine-tar data for proj_6.2.0.orig.tar.gz Message-ID: <5d6b796c340d0_577b2ade5d84e91c201775@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / proj Commits: 4f4408eb by Bas Couwenberg at 2019-09-01T07:35:14Z pristine-tar data for proj_6.2.0.orig.tar.gz - - - - - 2 changed files: - + proj_6.2.0.orig.tar.gz.delta - + proj_6.2.0.orig.tar.gz.id Changes: ===================================== proj_6.2.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/proj_6.2.0.orig.tar.gz.delta differ ===================================== proj_6.2.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +8dd0a8d2e9fc568745e10238784f4ea9f64e9b78 View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/commit/4f4408eb5487d0328b970ab3a3a1d7286b5349ca -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/commit/4f4408eb5487d0328b970ab3a3a1d7286b5349ca You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 08:55:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 07:55:26 +0000 Subject: [Git][debian-gis-team/proj][upstream] New upstream version 6.2.0 Message-ID: <5d6b796e66e64_577b2ade5d7673c8201986@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / proj Commits: f7998464 by Bas Couwenberg at 2019-09-01T07:34:54Z New upstream version 6.2.0 - - - - - 0 changed files: Changes: View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/commit/f7998464300baae87156cad2fbe91b1ff6676939 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/commit/f7998464300baae87156cad2fbe91b1ff6676939 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 08:55:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 07:55:23 +0000 Subject: [Git][debian-gis-team/proj][experimental] 6 commits: New upstream version 6.2.0 Message-ID: <5d6b796be093c_577b2ade5d7ee8f020151c@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / proj Commits: f7998464 by Bas Couwenberg at 2019-09-01T07:34:54Z New upstream version 6.2.0 - - - - - be21fd98 by Bas Couwenberg at 2019-09-01T07:35:14Z Update upstream source from tag 'upstream/6.2.0' Update to upstream version '6.2.0' with Debian dir 4fbddd0ef622549979b2cb4ead687e3fd15e6434 - - - - - 27978788 by Bas Couwenberg at 2019-09-01T07:35:43Z New upstream release. - - - - - b4b1b7a0 by Bas Couwenberg at 2019-09-01T07:37:23Z Update symbols for other architectures. - - - - - 2b51696f by Bas Couwenberg at 2019-09-01T07:38:13Z Strip pre-releases from symbols version. - - - - - dba97801 by Bas Couwenberg at 2019-09-01T07:38:37Z Set distribution to experimental. - - - - - 2 changed files: - debian/changelog - debian/libproj15.symbols Changes: ===================================== debian/changelog ===================================== @@ -1,8 +1,11 @@ -proj (6.2.0~rc1-1~exp2) UNRELEASED; urgency=medium +proj (6.2.0-1~exp1) experimental; urgency=medium + * New upstream release. * Don't remove data/null on clean, included upstream too. + * Update symbols for other architectures. + * Strip pre-releases from symbols version. - -- Bas Couwenberg Tue, 27 Aug 2019 07:31:58 +0200 + -- Bas Couwenberg Sun, 01 Sep 2019 09:38:16 +0200 proj (6.2.0~rc1-1~exp1) experimental; urgency=medium ===================================== debian/libproj15.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 6.2.0~rc1 amd64 +# SymbolsHelper-Confirmed: 6.2.0~rc1 alpha amd64 arm64 armhf hppa hurd-i386 i386 ia64 kfreebsd-i386 m68k powerpc ppc64 ppc64el riscv64 s390x sh4 sparc64 x32 libproj.so.15 #PACKAGE# #MINVER# * Build-Depends-Package: libproj-dev _Z10pj_ell_setP9projCtx_tP8ARG_listPdS3_ at Base 6.0.0 @@ -134,16 +134,16 @@ libproj.so.15 #PACKAGE# #MINVER# _ZN5osgeo4proj2io12WKTFormatter9setStrictEb at Base 6.0.0 _ZN5osgeo4proj2io12WKTFormatterD1Ev at Base 6.0.0 _ZN5osgeo4proj2io12WKTFormatterD2Ev at Base 6.0.0 - _ZN5osgeo4proj2io13JSONFormatter12setMultiLineEb at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatter13ObjectContextC1ERS2_PKcb at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatter13ObjectContextC2ERS2_PKcb at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatter13ObjectContextD1Ev at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatter13ObjectContextD2Ev at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatter19setIndentationWidthEi at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatter6createESt10shared_ptrINS1_15DatabaseContextEE at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatter9setSchemaERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatterD1Ev at Base 6.2.0~rc1 - _ZN5osgeo4proj2io13JSONFormatterD2Ev at Base 6.2.0~rc1 + _ZN5osgeo4proj2io13JSONFormatter12setMultiLineEb at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatter13ObjectContextC1ERS2_PKcb at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatter13ObjectContextC2ERS2_PKcb at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatter13ObjectContextD1Ev at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatter13ObjectContextD2Ev at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatter19setIndentationWidthEi at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatter6createESt10shared_ptrINS1_15DatabaseContextEE at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatter9setSchemaERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatterD1Ev at Base 6.2.0 + _ZN5osgeo4proj2io13JSONFormatterD2Ev at Base 6.2.0 _ZN5osgeo4proj2io14IWKTExportableD0Ev at Base 6.0.0 _ZN5osgeo4proj2io14IWKTExportableD1Ev at Base 6.0.0 _ZN5osgeo4proj2io14IWKTExportableD2Ev at Base 6.0.0 @@ -151,9 +151,9 @@ libproj.so.15 #PACKAGE# #MINVER# _ZN5osgeo4proj2io15DatabaseContext6createERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERKSt6vectorIS8_SaIS8_EEP9projCtx_t at Base 6.0.0 _ZN5osgeo4proj2io15DatabaseContextD1Ev at Base 6.0.0 _ZN5osgeo4proj2io15DatabaseContextD2Ev at Base 6.0.0 - _ZN5osgeo4proj2io15IJSONExportableD0Ev at Base 6.2.0~rc1 - _ZN5osgeo4proj2io15IJSONExportableD1Ev at Base 6.2.0~rc1 - _ZN5osgeo4proj2io15IJSONExportableD2Ev at Base 6.2.0~rc1 + _ZN5osgeo4proj2io15IJSONExportableD0Ev at Base 6.2.0 + _ZN5osgeo4proj2io15IJSONExportableD1Ev at Base 6.2.0 + _ZN5osgeo4proj2io15IJSONExportableD2Ev at Base 6.2.0 _ZN5osgeo4proj2io16AuthorityFactory6createERKN7dropbox6oxygen2nnISt10shared_ptrINS1_15DatabaseContextEEEERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 _ZN5osgeo4proj2io16AuthorityFactory7CRSInfoC1Ev at Base 6.1.0~rc1 _ZN5osgeo4proj2io16AuthorityFactory7CRSInfoC2Ev at Base 6.1.0~rc1 @@ -219,8 +219,8 @@ libproj.so.15 #PACKAGE# #MINVER# _ZN5osgeo4proj2io28NoSuchAuthorityCodeExceptionD0Ev at Base 6.0.0 _ZN5osgeo4proj2io28NoSuchAuthorityCodeExceptionD1Ev at Base 6.0.0 _ZN5osgeo4proj2io28NoSuchAuthorityCodeExceptionD2Ev at Base 6.0.0 - (arch=!amd64 !arm64 !ia64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !riscv64 !s390x !sparc64)_ZN5osgeo4proj2io7WKTNode10createFromERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEj at Base 6.0.0 - (arch=amd64 arm64 ia64 kfreebsd-amd64 mips64el ppc64 ppc64el riscv64 s390x sparc64)_ZN5osgeo4proj2io7WKTNode10createFromERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEm at Base 6.0.0 + (arch=!alpha !amd64 !arm64 !ia64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !riscv64 !s390x !sparc64)_ZN5osgeo4proj2io7WKTNode10createFromERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEj at Base 6.0.0 + (arch=alpha amd64 arm64 ia64 kfreebsd-amd64 mips64el ppc64 ppc64el riscv64 s390x sparc64)_ZN5osgeo4proj2io7WKTNode10createFromERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEm at Base 6.2.0 _ZN5osgeo4proj2io7WKTNode8addChildEON7dropbox6oxygen2nnISt10unique_ptrIS2_St14default_deleteIS2_EEEE at Base 6.0.0 _ZN5osgeo4proj2io7WKTNodeC1ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 _ZN5osgeo4proj2io7WKTNodeC2ERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 @@ -781,7 +781,12 @@ libproj.so.15 #PACKAGE# #MINVER# _ZN5osgeo4proj9operation26CoordinateOperationFactoryD0Ev at Base 6.0.0 _ZN5osgeo4proj9operation26CoordinateOperationFactoryD1Ev at Base 6.0.0 _ZN5osgeo4proj9operation26CoordinateOperationFactoryD2Ev at Base 6.0.0 - (optional=templinst)_ZN9__gnu_cxx12__to_xstringINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEcEET_PFiPT0_mPKS8_P13__va_list_tagEmSB_z at Base 6.2.0~rc1 + (optional=templinst|arch=powerpc x32)_ZN9__gnu_cxx12__to_xstringINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEcEET_PFiPT0_jPKS8_P13__va_list_tagEjSB_z at Base 6.2.0 + (optional=templinst)_ZN9__gnu_cxx12__to_xstringINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEcEET_PFiPT0_mPKS8_P13__va_list_tagEmSB_z at Base 6.2.0 + (optional=templinst|arch=alpha sh4|subst)_ZN9__gnu_cxx12__to_xstringINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEcEET_PFiPT0_{size_t}PKS8_13__va_list_tagE{size_t}SB_z at Base 6.2.0 + (optional=templinst|arch=hurd-i386 i386 kfreebsd-i386 ppc64 ppc64el|subst)_ZN9__gnu_cxx12__to_xstringINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEcEET_PFiPT0_{size_t}PKS8_PcE{size_t}SB_z at Base 6.2.0 + (optional=templinst|arch=hppa ia64 m68k riscv64 sparc64|subst)_ZN9__gnu_cxx12__to_xstringINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEcEET_PFiPT0_{size_t}PKS8_PvE{size_t}SB_z at Base 6.2.0 + (optional=templinst|arch=arm64 armhf|subst)_ZN9__gnu_cxx12__to_xstringINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEcEET_PFiPT0_{size_t}PKS8_St9__va_listE{size_t}SB_z at Base 6.2.0 (arch=armel riscv64)_ZN9__gnu_cxx24__concurrence_lock_errorD0Ev at Base 6.0.0 (arch=armel riscv64)_ZN9__gnu_cxx24__concurrence_lock_errorD1Ev at Base 6.0.0 (arch=armel riscv64)_ZN9__gnu_cxx24__concurrence_lock_errorD2Ev at Base 6.0.0 @@ -790,8 +795,8 @@ libproj.so.15 #PACKAGE# #MINVER# (arch=armel riscv64)_ZN9__gnu_cxx26__concurrence_unlock_errorD2Ev at Base 6.0.0 (arch=armel riscv64)_ZN9__gnu_cxx30__throw_concurrence_lock_errorEv at Base 6.0.0 (arch=armel riscv64)_ZN9__gnu_cxx32__throw_concurrence_unlock_errorEv at Base 6.0.0 - (optional=templinst|arch=!amd64 !arm64 !ia64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !riscv64 !s390x !sparc64)_ZN9__gnu_cxx6__stoaIlicJiEEET0_PFT_PKT1_PPS3_DpT2_EPKcS5_PjS9_ at Base 6.0.0 - (optional=templinst|arch=amd64 arm64 ia64 kfreebsd-amd64 mips64el ppc64 ppc64el riscv64 s390x sparc64)_ZN9__gnu_cxx6__stoaIlicJiEEET0_PFT_PKT1_PPS3_DpT2_EPKcS5_PmS9_ at Base 6.0.0 + (optional=templinst|arch=!alpha !amd64 !arm64 !ia64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !riscv64 !s390x !sparc64)_ZN9__gnu_cxx6__stoaIlicJiEEET0_PFT_PKT1_PPS3_DpT2_EPKcS5_PjS9_ at Base 6.0.0 + (optional=templinst|arch=alpha amd64 arm64 ia64 kfreebsd-amd64 mips64el ppc64 ppc64el riscv64 s390x sparc64)_ZN9__gnu_cxx6__stoaIlicJiEEET0_PFT_PKT1_PPS3_DpT2_EPKcS5_PmS9_ at Base 6.0.0 _ZNK5osgeo4proj2cs16CoordinateSystem8axisListEv at Base 6.0.0 _ZNK5osgeo4proj2cs20CoordinateSystemAxis12abbreviationB5cxx11Ev at Base 6.0.0 _ZNK5osgeo4proj2cs20CoordinateSystemAxis12maximumValueEv at Base 6.0.0 @@ -802,7 +807,7 @@ libproj.so.15 #PACKAGE# #MINVER# _ZNK5osgeo4proj2cs8Meridian9longitudeEv at Base 6.0.0 _ZNK5osgeo4proj2io12WKTFormatter8isStrictEv at Base 6.0.0 _ZNK5osgeo4proj2io12WKTFormatter8toStringB5cxx11Ev at Base 6.0.0 - _ZNK5osgeo4proj2io13JSONFormatter8toStringB5cxx11Ev at Base 6.2.0~rc1 + _ZNK5osgeo4proj2io13JSONFormatter8toStringB5cxx11Ev at Base 6.2.0 _ZNK5osgeo4proj2io14IWKTExportable11exportToWKTB5cxx11EPNS1_12WKTFormatterE at Base 6.0.0 _ZNK5osgeo4proj2io15DatabaseContext11getMetadataEPKc at Base 6.0.0 _ZNK5osgeo4proj2io15DatabaseContext14getAuthoritiesB5cxx11Ev at Base 6.0.0 @@ -810,7 +815,7 @@ libproj.so.15 #PACKAGE# #MINVER# _ZNK5osgeo4proj2io15DatabaseContext15lookForGridInfoERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERS8_SB_SB_RbSC_SC_ at Base 6.0.0 _ZNK5osgeo4proj2io15DatabaseContext20getDatabaseStructureB5cxx11Ev at Base 6.0.0 _ZNK5osgeo4proj2io15DatabaseContext7getPathB5cxx11Ev at Base 6.0.0 - _ZNK5osgeo4proj2io15IJSONExportable12exportToJSONB5cxx11EPNS1_13JSONFormatterE at Base 6.2.0~rc1 + _ZNK5osgeo4proj2io15IJSONExportable12exportToJSONB5cxx11EPNS1_13JSONFormatterE at Base 6.2.0 _ZNK5osgeo4proj2io16AuthorityFactory11createDatumERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 _ZNK5osgeo4proj2io16AuthorityFactory12createExtentERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 _ZNK5osgeo4proj2io16AuthorityFactory12createObjectERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 @@ -830,8 +835,8 @@ libproj.so.15 #PACKAGE# #MINVER# _ZNK5osgeo4proj2io16AuthorityFactory19createPrimeMeridianERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 _ZNK5osgeo4proj2io16AuthorityFactory19createUnitOfMeasureERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 _ZNK5osgeo4proj2io16AuthorityFactory19createVerticalDatumERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 - (arch=!amd64 !arm64 !ia64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !riscv64 !s390x !sparc64)_ZNK5osgeo4proj2io16AuthorityFactory21createObjectsFromNameERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERKSt6vectorINS2_10ObjectTypeESaISC_EEbj at Base 6.0.0 - (arch=amd64 arm64 ia64 kfreebsd-amd64 mips64el ppc64 ppc64el riscv64 s390x sparc64)_ZNK5osgeo4proj2io16AuthorityFactory21createObjectsFromNameERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERKSt6vectorINS2_10ObjectTypeESaISC_EEbm at Base 6.0.0 + (arch=!alpha !amd64 !arm64 !ia64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !riscv64 !s390x !sparc64)_ZNK5osgeo4proj2io16AuthorityFactory21createObjectsFromNameERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERKSt6vectorINS2_10ObjectTypeESaISC_EEbj at Base 6.0.0 + (arch=alpha amd64 arm64 ia64 kfreebsd-amd64 mips64el ppc64 ppc64el riscv64 s390x sparc64)_ZNK5osgeo4proj2io16AuthorityFactory21createObjectsFromNameERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERKSt6vectorINS2_10ObjectTypeESaISC_EEbm at Base 6.2.0 _ZNK5osgeo4proj2io16AuthorityFactory21listAreaOfUseFromNameERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEb at Base 6.0.0 _ZNK5osgeo4proj2io16AuthorityFactory22createCoordinateSystemERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 6.0.0 _ZNK5osgeo4proj2io16AuthorityFactory24getOfficialNameFromAliasERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESA_SA_bRS8_SB_SB_ at Base 6.0.0 @@ -940,7 +945,7 @@ libproj.so.15 #PACKAGE# #MINVER# _ZNK5osgeo4proj5datum9Ellipsoid8isSphereEv at Base 6.0.0 _ZNK5osgeo4proj6common11ObjectUsage15_isEquivalentToEPKNS0_4util11IComparableENS4_9CriterionE at Base 6.0.0 _ZNK5osgeo4proj6common11ObjectUsage15baseExportToWKTEPNS0_2io12WKTFormatterE at Base 6.0.0 - _ZNK5osgeo4proj6common11ObjectUsage16baseExportToJSONEPNS0_2io13JSONFormatterE at Base 6.2.0~rc1 + _ZNK5osgeo4proj6common11ObjectUsage16baseExportToJSONEPNS0_2io13JSONFormatterE at Base 6.2.0 _ZNK5osgeo4proj6common11ObjectUsage7domainsEv at Base 6.0.0 _ZNK5osgeo4proj6common12ObjectDomain12_exportToWKTEPNS0_2io12WKTFormatterE at Base 6.0.0 _ZNK5osgeo4proj6common12ObjectDomain15_isEquivalentToEPKNS0_4util11IComparableENS4_9CriterionE at Base 6.0.0 @@ -1065,14 +1070,12 @@ libproj.so.15 #PACKAGE# #MINVER# (optional=templinst)_ZNSt10shared_ptrIN5osgeo4proj9operation14ParameterValueEED2Ev at Base 6.0.0 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj2cs16CoordinateSystemELN9__gnu_cxx12_Lock_policyE1EEC1ERKS6_ at Base 6.1.0~rc1 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj2cs16CoordinateSystemELN9__gnu_cxx12_Lock_policyE1EEC2ERKS6_ at Base 6.1.0~rc1 - (optional=templinst|arch=armhf ia64 m68k mips mips64el mipsel powerpc powerpcspe ppc64 ppc64el s390x sparc64)_ZNSt12__shared_ptrIN5osgeo4proj2cs16CoordinateSystemELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.0.0 - (optional=templinst|arch=armhf ia64 m68k mips mips64el mipsel powerpc powerpcspe ppc64 ppc64el s390x sparc64)_ZNSt12__shared_ptrIN5osgeo4proj2cs16CoordinateSystemELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.0.0 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj3crs11CompoundCRSELN9__gnu_cxx12_Lock_policyE1EEC1ERKS6_ at Base 6.1.0~rc1 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj3crs11CompoundCRSELN9__gnu_cxx12_Lock_policyE1EEC2ERKS6_ at Base 6.1.0~rc1 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj3crs11CompoundCRSELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.1.0~rc1 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj3crs11CompoundCRSELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.1.0~rc1 - (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj3crs11GeodeticCRSELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.2.0~rc1 - (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj3crs11GeodeticCRSELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.2.0~rc1 + (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj3crs11GeodeticCRSELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.2.0 + (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj3crs11GeodeticCRSELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.2.0 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj3crs13GeographicCRSELN9__gnu_cxx12_Lock_policyE1EEC1ERKS6_ at Base 6.0.0 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj3crs13GeographicCRSELN9__gnu_cxx12_Lock_policyE1EEC2ERKS6_ at Base 6.0.0 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj3crs13GeographicCRSELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.0.0 @@ -1089,14 +1092,14 @@ libproj.so.15 #PACKAGE# #MINVER# (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj4util10BaseObjectELN9__gnu_cxx12_Lock_policyE1EEC2INS1_6common16IdentifiedObjectEvEERKS_IT_LS5_1EE at Base 6.0.0 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj4util10BaseObjectELN9__gnu_cxx12_Lock_policyE2EEC1INS1_6common16IdentifiedObjectEvEERKS_IT_LS5_2EE at Base 6.0.0 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj4util10BaseObjectELN9__gnu_cxx12_Lock_policyE2EEC2INS1_6common16IdentifiedObjectEvEERKS_IT_LS5_2EE at Base 6.0.0 - (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum13PrimeMeridianELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.2.0~rc1 - (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum13PrimeMeridianELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.2.0~rc1 + (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum13PrimeMeridianELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.2.0 + (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum13PrimeMeridianELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.2.0 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj5datum22GeodeticReferenceFrameELN9__gnu_cxx12_Lock_policyE1EEC1ERKS6_ at Base 6.1.0~rc1 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj5datum22GeodeticReferenceFrameELN9__gnu_cxx12_Lock_policyE1EEC2ERKS6_ at Base 6.1.0~rc1 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj5datum22GeodeticReferenceFrameELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.1.0~rc1 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj5datum22GeodeticReferenceFrameELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.1.0~rc1 - (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum9EllipsoidELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.2.0~rc1 - (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum9EllipsoidELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.2.0~rc1 + (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum9EllipsoidELN9__gnu_cxx12_Lock_policyE2EEC1ERKS6_ at Base 6.2.0 + (optional=templinst)_ZNSt12__shared_ptrIN5osgeo4proj5datum9EllipsoidELN9__gnu_cxx12_Lock_policyE2EEC2ERKS6_ at Base 6.2.0 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj6common16IdentifiedObjectELN9__gnu_cxx12_Lock_policyE1EEC1INS1_9operation15OperationMethodEvEERKS_IT_LS5_1EE at Base 6.0.0 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN5osgeo4proj6common16IdentifiedObjectELN9__gnu_cxx12_Lock_policyE1EEC2INS1_9operation15OperationMethodEvEERKS_IT_LS5_1EE at Base 6.0.0 (optional=templinst|arch=!armel !riscv64)_ZNSt12__shared_ptrIN5osgeo4proj6common16IdentifiedObjectELN9__gnu_cxx12_Lock_policyE2EEC1INS1_9operation15OperationMethodEvEERKS_IT_LS5_2EE at Base 6.0.0 @@ -1786,7 +1789,8 @@ libproj.so.15 #PACKAGE# #MINVER# (optional=templinst|arch=!armel !riscv64)_ZNSt23_Sp_counted_ptr_inplaceIN5osgeo4proj6common13UnitOfMeasureESaIS3_ELN9__gnu_cxx12_Lock_policyE2EED0Ev at Base 6.0.0 (optional=templinst|arch=!armel !riscv64)_ZNSt23_Sp_counted_ptr_inplaceIN5osgeo4proj6common13UnitOfMeasureESaIS3_ELN9__gnu_cxx12_Lock_policyE2EED1Ev at Base 6.0.0 (optional=templinst|arch=!armel !riscv64)_ZNSt23_Sp_counted_ptr_inplaceIN5osgeo4proj6common13UnitOfMeasureESaIS3_ELN9__gnu_cxx12_Lock_policyE2EED2Ev at Base 6.0.0 - (optional=templinst)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES5_St4lessIS5_ESaISt4pairIKS5_S5_EEEixERS9_ at Base 6.2.0~rc1 + (optional=templinst|arch=alpha hurd-i386 i386 kfreebsd-i386)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEPKcSt4lessIS5_ESaISt4pairIKS5_S7_EEEixEOS5_ at Base 6.2.0 + (optional=templinst)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES5_St4lessIS5_ESaISt4pairIKS5_S5_EEEixERS9_ at Base 6.2.0 (optional=templinst)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEdSt4lessIS5_ESaISt4pairIKS5_dEEEixERS9_ at Base 6.0.0 (optional=templinst)_ZNSt3setINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4lessIS5_ESaIS5_EED1Ev at Base 6.0.0 (optional=templinst)_ZNSt3setINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4lessIS5_ESaIS5_EED2Ev at Base 6.0.0 @@ -1812,9 +1816,9 @@ libproj.so.15 #PACKAGE# #MINVER# (optional=templinst)_ZNSt6vectorISt4pairINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES6_ESaIS7_EE17_M_realloc_insertIJS7_EEEvN9__gnu_cxx17__normal_iteratorIPS7_S9_EEDpOT_ at Base 6.0.0 (optional=templinst)_ZNSt6vectorISt4pairINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES6_ESaIS7_EEaSERKS9_ at Base 6.0.0 (optional=templinst)_ZNSt6vectorIbSaIbEE13_M_insert_auxESt13_Bit_iteratorb at Base 6.0.0 - (optional=templinst)_ZNSt6vectorIbSaIbEE9push_backEb at Base 6.2.0~rc1 - (optional=templinst)_ZNSt6vectorIcSaIcEE12emplace_backIJcEEEvDpOT_ at Base 6.2.0~rc1 - (optional=templinst)_ZNSt6vectorIcSaIcEE17_M_realloc_insertIJcEEEvN9__gnu_cxx17__normal_iteratorIPcS1_EEDpOT_ at Base 6.2.0~rc1 + (optional=templinst)_ZNSt6vectorIbSaIbEE9push_backEb at Base 6.2.0 + (optional=templinst)_ZNSt6vectorIcSaIcEE12emplace_backIJcEEEvDpOT_ at Base 6.2.0 + (optional=templinst)_ZNSt6vectorIcSaIcEE17_M_realloc_insertIJcEEEvN9__gnu_cxx17__normal_iteratorIPcS1_EEDpOT_ at Base 6.2.0 (optional=templinst)_ZNSt6vectorIdSaIdEE12emplace_backIJdEEEvDpOT_ at Base 6.0.0 (optional=templinst)_ZNSt6vectorIdSaIdEE17_M_realloc_insertIJdEEEvN9__gnu_cxx17__normal_iteratorIPdS1_EEDpOT_ at Base 6.0.0 (optional=templinst)_ZNSt6vectorIdSaIdEEaSERKS1_ at Base 6.0.0 @@ -1852,6 +1856,7 @@ libproj.so.15 #PACKAGE# #MINVER# (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_dESt10_Select1stIS8_ESt4lessIS5_ESaIS8_EE4findERS7_ at Base 6.0.0 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_dESt10_Select1stIS8_ESt4lessIS5_ESaIS8_EE8_M_eraseEPSt13_Rb_tree_nodeIS8_E at Base 6.0.0 (optional=templinst)_ZNSt8_Rb_treeISt3setINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4lessIS6_ESaIS6_EESA_St9_IdentityISA_ES7_ISA_ESaISA_EE10_M_insert_IRKSA_NSF_11_Alloc_nodeEEESt17_Rb_tree_iteratorISA_EPSt18_Rb_tree_node_baseSN_OT_RT0_ at Base 6.0.0 + (optional=templinst|arch=!amd64 !arm64 !hppa !m68k !sh4 !x32)_ZNSt8_Rb_treeISt3setINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4lessIS6_ESaIS6_EESA_St9_IdentityISA_ES7_ISA_ESaISA_EE16_M_insert_uniqueIRKSA_EESt4pairISt17_Rb_tree_iteratorISA_EbEOT_ at Base 6.2.0 (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 sh4 x32)_ZNSt8_Rb_treeISt3setINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4lessIS6_ESaIS6_EESA_St9_IdentityISA_ES7_ISA_ESaISA_EE24_M_get_insert_unique_posERKSA_ at Base 6.0.0 (optional=templinst)_ZNSt8_Rb_treeISt3setINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4lessIS6_ESaIS6_EESA_St9_IdentityISA_ES7_ISA_ESaISA_EE8_M_eraseEPSt13_Rb_tree_nodeISA_E at Base 6.0.0 (optional=templinst)_ZNSt8_Rb_treeISt4pairINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES6_ES7_St9_IdentityIS7_ESt4lessIS7_ESaIS7_EE16_M_insert_uniqueIS7_EES0_ISt17_Rb_tree_iteratorIS7_EbEOT_ at Base 6.0.0 @@ -1878,7 +1883,7 @@ libproj.so.15 #PACKAGE# #MINVER# _ZTIN5osgeo4proj2cs8MeridianE at Base 6.0.0 _ZTIN5osgeo4proj2cs9OrdinalCSE at Base 6.0.0 _ZTIN5osgeo4proj2io14IWKTExportableE at Base 6.0.0 - _ZTIN5osgeo4proj2io15IJSONExportableE at Base 6.2.0~rc1 + _ZTIN5osgeo4proj2io15IJSONExportableE at Base 6.2.0 _ZTIN5osgeo4proj2io16FactoryExceptionE at Base 6.0.0 _ZTIN5osgeo4proj2io16ParsingExceptionE at Base 6.0.0 _ZTIN5osgeo4proj2io19FormattingExceptionE at Base 6.0.0 @@ -2076,7 +2081,7 @@ libproj.so.15 #PACKAGE# #MINVER# _ZTSN5osgeo4proj2cs8MeridianE at Base 6.0.0 _ZTSN5osgeo4proj2cs9OrdinalCSE at Base 6.0.0 _ZTSN5osgeo4proj2io14IWKTExportableE at Base 6.0.0 - _ZTSN5osgeo4proj2io15IJSONExportableE at Base 6.2.0~rc1 + _ZTSN5osgeo4proj2io15IJSONExportableE at Base 6.2.0 _ZTSN5osgeo4proj2io16FactoryExceptionE at Base 6.0.0 _ZTSN5osgeo4proj2io16ParsingExceptionE at Base 6.0.0 _ZTSN5osgeo4proj2io19FormattingExceptionE at Base 6.0.0 @@ -2293,7 +2298,7 @@ libproj.so.15 #PACKAGE# #MINVER# _ZTVN5osgeo4proj2cs8MeridianE at Base 6.0.0 _ZTVN5osgeo4proj2cs9OrdinalCSE at Base 6.0.0 _ZTVN5osgeo4proj2io14IWKTExportableE at Base 6.0.0 - _ZTVN5osgeo4proj2io15IJSONExportableE at Base 6.2.0~rc1 + _ZTVN5osgeo4proj2io15IJSONExportableE at Base 6.2.0 _ZTVN5osgeo4proj2io16FactoryExceptionE at Base 6.0.0 _ZTVN5osgeo4proj2io16ParsingExceptionE at Base 6.0.0 _ZTVN5osgeo4proj2io19FormattingExceptionE at Base 6.0.0 @@ -2619,13 +2624,13 @@ libproj.so.15 #PACKAGE# #MINVER# proj_area_destroy at Base 6.0.0 proj_area_set_bbox at Base 6.0.0 proj_as_proj_string at Base 6.0.0 - proj_as_projjson at Base 6.2.0~rc1 + proj_as_projjson at Base 6.2.0 proj_as_wkt at Base 6.0.0 proj_assign_context at Base 6.0.0 - proj_cleanup at Base 6.2.0~rc1 + proj_cleanup at Base 6.2.0 proj_clone at Base 6.0.0 - proj_concatoperation_get_step at Base 6.2.0~rc1 - proj_concatoperation_get_step_count at Base 6.2.0~rc1 + proj_concatoperation_get_step at Base 6.2.0 + proj_concatoperation_get_step_count at Base 6.2.0 proj_context_create at Base 5.0.0 proj_context_destroy at Base 5.0.0 proj_context_errno at Base 5.0.0 @@ -2633,7 +2638,7 @@ libproj.so.15 #PACKAGE# #MINVER# proj_context_get_database_path at Base 6.0.0 proj_context_get_use_proj4_init_rules at Base 6.0.0 proj_context_guess_wkt_dialect at Base 6.0.0 - proj_context_set_autoclose_database at Base 6.2.0~rc1 + proj_context_set_autoclose_database at Base 6.2.0 proj_context_set_database_path at Base 6.0.0 proj_context_set_file_finder at Base 6.0.0 proj_context_set_search_paths at Base 6.0.0 @@ -2721,7 +2726,7 @@ libproj.so.15 #PACKAGE# #MINVER# proj_create_conversion_wagner_vi at Base 6.0.0 proj_create_conversion_wagner_vii at Base 6.0.0 proj_create_crs_to_crs at Base 5.0.0 - proj_create_crs_to_crs_from_pj at Base 6.2.0~rc1 + proj_create_crs_to_crs_from_pj at Base 6.2.0 proj_create_cs at Base 6.0.0 proj_create_ellipsoidal_2D_cs at Base 6.0.0 proj_create_engineering_crs at Base 6.0.0 @@ -2775,12 +2780,12 @@ libproj.so.15 #PACKAGE# #MINVER# proj_get_name at Base 6.0.0 proj_get_non_deprecated at Base 6.0.0 proj_get_prime_meridian at Base 6.0.0 - proj_get_remarks at Base 6.2.0~rc1 - proj_get_scope at Base 6.2.0~rc1 + proj_get_remarks at Base 6.2.0 + proj_get_scope at Base 6.2.0 proj_get_source_crs at Base 6.0.0 proj_get_target_crs at Base 6.0.0 proj_get_type at Base 6.0.0 - proj_grid_get_info_from_database at Base 6.2.0~rc1 + proj_grid_get_info_from_database at Base 6.2.0 proj_grid_info at Base 5.0.0 proj_identify at Base 6.0.0 proj_info at Base 5.0.0 @@ -2808,7 +2813,7 @@ libproj.so.15 #PACKAGE# #MINVER# proj_operation_factory_context_set_area_of_interest at Base 6.0.0 proj_operation_factory_context_set_crs_extent_use at Base 6.0.0 proj_operation_factory_context_set_desired_accuracy at Base 6.0.0 - proj_operation_factory_context_set_discard_superseded at Base 6.2.0~rc1 + proj_operation_factory_context_set_discard_superseded at Base 6.2.0 proj_operation_factory_context_set_grid_availability_use at Base 6.0.0 proj_operation_factory_context_set_spatial_criterion at Base 6.0.0 proj_operation_factory_context_set_use_proj_alternative_grid_names at Base 6.0.0 View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/compare/60589ae341d041a44cbd9df48e69d9123e4980da...dba97801681506ef86f9c20285df144455110d43 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/compare/60589ae341d041a44cbd9df48e69d9123e4980da...dba97801681506ef86f9c20285df144455110d43 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 08:55:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 07:55:27 +0000 Subject: [Git][debian-gis-team/proj] Pushed new tag debian/6.2.0-1_exp1 Message-ID: <5d6b796fb36f2_577b2ade5d84e91c20215@godard.mail> Bas Couwenberg pushed new tag debian/6.2.0-1_exp1 at Debian GIS Project / proj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/tree/debian/6.2.0-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 1 08:55:28 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 01 Sep 2019 07:55:28 +0000 Subject: [Git][debian-gis-team/proj] Pushed new tag upstream/6.2.0 Message-ID: <5d6b7970c5afc_577b2ade5d5c1a8c202310@godard.mail> Bas Couwenberg pushed new tag upstream/6.2.0 at Debian GIS Project / proj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/tree/upstream/6.2.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 1 09:06:23 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 08:06:23 +0000 Subject: Processing of proj_6.2.0-1~exp1_source.changes Message-ID: proj_6.2.0-1~exp1_source.changes uploaded successfully to localhost along with the files: proj_6.2.0-1~exp1.dsc proj_6.2.0.orig.tar.gz proj_6.2.0-1~exp1.debian.tar.xz proj_6.2.0-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 1 09:44:09 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 08:44:09 +0000 Subject: proj_6.2.0-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 01 Sep 2019 09:38:16 +0200 Source: proj Architecture: source Version: 6.2.0-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: proj (6.2.0-1~exp1) experimental; urgency=medium . * New upstream release. * Don't remove data/null on clean, included upstream too. * Update symbols for other architectures. * Strip pre-releases from symbols version. Checksums-Sha1: 172f7c37ac8f852558f4ba3f3b29e6cb5652cd60 2203 proj_6.2.0-1~exp1.dsc 579174affcdb5c7abdc5830630a36cc43a00fa02 2685319 proj_6.2.0.orig.tar.gz 62c6e8ed0b5a9b2d0b4114fe3cb1f36780d6d923 10209584 proj_6.2.0-1~exp1.debian.tar.xz 5872163f248dffda94962123720b43ffe21ea3fd 8210 proj_6.2.0-1~exp1_amd64.buildinfo Checksums-Sha256: e26279df1f92cd9aa953ec9e4bf273e2a26db78cefa3a486b081f791744a8b9f 2203 proj_6.2.0-1~exp1.dsc b300c0f872f632ad7f8eb60725edbf14f0f8f52db740a3ab23e7b94f1cd22a50 2685319 proj_6.2.0.orig.tar.gz 8c23bf17ec21ca09d78f166c1e8d7f1fba0d763ef1fbf189dd9772c755a51f32 10209584 proj_6.2.0-1~exp1.debian.tar.xz f081dd5d4cfab03bd737c67efe0c68d57a43f37ecffd86bf8a970d1eadc8e74f 8210 proj_6.2.0-1~exp1_amd64.buildinfo Files: de6c77ecc826ed6cb10889f9914d6616 2203 science optional proj_6.2.0-1~exp1.dsc 5cde556545828beaffbe50b1bb038480 2685319 science optional proj_6.2.0.orig.tar.gz 023ffddae05d43dd1e2590b38e01aab1 10209584 science optional proj_6.2.0-1~exp1.debian.tar.xz c2e13803fdd5f4b1cc70d14afab07b77 8210 science optional proj_6.2.0-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1reVEACgkQZ1DxCuiN SvGaOw/7BOajTMbFq+XFD/Z0DOER6vF5RK2WFoh5m4PrAiYaY4EEiqFFUbYe9x7O 9FTob+CCdMZrgsB/bBmn2OBwQEx/VHnwfeQx4SBIMOBzIEsdb+CJS5QeQA80xnzo 5NO95eC5VGx6lGzBDDSKnuG5FJSOrPyAsmy3/XnmK68ZS+XQQvQGdt4FxDiO5Vw3 gd9K0JNzXZVhkHtKiPNDdvNYDmvqkWST5OJm7MxtJyLAuNxBljv3rjggFWDbZJX7 4C1PgAT5+c42mpxE1lEAJ5FXt1/yJBggtOR3LpYLShGaC8WeABRsuOVX+Q3bazKb v5riZZ6LYg7qDWLzFS2Tuynqsm6RTcy2Bab1SgnI3jCaJYOPPaqAle3Cq8CdkXHQ EdXyAKTWx0sd6TvpQAqpnMOyI+//pFQDzWLMiCG5a2dTBxW2z4P0XxfE+Y+8ZBrl X+t2+kraL5PkOLhtXdyg1BiTs406HzciN5d2iZgTQeWFNoxv6U1Gy1VwcxfMtzJq 6yv+673FhSywNYUPCtGDYiFewLyH83OmQam42apHQbh4Gjd5AMyh4x9cmkidV7pX 3mdvsmDiI68N6qLJ6ndFoK354D36L6V4bCMoc/5xqpxPDbP+hwyNRb83TzYZaCV1 b/pR33QpPAkmYQicGM3idfNHS9VaQqpGBlCgnAr3Rdbqf/bNdas= =D2WO -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sun Sep 1 12:17:25 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 11:17:25 +0000 Subject: Processing of mapproxy_1.11.0-3+deb10u1_amd64.changes Message-ID: mapproxy_1.11.0-3+deb10u1_amd64.changes uploaded successfully to localhost along with the files: mapproxy_1.11.0-3+deb10u1.dsc mapproxy_1.11.0-3+deb10u1.debian.tar.xz mapproxy-doc_1.11.0-3+deb10u1_all.deb mapproxy_1.11.0-3+deb10u1_all.deb mapproxy_1.11.0-3+deb10u1_amd64.buildinfo python-mapproxy_1.11.0-3+deb10u1_all.deb python3-mapproxy_1.11.0-3+deb10u1_all.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 1 12:21:43 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 11:21:43 +0000 Subject: mapproxy_1.11.0-3+deb10u1_amd64.changes ACCEPTED into proposed-updates->stable-new Message-ID: Mapping buster to stable. Mapping stable to proposed-updates. Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 28 Aug 2019 14:08:20 +0200 Source: mapproxy Binary: mapproxy mapproxy-doc python-mapproxy python3-mapproxy Architecture: source all Version: 1.11.0-3+deb10u1 Distribution: buster Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: mapproxy - open source proxy for geospatial data mapproxy-doc - open source proxy for geospatial data - documentation python-mapproxy - open source proxy for geospatial data - Python 2 module python3-mapproxy - open source proxy for geospatial data - Python 3 module Closes: 935887 Changes: mapproxy (1.11.0-3+deb10u1) buster; urgency=medium . * Update branch in gbp.conf & Vcs-Git URL. * Add upstream patch to fix WMS Capabilties with Python 3.7. (closes: #935887) Checksums-Sha1: ce28bf636549207c256f352b692d850b56935fa3 2639 mapproxy_1.11.0-3+deb10u1.dsc e3395f8b81c2b6899e44221df3a7f9157bed1467 27324 mapproxy_1.11.0-3+deb10u1.debian.tar.xz 1bb92a5491635c5f2ffa68c67b2cda2fd9338177 774564 mapproxy-doc_1.11.0-3+deb10u1_all.deb 4292a907f636dd1f8e713826b3f9a4d349773992 26548 mapproxy_1.11.0-3+deb10u1_all.deb b049b541dfbc33ae9e9c6bb4b3bf4b12ad07691f 16430 mapproxy_1.11.0-3+deb10u1_amd64.buildinfo 68e183be442b1d173c811dabfb5f1c5cbddc47a0 450560 python-mapproxy_1.11.0-3+deb10u1_all.deb d3e700f62299f89b1320cacbb99003a5177a7274 450716 python3-mapproxy_1.11.0-3+deb10u1_all.deb Checksums-Sha256: 35f2f9a5ac5563fb86891f1cc2fc0b5c47ed0ef6a871733e8be644937a0c1a2f 2639 mapproxy_1.11.0-3+deb10u1.dsc 200292b51c6accb180f5161a071e8e7d643f222cd360e8415f141920575bc71c 27324 mapproxy_1.11.0-3+deb10u1.debian.tar.xz 121e264e0903ab2c2aef225070c0d559b0b89773d5311c2706981404a87e4abc 774564 mapproxy-doc_1.11.0-3+deb10u1_all.deb ca8bee10ee14794bc4b8f4ce64c13891c052433f2d3f2d7c77a25a85941036a6 26548 mapproxy_1.11.0-3+deb10u1_all.deb 97ee1f261a446dfcc26c52021bfb2165ac176b675a64310297c96d2cc8c4aea0 16430 mapproxy_1.11.0-3+deb10u1_amd64.buildinfo 6ef02768e7dd2bb0ef5d1fd822bc47bcde89685d972e06f7a04711f671980301 450560 python-mapproxy_1.11.0-3+deb10u1_all.deb 91b99dcf82a6aa6f44701f1f068a4146b6810deecec08bfeb8c9a8f2c700f47c 450716 python3-mapproxy_1.11.0-3+deb10u1_all.deb Files: 240d241bae9be7620d50707697449690 2639 python optional mapproxy_1.11.0-3+deb10u1.dsc f2693d9e3c0e68a5a93c63483b5d15bb 27324 python optional mapproxy_1.11.0-3+deb10u1.debian.tar.xz eff06c8bdc3beea14dbd13508687c05d 774564 doc optional mapproxy-doc_1.11.0-3+deb10u1_all.deb 6b2a0817904c32fff600511d1a7a2ebd 26548 web optional mapproxy_1.11.0-3+deb10u1_all.deb b0ffd638b995518b8ffc80e52064716b 16430 python optional mapproxy_1.11.0-3+deb10u1_amd64.buildinfo 6d080ec7566035c5ae3c385c0c89da80 450560 python optional python-mapproxy_1.11.0-3+deb10u1_all.deb 8ad67502244eca82c9a69aba50bc95c5 450716 python optional python3-mapproxy_1.11.0-3+deb10u1_all.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1rpnMACgkQZ1DxCuiN SvEx0g/+JhhJEknHAthy5cJxr0an4UlgKDf449CgHN7fn656JQe2QoMG3Bcbupqg 0VbZWf+5TPsb5iV6m2Ni7KYZY92lifweW5RwQPGQRraTgJywW1WdG/7YiER3bQlN 9/TuhFonWlsTO+oUBmh6Yd4+QswqZB0G4uQzlkZBvj0v7FudfHniWHkJiKlVfSQL 3uhrfVKBWByMZRWZyrnp6SrNJB0t50ggiF14jey6pY2ZApeB8Aoml37y04B0/STt iRk+lHVU8hCe6Tv66StMU4WIVgUkacUFAOmKcJkyj8e3yW3aDebtsPDeFY2ryK6O wA8AXKdDlsN64j4IlLJpROLndN/Wb5S3mklpFZMJxE/f5tPUWegc81RN8ft1k4i1 hkNPTtv+oUBbekCS3vAxW153TzXQAz38oDd/xYP+0EPThiiyAtY1ZL8fXZ5MiK8I E6rysiyX8xFcdXAimIjISHnXF4KbPoDe0HroR61nEBBigxNBKHiZec+9aROQap9A 0HChFP2oN5sNuAXPd1HTbNA2YGoG/0oNg1E9DL5XMHUOoMda5+5HX1g9lVYMr8zH SD4rY5mZ6fTKSko/AcRyUrpVxv2mX8eaa5HNdnwB6wmmQLUkvcqdyBpaeb4o4Qm2 MTv/217ecIqoss+8wmRGMfRbMUu1nU49024beVQ7BP5ToD02Vjo= =sEdm -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From sebastic at xs4all.nl Sun Sep 1 14:05:26 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sun, 1 Sep 2019 15:05:26 +0200 Subject: Bug#939022: [#939022] Re: pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) In-Reply-To: References: <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <23ab9a59-f21b-0099-7d7b-4e2a60f47512@tiscali.it> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: On 8/31/19 7:14 PM, Sebastiaan Couwenberg wrote: >> I'm going to reassign. > > That doesn't seem appropriate, pyresample needs to be updated too. It > does things like this: > > pyresample/test/test_geometry.py: > projections = {'+init=epsg:3006': 'init: epsg:3006'} > > Note the explicit use of init files, this is not going to work correctly > with PROJ 6. As sort term workaround may be to skip these tests when /usr/share/proj/epsg doesn't exist (or when /usr/share/proj/proj.db does exist). Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From owner at bugs.debian.org Sun Sep 1 14:21:09 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sun, 01 Sep 2019 13:21:09 +0000 Subject: Processed: reassign 939022 to src:pyresample, found 939022 in pyresample/1.12.3-5, affects 939022 References: <1567344010-2316-bts-sebastic@debian.org> Message-ID: Processing commands for control at bugs.debian.org: > reassign 939022 src:pyresample Bug #939022 [pyresample] pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) Bug reassigned from package 'pyresample' to 'src:pyresample'. No longer marked as found in versions pyresample/1.12.3-5. Ignoring request to alter fixed versions of bug #939022 to the same values previously set > found 939022 pyresample/1.12.3-5 Bug #939022 [src:pyresample] pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) Marked as found in versions pyresample/1.12.3-5. > affects 939022 - pyresample Bug #939022 [src:pyresample] pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) Removed indication that 939022 affects pyresample > thanks Stopping processing here. Please contact me if you need assistance. -- 939022: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939022 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From ftpmaster at ftp-master.debian.org Sun Sep 1 14:53:06 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 01 Sep 2019 13:53:06 +0000 Subject: mapproxy_1.11.0-3+deb10u1_amd64.changes ACCEPTED into proposed-updates->stable-new, proposed-updates Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 28 Aug 2019 14:08:20 +0200 Source: mapproxy Binary: mapproxy mapproxy-doc python-mapproxy python3-mapproxy Architecture: source all Version: 1.11.0-3+deb10u1 Distribution: buster Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: mapproxy - open source proxy for geospatial data mapproxy-doc - open source proxy for geospatial data - documentation python-mapproxy - open source proxy for geospatial data - Python 2 module python3-mapproxy - open source proxy for geospatial data - Python 3 module Closes: 935887 Changes: mapproxy (1.11.0-3+deb10u1) buster; urgency=medium . * Update branch in gbp.conf & Vcs-Git URL. * Add upstream patch to fix WMS Capabilties with Python 3.7. (closes: #935887) Checksums-Sha1: ce28bf636549207c256f352b692d850b56935fa3 2639 mapproxy_1.11.0-3+deb10u1.dsc e3395f8b81c2b6899e44221df3a7f9157bed1467 27324 mapproxy_1.11.0-3+deb10u1.debian.tar.xz 1bb92a5491635c5f2ffa68c67b2cda2fd9338177 774564 mapproxy-doc_1.11.0-3+deb10u1_all.deb 4292a907f636dd1f8e713826b3f9a4d349773992 26548 mapproxy_1.11.0-3+deb10u1_all.deb b049b541dfbc33ae9e9c6bb4b3bf4b12ad07691f 16430 mapproxy_1.11.0-3+deb10u1_amd64.buildinfo 68e183be442b1d173c811dabfb5f1c5cbddc47a0 450560 python-mapproxy_1.11.0-3+deb10u1_all.deb d3e700f62299f89b1320cacbb99003a5177a7274 450716 python3-mapproxy_1.11.0-3+deb10u1_all.deb Checksums-Sha256: 35f2f9a5ac5563fb86891f1cc2fc0b5c47ed0ef6a871733e8be644937a0c1a2f 2639 mapproxy_1.11.0-3+deb10u1.dsc 200292b51c6accb180f5161a071e8e7d643f222cd360e8415f141920575bc71c 27324 mapproxy_1.11.0-3+deb10u1.debian.tar.xz 121e264e0903ab2c2aef225070c0d559b0b89773d5311c2706981404a87e4abc 774564 mapproxy-doc_1.11.0-3+deb10u1_all.deb ca8bee10ee14794bc4b8f4ce64c13891c052433f2d3f2d7c77a25a85941036a6 26548 mapproxy_1.11.0-3+deb10u1_all.deb 97ee1f261a446dfcc26c52021bfb2165ac176b675a64310297c96d2cc8c4aea0 16430 mapproxy_1.11.0-3+deb10u1_amd64.buildinfo 6ef02768e7dd2bb0ef5d1fd822bc47bcde89685d972e06f7a04711f671980301 450560 python-mapproxy_1.11.0-3+deb10u1_all.deb 91b99dcf82a6aa6f44701f1f068a4146b6810deecec08bfeb8c9a8f2c700f47c 450716 python3-mapproxy_1.11.0-3+deb10u1_all.deb Files: 240d241bae9be7620d50707697449690 2639 python optional mapproxy_1.11.0-3+deb10u1.dsc f2693d9e3c0e68a5a93c63483b5d15bb 27324 python optional mapproxy_1.11.0-3+deb10u1.debian.tar.xz eff06c8bdc3beea14dbd13508687c05d 774564 doc optional mapproxy-doc_1.11.0-3+deb10u1_all.deb 6b2a0817904c32fff600511d1a7a2ebd 26548 web optional mapproxy_1.11.0-3+deb10u1_all.deb b0ffd638b995518b8ffc80e52064716b 16430 python optional mapproxy_1.11.0-3+deb10u1_amd64.buildinfo 6d080ec7566035c5ae3c385c0c89da80 450560 python optional python-mapproxy_1.11.0-3+deb10u1_all.deb 8ad67502244eca82c9a69aba50bc95c5 450716 python optional python3-mapproxy_1.11.0-3+deb10u1_all.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1rpnMACgkQZ1DxCuiN SvEx0g/+JhhJEknHAthy5cJxr0an4UlgKDf449CgHN7fn656JQe2QoMG3Bcbupqg 0VbZWf+5TPsb5iV6m2Ni7KYZY92lifweW5RwQPGQRraTgJywW1WdG/7YiER3bQlN 9/TuhFonWlsTO+oUBmh6Yd4+QswqZB0G4uQzlkZBvj0v7FudfHniWHkJiKlVfSQL 3uhrfVKBWByMZRWZyrnp6SrNJB0t50ggiF14jey6pY2ZApeB8Aoml37y04B0/STt iRk+lHVU8hCe6Tv66StMU4WIVgUkacUFAOmKcJkyj8e3yW3aDebtsPDeFY2ryK6O wA8AXKdDlsN64j4IlLJpROLndN/Wb5S3mklpFZMJxE/f5tPUWegc81RN8ft1k4i1 hkNPTtv+oUBbekCS3vAxW153TzXQAz38oDd/xYP+0EPThiiyAtY1ZL8fXZ5MiK8I E6rysiyX8xFcdXAimIjISHnXF4KbPoDe0HroR61nEBBigxNBKHiZec+9aROQap9A 0HChFP2oN5sNuAXPd1HTbNA2YGoG/0oNg1E9DL5XMHUOoMda5+5HX1g9lVYMr8zH SD4rY5mZ6fTKSko/AcRyUrpVxv2mX8eaa5HNdnwB6wmmQLUkvcqdyBpaeb4o4Qm2 MTv/217ecIqoss+8wmRGMfRbMUu1nU49024beVQ7BP5ToD02Vjo= =sEdm -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Sun Sep 1 14:57:25 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sun, 01 Sep 2019 13:57:25 +0000 Subject: Bug#935887: marked as done (python3-mapproxy: Change default template_dir and fix capabilities for Python >= 3.6.7) References: <156689956231.26333.3470700706602159540.reportbug@czpzlvls0006.device.lukacs.cz> Message-ID: Your message dated Sun, 01 Sep 2019 13:53:06 +0000 with message-id and subject line Bug#935887: fixed in mapproxy 1.11.0-3+deb10u1 has caused the Debian Bug report #935887, regarding python3-mapproxy: Change default template_dir and fix capabilities for Python >= 3.6.7 to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 935887: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935887 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: =?utf-8?q?Marek_Luk=C3=A1cs?= Subject: python3-mapproxy: Change default template_dir and fix capabilities for Python >= 3.6.7 Date: Tue, 27 Aug 2019 11:52:42 +0200 Size: 13217 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#935887: fixed in mapproxy 1.11.0-3+deb10u1 Date: Sun, 01 Sep 2019 13:53:06 +0000 Size: 6780 URL: From antonio.valentino at tiscali.it Sun Sep 1 16:48:01 2019 From: antonio.valentino at tiscali.it (Antonio Valentino) Date: Sun, 1 Sep 2019 17:48:01 +0200 Subject: Bug#939022: [#939022] Re: pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) In-Reply-To: References: <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <23ab9a59-f21b-0099-7d7b-4e2a60f47512@tiscali.it> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: Hi Sebastiaan, I have a workaround for the basemap issue but there are still a couple of tests failing. cheers antonio Il 01/09/19 15:05, Sebastiaan Couwenberg ha scritto: > On 8/31/19 7:14 PM, Sebastiaan Couwenberg wrote: >>> I'm going to reassign. >> >> That doesn't seem appropriate, pyresample needs to be updated too. It >> does things like this: >> >> pyresample/test/test_geometry.py: >> projections = {'+init=epsg:3006': 'init: epsg:3006'} >> >> Note the explicit use of init files, this is not going to work correctly >> with PROJ 6. > > As sort term workaround may be to skip these tests when > /usr/share/proj/epsg doesn't exist (or when /usr/share/proj/proj.db does > exist). > > Kind Regards, > > Bas > -- Antonio Valentino From gitlab at salsa.debian.org Sun Sep 1 16:56:06 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 01 Sep 2019 15:56:06 +0000 Subject: [Git][debian-gis-team/pyresample][master] Workaround for broken basemap Message-ID: <5d6bea16a83a3_577b3f91d8630ee4247398@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyresample Commits: 55eab16e by Antonio Valentino at 2019-09-01T15:55:42Z Workaround for broken basemap - - - - - 3 changed files: - debian/changelog - + debian/patches/0004-Detect-broken-basemap.patch - debian/patches/series Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,10 @@ pyresample (1.12.3-6) UNRELEASED; urgency=medium * Use debhelper-compat instead of debian/compat. + * debian/patches: + - new 0004-Detect-broken-basemap.patch: workaround for failures + of basemap imports due to incompatibility with new pyproj + versions -- Antonio Valentino Wed, 21 Aug 2019 19:35:41 +0000 ===================================== debian/patches/0004-Detect-broken-basemap.patch ===================================== @@ -0,0 +1,23 @@ +From: Antonio Valentino +Date: Sun, 1 Sep 2019 06:37:12 +0000 +Subject: Detect broken basemap + +https://github.com/matplotlib/basemap/pull/454 +https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939022 +--- + pyresample/test/test_plot.py | 2 +- + 1 file changed, 1 insertion(+), 1 deletion(-) + +diff --git a/pyresample/test/test_plot.py b/pyresample/test/test_plot.py +index 0ea4fa3..bc446e3 100644 +--- a/pyresample/test/test_plot.py ++++ b/pyresample/test/test_plot.py +@@ -29,7 +29,7 @@ except ImportError: + + try: + from mpl_toolkits.basemap import Basemap +-except ImportError: ++except (ImportError, AttributeError): + Basemap = None + + ===================================== debian/patches/series ===================================== @@ -1,3 +1,4 @@ 0001-fix-proj4-initialization.patch 0002-Skip-dask-related-tests-if-dask-is-not-available.patch 0003-Make-xarray-optional-for-testing.patch +0004-Detect-broken-basemap.patch View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/55eab16e711ff56bac48aee6920840d8ff6e86f0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/55eab16e711ff56bac48aee6920840d8ff6e86f0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From antonio.valentino at tiscali.it Sun Sep 1 18:44:35 2019 From: antonio.valentino at tiscali.it (Antonio Valentino) Date: Sun, 1 Sep 2019 19:44:35 +0200 Subject: Bug#939022: [#939022] Re: pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) In-Reply-To: References: <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <23ab9a59-f21b-0099-7d7b-4e2a60f47512@tiscali.it> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: Hi Sebastiaan, after the workaround committed in [1], remaining issues are: ====================================================================== ERROR: test_create_area_def (pyresample.test.test_geometry.TestStackedAreaDefinition) Test create_area_def and the four sub-methods that call it in AreaDefinition. ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/antonio/debian/git/pyresample/pyresample/test/test_geometry.py", line 1657, in test_create_area_def radius=essentials[1], description=description, units=units, rotation=45)) File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 444, in create_area_def center = _convert_units(center, 'center', units, p, proj_dict) File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 577, in _convert_units var = _round_poles(var, units, p) File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 517, in _round_poles center = p(*center, inverse=True, errcheck=True) File "/home/antonio/debian/git/pyresample/pyresample/_spatial_mp.py", line 157, in __call__ radians=radians, errcheck=errcheck) File "/usr/lib/python3/dist-packages/pyproj/proj.py", line 183, in __call__ self._inv(inx, iny, errcheck=errcheck) File "pyproj/_proj.pyx", line 165, in pyproj._proj.Proj._inv pyproj.exceptions.ProjError: generic error of unknown origin: (Internal Proj Error: proj_crs_get_sub_crs: Object is not a CompoundCRS) ====================================================================== ERROR: test_area_parser_yaml (pyresample.test.test_utils.TestYAMLAreaParser) Test YAML area parser. ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/antonio/debian/git/pyresample/pyresample/test/test_utils.py", line 96, in test_area_parser_yaml 'test_latlong') File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 104, in parse_area_file return _parse_yaml_area_file(area_file_name, *regions) File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 157, in _parse_yaml_area_file res.append(create_area_def(**params)) File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 444, in create_area_def center = _convert_units(center, 'center', units, p, proj_dict) File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 577, in _convert_units var = _round_poles(var, units, p) File "/home/antonio/debian/git/pyresample/pyresample/area_config.py", line 517, in _round_poles center = p(*center, inverse=True, errcheck=True) File "/home/antonio/debian/git/pyresample/pyresample/_spatial_mp.py", line 157, in __call__ radians=radians, errcheck=errcheck) File "/usr/lib/python3/dist-packages/pyproj/proj.py", line 183, in __call__ self._inv(inx, iny, errcheck=errcheck) File "pyproj/_proj.pyx", line 165, in pyproj._proj.Proj._inv pyproj.exceptions.ProjError: generic error of unknown origin: (Internal Proj Error: proj_crs_get_sub_crs: Object is not a CompoundCRS) It seems that the problem can be fixed by upgrading pyproj to v2.3.1 (see [2]). I made a quick test with python3-pyproj 2.3.1+ds-1~exp1 (available in experimental) and I can confirm that all tests pass now. I0m going to add a versioned dependency in pyresample but the upload can only happen once pyproj 2.3.1 is available in unstable. There seems to be still some warning regarding proj initialization but as far as I can understand there is some activity upstream on this front so I think all details will be fully fixed in the next upstream version of pyresample. [1] https://salsa.debian.org/debian-gis-team/pyresample/commit/55eab16e711ff56bac48aee6920840d8ff6e86f0 [2] https://github.com/pyproj4/pyproj/issues/413 kind regards antonio On Sun, 1 Sep 2019 17:48:01 +0200 Antonio Valentino wrote: > Hi Sebastiaan, > I have a workaround for the basemap issue but there are still a couple > of tests failing. > > cheers > antonio > > Il 01/09/19 15:05, Sebastiaan Couwenberg ha scritto: > > On 8/31/19 7:14 PM, Sebastiaan Couwenberg wrote: > >>> I'm going to reassign. > >> > >> That doesn't seem appropriate, pyresample needs to be updated too. It > >> does things like this: > >> > >> pyresample/test/test_geometry.py: > >> projections = {'+init=epsg:3006': 'init: epsg:3006'} > >> > >> Note the explicit use of init files, this is not going to work correctly > >> with PROJ 6. > > > > As sort term workaround may be to skip these tests when > > /usr/share/proj/epsg doesn't exist (or when /usr/share/proj/proj.db does > > exist). > > > > Kind Regards, > > > > Bas > > > > -- > Antonio Valentino > > -- Antonio Valentino From sebastic at xs4all.nl Sun Sep 1 18:54:59 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sun, 1 Sep 2019 19:54:59 +0200 Subject: Bug#939022: [#939022] Re: pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) In-Reply-To: References: <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <23ab9a59-f21b-0099-7d7b-4e2a60f47512@tiscali.it> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: On 9/1/19 7:44 PM, Antonio Valentino wrote: > It seems that the problem can be fixed by upgrading pyproj to v2.3.1 > (see [2]). > I made a quick test with python3-pyproj 2.3.1+ds-1~exp1 (available in > experimental) and I can confirm that all tests pass now. > > I'm going to add a versioned dependency in pyresample but the upload can > only happen once pyproj 2.3.1 is available in unstable. As that was only uploaded today, I'm waiting for more builds to succeed. Since the autopkgtest failures are likely going to block testing migration of proj and the rdeps rebuilt for the transition. I'm considering move proj 6.2.0 & pyproj 2.3.1 to unstable in the next few days, but I'm first going to wait for the aging of the packages uploaded during the transition. If the migration doesn't happen then, I'll upload to unstable. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From gitlab at salsa.debian.org Sun Sep 1 19:15:13 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 01 Sep 2019 18:15:13 +0000 Subject: [Git][debian-gis-team/pyresample][master] 4 commits: Close #939022 Message-ID: <5d6c0ab16e4ee_577b2ade6143c0c826008d@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyresample Commits: f5fe6753 by Antonio Valentino at 2019-09-01T17:33:17Z Close #939022 - - - - - 805a23be by Antonio Valentino at 2019-09-01T17:48:32Z Depend on pyproj <= 2.3.1 - - - - - abf47ce6 by Antonio Valentino at 2019-09-01T18:04:09Z Drop dependency on mock - - - - - c3b6dea1 by Antonio Valentino at 2019-09-01T18:04:16Z Remove obsolete fields Name from debian/upstream/metadata. - - - - - 4 changed files: - debian/changelog - debian/control - − debian/python-pyresample-test.lintian-overrides - debian/upstream/metadata Changes: ===================================== debian/changelog ===================================== @@ -4,7 +4,11 @@ pyresample (1.12.3-6) UNRELEASED; urgency=medium * debian/patches: - new 0004-Detect-broken-basemap.patch: workaround for failures of basemap imports due to incompatibility with new pyproj - versions + versions (Closes: #939022) + * debian/control: + - depend on pyproj >= 2.3.1 + - drop dependency on python3-mock (not really necessary with Python 3) + * Remove obsolete fields Name from debian/upstream/metadata. -- Antonio Valentino Wed, 21 Aug 2019 19:35:41 +0000 ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: cython3, python3-numpy, python3-pil, python3-pykdtree (>= 1.3.1), - python3-pyproj, + python3-pyproj (>= 2.3.1), python3-rasterio, python3-scipy, python3-setuptools, @@ -33,7 +33,7 @@ Architecture: any Depends: python-pyresample-test, python3-configobj, python3-numpy, - python3-pyproj, + python3-pyproj (>= 2.3.1), python3-pykdtree (>= 1.3.1), python3-rasterio, python3-six, @@ -83,8 +83,7 @@ Description: Resampling of remote sensing data in Python (documentation) Package: python-pyresample-test Architecture: all Multi-Arch: foreign -Depends: python3-mock, - ${misc:Depends} +Depends: ${misc:Depends} Description: Resampling of remote sensing data in Python (test suite) Pyresample is a Python package for resampling (reprojection) of earth observing satellite data. It handles both resampling of gridded data ===================================== debian/python-pyresample-test.lintian-overrides deleted ===================================== @@ -1,3 +0,0 @@ -# the package only contains common test data and both packages for python2 -# and python3 depend on it -python-package-depends-on-package-from-other-python-variant Depends: python3-* ===================================== debian/upstream/metadata ===================================== @@ -1,6 +1,4 @@ ---- Bug-Database: https://github.com/pytroll/pyresample/issues Bug-Submit: https://github.com/pytroll/pyresample/issues/new -Name: Pyresample Repository: https://github.com/pytroll/pyresample.git Repository-Browse: https://github.com/pytroll/pyresample View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/compare/55eab16e711ff56bac48aee6920840d8ff6e86f0...c3b6dea16f5ceadc90dfda5bed7178dfb740da61 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/compare/55eab16e711ff56bac48aee6920840d8ff6e86f0...c3b6dea16f5ceadc90dfda5bed7178dfb740da61 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From owner at bugs.debian.org Sun Sep 1 19:21:05 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sun, 01 Sep 2019 18:21:05 +0000 Subject: Processed: Mark #939022 as pending References: <59654065-53e9-06b0-1e9e-fd58ea48d6e5@tiscali.it> Message-ID: Processing commands for control at bugs.debian.org: > tags 939022 + pending Bug #939022 [src:pyresample] pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) Added tag(s) pending. > thanks Stopping processing here. Please contact me if you need assistance. -- 939022: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939022 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From noreply at release.debian.org Mon Sep 2 05:39:18 2019 From: noreply at release.debian.org (Debian testing watch) Date: Mon, 02 Sep 2019 04:39:18 +0000 Subject: mapproxy 1.11.1-2 MIGRATED to testing Message-ID: FYI: The status of the mapproxy source package in Debian's testing distribution has changed. Previous version: 1.11.1-1 Current version: 1.11.1-2 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Mon Sep 2 06:07:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 05:07:41 +0000 Subject: [Git][debian-gis-team/mapproxy][master] Move from experimental to unstable. Message-ID: <5d6ca39d5f2e8_577b2ade612253702912c2@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapproxy Commits: 4cc82917 by Bas Couwenberg at 2019-09-02T04:45:17Z Move from experimental to unstable. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mapproxy (1.12.0-1) unstable; urgency=medium + + * Move from experimental to unstable. + + -- Bas Couwenberg Mon, 02 Sep 2019 06:44:53 +0200 + mapproxy (1.12.0-1~exp1) experimental; urgency=medium * New upstream release. View it on GitLab: https://salsa.debian.org/debian-gis-team/mapproxy/commit/4cc82917e12d7f0045e5a99b27e3b0e2e9d0d6d5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapproxy/commit/4cc82917e12d7f0045e5a99b27e3b0e2e9d0d6d5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 06:07:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 05:07:46 +0000 Subject: [Git][debian-gis-team/mapproxy] Pushed new tag debian/1.12.0-1 Message-ID: <5d6ca3a2255e5_577b2ade612253702914d2@godard.mail> Bas Couwenberg pushed new tag debian/1.12.0-1 at Debian GIS Project / mapproxy -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapproxy/tree/debian/1.12.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 2 06:19:37 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 05:19:37 +0000 Subject: Processing of mapproxy_1.12.0-1_source.changes Message-ID: mapproxy_1.12.0-1_source.changes uploaded successfully to localhost along with the files: mapproxy_1.12.0-1.dsc mapproxy_1.12.0-1.debian.tar.xz mapproxy_1.12.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 2 07:04:24 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 06:04:24 +0000 Subject: mapproxy_1.12.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 02 Sep 2019 06:44:53 +0200 Source: mapproxy Architecture: source Version: 1.12.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: mapproxy (1.12.0-1) unstable; urgency=medium . * Move from experimental to unstable. Checksums-Sha1: 750ba9b138d1d6cc337970321c721226d19a2498 2323 mapproxy_1.12.0-1.dsc 85cf5ac8f5bc97f6e58759fe894e6967035b0057 19424 mapproxy_1.12.0-1.debian.tar.xz bfe8c5064d6008cce4930f52158ab142ecc847bd 15304 mapproxy_1.12.0-1_amd64.buildinfo Checksums-Sha256: 40b9a281946d9d09ee163c74056b98f900c90e580588bf426046f08792eea182 2323 mapproxy_1.12.0-1.dsc bd2fe675d50fe6c578b1fb2e7ac2c9ad0490d91c8c67902bc583843908465572 19424 mapproxy_1.12.0-1.debian.tar.xz ba3e2370d4c523b41cec8dac0179f35b0af16bfce69f503bda7c369b95fa3619 15304 mapproxy_1.12.0-1_amd64.buildinfo Files: ba29c1497d34ffa4d6d1828876fab820 2323 python optional mapproxy_1.12.0-1.dsc 578c0246a6e784876f378647f0c8a643 19424 python optional mapproxy_1.12.0-1.debian.tar.xz d6c54311623e66789005ba9a7e536235 15304 python optional mapproxy_1.12.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1so4oACgkQZ1DxCuiN SvHcAhAAgPN9bmzycPIs3xlZ72Zqle66+FHCqKGDsxwBen9Wr9/FVOms3hu3lwUY +2dpavwG7Zy2pS/Fy4Tb5snK5BIbvD7VIPa+mnTIwz/Wojc8VYjW7EEmz7jrpfpJ O1h/qhbfoFPsWgS7dVCnjPowz59AFrmDIwJcKi1Jrza2nM9R0zBNtRDNYrxJQGLL qt6dQ4C7W+prf4Z7b9ZLmvOk7UjU6BsHoAGu3E82lN4SgBubMR/Cap1FC8wW1oNd 48ifn3C+/q0qj8o8rkCeS4T110lVrr6Lua+hApaQqVmzfgk0ZO9nDKzP1FlcDCkq r50Kb3uKapRRBxMxs39Si/5F6yiZmjFuaV1GDn8qCDUEzX7y8BRv4WLoJwlmS0OJ RtBwS9/BSZvaZbivtAvfEgY35Wc3rvAeUzgnsF/rripJQV/1JxWV9z5Mf1iA4prH J8DOHxUEnlo4UJrLVHkpDxBe4HtBe/ak4l/5kTwKdVXcaEssy3wfCcmL4I3zDnbC VV2VPEu7GEbFBUtbUAM9hBMfFkZTFrjDnLYUybRix3aGKA12yZ9nLOhIZV8EV0gQ el9OnkUIp1K5vgvwnsbajcw4Xuu/i4BmxzDrtcfR334rbhq+2moqbHbLBUjq24la qQvR/4qyZztzeszc1unDqUAqjg8xUwEGedPV8/9dqWQN4CEgyBg= =j9FD -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Mon Sep 2 07:14:04 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 02 Sep 2019 06:14:04 +0000 Subject: [Git][debian-gis-team/pyspectral][pristine-tar] pristine-tar data for pyspectral_0.9.0+ds.orig.tar.xz Message-ID: <5d6cb32c80c67_577b2ade61225370292333@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / pyspectral Commits: f91d1273 by Antonio Valentino at 2019-09-02T05:46:17Z pristine-tar data for pyspectral_0.9.0+ds.orig.tar.xz - - - - - 2 changed files: - + pyspectral_0.9.0+ds.orig.tar.xz.delta - + pyspectral_0.9.0+ds.orig.tar.xz.id Changes: ===================================== pyspectral_0.9.0+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/pyspectral_0.9.0+ds.orig.tar.xz.delta differ ===================================== pyspectral_0.9.0+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +6c0062b9e878272a78123ca7d1933b11ac0f7e81 View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/commit/f91d12732f138afd53106a8fff54fd4589f6ada3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/commit/f91d12732f138afd53106a8fff54fd4589f6ada3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 07:14:15 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 02 Sep 2019 06:14:15 +0000 Subject: [Git][debian-gis-team/pyspectral][master] 4 commits: New upstream version 0.9.0+ds Message-ID: <5d6cb337172b0_577b3f91d054954c292495@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyspectral Commits: bdb76ea7 by Antonio Valentino at 2019-09-02T05:46:16Z New upstream version 0.9.0+ds - - - - - a3beaafe by Antonio Valentino at 2019-09-02T05:46:17Z Update upstream source from tag 'upstream/0.9.0+ds' Update to upstream version '0.9.0+ds' with Debian dir 7df66c97398a0f371925d08159b15eb1aef5c607 - - - - - 69437df5 by Antonio Valentino at 2019-09-02T05:46:53Z New upstream release - - - - - a8df3a1b by Antonio Valentino at 2019-09-02T06:04:39Z Set distribution to unstable - - - - - 19 changed files: - CHANGELOG.md - README.md - debian/changelog - doc/37_reflectance.rst - doc/rad_definitions.rst - pyspectral/blackbody.py - pyspectral/config.py - pyspectral/etc/pyspectral.yaml - pyspectral/rsr_reader.py - pyspectral/tests/test_atm_correction_ir.py - pyspectral/tests/test_blackbody.py - pyspectral/tests/test_rad_tb_conversions.py - pyspectral/tests/test_reflectance.py - pyspectral/tests/test_utils.py - pyspectral/utils.py - pyspectral/version.py - rsr_convert_scripts/README.rst - + rsr_convert_scripts/agri_rsr.py - + rsr_convert_scripts/virr_rsr.py Changes: ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,27 @@ +## Version (2019/08/30) + +### Issues Closed + +* [Issue 73](https://github.com/pytroll/pyspectral/issues/73) - Fix blackbody code to work with dask arrays ([PR 74](https://github.com/pytroll/pyspectral/pull/74)) + +In this release 1 issue was closed. + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 80](https://github.com/pytroll/pyspectral/pull/80) - Fix doc tests for python 2&3 +* [PR 79](https://github.com/pytroll/pyspectral/pull/79) - Fix rsr zenodo version +* [PR 74](https://github.com/pytroll/pyspectral/pull/74) - Fix dask compatibility in blackbody functions ([73](https://github.com/pytroll/pyspectral/issues/73)) + +#### Features added + +* [PR 78](https://github.com/pytroll/pyspectral/pull/78) - Add FY-3B VIRR and FY-3C VIRR RSRs +* [PR 77](https://github.com/pytroll/pyspectral/pull/77) - Add FY-4A AGRI support + +In this release 5 pull requests were closed. + + ## Version (2019/06/07) ### Issues Closed ===================================== README.md ===================================== @@ -1,14 +1,11 @@ PySpectral ========== -[![Codacy Badge](https://api.codacy.com/project/badge/Grade/9f039d7d640846ca89be8a78fa11e1f6)](https://www.codacy.com/app/adybbroe/pyspectral?utm_source=github.com&utm_medium=referral&utm_content=pytroll/pyspectral&utm_campaign=badger) [![Build Status](https://travis-ci.org/pytroll/pyspectral.png?branch=master)](https://travis-ci.org/pytroll/pyspectral) [![Build status](https://ci.appveyor.com/api/projects/status/5lm42n0l65l5o9xn?svg=true)](https://ci.appveyor.com/project/pytroll/pyspectral) [![Coverage Status](https://coveralls.io/repos/github/pytroll/pyspectral/badge.svg?branch=master)](https://coveralls.io/github/pytroll/pyspectral?branch=master) -[![Code Health](https://landscape.io/github/pytroll/pyspectral/master/landscape.png)](https://landscape.io/github/pytroll/pyspectral/master) [![PyPI version](https://badge.fury.io/py/pyspectral.svg)](https://badge.fury.io/py/pyspectral) [![Code Climate](https://codeclimate.com/github/pytroll/pyspectral/badges/gpa.svg)](https://codeclimate.com/github/pytroll/pyspectral) -[![Scrutinizer Code Quality](https://scrutinizer-ci.com/g/pytroll/pyspectral/badges/quality-score.png?b=master)](https://scrutinizer-ci.com/g/pytroll/pyspectral/?branch=master) Given a passive sensor on a meteorological satellite PySpectral provides the relative spectral response (rsr) function(s) and offer you some basic ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pyspectral (0.9.0+ds-1) unstable; urgency=medium + + * New upstream release. + + -- Antonio Valentino Mon, 02 Sep 2019 06:04:23 +0000 + pyspectral (0.8.9+ds-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== doc/37_reflectance.rst ===================================== @@ -46,7 +46,7 @@ expressed in :math:`W/m^2 sr^{-1} \mu m^{-1}`, or using SI units :math:`W/m^2 sr >>> viirs = RadTbConverter('Suomi-NPP', 'viirs', 'M12') >>> rad37 = viirs.tb2radiance(tb37) >>> print([np.round(rad, 7) for rad in rad37['radiance']]) - [369717.4765726, 355110.5207853, 314684.2788726, 173143.5424898, 116408.0007877] + [369717.4972296, 355110.6414922, 314684.3507084, 173143.4836477, 116408.0022674] >>> rad37['unit'] 'W/m^2 sr^-1 m^-1' @@ -59,7 +59,7 @@ In order to get the total radiance over the band one has to multiply with the eq >>> viirs = RadTbConverter('Suomi-NPP', 'viirs', 'M12') >>> rad37 = viirs.tb2radiance(tb37, normalized=False) >>> print([np.round(rad, 8) for rad in rad37['radiance']]) - [0.07037968, 0.06759909, 0.05990352, 0.03295972, 0.02215951] + [0.07037968, 0.06759911, 0.05990353, 0.03295971, 0.02215951] >>> rad37['unit'] 'W/m^2 sr^-1' @@ -218,17 +218,17 @@ We can try decompose equation :eq:`refl37` above using the example of VIIRS M12 >>> print(np.isnan(nomin)) [False False False False False] >>> print([np.round(val, 8) for val in nomin]) - [0.05083677, 0.04805618, 0.0404157, 0.01279279, 0.00204485] + [0.05083677, 0.0480562, 0.04041571, 0.01279277, 0.00204485] >>> denom = np.cos(np.deg2rad(sunz))/np.pi * sflux - rad11['radiance'] >>> print(np.isnan(denom)) [False False False False False] >>> print([np.round(val, 8) for val in denom]) - [0.23646313, 0.23645682, 0.23650559, 0.23582015, 0.2358661] + [0.23646312, 0.23645681, 0.23650559, 0.23582014, 0.23586609] >>> res = nomin/denom >>> print(np.isnan(res)) [False False False False False] >>> print([np.round(val, 8) for val in res]) - [0.21498817, 0.2032345, 0.17088689, 0.05424807, 0.00866955] + [0.21498817, 0.20323458, 0.17088693, 0.05424801, 0.00866952] Derive the emissive part of the 3.7 micron band @@ -255,5 +255,5 @@ Using the example of the VIIRS M12 band from above this gives the following spec >>> ['{tb:6.3f}'.format(tb=np.round(t, 4)) for t in tb] ['266.996', '267.262', '267.991', '271.033', '271.927'] >>> rad = refl_m12.emissive_part_3x(tb=False) - >>> ['{rad:6.3f}'.format(rad=np.round(r, 3)) for r in rad] - ['80285.149', '81458.022', '84749.639', '99761.400', '104582.030'] + >>> ['{rad:6.3f}'.format(rad=np.round(r, 3)) for r in rad.compute()] + ['80285.150', '81458.022', '84749.638', '99761.401', '104582.031'] ===================================== doc/rad_definitions.rst ===================================== @@ -229,7 +229,7 @@ And using wavelength representation: >>> wvl = 1./wavenumber >>> rad = blackbody(wvl, [300., 301]) >>> print("{0:10.3f} {1:10.3f}".format(rad[0], rad[1])) - 9573178.886 9714689.259 + 9573177.494 9714687.157 Which are the spectral radiances in SI units around :math:`11 \mu m` at temperatures 300 and 301 Kelvin. In units of :math:`mW/m^2\ m^{-1} sr^{-1}` this becomes: ===================================== pyspectral/blackbody.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2013-2018 Adam.Dybbroe +# Copyright (c) 2013-2019 Adam.Dybbroe # Author(s): @@ -23,8 +23,13 @@ """Planck radiation equation""" import numpy as np - import logging + +try: + import dask.array as da +except ImportError: + da = np + LOG = logging.getLogger(__name__) H_PLANCK = 6.62606957 * 1e-34 # SI-unit = [J*s] @@ -84,38 +89,39 @@ def blackbody_wn_rad2temp(wavenumber, radiance): function. Wavenumber space""" if np.isscalar(radiance): - rad = np.array([radiance, ], dtype='float64') - else: - rad = np.array(radiance, dtype='float64') + radiance = np.array([radiance], dtype='float64') + elif isinstance(radiance, (list, tuple)): + radiance = np.array(radiance, dtype='float64') if np.isscalar(wavenumber): - wavnum = np.array([wavenumber, ], dtype='float64') - else: + wavnum = np.array([wavenumber], dtype='float64') + elif isinstance(wavenumber, (list, tuple)): wavnum = np.array(wavenumber, dtype='float64') const1 = H_PLANCK * C_SPEED / K_BOLTZMANN const2 = 2 * H_PLANCK * C_SPEED**2 - res = const1 * wavnum / np.log(np.divide(const2 * wavnum**3, rad) + 1.0) + res = const1 * wavnum / np.log( + np.divide(const2 * wavnum**3, radiance) + 1.0) - shape = rad.shape + shape = radiance.shape resshape = res.shape if wavnum.shape[0] == 1: - if rad.shape[0] == 1: + if radiance.shape[0] == 1: return res[0] else: return res[::].reshape(shape) else: - if rad.shape[0] == 1: + if radiance.shape[0] == 1: return res[0, :] else: if len(shape) == 1: - return np.reshape(res, (shape[0], resshape[1])) + return res.reshape((shape[0], resshape[1])) else: - return np.reshape(res, (shape[0], shape[1], resshape[1])) + return res.reshape((shape[0], shape[1], resshape[1])) -def planck(wave, temp, wavelength=True): - """The Planck radiation or Blackbody radiation as a function of wavelength +def planck(wave, temperature, wavelength=True): + """The Planck radiation or Blackbody radiation as a function of wavelength or wavenumber. SI units. _planck(wave, temperature, wavelength=True) wave = Wavelength/wavenumber or a sequence of wavelengths/wavenumbers (m or m^-1) @@ -136,12 +142,12 @@ def planck(wave, temp, wavelength=True): units = ['wavelengths', 'wavenumbers'] if wavelength: LOG.debug("Using {0} when calculating the Blackbody radiance".format( - units[(wavelength == True) - 1])) + units[(wavelength is True) - 1])) - if np.isscalar(temp): - temperature = np.array([temp, ], dtype='float64') - else: - temperature = np.array(temp, dtype='float64') + if np.isscalar(temperature): + temperature = np.array([temperature, ], dtype='float64') + elif isinstance(temperature, (list, tuple)): + temperature = np.array(temperature, dtype='float64') shape = temperature.shape if np.isscalar(wave): @@ -157,13 +163,19 @@ def planck(wave, temp, wavelength=True): nom = 2 * H_PLANCK * (C_SPEED ** 2) * (wln ** 3) arg1 = H_PLANCK * C_SPEED * wln / K_BOLTZMANN - arg2 = np.where(np.greater(np.abs(temperature), EPSILON), - np.array(1. / temperature), -9).reshape(-1, 1) - arg2 = np.ma.masked_array(arg2, mask=arg2 == -9) - LOG.debug("Max and min - arg1: %s %s", str(arg1.max()), str(arg1.min())) - LOG.debug("Max and min - arg2: %s %s", str(arg2.max()), str(arg2.min())) + # use dask functions when needed + np_ = np if isinstance(temperature, np.ndarray) else da + arg2 = np_.where(np.greater(np.abs(temperature), EPSILON), + (1. / temperature), np.nan).reshape(-1, 1) + if isinstance(arg2, np.ndarray): + # don't compute min/max if we have dask arrays + LOG.debug("Max and min - arg1: %s %s", + str(np.nanmax(arg1)), str(np.nanmin(arg1))) + LOG.debug("Max and min - arg2: %s %s", + str(np.nanmax(arg2)), str(np.nanmin(arg2))) + try: - exp_arg = np.multiply(arg1.astype('float32'), arg2.astype('float32')) + exp_arg = np.multiply(arg1.astype('float64'), arg2.astype('float64')) except MemoryError: LOG.warning(("Dimensions used in numpy.multiply probably reached " "limit!\n" @@ -171,9 +183,9 @@ def planck(wave, temp, wavelength=True): "and try running again")) raise - LOG.debug("Max and min before exp: %s %s", str(exp_arg.max()), - str(exp_arg.min())) - if exp_arg.min() < 0: + if isinstance(exp_arg, np.ndarray) and exp_arg.min() < 0: + LOG.debug("Max and min before exp: %s %s", + str(exp_arg.max()), str(exp_arg.min())) LOG.warning("Something is fishy: \n" + "\tDenominator might be zero or negative in radiance derivation:") dubious = np.where(exp_arg < 0)[0] @@ -182,7 +194,6 @@ def planck(wave, temp, wavelength=True): denom = np.exp(exp_arg) - 1 rad = nom / denom - rad = np.where(rad.mask, np.nan, rad.data) radshape = rad.shape if wln.shape[0] == 1: if temperature.shape[0] == 1: @@ -194,9 +205,9 @@ def planck(wave, temp, wavelength=True): return rad[0, :] else: if len(shape) == 1: - return np.reshape(rad, (shape[0], radshape[1])) + return rad.reshape((shape[0], radshape[1])) else: - return np.reshape(rad, (shape[0], shape[1], radshape[1])) + return rad.reshape((shape[0], shape[1], radshape[1])) def blackbody_wn(wavenumber, temp): ===================================== pyspectral/config.py ===================================== @@ -28,7 +28,12 @@ import os from os.path import expanduser from appdirs import AppDirs import yaml -from collections import Mapping +try: + # python 3.3+ + from collections.abc import Mapping +except ImportError: + # deprecated (above can't be done in 2.7) + from collections import Mapping import pkg_resources try: ===================================== pyspectral/etc/pyspectral.yaml ===================================== @@ -209,6 +209,46 @@ download_from_internet: True # ch15: GOES-R_ABI_FM2_SRF_CWG_ch15.txt # ch16: GOES-R_ABI_FM2_SRF_CWG_ch16.txt +# FY-4A-agri: +# path: /path/to/original/fy4a/agri/data +# ch1: FY4A_AGRI_SRF_CH01.txt +# ch2: FY4A_AGRI_SRF_CH02.txt +# ch3: FY4A_AGRI_SRF_CH03.txt +# ch4: FY4A_AGRI_SRF_CH04.txt +# ch5: FY4A_AGRI_SRF_CH05.txt +# ch6: FY4A_AGRI_SRF_CH06.txt +# ch7: FY4A_AGRI_SRF_CH07.txt +# ch8: FY4A_AGRI_SRF_CH08.txt +# ch9: FY4A_AGRI_SRF_CH09.txt +# ch10: FY4A_AGRI_SRF_CH10.txt +# ch11: FY4A_AGRI_SRF_CH11.txt +# ch12: FY4A_AGRI_SRF_CH12.txt +# ch13: FY4A_AGRI_SRF_CH13.txt +# ch14: FY4A_AGRI_SRF_CH14.txt + +#FY-3B-virr: +# path: /Users/davidh/repos/git/pyspectral/virr_srf/FY3B-VIRR +# ch1: ch1.prn +# ch2: ch2.prn +# ch3: ch3.prn +# ch4: ch4.prn +# ch5: ch5.prn +# ch6: ch6.prn +# ch7: ch7.prn +# ch8: ch8.prn +# ch9: ch9.prn +# ch10: ch10.prn + +#FY-3C-virr: +# path: /Users/davidh/repos/git/pyspectral/virr_srf/FY3C_VIRR_SRF +# ch1: FY3C_VIRR_CH01.txt +# ch2: FY3C_VIRR_CH02.txt +# ch6: FY3C_VIRR_CH06.txt +# ch7: FY3C_VIRR_CH07.txt +# ch8: FY3C_VIRR_CH08.txt +# ch9: FY3C_VIRR_CH09.txt +# ch10: FY3C_VIRR_CH10.txt + # FY-3D-mersi-2: # path: /path/to/original/fy3d/mersi2/data # ch1: FY3D_MERSI_SRF_CH01_Pub.txt ===================================== pyspectral/rsr_reader.py ===================================== @@ -27,16 +27,15 @@ import os import numpy as np from glob import glob from os.path import expanduser - -import logging -LOG = logging.getLogger(__name__) - from pyspectral.config import get_config from pyspectral.utils import WAVE_NUMBER from pyspectral.utils import WAVE_LENGTH from pyspectral.utils import (INSTRUMENTS, download_rsr) from pyspectral.utils import (RSR_DATA_VERSION_FILENAME, RSR_DATA_VERSION) +import logging +LOG = logging.getLogger(__name__) + class RSRDataBaseClass(object): @@ -168,11 +167,18 @@ class RelativeSpectralResponse(RSRDataBaseClass): no_detectors_message = False with h5py.File(self.filename, 'r') as h5f: - self.band_names = [b.decode('utf-8') for b in h5f.attrs['band_names'].tolist()] - self.description = h5f.attrs['description'].decode('utf-8') + self.band_names = h5f.attrs['band_names'].tolist() + self.description = h5f.attrs['description'] + if not isinstance(self.band_names[0], str): + # byte array in python 3 + self.band_names = [x.decode('utf-8') for x in self.band_names] + self.description = self.description.decode('utf-8') + if not self.platform_name: try: - self.platform_name = h5f.attrs['platform_name'].decode('utf-8') + self.platform_name = h5f.attrs['platform_name'] + if not isinstance(self.platform_name, str): + self.platform_name = self.platform_name.decode('utf-8') except KeyError: LOG.warning("No platform_name in HDF5 file") try: @@ -186,6 +192,8 @@ class RelativeSpectralResponse(RSRDataBaseClass): if not self.instrument: try: self.instrument = h5f.attrs['sensor'].decode('utf-8') + if not isinstance(self.instrument, str): + self.instrument = self.instrument.decode('utf-8') except KeyError: LOG.warning("No sensor name specified in HDF5 file") self.instrument = INSTRUMENTS.get(self.platform_name) ===================================== pyspectral/tests/test_atm_correction_ir.py ===================================== @@ -1,11 +1,11 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2017 Adam.Dybbroe +# Copyright (c) 2017 - 2019 Pytroll # Author(s): -# Adam.Dybbroe +# Adam.Dybbroe # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by @@ -19,27 +19,17 @@ # You should have received a copy of the GNU General Public License # along with this program. If not, see . - -"""Unit tests of the atmospherical correction in the ir spectral range -""" +"""Unit tests of the atmospherical correction in the ir spectral range.""" +import numpy as np +from pyspectral.atm_correction_ir import AtmosphericalCorrection import sys if sys.version_info < (2, 7): import unittest2 as unittest else: import unittest -import numpy as np -from pyspectral.atm_correction_ir import AtmosphericalCorrection -#from mock import patch - -from pyspectral.tests.unittest_helpers import assertNumpyArraysEqual - -# Mock some modules, so we don't need them for tests. - -#sys.modules['pyresample'] = MagicMock() - SATZ = np.ma.array([[48.03, 48.03002, 48.03004, 48.03006, 48.03008, 48.0301, 48.03012, 48.03014, 48.03016, 48.03018], [48.09, 48.09002, 48.09004, 48.09006, 48.09008, 48.0901, @@ -125,28 +115,18 @@ RES = np.ma.array([[286.03159412, 286.03162417, 286.03165421, 286.03168426, class TestAtmCorrection(unittest.TestCase): - - """Class for testing pyspectral.atm_correction_ir""" - - def setUp(self): - """Setup the test""" - pass + """Class for testing pyspectral.atm_correction_ir.""" def test_get_correction(self): """Test getting the atm correction""" this = AtmosphericalCorrection('EOS-Terra', 'modis') atm_corr = this.get_correction(SATZ, None, TBS) - assertNumpyArraysEqual(TBS, atm_corr) - - def tearDown(self): - """Clean up""" - pass + np.testing.assert_almost_equal(TBS, atm_corr) def suite(): - """The test suite for test_atm_correction_ir. - """ + """Create the test suite for test_atm_correction_ir.""" loader = unittest.TestLoader() mysuite = unittest.TestSuite() mysuite.addTest(loader.loadTestsFromTestCase(TestAtmCorrection)) ===================================== pyspectral/tests/test_blackbody.py ===================================== @@ -1,11 +1,11 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2013, 2014, 2015, 2016, 2017 Adam.Dybbroe +# Copyright (c) 2013 - 2019 Pytroll # Author(s): -# Adam.Dybbroe +# Adam.Dybbroe # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by @@ -31,10 +31,8 @@ from pyspectral.tests.unittest_helpers import assertNumpyArraysEqual import unittest import numpy as np -#RAD_11MICRON_300KELVIN = 9572498.1141643394 -RAD_11MICRON_300KELVIN = 9573177.8811719529 -#RAD_11MICRON_301KELVIN = 9713997.9623772576 -RAD_11MICRON_301KELVIN = 9714688.2959563732 +RAD_11MICRON_300KELVIN = 9573176.935507433 +RAD_11MICRON_301KELVIN = 9714686.576498277 # Radiances in wavenumber space (SI-units) WN_RAD_11MICRON_300KELVIN = 0.00115835441353 @@ -43,18 +41,28 @@ WN_RAD_11MICRON_301KELVIN = 0.00117547716523 __unittest = True -class TestBlackbody(unittest.TestCase): +class CustomScheduler(object): + """Custom dask scheduler that raises an exception if dask is computed too many times.""" + + def __init__(self, max_computes=1): + """Set starting and maximum compute counts.""" + self.max_computes = max_computes + self.total_computes = 0 + + def __call__(self, dsk, keys, **kwargs): + """Compute dask task and keep track of number of times we do so.""" + import dask + self.total_computes += 1 + if self.total_computes > self.max_computes: + raise RuntimeError("Too many dask computations were scheduled: {}".format(self.total_computes)) + return dask.get(dsk, keys, **kwargs) - """Unit testing the blackbody function""" - def setUp(self): - """Set up""" - return +class TestBlackbody(unittest.TestCase): + """Unit testing the blackbody function""" def test_blackbody(self): - """Calculate the blackbody radiation from wavelengths and - temperatures - """ + """Calculate the blackbody radiation from wavelengths and temperatures.""" wavel = 11. * 1E-6 black = blackbody((wavel, ), [300., 301]) self.assertEqual(black.shape[0], 2) @@ -71,14 +79,28 @@ class TestBlackbody(unittest.TestCase): tb_therm = np.array([[300., 301], [299, 298], [279, 286]]) black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, np.ndarray) tb_therm = np.array([[300., 301], [0., 298], [279, 286]]) black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, np.ndarray) + + def test_blackbody_dask(self): + """Calculate the blackbody radiation from wavelengths and temperatures with dask arrays.""" + import dask + import dask.array as da + tb_therm = da.from_array([[300., 301], [299, 298], [279, 286]], chunks=2) + with dask.config.set(scheduler=CustomScheduler(0)): + black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, da.Array) + + tb_therm = da.from_array([[300., 301], [0., 298], [279, 286]], chunks=2) + with dask.config.set(scheduler=CustomScheduler(0)): + black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, da.Array) def test_blackbody_wn(self): - """Calculate the blackbody radiation from wavenumbers and - temperatures - """ + """Calculate the blackbody radiation from wavenumbers and temperatures.""" wavenumber = 90909.1 # 11 micron band black = blackbody_wn((wavenumber, ), [300., 301]) self.assertEqual(black.shape[0], 2) @@ -106,9 +128,24 @@ class TestBlackbody(unittest.TestCase): assertNumpyArraysEqual(t__, expected) - def tearDown(self): - """Clean up""" - return + def test_blackbody_wn_dask(self): + """Test that blackbody rad2temp preserves dask arrays.""" + import dask + import dask.array as da + wavenumber = 90909.1 # 11 micron band + radiances = da.from_array([0.001, 0.0009, 0.0012, 0.0018], chunks=2).reshape(2, 2) + with dask.config.set(scheduler=CustomScheduler(0)): + t__ = blackbody_wn_rad2temp(wavenumber, radiances) + self.assertIsInstance(t__, da.Array) + t__ = t__.compute() + expected = np.array([290.3276916, 283.76115441, + 302.4181330, 333.1414164]).reshape(2, 2) + self.assertAlmostEqual(t__[1, 1], expected[1, 1], 5) + self.assertAlmostEqual(t__[0, 0], expected[0, 0], 5) + self.assertAlmostEqual(t__[0, 1], expected[0, 1], 5) + self.assertAlmostEqual(t__[1, 0], expected[1, 0], 5) + + assertNumpyArraysEqual(t__, expected) def suite(): ===================================== pyspectral/tests/test_rad_tb_conversions.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2014-2018 Adam.Dybbroe +# Copyright (c) 2014-2019 Adam.Dybbroe # Author(s): @@ -347,10 +347,10 @@ class TestRadTbConversions(unittest.TestCase): self.assertTrue(np.allclose(TRUE_RADS * integral, res['radiance'])) res = self.modis.tb2radiance(237., lut=False) - self.assertAlmostEqual(16570.592171157, res['radiance']) + self.assertAlmostEqual(16570.579551068, res['radiance']) res = self.modis.tb2radiance(277., lut=False) - self.assertAlmostEqual(167544.3823631, res['radiance']) + self.assertAlmostEqual(167544.39368663222, res['radiance']) res = self.modis.tb2radiance(1.1, lut=False) self.assertAlmostEqual(0.0, res['radiance']) @@ -362,7 +362,7 @@ class TestRadTbConversions(unittest.TestCase): self.assertAlmostEqual(5.3940515573e-06, res['radiance']) res = self.modis.tb2radiance(200.1, lut=False) - self.assertAlmostEqual(865.09776189, res['radiance']) + self.assertAlmostEqual(865.09759706, res['radiance']) def tearDown(self): """Clean up""" ===================================== pyspectral/tests/test_reflectance.py ===================================== @@ -139,7 +139,10 @@ class TestReflectance(unittest.TestCase): with patch('pyspectral.radiance_tb_conversion.RelativeSpectralResponse') as mymock: instance = mymock.return_value - instance.rsr = TEST_RSR + # VIIRS doesn't have a channel '20' like MODIS so the generic + # mapping this test will end up using will find 'ch20' for VIIRS + viirs_rsr = {'ch20': TEST_RSR['20'], '99': TEST_RSR['99']} + instance.rsr = viirs_rsr instance.unit = '1e-6 m' instance.si_scale = 1e-6 @@ -148,7 +151,7 @@ class TestReflectance(unittest.TestCase): refl37 = Calculator('Suomi-NPP', 'viirs', 3.7) self.assertEqual(refl37.bandwavelength, 3.7) - self.assertEqual(refl37.bandname, '20') + self.assertEqual(refl37.bandname, 'ch20') with patch('pyspectral.radiance_tb_conversion.RelativeSpectralResponse') as mymock: instance = mymock.return_value ===================================== pyspectral/tests/test_utils.py ===================================== @@ -115,8 +115,7 @@ class TestUtils(unittest.TestCase): self.rsr = RsrTestData() def test_convert2wavenumber(self): - """Testing the conversion of rsr from wavelength to wavenumber - """ + """Testing the conversion of rsr from wavelength to wavenumber.""" newrsr, info = utils.convert2wavenumber(TEST_RSR) unit = info['unit'] self.assertEqual(unit, 'cm-1') @@ -127,11 +126,7 @@ class TestUtils(unittest.TestCase): self.assertTrue(np.allclose(wvn_res, wvn)) def test_get_bandname_from_wavelength(self): - """Test that it is possible to get the right bandname provided the wavelength - in micro meters - - """ - + """Test the right bandname is found provided the wavelength in micro meters.""" x = utils.get_bandname_from_wavelength('abi', 0.4, self.rsr.rsr) self.assertEqual(x, 'ch1') with self.assertRaises(AttributeError): @@ -146,19 +141,16 @@ class TestUtils(unittest.TestCase): x = utils.get_bandname_from_wavelength('abi', 1.0, self.rsr.rsr) self.assertEqual(x, None) + # uses generic channel mapping where '20' -> 'ch20' bandname = utils.get_bandname_from_wavelength('abi', 3.7, TEST_RSR) - self.assertEqual(bandname, '20') + self.assertEqual(bandname, 'ch20') bandname = utils.get_bandname_from_wavelength('abi', 3.0, TEST_RSR) self.assertIsNone(bandname) - def tearDown(self): - """Clean up""" - pass - def suite(): - """The suite for test_utils.""" + """Create the suite for test_utils.""" loader = unittest.TestLoader() mysuite = unittest.TestSuite() mysuite.addTest(loader.loadTestsFromTestCase(TestUtils)) ===================================== pyspectral/utils.py ===================================== @@ -78,9 +78,19 @@ BANDNAMES['generic'] = {'VIS006': 'VIS0.6', 'C15': 'ch15', 'C16': 'ch16', } +# handle arbitrary channel numbers +for chan_num in range(1, 37): + BANDNAMES['generic'][str(chan_num)] = 'ch{:d}'.format(chan_num) -BANDNAMES['avhrr-3'] = {'3b': 'ch3b', - '3a': 'ch3a'} +# MODIS RSR files were made before 'chX' became standard in pyspectral +BANDNAMES['modis'] = {str(chan_num): str(chan_num) for chan_num in range(1, 37)} + +BANDNAMES['avhrr-3'] = {'1': 'ch1', + '2': 'ch2', + '3b': 'ch3b', + '3a': 'ch3a', + '4': 'ch4', + '5': 'ch5'} BANDNAMES['ahi'] = {'B01': 'ch1', 'B02': 'ch2', @@ -120,12 +130,16 @@ INSTRUMENTS = {'NOAA-19': 'avhrr/3', 'Suomi-NPP': 'viirs', 'NOAA-20': 'viirs', 'FY-3D': 'mersi-2', - 'Feng-Yun 3D': 'mersi-2' + 'FY-3C': 'virr', + 'FY-3B': 'virr', + 'Feng-Yun 3D': 'mersi-2', + 'FY-4A': 'agri' } -HTTP_PYSPECTRAL_RSR = "https://zenodo.org/record/2653487/files/pyspectral_rsr_data.tgz" + +HTTP_PYSPECTRAL_RSR = "https://zenodo.org/record/3381130/files/pyspectral_rsr_data.tgz" RSR_DATA_VERSION_FILENAME = "PYSPECTRAL_RSR_VERSION" -RSR_DATA_VERSION = "v1.0.6" +RSR_DATA_VERSION = "v1.0.9" ATM_CORRECTION_LUT_VERSION = {} ATM_CORRECTION_LUT_VERSION['antarctic_aerosol'] = {'version': 'v1.0.1', ===================================== pyspectral/version.py ===================================== @@ -46,9 +46,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (tag: v0.8.9)" - git_full = "b3a1f58f49e2ab7c81c331e379065f24eb21f1ba" - git_date = "2019-06-07 09:27:42 +0200" + git_refnames = " (HEAD -> master, tag: v0.9.0)" + git_full = "2372396547ecace7fbcefc50d5dcbaa32e4a5177" + git_date = "2019-08-30 18:04:19 +0200" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords ===================================== rsr_convert_scripts/README.rst ===================================== @@ -176,3 +176,12 @@ the pyspectral.yaml file: Adam Dybbroe Sat Dec 1 17:39:48 2018 +.. code:: + + %> python virr_rsr.py + +Converting the FY-3B or FY-3C VIRR spectral responses to HDF5. Original files +for FY-3B come as ``.prn`` text files for each channel (ex. ``ch1.prn``). For +FY-3C they come as ``.txt`` text files for channels 1, 2, 6, 7, 8, 9, and 10 +only with names like ``FY3C_VIRR_CH01.txt``. + ===================================== rsr_convert_scripts/agri_rsr.py ===================================== @@ -0,0 +1,108 @@ +#!/usr/bin/env python +# -*- coding: utf-8 -*- + +# Copyright (c) 2018, 2019 Pytroll + +# Author(s): + +# Xin.Zhang +# Adam.Dybbroe + +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +"""Read the FY-4A AGRI relative spectral responses. Data from +http://fy4.nsmc.org.cn/portal/cn/fycv/srf.html +""" +import os +import numpy as np +from pyspectral.utils import INSTRUMENTS +from pyspectral.utils import convert2hdf5 as tohdf5 +from pyspectral.raw_reader import InstrumentRSR +from pyspectral.utils import logging_on, get_logger + +FY4A_BAND_NAMES = ['ch1', 'ch2', 'ch3', 'ch4', 'ch5', 'ch6', 'ch7', 'ch8', + 'ch9', 'ch10', 'ch11', 'ch12', 'ch13', 'ch14'] +BANDNAME_SCALE2MICROMETERS = {'ch1': 0.001, + 'ch2': 0.001, + 'ch3': 0.001, + 'ch4': 1.0, + 'ch5': 1.0, + 'ch6': 1.0, + 'ch7': 1.0, + 'ch8': 1.0, + 'ch9': 1.0, + 'ch10': 1.0, + 'ch11': 1.0, + 'ch12': 1.0, + 'ch13': 1.0, + 'ch14': 1.0} + + +class AGRIRSR(InstrumentRSR): + """Container for the FY-4 AGRI RSR data""" + + def __init__(self, bandname, platform_name): + """Initialise the FY-4 AGRI relative spectral response data""" + super(AGRIRSR, self).__init__(bandname, platform_name, FY4A_BAND_NAMES) + + self.instrument = INSTRUMENTS.get(platform_name, 'agri') + + self._get_options_from_config() + self._get_bandfilenames() + + LOG.debug("Filenames: %s", str(self.filenames)) + if self.filenames[bandname] and os.path.exists(self.filenames[bandname]): + self.requested_band_filename = self.filenames[bandname] + scale = BANDNAME_SCALE2MICROMETERS.get(bandname) + if scale: + self._load(scale=scale) + else: + LOG.error( + "Failed determine the scale used to convert to wavelength in micrometers - channel = %s", bandname) + raise AttributeError('no scale for bandname %s', bandname) + + else: + LOG.warning("Couldn't find an existing file for this band: %s", + str(self.bandname)) + + self.filename = self.requested_band_filename + + def _load(self, scale=0.001): + """Load the AGRI RSR data for the band requested + + Wavelength is given in nanometers. + """ + data = np.genfromtxt(self.requested_band_filename, + unpack=True, + names=['wavelength', + 'response'], + skip_header=0) + + wavelength = data['wavelength'] * scale + response = data['response'] + + self.rsr = {'wavelength': wavelength, 'response': response} + + +def main(): + """Main""" + for platform_name in ["FY-4A", ]: + tohdf5(AGRIRSR, platform_name, FY4A_BAND_NAMES) + + +if __name__ == "__main__": + LOG = get_logger(__name__) + logging_on() + + main() ===================================== rsr_convert_scripts/virr_rsr.py ===================================== @@ -0,0 +1,89 @@ +#!/usr/bin/env python +# -*- coding: utf-8 -*- +# +# Copyright (c) 2019 Adam.Dybbroe +# +# Author(s): +# +# David Hoese +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . +"""Read the VIRR relative spectral responses. + +Data from http://gsics.nsmc.org.cn/portal/en/fycv/srf.html + +""" + +import os +import numpy as np +from pyspectral.utils import INSTRUMENTS +from pyspectral.utils import convert2hdf5 as tohdf5 +from pyspectral.raw_reader import InstrumentRSR + +import logging +LOG = logging.getLogger(__name__) + +VIRR_BAND_NAMES = { + 'FY-3B': ['ch{:d}'.format(x) for x in range(1, 11)], + 'FY-3C': ['ch1', 'ch2'] + ['ch{:d}'.format(x) for x in range(6, 11)], +} + + +class VirrRSR(InstrumentRSR): + """Container for the FY-3B/FY-3C VIRR RSR data.""" + + def __init__(self, bandname, platform_name): + """Verify that file exists and can be read.""" + super(VirrRSR, self).__init__(bandname, platform_name, VIRR_BAND_NAMES[platform_name]) + + self.instrument = INSTRUMENTS.get(platform_name, 'virr') + self._get_options_from_config() + self._get_bandfilenames() + + LOG.debug("Filenames: %s", str(self.filenames)) + if self.filenames[bandname] and os.path.exists(self.filenames[bandname]): + self.requested_band_filename = self.filenames[bandname] + self._load() + else: + LOG.warning("Couldn't find an existing file for this band: %s", + str(self.bandname)) + + # To be compatible with VIIRS.... + self.filename = self.requested_band_filename + + def _load(self, scale=0.001): + """Load the VIRR RSR data for the band requested. + + Wavelength is given in nanometers. + """ + data = np.genfromtxt(self.requested_band_filename, + unpack=True, + names=['wavelength', + 'response'], + skip_header=0) + + wavelength = data['wavelength'] * scale + response = data['response'] + + self.rsr = {'wavelength': wavelength, 'response': response} + + +def main(): + """Main""" + for platform_name, band_names in VIRR_BAND_NAMES.items(): + tohdf5(VirrRSR, platform_name, band_names) + + +if __name__ == "__main__": + main() View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/compare/c9a3be279065fdb63807a9576f59a156996a9f51...a8df3a1bbb29fa6754bafa6b71920a3c9293f853 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/compare/c9a3be279065fdb63807a9576f59a156996a9f51...a8df3a1bbb29fa6754bafa6b71920a3c9293f853 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 07:14:24 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 02 Sep 2019 06:14:24 +0000 Subject: [Git][debian-gis-team/pyspectral] Pushed new tag upstream/0.9.0+ds Message-ID: <5d6cb340494ed_577b3f91d4b84034292797@godard.mail> Antonio Valentino pushed new tag upstream/0.9.0+ds at Debian GIS Project / pyspectral -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/tree/upstream/0.9.0+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 07:14:29 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 02 Sep 2019 06:14:29 +0000 Subject: [Git][debian-gis-team/pyspectral][upstream] New upstream version 0.9.0+ds Message-ID: <5d6cb345e5a43_577b2ade5efcfd34292845@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / pyspectral Commits: bdb76ea7 by Antonio Valentino at 2019-09-02T05:46:16Z New upstream version 0.9.0+ds - - - - - 18 changed files: - CHANGELOG.md - README.md - doc/37_reflectance.rst - doc/rad_definitions.rst - pyspectral/blackbody.py - pyspectral/config.py - pyspectral/etc/pyspectral.yaml - pyspectral/rsr_reader.py - pyspectral/tests/test_atm_correction_ir.py - pyspectral/tests/test_blackbody.py - pyspectral/tests/test_rad_tb_conversions.py - pyspectral/tests/test_reflectance.py - pyspectral/tests/test_utils.py - pyspectral/utils.py - pyspectral/version.py - rsr_convert_scripts/README.rst - + rsr_convert_scripts/agri_rsr.py - + rsr_convert_scripts/virr_rsr.py Changes: ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,27 @@ +## Version (2019/08/30) + +### Issues Closed + +* [Issue 73](https://github.com/pytroll/pyspectral/issues/73) - Fix blackbody code to work with dask arrays ([PR 74](https://github.com/pytroll/pyspectral/pull/74)) + +In this release 1 issue was closed. + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 80](https://github.com/pytroll/pyspectral/pull/80) - Fix doc tests for python 2&3 +* [PR 79](https://github.com/pytroll/pyspectral/pull/79) - Fix rsr zenodo version +* [PR 74](https://github.com/pytroll/pyspectral/pull/74) - Fix dask compatibility in blackbody functions ([73](https://github.com/pytroll/pyspectral/issues/73)) + +#### Features added + +* [PR 78](https://github.com/pytroll/pyspectral/pull/78) - Add FY-3B VIRR and FY-3C VIRR RSRs +* [PR 77](https://github.com/pytroll/pyspectral/pull/77) - Add FY-4A AGRI support + +In this release 5 pull requests were closed. + + ## Version (2019/06/07) ### Issues Closed ===================================== README.md ===================================== @@ -1,14 +1,11 @@ PySpectral ========== -[![Codacy Badge](https://api.codacy.com/project/badge/Grade/9f039d7d640846ca89be8a78fa11e1f6)](https://www.codacy.com/app/adybbroe/pyspectral?utm_source=github.com&utm_medium=referral&utm_content=pytroll/pyspectral&utm_campaign=badger) [![Build Status](https://travis-ci.org/pytroll/pyspectral.png?branch=master)](https://travis-ci.org/pytroll/pyspectral) [![Build status](https://ci.appveyor.com/api/projects/status/5lm42n0l65l5o9xn?svg=true)](https://ci.appveyor.com/project/pytroll/pyspectral) [![Coverage Status](https://coveralls.io/repos/github/pytroll/pyspectral/badge.svg?branch=master)](https://coveralls.io/github/pytroll/pyspectral?branch=master) -[![Code Health](https://landscape.io/github/pytroll/pyspectral/master/landscape.png)](https://landscape.io/github/pytroll/pyspectral/master) [![PyPI version](https://badge.fury.io/py/pyspectral.svg)](https://badge.fury.io/py/pyspectral) [![Code Climate](https://codeclimate.com/github/pytroll/pyspectral/badges/gpa.svg)](https://codeclimate.com/github/pytroll/pyspectral) -[![Scrutinizer Code Quality](https://scrutinizer-ci.com/g/pytroll/pyspectral/badges/quality-score.png?b=master)](https://scrutinizer-ci.com/g/pytroll/pyspectral/?branch=master) Given a passive sensor on a meteorological satellite PySpectral provides the relative spectral response (rsr) function(s) and offer you some basic ===================================== doc/37_reflectance.rst ===================================== @@ -46,7 +46,7 @@ expressed in :math:`W/m^2 sr^{-1} \mu m^{-1}`, or using SI units :math:`W/m^2 sr >>> viirs = RadTbConverter('Suomi-NPP', 'viirs', 'M12') >>> rad37 = viirs.tb2radiance(tb37) >>> print([np.round(rad, 7) for rad in rad37['radiance']]) - [369717.4765726, 355110.5207853, 314684.2788726, 173143.5424898, 116408.0007877] + [369717.4972296, 355110.6414922, 314684.3507084, 173143.4836477, 116408.0022674] >>> rad37['unit'] 'W/m^2 sr^-1 m^-1' @@ -59,7 +59,7 @@ In order to get the total radiance over the band one has to multiply with the eq >>> viirs = RadTbConverter('Suomi-NPP', 'viirs', 'M12') >>> rad37 = viirs.tb2radiance(tb37, normalized=False) >>> print([np.round(rad, 8) for rad in rad37['radiance']]) - [0.07037968, 0.06759909, 0.05990352, 0.03295972, 0.02215951] + [0.07037968, 0.06759911, 0.05990353, 0.03295971, 0.02215951] >>> rad37['unit'] 'W/m^2 sr^-1' @@ -218,17 +218,17 @@ We can try decompose equation :eq:`refl37` above using the example of VIIRS M12 >>> print(np.isnan(nomin)) [False False False False False] >>> print([np.round(val, 8) for val in nomin]) - [0.05083677, 0.04805618, 0.0404157, 0.01279279, 0.00204485] + [0.05083677, 0.0480562, 0.04041571, 0.01279277, 0.00204485] >>> denom = np.cos(np.deg2rad(sunz))/np.pi * sflux - rad11['radiance'] >>> print(np.isnan(denom)) [False False False False False] >>> print([np.round(val, 8) for val in denom]) - [0.23646313, 0.23645682, 0.23650559, 0.23582015, 0.2358661] + [0.23646312, 0.23645681, 0.23650559, 0.23582014, 0.23586609] >>> res = nomin/denom >>> print(np.isnan(res)) [False False False False False] >>> print([np.round(val, 8) for val in res]) - [0.21498817, 0.2032345, 0.17088689, 0.05424807, 0.00866955] + [0.21498817, 0.20323458, 0.17088693, 0.05424801, 0.00866952] Derive the emissive part of the 3.7 micron band @@ -255,5 +255,5 @@ Using the example of the VIIRS M12 band from above this gives the following spec >>> ['{tb:6.3f}'.format(tb=np.round(t, 4)) for t in tb] ['266.996', '267.262', '267.991', '271.033', '271.927'] >>> rad = refl_m12.emissive_part_3x(tb=False) - >>> ['{rad:6.3f}'.format(rad=np.round(r, 3)) for r in rad] - ['80285.149', '81458.022', '84749.639', '99761.400', '104582.030'] + >>> ['{rad:6.3f}'.format(rad=np.round(r, 3)) for r in rad.compute()] + ['80285.150', '81458.022', '84749.638', '99761.401', '104582.031'] ===================================== doc/rad_definitions.rst ===================================== @@ -229,7 +229,7 @@ And using wavelength representation: >>> wvl = 1./wavenumber >>> rad = blackbody(wvl, [300., 301]) >>> print("{0:10.3f} {1:10.3f}".format(rad[0], rad[1])) - 9573178.886 9714689.259 + 9573177.494 9714687.157 Which are the spectral radiances in SI units around :math:`11 \mu m` at temperatures 300 and 301 Kelvin. In units of :math:`mW/m^2\ m^{-1} sr^{-1}` this becomes: ===================================== pyspectral/blackbody.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2013-2018 Adam.Dybbroe +# Copyright (c) 2013-2019 Adam.Dybbroe # Author(s): @@ -23,8 +23,13 @@ """Planck radiation equation""" import numpy as np - import logging + +try: + import dask.array as da +except ImportError: + da = np + LOG = logging.getLogger(__name__) H_PLANCK = 6.62606957 * 1e-34 # SI-unit = [J*s] @@ -84,38 +89,39 @@ def blackbody_wn_rad2temp(wavenumber, radiance): function. Wavenumber space""" if np.isscalar(radiance): - rad = np.array([radiance, ], dtype='float64') - else: - rad = np.array(radiance, dtype='float64') + radiance = np.array([radiance], dtype='float64') + elif isinstance(radiance, (list, tuple)): + radiance = np.array(radiance, dtype='float64') if np.isscalar(wavenumber): - wavnum = np.array([wavenumber, ], dtype='float64') - else: + wavnum = np.array([wavenumber], dtype='float64') + elif isinstance(wavenumber, (list, tuple)): wavnum = np.array(wavenumber, dtype='float64') const1 = H_PLANCK * C_SPEED / K_BOLTZMANN const2 = 2 * H_PLANCK * C_SPEED**2 - res = const1 * wavnum / np.log(np.divide(const2 * wavnum**3, rad) + 1.0) + res = const1 * wavnum / np.log( + np.divide(const2 * wavnum**3, radiance) + 1.0) - shape = rad.shape + shape = radiance.shape resshape = res.shape if wavnum.shape[0] == 1: - if rad.shape[0] == 1: + if radiance.shape[0] == 1: return res[0] else: return res[::].reshape(shape) else: - if rad.shape[0] == 1: + if radiance.shape[0] == 1: return res[0, :] else: if len(shape) == 1: - return np.reshape(res, (shape[0], resshape[1])) + return res.reshape((shape[0], resshape[1])) else: - return np.reshape(res, (shape[0], shape[1], resshape[1])) + return res.reshape((shape[0], shape[1], resshape[1])) -def planck(wave, temp, wavelength=True): - """The Planck radiation or Blackbody radiation as a function of wavelength +def planck(wave, temperature, wavelength=True): + """The Planck radiation or Blackbody radiation as a function of wavelength or wavenumber. SI units. _planck(wave, temperature, wavelength=True) wave = Wavelength/wavenumber or a sequence of wavelengths/wavenumbers (m or m^-1) @@ -136,12 +142,12 @@ def planck(wave, temp, wavelength=True): units = ['wavelengths', 'wavenumbers'] if wavelength: LOG.debug("Using {0} when calculating the Blackbody radiance".format( - units[(wavelength == True) - 1])) + units[(wavelength is True) - 1])) - if np.isscalar(temp): - temperature = np.array([temp, ], dtype='float64') - else: - temperature = np.array(temp, dtype='float64') + if np.isscalar(temperature): + temperature = np.array([temperature, ], dtype='float64') + elif isinstance(temperature, (list, tuple)): + temperature = np.array(temperature, dtype='float64') shape = temperature.shape if np.isscalar(wave): @@ -157,13 +163,19 @@ def planck(wave, temp, wavelength=True): nom = 2 * H_PLANCK * (C_SPEED ** 2) * (wln ** 3) arg1 = H_PLANCK * C_SPEED * wln / K_BOLTZMANN - arg2 = np.where(np.greater(np.abs(temperature), EPSILON), - np.array(1. / temperature), -9).reshape(-1, 1) - arg2 = np.ma.masked_array(arg2, mask=arg2 == -9) - LOG.debug("Max and min - arg1: %s %s", str(arg1.max()), str(arg1.min())) - LOG.debug("Max and min - arg2: %s %s", str(arg2.max()), str(arg2.min())) + # use dask functions when needed + np_ = np if isinstance(temperature, np.ndarray) else da + arg2 = np_.where(np.greater(np.abs(temperature), EPSILON), + (1. / temperature), np.nan).reshape(-1, 1) + if isinstance(arg2, np.ndarray): + # don't compute min/max if we have dask arrays + LOG.debug("Max and min - arg1: %s %s", + str(np.nanmax(arg1)), str(np.nanmin(arg1))) + LOG.debug("Max and min - arg2: %s %s", + str(np.nanmax(arg2)), str(np.nanmin(arg2))) + try: - exp_arg = np.multiply(arg1.astype('float32'), arg2.astype('float32')) + exp_arg = np.multiply(arg1.astype('float64'), arg2.astype('float64')) except MemoryError: LOG.warning(("Dimensions used in numpy.multiply probably reached " "limit!\n" @@ -171,9 +183,9 @@ def planck(wave, temp, wavelength=True): "and try running again")) raise - LOG.debug("Max and min before exp: %s %s", str(exp_arg.max()), - str(exp_arg.min())) - if exp_arg.min() < 0: + if isinstance(exp_arg, np.ndarray) and exp_arg.min() < 0: + LOG.debug("Max and min before exp: %s %s", + str(exp_arg.max()), str(exp_arg.min())) LOG.warning("Something is fishy: \n" + "\tDenominator might be zero or negative in radiance derivation:") dubious = np.where(exp_arg < 0)[0] @@ -182,7 +194,6 @@ def planck(wave, temp, wavelength=True): denom = np.exp(exp_arg) - 1 rad = nom / denom - rad = np.where(rad.mask, np.nan, rad.data) radshape = rad.shape if wln.shape[0] == 1: if temperature.shape[0] == 1: @@ -194,9 +205,9 @@ def planck(wave, temp, wavelength=True): return rad[0, :] else: if len(shape) == 1: - return np.reshape(rad, (shape[0], radshape[1])) + return rad.reshape((shape[0], radshape[1])) else: - return np.reshape(rad, (shape[0], shape[1], radshape[1])) + return rad.reshape((shape[0], shape[1], radshape[1])) def blackbody_wn(wavenumber, temp): ===================================== pyspectral/config.py ===================================== @@ -28,7 +28,12 @@ import os from os.path import expanduser from appdirs import AppDirs import yaml -from collections import Mapping +try: + # python 3.3+ + from collections.abc import Mapping +except ImportError: + # deprecated (above can't be done in 2.7) + from collections import Mapping import pkg_resources try: ===================================== pyspectral/etc/pyspectral.yaml ===================================== @@ -209,6 +209,46 @@ download_from_internet: True # ch15: GOES-R_ABI_FM2_SRF_CWG_ch15.txt # ch16: GOES-R_ABI_FM2_SRF_CWG_ch16.txt +# FY-4A-agri: +# path: /path/to/original/fy4a/agri/data +# ch1: FY4A_AGRI_SRF_CH01.txt +# ch2: FY4A_AGRI_SRF_CH02.txt +# ch3: FY4A_AGRI_SRF_CH03.txt +# ch4: FY4A_AGRI_SRF_CH04.txt +# ch5: FY4A_AGRI_SRF_CH05.txt +# ch6: FY4A_AGRI_SRF_CH06.txt +# ch7: FY4A_AGRI_SRF_CH07.txt +# ch8: FY4A_AGRI_SRF_CH08.txt +# ch9: FY4A_AGRI_SRF_CH09.txt +# ch10: FY4A_AGRI_SRF_CH10.txt +# ch11: FY4A_AGRI_SRF_CH11.txt +# ch12: FY4A_AGRI_SRF_CH12.txt +# ch13: FY4A_AGRI_SRF_CH13.txt +# ch14: FY4A_AGRI_SRF_CH14.txt + +#FY-3B-virr: +# path: /Users/davidh/repos/git/pyspectral/virr_srf/FY3B-VIRR +# ch1: ch1.prn +# ch2: ch2.prn +# ch3: ch3.prn +# ch4: ch4.prn +# ch5: ch5.prn +# ch6: ch6.prn +# ch7: ch7.prn +# ch8: ch8.prn +# ch9: ch9.prn +# ch10: ch10.prn + +#FY-3C-virr: +# path: /Users/davidh/repos/git/pyspectral/virr_srf/FY3C_VIRR_SRF +# ch1: FY3C_VIRR_CH01.txt +# ch2: FY3C_VIRR_CH02.txt +# ch6: FY3C_VIRR_CH06.txt +# ch7: FY3C_VIRR_CH07.txt +# ch8: FY3C_VIRR_CH08.txt +# ch9: FY3C_VIRR_CH09.txt +# ch10: FY3C_VIRR_CH10.txt + # FY-3D-mersi-2: # path: /path/to/original/fy3d/mersi2/data # ch1: FY3D_MERSI_SRF_CH01_Pub.txt ===================================== pyspectral/rsr_reader.py ===================================== @@ -27,16 +27,15 @@ import os import numpy as np from glob import glob from os.path import expanduser - -import logging -LOG = logging.getLogger(__name__) - from pyspectral.config import get_config from pyspectral.utils import WAVE_NUMBER from pyspectral.utils import WAVE_LENGTH from pyspectral.utils import (INSTRUMENTS, download_rsr) from pyspectral.utils import (RSR_DATA_VERSION_FILENAME, RSR_DATA_VERSION) +import logging +LOG = logging.getLogger(__name__) + class RSRDataBaseClass(object): @@ -168,11 +167,18 @@ class RelativeSpectralResponse(RSRDataBaseClass): no_detectors_message = False with h5py.File(self.filename, 'r') as h5f: - self.band_names = [b.decode('utf-8') for b in h5f.attrs['band_names'].tolist()] - self.description = h5f.attrs['description'].decode('utf-8') + self.band_names = h5f.attrs['band_names'].tolist() + self.description = h5f.attrs['description'] + if not isinstance(self.band_names[0], str): + # byte array in python 3 + self.band_names = [x.decode('utf-8') for x in self.band_names] + self.description = self.description.decode('utf-8') + if not self.platform_name: try: - self.platform_name = h5f.attrs['platform_name'].decode('utf-8') + self.platform_name = h5f.attrs['platform_name'] + if not isinstance(self.platform_name, str): + self.platform_name = self.platform_name.decode('utf-8') except KeyError: LOG.warning("No platform_name in HDF5 file") try: @@ -186,6 +192,8 @@ class RelativeSpectralResponse(RSRDataBaseClass): if not self.instrument: try: self.instrument = h5f.attrs['sensor'].decode('utf-8') + if not isinstance(self.instrument, str): + self.instrument = self.instrument.decode('utf-8') except KeyError: LOG.warning("No sensor name specified in HDF5 file") self.instrument = INSTRUMENTS.get(self.platform_name) ===================================== pyspectral/tests/test_atm_correction_ir.py ===================================== @@ -1,11 +1,11 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2017 Adam.Dybbroe +# Copyright (c) 2017 - 2019 Pytroll # Author(s): -# Adam.Dybbroe +# Adam.Dybbroe # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by @@ -19,27 +19,17 @@ # You should have received a copy of the GNU General Public License # along with this program. If not, see . - -"""Unit tests of the atmospherical correction in the ir spectral range -""" +"""Unit tests of the atmospherical correction in the ir spectral range.""" +import numpy as np +from pyspectral.atm_correction_ir import AtmosphericalCorrection import sys if sys.version_info < (2, 7): import unittest2 as unittest else: import unittest -import numpy as np -from pyspectral.atm_correction_ir import AtmosphericalCorrection -#from mock import patch - -from pyspectral.tests.unittest_helpers import assertNumpyArraysEqual - -# Mock some modules, so we don't need them for tests. - -#sys.modules['pyresample'] = MagicMock() - SATZ = np.ma.array([[48.03, 48.03002, 48.03004, 48.03006, 48.03008, 48.0301, 48.03012, 48.03014, 48.03016, 48.03018], [48.09, 48.09002, 48.09004, 48.09006, 48.09008, 48.0901, @@ -125,28 +115,18 @@ RES = np.ma.array([[286.03159412, 286.03162417, 286.03165421, 286.03168426, class TestAtmCorrection(unittest.TestCase): - - """Class for testing pyspectral.atm_correction_ir""" - - def setUp(self): - """Setup the test""" - pass + """Class for testing pyspectral.atm_correction_ir.""" def test_get_correction(self): """Test getting the atm correction""" this = AtmosphericalCorrection('EOS-Terra', 'modis') atm_corr = this.get_correction(SATZ, None, TBS) - assertNumpyArraysEqual(TBS, atm_corr) - - def tearDown(self): - """Clean up""" - pass + np.testing.assert_almost_equal(TBS, atm_corr) def suite(): - """The test suite for test_atm_correction_ir. - """ + """Create the test suite for test_atm_correction_ir.""" loader = unittest.TestLoader() mysuite = unittest.TestSuite() mysuite.addTest(loader.loadTestsFromTestCase(TestAtmCorrection)) ===================================== pyspectral/tests/test_blackbody.py ===================================== @@ -1,11 +1,11 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2013, 2014, 2015, 2016, 2017 Adam.Dybbroe +# Copyright (c) 2013 - 2019 Pytroll # Author(s): -# Adam.Dybbroe +# Adam.Dybbroe # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by @@ -31,10 +31,8 @@ from pyspectral.tests.unittest_helpers import assertNumpyArraysEqual import unittest import numpy as np -#RAD_11MICRON_300KELVIN = 9572498.1141643394 -RAD_11MICRON_300KELVIN = 9573177.8811719529 -#RAD_11MICRON_301KELVIN = 9713997.9623772576 -RAD_11MICRON_301KELVIN = 9714688.2959563732 +RAD_11MICRON_300KELVIN = 9573176.935507433 +RAD_11MICRON_301KELVIN = 9714686.576498277 # Radiances in wavenumber space (SI-units) WN_RAD_11MICRON_300KELVIN = 0.00115835441353 @@ -43,18 +41,28 @@ WN_RAD_11MICRON_301KELVIN = 0.00117547716523 __unittest = True -class TestBlackbody(unittest.TestCase): +class CustomScheduler(object): + """Custom dask scheduler that raises an exception if dask is computed too many times.""" + + def __init__(self, max_computes=1): + """Set starting and maximum compute counts.""" + self.max_computes = max_computes + self.total_computes = 0 + + def __call__(self, dsk, keys, **kwargs): + """Compute dask task and keep track of number of times we do so.""" + import dask + self.total_computes += 1 + if self.total_computes > self.max_computes: + raise RuntimeError("Too many dask computations were scheduled: {}".format(self.total_computes)) + return dask.get(dsk, keys, **kwargs) - """Unit testing the blackbody function""" - def setUp(self): - """Set up""" - return +class TestBlackbody(unittest.TestCase): + """Unit testing the blackbody function""" def test_blackbody(self): - """Calculate the blackbody radiation from wavelengths and - temperatures - """ + """Calculate the blackbody radiation from wavelengths and temperatures.""" wavel = 11. * 1E-6 black = blackbody((wavel, ), [300., 301]) self.assertEqual(black.shape[0], 2) @@ -71,14 +79,28 @@ class TestBlackbody(unittest.TestCase): tb_therm = np.array([[300., 301], [299, 298], [279, 286]]) black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, np.ndarray) tb_therm = np.array([[300., 301], [0., 298], [279, 286]]) black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, np.ndarray) + + def test_blackbody_dask(self): + """Calculate the blackbody radiation from wavelengths and temperatures with dask arrays.""" + import dask + import dask.array as da + tb_therm = da.from_array([[300., 301], [299, 298], [279, 286]], chunks=2) + with dask.config.set(scheduler=CustomScheduler(0)): + black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, da.Array) + + tb_therm = da.from_array([[300., 301], [0., 298], [279, 286]], chunks=2) + with dask.config.set(scheduler=CustomScheduler(0)): + black = blackbody((10. * 1E-6, 11.e-6), tb_therm) + self.assertIsInstance(black, da.Array) def test_blackbody_wn(self): - """Calculate the blackbody radiation from wavenumbers and - temperatures - """ + """Calculate the blackbody radiation from wavenumbers and temperatures.""" wavenumber = 90909.1 # 11 micron band black = blackbody_wn((wavenumber, ), [300., 301]) self.assertEqual(black.shape[0], 2) @@ -106,9 +128,24 @@ class TestBlackbody(unittest.TestCase): assertNumpyArraysEqual(t__, expected) - def tearDown(self): - """Clean up""" - return + def test_blackbody_wn_dask(self): + """Test that blackbody rad2temp preserves dask arrays.""" + import dask + import dask.array as da + wavenumber = 90909.1 # 11 micron band + radiances = da.from_array([0.001, 0.0009, 0.0012, 0.0018], chunks=2).reshape(2, 2) + with dask.config.set(scheduler=CustomScheduler(0)): + t__ = blackbody_wn_rad2temp(wavenumber, radiances) + self.assertIsInstance(t__, da.Array) + t__ = t__.compute() + expected = np.array([290.3276916, 283.76115441, + 302.4181330, 333.1414164]).reshape(2, 2) + self.assertAlmostEqual(t__[1, 1], expected[1, 1], 5) + self.assertAlmostEqual(t__[0, 0], expected[0, 0], 5) + self.assertAlmostEqual(t__[0, 1], expected[0, 1], 5) + self.assertAlmostEqual(t__[1, 0], expected[1, 0], 5) + + assertNumpyArraysEqual(t__, expected) def suite(): ===================================== pyspectral/tests/test_rad_tb_conversions.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (c) 2014-2018 Adam.Dybbroe +# Copyright (c) 2014-2019 Adam.Dybbroe # Author(s): @@ -347,10 +347,10 @@ class TestRadTbConversions(unittest.TestCase): self.assertTrue(np.allclose(TRUE_RADS * integral, res['radiance'])) res = self.modis.tb2radiance(237., lut=False) - self.assertAlmostEqual(16570.592171157, res['radiance']) + self.assertAlmostEqual(16570.579551068, res['radiance']) res = self.modis.tb2radiance(277., lut=False) - self.assertAlmostEqual(167544.3823631, res['radiance']) + self.assertAlmostEqual(167544.39368663222, res['radiance']) res = self.modis.tb2radiance(1.1, lut=False) self.assertAlmostEqual(0.0, res['radiance']) @@ -362,7 +362,7 @@ class TestRadTbConversions(unittest.TestCase): self.assertAlmostEqual(5.3940515573e-06, res['radiance']) res = self.modis.tb2radiance(200.1, lut=False) - self.assertAlmostEqual(865.09776189, res['radiance']) + self.assertAlmostEqual(865.09759706, res['radiance']) def tearDown(self): """Clean up""" ===================================== pyspectral/tests/test_reflectance.py ===================================== @@ -139,7 +139,10 @@ class TestReflectance(unittest.TestCase): with patch('pyspectral.radiance_tb_conversion.RelativeSpectralResponse') as mymock: instance = mymock.return_value - instance.rsr = TEST_RSR + # VIIRS doesn't have a channel '20' like MODIS so the generic + # mapping this test will end up using will find 'ch20' for VIIRS + viirs_rsr = {'ch20': TEST_RSR['20'], '99': TEST_RSR['99']} + instance.rsr = viirs_rsr instance.unit = '1e-6 m' instance.si_scale = 1e-6 @@ -148,7 +151,7 @@ class TestReflectance(unittest.TestCase): refl37 = Calculator('Suomi-NPP', 'viirs', 3.7) self.assertEqual(refl37.bandwavelength, 3.7) - self.assertEqual(refl37.bandname, '20') + self.assertEqual(refl37.bandname, 'ch20') with patch('pyspectral.radiance_tb_conversion.RelativeSpectralResponse') as mymock: instance = mymock.return_value ===================================== pyspectral/tests/test_utils.py ===================================== @@ -115,8 +115,7 @@ class TestUtils(unittest.TestCase): self.rsr = RsrTestData() def test_convert2wavenumber(self): - """Testing the conversion of rsr from wavelength to wavenumber - """ + """Testing the conversion of rsr from wavelength to wavenumber.""" newrsr, info = utils.convert2wavenumber(TEST_RSR) unit = info['unit'] self.assertEqual(unit, 'cm-1') @@ -127,11 +126,7 @@ class TestUtils(unittest.TestCase): self.assertTrue(np.allclose(wvn_res, wvn)) def test_get_bandname_from_wavelength(self): - """Test that it is possible to get the right bandname provided the wavelength - in micro meters - - """ - + """Test the right bandname is found provided the wavelength in micro meters.""" x = utils.get_bandname_from_wavelength('abi', 0.4, self.rsr.rsr) self.assertEqual(x, 'ch1') with self.assertRaises(AttributeError): @@ -146,19 +141,16 @@ class TestUtils(unittest.TestCase): x = utils.get_bandname_from_wavelength('abi', 1.0, self.rsr.rsr) self.assertEqual(x, None) + # uses generic channel mapping where '20' -> 'ch20' bandname = utils.get_bandname_from_wavelength('abi', 3.7, TEST_RSR) - self.assertEqual(bandname, '20') + self.assertEqual(bandname, 'ch20') bandname = utils.get_bandname_from_wavelength('abi', 3.0, TEST_RSR) self.assertIsNone(bandname) - def tearDown(self): - """Clean up""" - pass - def suite(): - """The suite for test_utils.""" + """Create the suite for test_utils.""" loader = unittest.TestLoader() mysuite = unittest.TestSuite() mysuite.addTest(loader.loadTestsFromTestCase(TestUtils)) ===================================== pyspectral/utils.py ===================================== @@ -78,9 +78,19 @@ BANDNAMES['generic'] = {'VIS006': 'VIS0.6', 'C15': 'ch15', 'C16': 'ch16', } +# handle arbitrary channel numbers +for chan_num in range(1, 37): + BANDNAMES['generic'][str(chan_num)] = 'ch{:d}'.format(chan_num) -BANDNAMES['avhrr-3'] = {'3b': 'ch3b', - '3a': 'ch3a'} +# MODIS RSR files were made before 'chX' became standard in pyspectral +BANDNAMES['modis'] = {str(chan_num): str(chan_num) for chan_num in range(1, 37)} + +BANDNAMES['avhrr-3'] = {'1': 'ch1', + '2': 'ch2', + '3b': 'ch3b', + '3a': 'ch3a', + '4': 'ch4', + '5': 'ch5'} BANDNAMES['ahi'] = {'B01': 'ch1', 'B02': 'ch2', @@ -120,12 +130,16 @@ INSTRUMENTS = {'NOAA-19': 'avhrr/3', 'Suomi-NPP': 'viirs', 'NOAA-20': 'viirs', 'FY-3D': 'mersi-2', - 'Feng-Yun 3D': 'mersi-2' + 'FY-3C': 'virr', + 'FY-3B': 'virr', + 'Feng-Yun 3D': 'mersi-2', + 'FY-4A': 'agri' } -HTTP_PYSPECTRAL_RSR = "https://zenodo.org/record/2653487/files/pyspectral_rsr_data.tgz" + +HTTP_PYSPECTRAL_RSR = "https://zenodo.org/record/3381130/files/pyspectral_rsr_data.tgz" RSR_DATA_VERSION_FILENAME = "PYSPECTRAL_RSR_VERSION" -RSR_DATA_VERSION = "v1.0.6" +RSR_DATA_VERSION = "v1.0.9" ATM_CORRECTION_LUT_VERSION = {} ATM_CORRECTION_LUT_VERSION['antarctic_aerosol'] = {'version': 'v1.0.1', ===================================== pyspectral/version.py ===================================== @@ -46,9 +46,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (tag: v0.8.9)" - git_full = "b3a1f58f49e2ab7c81c331e379065f24eb21f1ba" - git_date = "2019-06-07 09:27:42 +0200" + git_refnames = " (HEAD -> master, tag: v0.9.0)" + git_full = "2372396547ecace7fbcefc50d5dcbaa32e4a5177" + git_date = "2019-08-30 18:04:19 +0200" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords ===================================== rsr_convert_scripts/README.rst ===================================== @@ -176,3 +176,12 @@ the pyspectral.yaml file: Adam Dybbroe Sat Dec 1 17:39:48 2018 +.. code:: + + %> python virr_rsr.py + +Converting the FY-3B or FY-3C VIRR spectral responses to HDF5. Original files +for FY-3B come as ``.prn`` text files for each channel (ex. ``ch1.prn``). For +FY-3C they come as ``.txt`` text files for channels 1, 2, 6, 7, 8, 9, and 10 +only with names like ``FY3C_VIRR_CH01.txt``. + ===================================== rsr_convert_scripts/agri_rsr.py ===================================== @@ -0,0 +1,108 @@ +#!/usr/bin/env python +# -*- coding: utf-8 -*- + +# Copyright (c) 2018, 2019 Pytroll + +# Author(s): + +# Xin.Zhang +# Adam.Dybbroe + +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +"""Read the FY-4A AGRI relative spectral responses. Data from +http://fy4.nsmc.org.cn/portal/cn/fycv/srf.html +""" +import os +import numpy as np +from pyspectral.utils import INSTRUMENTS +from pyspectral.utils import convert2hdf5 as tohdf5 +from pyspectral.raw_reader import InstrumentRSR +from pyspectral.utils import logging_on, get_logger + +FY4A_BAND_NAMES = ['ch1', 'ch2', 'ch3', 'ch4', 'ch5', 'ch6', 'ch7', 'ch8', + 'ch9', 'ch10', 'ch11', 'ch12', 'ch13', 'ch14'] +BANDNAME_SCALE2MICROMETERS = {'ch1': 0.001, + 'ch2': 0.001, + 'ch3': 0.001, + 'ch4': 1.0, + 'ch5': 1.0, + 'ch6': 1.0, + 'ch7': 1.0, + 'ch8': 1.0, + 'ch9': 1.0, + 'ch10': 1.0, + 'ch11': 1.0, + 'ch12': 1.0, + 'ch13': 1.0, + 'ch14': 1.0} + + +class AGRIRSR(InstrumentRSR): + """Container for the FY-4 AGRI RSR data""" + + def __init__(self, bandname, platform_name): + """Initialise the FY-4 AGRI relative spectral response data""" + super(AGRIRSR, self).__init__(bandname, platform_name, FY4A_BAND_NAMES) + + self.instrument = INSTRUMENTS.get(platform_name, 'agri') + + self._get_options_from_config() + self._get_bandfilenames() + + LOG.debug("Filenames: %s", str(self.filenames)) + if self.filenames[bandname] and os.path.exists(self.filenames[bandname]): + self.requested_band_filename = self.filenames[bandname] + scale = BANDNAME_SCALE2MICROMETERS.get(bandname) + if scale: + self._load(scale=scale) + else: + LOG.error( + "Failed determine the scale used to convert to wavelength in micrometers - channel = %s", bandname) + raise AttributeError('no scale for bandname %s', bandname) + + else: + LOG.warning("Couldn't find an existing file for this band: %s", + str(self.bandname)) + + self.filename = self.requested_band_filename + + def _load(self, scale=0.001): + """Load the AGRI RSR data for the band requested + + Wavelength is given in nanometers. + """ + data = np.genfromtxt(self.requested_band_filename, + unpack=True, + names=['wavelength', + 'response'], + skip_header=0) + + wavelength = data['wavelength'] * scale + response = data['response'] + + self.rsr = {'wavelength': wavelength, 'response': response} + + +def main(): + """Main""" + for platform_name in ["FY-4A", ]: + tohdf5(AGRIRSR, platform_name, FY4A_BAND_NAMES) + + +if __name__ == "__main__": + LOG = get_logger(__name__) + logging_on() + + main() ===================================== rsr_convert_scripts/virr_rsr.py ===================================== @@ -0,0 +1,89 @@ +#!/usr/bin/env python +# -*- coding: utf-8 -*- +# +# Copyright (c) 2019 Adam.Dybbroe +# +# Author(s): +# +# David Hoese +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . +"""Read the VIRR relative spectral responses. + +Data from http://gsics.nsmc.org.cn/portal/en/fycv/srf.html + +""" + +import os +import numpy as np +from pyspectral.utils import INSTRUMENTS +from pyspectral.utils import convert2hdf5 as tohdf5 +from pyspectral.raw_reader import InstrumentRSR + +import logging +LOG = logging.getLogger(__name__) + +VIRR_BAND_NAMES = { + 'FY-3B': ['ch{:d}'.format(x) for x in range(1, 11)], + 'FY-3C': ['ch1', 'ch2'] + ['ch{:d}'.format(x) for x in range(6, 11)], +} + + +class VirrRSR(InstrumentRSR): + """Container for the FY-3B/FY-3C VIRR RSR data.""" + + def __init__(self, bandname, platform_name): + """Verify that file exists and can be read.""" + super(VirrRSR, self).__init__(bandname, platform_name, VIRR_BAND_NAMES[platform_name]) + + self.instrument = INSTRUMENTS.get(platform_name, 'virr') + self._get_options_from_config() + self._get_bandfilenames() + + LOG.debug("Filenames: %s", str(self.filenames)) + if self.filenames[bandname] and os.path.exists(self.filenames[bandname]): + self.requested_band_filename = self.filenames[bandname] + self._load() + else: + LOG.warning("Couldn't find an existing file for this band: %s", + str(self.bandname)) + + # To be compatible with VIIRS.... + self.filename = self.requested_band_filename + + def _load(self, scale=0.001): + """Load the VIRR RSR data for the band requested. + + Wavelength is given in nanometers. + """ + data = np.genfromtxt(self.requested_band_filename, + unpack=True, + names=['wavelength', + 'response'], + skip_header=0) + + wavelength = data['wavelength'] * scale + response = data['response'] + + self.rsr = {'wavelength': wavelength, 'response': response} + + +def main(): + """Main""" + for platform_name, band_names in VIRR_BAND_NAMES.items(): + tohdf5(VirrRSR, platform_name, band_names) + + +if __name__ == "__main__": + main() View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/commit/bdb76ea7177514603c82b55f5f96f170e21171a6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/commit/bdb76ea7177514603c82b55f5f96f170e21171a6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 07:17:39 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 02 Sep 2019 06:17:39 +0000 Subject: [Git][debian-gis-team/trollimage][master] 2 commits: Remove obsolete fields Name, Contact from debian/upstream/metadata. Message-ID: <5d6cb403e4612_577b3f91d054954c29363b@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / trollimage Commits: 6e46a3d0 by Antonio Valentino at 2019-09-02T06:16:07Z Remove obsolete fields Name, Contact from debian/upstream/metadata. - - - - - 57d3249a by Antonio Valentino at 2019-09-02T06:17:25Z Set distribution to unstable - - - - - 2 changed files: - debian/changelog - debian/upstream/metadata Changes: ===================================== debian/changelog ===================================== @@ -1,11 +1,12 @@ -trollimage (1.9.0-2) UNRELEASED; urgency=medium +trollimage (1.9.0-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. * Use debhelper-compat instead of debian/compat. * Set compat to 12. * Set upstream metadata fields: Contact. + * Remove obsolete fields Name, Contact from debian/upstream/metadata. - -- Antonio Valentino Wed, 10 Jul 2019 19:28:10 +0200 + -- Antonio Valentino Mon, 02 Sep 2019 06:17:09 +0000 trollimage (1.9.0-1) unstable; urgency=medium ===================================== debian/upstream/metadata ===================================== @@ -1,6 +1,4 @@ Bug-Database: https://github.com/pytroll/trollimage/issues Bug-Submit: https://github.com/pytroll/trollimage/issues/new -Contact: Martin Raspaud -Name: Trollimage Repository: https://github.com/pytroll/trollimage.git Repository-Browse: https://github.com/pytroll/trollimage View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/compare/22756e846eeb7015c6e9b161a1269dbf5536630b...57d3249a227f615b25b3dce3f2895222e5026410 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/compare/22756e846eeb7015c6e9b161a1269dbf5536630b...57d3249a227f615b25b3dce3f2895222e5026410 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 07:29:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 06:29:53 +0000 Subject: [Git][debian-gis-team/pyspectral] Pushed new tag debian/0.9.0+ds-1 Message-ID: <5d6cb6e14509e_577b2ade612253702943e9@godard.mail> Bas Couwenberg pushed new tag debian/0.9.0+ds-1 at Debian GIS Project / pyspectral -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/tree/debian/0.9.0+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 07:38:47 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 06:38:47 +0000 Subject: [Git][debian-gis-team/trollimage] Pushed new tag debian/1.9.0-2 Message-ID: <5d6cb8f7d746d_577b2ade6122537029472e@godard.mail> Bas Couwenberg pushed new tag debian/1.9.0-2 at Debian GIS Project / trollimage -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/tree/debian/1.9.0-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 2 07:39:54 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 06:39:54 +0000 Subject: Processing of pyspectral_0.9.0+ds-1_source.changes Message-ID: pyspectral_0.9.0+ds-1_source.changes uploaded successfully to localhost along with the files: pyspectral_0.9.0+ds-1.dsc pyspectral_0.9.0+ds.orig.tar.xz pyspectral_0.9.0+ds-1.debian.tar.xz pyspectral_0.9.0+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 2 07:49:54 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 06:49:54 +0000 Subject: Processing of trollimage_1.9.0-2_source.changes Message-ID: trollimage_1.9.0-2_source.changes uploaded successfully to localhost along with the files: trollimage_1.9.0-2.dsc trollimage_1.9.0-2.debian.tar.xz trollimage_1.9.0-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 2 08:05:14 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 07:05:14 +0000 Subject: pyspectral_0.9.0+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 02 Sep 2019 06:04:23 +0000 Source: pyspectral Architecture: source Version: 0.9.0+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: pyspectral (0.9.0+ds-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: 7cc6fd7f2b335dbb0e8cfe5c6b60228ffdb24296 2454 pyspectral_0.9.0+ds-1.dsc 1f43575093616fe277029afb07dbacc241af8690 3607288 pyspectral_0.9.0+ds.orig.tar.xz 0eacbe6469eadf0769ecd1f5decf18227d73cf22 118536 pyspectral_0.9.0+ds-1.debian.tar.xz 56a3951a72c63856bad30834a6c0d975f379fb50 10409 pyspectral_0.9.0+ds-1_amd64.buildinfo Checksums-Sha256: 03db3725a4cb773d26a06fa7c59728d8059343217db3e0be2d87ffa14acd3b03 2454 pyspectral_0.9.0+ds-1.dsc bbfc5a3976466d28308678d2285df2dd93d9cb6c7f5f72d1488b78743a01084b 3607288 pyspectral_0.9.0+ds.orig.tar.xz b57a1d8ebf2819b48fd9f3df633d8ed66973e45649430c9a5f511174ba74bd58 118536 pyspectral_0.9.0+ds-1.debian.tar.xz 0599c33110de99b09669a47788264d49465ae7e782b00156acc357e48bc9d1e5 10409 pyspectral_0.9.0+ds-1_amd64.buildinfo Files: 0b32bdd3ae682e72c4b7a4c861174bd2 2454 python optional pyspectral_0.9.0+ds-1.dsc cd453e4ab9a1000120598d3d464209b7 3607288 python optional pyspectral_0.9.0+ds.orig.tar.xz 04063ce2647330bc0ef6ec05d17b3158 118536 python optional pyspectral_0.9.0+ds-1.debian.tar.xz 8c405f1cb7435d3657be9a896e9454fe 10409 python optional pyspectral_0.9.0+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1sts8ACgkQZ1DxCuiN SvHpSQ//VAAif0BayRivObyCwTG6asx/JELr/RgIribTBQoUl7veGP/bsaW9eF6j q95k7mFNDEinSVl6w34rBcYeLPA2g/gAex6J9olkrk/qIR9mW2Jko+V6ytIwDcOX /ZQVAX1T923hOtppJP3Dn84xbpkogT+RGRxJh1sij55M0U/Ceyz1jsMz2d7lsUAS mfI7ArTurCnn59UhxGJRqtd82WjShfLsI8Omr+uCuDWaQH47T1stb2d/xRMAvW7l mA0qkMfg9HdFZGSOwfJwmoheM8AHZ3OmxUOnOImh9+htIbkzMdtf/jtQ9azx5B2A /5OgDSrFTQ8cb9wrEYoygv/2Xi9eRvtrYJREHigXhlNTVt3YznIlkUEP9Yu47++m +TxN8Vsu3NIMlZ3MAVLVXrUtKqUlDMDOizlJk9hQQVPvAb8COBqbFrCIidZXdJ0Z ZTLhz7n/jtZJZMQpcVvSTB45eJ7XrmdKL0MFyR0Xl11zzfyGc/ObGF1ZabGSW/xF HPO6TYyF7z+0lb/GYBXXxe8B+FFyW1IDaPaIuCyE/n5UP25zs0rgPpwS2DBmTfda niatxo/1RNS+dKAJtHrG/UobkRpZMVOtNtlGscNMZ4Ic8lMgKxZSRHI9MZ1zs6QO 97lH8ptVLehlPWM2e7Cx9WFfWkQjQhUcoV3d7LlnvyH4mH9BzA4= =zX8r -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 2 08:05:18 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 07:05:18 +0000 Subject: trollimage_1.9.0-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 02 Sep 2019 06:17:09 +0000 Source: trollimage Architecture: source Version: 1.9.0-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: trollimage (1.9.0-2) unstable; urgency=medium . * Bump Standards-Version to 4.4.0, no changes. * Use debhelper-compat instead of debian/compat. * Set compat to 12. * Set upstream metadata fields: Contact. * Remove obsolete fields Name, Contact from debian/upstream/metadata. Checksums-Sha1: 21331dfbc4ef1d60b0428384562279fba819f396 2151 trollimage_1.9.0-2.dsc e6c8b7493bfd3825bda87555a6491cc0e18dd66f 3096 trollimage_1.9.0-2.debian.tar.xz 9a7df33b0b14cd83dd881af4422c689af71b63f3 9951 trollimage_1.9.0-2_amd64.buildinfo Checksums-Sha256: fc7c029134e0f8489083a0fae545ba237e3e7373a70fdcf2a324692b7504068a 2151 trollimage_1.9.0-2.dsc 8aa4a4e3323ef299ce233b150ac66f7a3b1a15a69e2c65da752f656ec2182bb0 3096 trollimage_1.9.0-2.debian.tar.xz 09f7349f9e6fe558f1dccd9eaa4056e221df0050ac1460fe9b28a373547ebe67 9951 trollimage_1.9.0-2_amd64.buildinfo Files: 460230124df46dac70676df97d9a2e04 2151 python optional trollimage_1.9.0-2.dsc bcfac8142069950f98b863ac7c82f249 3096 python optional trollimage_1.9.0-2.debian.tar.xz 63fcfd7e7f9f045aa26bdbcd1240cefd 9951 python optional trollimage_1.9.0-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1suOUACgkQZ1DxCuiN SvGPcw//WUuPU5rCBbd19Emkg2OBGLq6fRpdBr0ASSb6gJTLwRxL1MMlbqKbQOdz WMcwH15atJMwGmyz0N5Bc+CjRxj5VlgRJQJrgrURCrFTBGOAdxYyBA7DWdqiAC2w 7HDbHzWIBWIyUpfj/26wrtjuZibbUjDsgjPAAU8ulbnc3+W9DdNCel8ntpxvZeiB +xFg3zxlr7uqy7QddBZYd3VZv9r5yau+FbJqPNcLxaEmhV4BIYYlv0AhibJGwzl+ zBdnZWfdqTKrSe/3u9htow0LMPwtKsYkNdZrEVWCvcacdBmnPN2OvtCAfuObPXLN pFqablqSObmlcdTWcByi2dv9aC4TlZfBHj70Qy22lDzL7eoJICBGgqZ63lkxzLgt FJk78ddmE0LIW1bwHFRFLZIaW8r45WlBnQRjRD1B8JgieW31ymG8cLPXpnc2ntwo 1lOePUVbDujjiAVd6RkASm06PohMGVYJeu+rxNdnir0+ubVHh2l8jJhtxYMuWIKc hyX/P/gB8MsJk18g89BhOllmQi3xu+UuIT+sFcQHUR0IWcPJQbahwnuhh6QvRKZn ePFeejv51uN1mR4UCcE1d+zxEUuIHVlkqdcmdpvTNgT367bmHM6EqzWehfcffjJ0 wHE4cSpEQq6T06IzDnOWcZb6rKGi5/pmnfaChr9j8YNK4YO+G+w= =3FYX -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From brenda at acae.co.za Mon Sep 2 09:05:43 2019 From: brenda at acae.co.za (Brenda Mweshi (Workshop Invitation)) Date: Mon, 2 Sep 2019 10:05:43 +0200 Subject: Protocol, Etiquette & Event Management Workshop lnvitation Message-ID: <4772381967056277577829@PROD08> Thank you for Receiving this workshop invitation. You may please unsubscribe here if you no longer wish to receive our emails --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 1.png Type: image/png Size: 44672 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 2.gif Type: image/gif Size: 646 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 3.gif Type: image/gif Size: 641 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 4.gif Type: image/gif Size: 650 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 5.gif Type: image/gif Size: 650 bytes Desc: not available URL: From gitlab at salsa.debian.org Mon Sep 2 12:35:03 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 11:35:03 +0000 Subject: [Git][debian-gis-team/python-pyproj][master] Move from experimental to unstable. Message-ID: <5d6cfe6748355_577b3f91d4b840343175e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pyproj Commits: 5220dae9 by Bas Couwenberg at 2019-09-02T11:26:39Z Move from experimental to unstable. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-pyproj (2.3.1+ds-1) unstable; urgency=medium + + * Move from experimental to unstable. + + -- Bas Couwenberg Mon, 02 Sep 2019 13:26:21 +0200 + python-pyproj (2.3.1+ds-1~exp1) experimental; urgency=medium * New upstream release. View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/5220dae973d1767f0c171504500ddf279b3876fc -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/5220dae973d1767f0c171504500ddf279b3876fc You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 12:35:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 11:35:10 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag debian/2.3.1+ds-1 Message-ID: <5d6cfe6e46b7e_577b2ade611d5a50317766@godard.mail> Bas Couwenberg pushed new tag debian/2.3.1+ds-1 at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/debian/2.3.1+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 2 12:42:42 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 11:42:42 +0000 Subject: Processing of python-pyproj_2.3.1+ds-1_source.changes Message-ID: python-pyproj_2.3.1+ds-1_source.changes uploaded successfully to localhost along with the files: python-pyproj_2.3.1+ds-1.dsc python-pyproj_2.3.1+ds-1.debian.tar.xz python-pyproj_2.3.1+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Mon Sep 2 12:45:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 11:45:43 +0000 Subject: [Git][debian-gis-team/proj][master] 17 commits: Update branch in gbp.conf & Vcs-Git URL. Message-ID: <5d6d00e71f1dd_577b2ade612253703195db@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / proj Commits: 9ef4d1e9 by Bas Couwenberg at 2019-08-27T03:52:25Z Update branch in gbp.conf & Vcs-Git URL. - - - - - 53616d07 by Bas Couwenberg at 2019-08-27T03:54:09Z New upstream version 6.2.0~rc1 - - - - - 44f00d2f by Bas Couwenberg at 2019-08-27T03:54:31Z Update upstream source from tag 'upstream/6.2.0_rc1' Update to upstream version '6.2.0~rc1' with Debian dir ad1112258c93111992454733476a5e15adfade4d - - - - - b32ee154 by Bas Couwenberg at 2019-08-27T03:55:00Z New upstream release candidate. - - - - - 7bedbe18 by Bas Couwenberg at 2019-08-27T03:58:07Z Add license & copyright for nlohmann sources. - - - - - 46e10ca8 by Bas Couwenberg at 2019-08-27T04:17:23Z Update symbols for amd64. - - - - - 9df89c97 by Bas Couwenberg at 2019-08-27T04:17:23Z Set distribution to experimental. - - - - - 60589ae3 by Bas Couwenberg at 2019-08-27T05:32:01Z Don't remove data/null on clean, included upstream too. - - - - - f7998464 by Bas Couwenberg at 2019-09-01T07:34:54Z New upstream version 6.2.0 - - - - - be21fd98 by Bas Couwenberg at 2019-09-01T07:35:14Z Update upstream source from tag 'upstream/6.2.0' Update to upstream version '6.2.0' with Debian dir 4fbddd0ef622549979b2cb4ead687e3fd15e6434 - - - - - 27978788 by Bas Couwenberg at 2019-09-01T07:35:43Z New upstream release. - - - - - b4b1b7a0 by Bas Couwenberg at 2019-09-01T07:37:23Z Update symbols for other architectures. - - - - - 2b51696f by Bas Couwenberg at 2019-09-01T07:38:13Z Strip pre-releases from symbols version. - - - - - dba97801 by Bas Couwenberg at 2019-09-01T07:38:37Z Set distribution to experimental. - - - - - 15ad84e7 by Bas Couwenberg at 2019-09-02T11:06:53Z Revert "Update branch in gbp.conf & Vcs-Git URL." This reverts commit 9ef4d1e9d19c6de7b7f5b436602cbe3a8a67bc31. - - - - - c7933564 by Bas Couwenberg at 2019-09-02T11:25:55Z Update symbols for other architectures. - - - - - c05baad6 by Bas Couwenberg at 2019-09-02T11:25:55Z Set distribution to unstable. - - - - - 30 changed files: - CMakeLists.txt - NEWS - README - README.md - cmake/Makefile.am - cmake/Makefile.in - − cmake/ProjSystemInfo.cmake - configure - configure.ac - data/Makefile.am - data/Makefile.in - + data/null - + data/projjson.schema.json - data/sql/alias_name.sql - data/sql/area.sql - data/sql/concatenated_operation.sql - data/sql/customizations.sql - data/sql/geodetic_crs.sql - data/sql/geodetic_datum.sql - data/sql/grid_alternatives.sql - data/sql/grid_transformation.sql - data/sql/helmert_transformation.sql - data/sql/metadata.sql - data/sql/other_transformation.sql - data/sql/projected_crs.sql - data/sql/supersession.sql - data/sql/vertical_crs.sql - data/sql/vertical_datum.sql - debian/changelog - debian/copyright The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/compare/7470f303839e8090530c05fe0894f63d3f52325e...c05baad6167ea4f49d6f2e7bfbd0f712be91483d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/compare/7470f303839e8090530c05fe0894f63d3f52325e...c05baad6167ea4f49d6f2e7bfbd0f712be91483d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 12:45:47 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 11:45:47 +0000 Subject: [Git][debian-gis-team/proj] Pushed new tag debian/6.2.0-1 Message-ID: <5d6d00eb9a90f_577b2ade611d5a50319778@godard.mail> Bas Couwenberg pushed new tag debian/6.2.0-1 at Debian GIS Project / proj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/tree/debian/6.2.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 2 12:49:50 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 11:49:50 +0000 Subject: python-pyproj_2.3.1+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 02 Sep 2019 13:26:21 +0200 Source: python-pyproj Architecture: source Version: 2.3.1+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pyproj (2.3.1+ds-1) unstable; urgency=medium . * Move from experimental to unstable. Checksums-Sha1: f24a5a350a69b28f78b03583550ac2169e31a330 2201 python-pyproj_2.3.1+ds-1.dsc ab474b6bd2475ae9b363fd9ae8ed4ec190ed6e08 6068 python-pyproj_2.3.1+ds-1.debian.tar.xz a6f31af2f4049f37d7e697c5443d4c8f67e9d1d0 8610 python-pyproj_2.3.1+ds-1_amd64.buildinfo Checksums-Sha256: 4f8a1bb960c6a958fbba803d4f472a9acf91d46831087c4aea19cf2f8d143919 2201 python-pyproj_2.3.1+ds-1.dsc 40104fb8b8f8ab35a921406d963b67b94fd7e2f6b44574e5ed22012dc662591e 6068 python-pyproj_2.3.1+ds-1.debian.tar.xz 45fdc31c144bfb66b32db423319f8146bbaa5f5e12dd7bb2e08f9015a572c434 8610 python-pyproj_2.3.1+ds-1_amd64.buildinfo Files: e6d25c4d1e184faa36670ea6936cb834 2201 python optional python-pyproj_2.3.1+ds-1.dsc 433e6582fe544b98c67d85b3144e1a09 6068 python optional python-pyproj_2.3.1+ds-1.debian.tar.xz 44119f6b63405c062cb3ebef46c8ecb1 8610 python optional python-pyproj_2.3.1+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1s/lMACgkQZ1DxCuiN SvE2oQ//X1l6LjVz+2a+dMmAryGG7/U7FEFff2/BOxlgjRTkkcj/rgjay6JXl5tk OKZTwZT3qiXWdM61DGitS3d7l/J7Jjtoud2+lHa5tXFgc+BLgBAlg/u/6ub1Hjuj ntG11/ric9SaSLZsm3ItsT2+PRrZJ3zbafasI/p6UQB359K+AYHxVofvLdDws0aT JlAlQNpVhXBtLzuYLtxMhDVxGDgtwoT73jPmW0EuMLU8sJRWcr11bLqPe6sGFNCE yDU5A+jv5VSGP+W+A7yiEpsYnPbJCvDPnMiKQsa4VAUXGwPuPLDiUjf00ZBd1i3F SyVVJO6uV9gvLcy1pyBxKHucYRgVJ9NPv1rEeE1Fxp6+b0ZGwX+1ILJpUGFOcfEY xHExs/4u0gCHMn6ZLgNTPATp55gGDHPpMG2yZdwgRdjV1uKJ1JZB0H6muEF5x46D TwANu7B7Fg17210vRwtdaYrZmuAd2f1niPBAkhiu5xc4Hlq8pykUtLOvpeB7sQh7 zacyTeHTra9MPmzmEn9EhRxvqM+FbbOyQJEQiDg+AJztY1oxiJyUneKPlq7EDyJ6 6yzaLipZU9+Eyy9rx+BWtom8C8Wz7WCBBdf08bw9MFYZ5GzqsC+EuENwiaseZIDX U1WAV17yYd4naEVS2zrdy0aweubyIE/I6G1k4AThu9vVtmwyB80= =lcYC -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 2 12:57:45 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 11:57:45 +0000 Subject: Processing of proj_6.2.0-1_source.changes Message-ID: proj_6.2.0-1_source.changes uploaded successfully to localhost along with the files: proj_6.2.0-1.dsc proj_6.2.0-1.debian.tar.xz proj_6.2.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 2 13:04:50 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 12:04:50 +0000 Subject: proj_6.2.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 02 Sep 2019 13:10:14 +0200 Source: proj Architecture: source Version: 6.2.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: proj (6.2.0-1) unstable; urgency=medium . * Update symbols for other architectures. * Move from experimental to unstable. Checksums-Sha1: c54bdd9e227ae6cd076d4825a41c09556a40cce9 2167 proj_6.2.0-1.dsc f686a97662149691adec337fc53148a153b1185b 10210308 proj_6.2.0-1.debian.tar.xz a22282971c8e4bfeba3955689e01f191b90f71cb 8087 proj_6.2.0-1_amd64.buildinfo Checksums-Sha256: 3e7c6af95f3b5e8f987b8f8b4927fe1044686ef652533d34b85fc1fc4eb6977a 2167 proj_6.2.0-1.dsc 356fb69da850433a3f58986f3d307cc1c9195718080c5fc8677d9ebef84fcab4 10210308 proj_6.2.0-1.debian.tar.xz c106686b9d207d11e0a444f5961e6d9ec64495726f06fa97797414fdee29baa5 8087 proj_6.2.0-1_amd64.buildinfo Files: b33be135a6abe8b97a44d398d0f03033 2167 science optional proj_6.2.0-1.dsc b717cb189a42e73185e58b79bd06e7f7 10210308 science optional proj_6.2.0-1.debian.tar.xz 4474508263bb3fc7ac6b517df57ece61 8087 science optional proj_6.2.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1tANAACgkQZ1DxCuiN SvEE9g/9FxFjIwPq5AnwMumHI+Cw2vSjYMomH6mWquuKmbMN/z5AKtDL3rzIlTes AGugURnvfojfIHaUKanYgx74m7X+XvyYesHvYAlR8fl4QxCW33r3a8qs6kYUIVXC iOahusT0t5gXzistNHI35hdoU/1cUTS4mIyafJnnHhAxJkb3Die/+rh6SQKs9E7t jY3XLh+bfh2LlH46CvaZc7uaF8hV01l7K7gkVLDkaoZKNhw3J7p6WWErThlICTjy POTm8UQcHwr+23ZWq4n9K9VA0+25aVg78LmOBqDrZWwIHqcJEKAJiOyxIopImDTX +OFVdhGIDdGQUwHGS6jnl80wQuxLP3i/88wgQZqHbFt3TsbhEvoI3RKhUcTi0bkS gO8akdIFPYeE29cErLJsjc3iqvug+srqzR5/K9ePbOL9pNrMBljw1pMJ0Cmu8mzE sDI0Z3JcfiDpYSSRNMN8UnH2JR91EzAA6tAtlL5dwwKUSYjB+stkatPv8Jkr7sTd YtuOw2hv5fvpnEn5QAR6jEa6UZ8ewFLPC7t8JuhYso5vu1IPpA98EUp87KcU0EpG 7uSYxGb4d9R5y7rMHa7yBkbXBiaqTec8INpcUh6Owp2CSQgpKwW+dXBwe1p/jzNL WSROmx0FuzZaKwuREOOLR4TjP5WWWbIf53nttkIvf/xgTIoelNI= =P0NM -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From sebastic at xs4all.nl Mon Sep 2 16:19:07 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Mon, 2 Sep 2019 17:19:07 +0200 Subject: Bug#939022: [#939022] Re: pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) In-Reply-To: References: <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <23ab9a59-f21b-0099-7d7b-4e2a60f47512@tiscali.it> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: <81d382be-eaf7-4dcc-c409-d8b51d69126f@xs4all.nl> proj (6.2.0-1) & python-pyproj (2.3.1+ds-1) are now in unstable. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From gitlab at salsa.debian.org Mon Sep 2 17:17:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 16:17:23 +0000 Subject: [Git][debian-gis-team/pyresample][master] Set distribution to unstable. Message-ID: <5d6d4093e3553_577b3f91ca571214353770@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyresample Commits: 3a94bc48 by Bas Couwenberg at 2019-09-02T16:03:46Z Set distribution to unstable. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,5 +1,8 @@ -pyresample (1.12.3-6) UNRELEASED; urgency=medium +pyresample (1.12.3-6) unstable; urgency=medium + * Team upload. + + [ Antonio Valentino ] * Use debhelper-compat instead of debian/compat. * debian/patches: - new 0004-Detect-broken-basemap.patch: workaround for failures @@ -10,7 +13,7 @@ pyresample (1.12.3-6) UNRELEASED; urgency=medium - drop dependency on python3-mock (not really necessary with Python 3) * Remove obsolete fields Name from debian/upstream/metadata. - -- Antonio Valentino Wed, 21 Aug 2019 19:35:41 +0000 + -- Bas Couwenberg Mon, 02 Sep 2019 18:03:28 +0200 pyresample (1.12.3-5) unstable; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/3a94bc4893e6beab6b3f3de4b36c9b7ebe33a167 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/3a94bc4893e6beab6b3f3de4b36c9b7ebe33a167 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 2 17:17:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 02 Sep 2019 16:17:32 +0000 Subject: [Git][debian-gis-team/pyresample] Pushed new tag debian/1.12.3-6 Message-ID: <5d6d409c51dc0_577b3f91ca57121435393a@godard.mail> Bas Couwenberg pushed new tag debian/1.12.3-6 at Debian GIS Project / pyresample -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/tree/debian/1.12.3-6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 2 17:29:36 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 16:29:36 +0000 Subject: Processing of pyresample_1.12.3-6_source.changes Message-ID: pyresample_1.12.3-6_source.changes uploaded successfully to localhost along with the files: pyresample_1.12.3-6.dsc pyresample_1.12.3-6.debian.tar.xz pyresample_1.12.3-6_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 2 17:51:12 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 02 Sep 2019 16:51:12 +0000 Subject: pyresample_1.12.3-6_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 02 Sep 2019 18:03:28 +0200 Source: pyresample Architecture: source Version: 1.12.3-6 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 939022 Changes: pyresample (1.12.3-6) unstable; urgency=medium . * Team upload. . [ Antonio Valentino ] * Use debhelper-compat instead of debian/compat. * debian/patches: - new 0004-Detect-broken-basemap.patch: workaround for failures of basemap imports due to incompatibility with new pyproj versions (Closes: #939022) * debian/control: - depend on pyproj >= 2.3.1 - drop dependency on python3-mock (not really necessary with Python 3) * Remove obsolete fields Name from debian/upstream/metadata. Checksums-Sha1: 73f6e11f6c4d6b58bf8bcc24398d31b9ab954045 2529 pyresample_1.12.3-6.dsc 2c8dd7d7ca8bb3c284bd6d6bc2531c6590e67718 9184 pyresample_1.12.3-6.debian.tar.xz 6ae74d0f45e5274ead47fe7c22d20b0269911317 12634 pyresample_1.12.3-6_amd64.buildinfo Checksums-Sha256: 3f92b7785b99de1dee14ea77afbdc93eff2c7875444d85a5e8e7f33fd42a7ba0 2529 pyresample_1.12.3-6.dsc 05cdb00cc0c35d41c8629c7bc479c3fefeef84d19ca69fcc706cf907ebceed50 9184 pyresample_1.12.3-6.debian.tar.xz 5c2d20ed647211e824bc4d8855b2349512b7e2718151e12ca1be79f0aabc0b3c 12634 pyresample_1.12.3-6_amd64.buildinfo Files: f6f11a187f050b29c4a06567c16d1d11 2529 python optional pyresample_1.12.3-6.dsc 0abdf65c21e73d6a9f438e5a254b81f8 9184 python optional pyresample_1.12.3-6.debian.tar.xz d84cbb2e5db2b4bebfb85b941dfbb657 12634 python optional pyresample_1.12.3-6_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1tQHQACgkQZ1DxCuiN SvGb/g//eXrH6xtKO2SRZSEad+nJN1YC99qfFkg7iVR9/sjl4QflKIVkrea+burx UsuYYKpxnDO2GQ8uRLs+Ty3i+lyTv0lY2pqr6Snc3YsQZ9037krHsKJWVPHBKS5H dh5HFJIsmAV2fWOv+/BMMwFpy6bUGpcKeUcINUCmsXhp75duJWpACew+G27CKZfN eY6kib3bcFJvzFz9ingT9K3S4QTk13jydLqp0qwn16+IfoLjZwjVY+aKHyGLng3E APVaJQTUA7cXn60T7x19YZFPnLlZGdtIWdLIKlwnc3IZTtfXX0k/leQpzzVa3jat HT6+bFltYULs4lpzQkigJqMjpOWVZP+fLfOInml34etYVsikK9jX3cAn3Q9EjF96 R6bPJ75gSixoW7ToYueCzNmNc/LkJ04owh7xs4wqGywW5JjggIFLo73tmk9z2idO QvBsb2HpVa/mKQinMaXSLAf9lft0ceD5pnedp+fSIbK6lGUROkq3FAFyLve0vRKn PHx9jEmcs97RFL4nCQv6pb1yAOvcRjkODqNwFQwv+zW5BsBU19bRoLpN9LJUUG8q aU+4zt+0mX80Y152PoQ4s87rAgQa0Ew7hw9JVZs17y9k0v8h4l73w57qJZYNEkxL 4I78jxL4dMekiuMvzqLA9hIvivQyX7FES6T+rjmVludLyi1A/l0= =+1iX -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Mon Sep 2 17:54:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Mon, 02 Sep 2019 16:54:03 +0000 Subject: Bug#939022: marked as done (pyresample: autopkgtest failure with PROJ 6 (epsg data file removed)) References: <156724515379.21434.13043590679250864051.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: Your message dated Mon, 02 Sep 2019 16:51:12 +0000 with message-id and subject line Bug#939022: fixed in pyresample 1.12.3-6 has caused the Debian Bug report #939022, regarding pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 939022: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939022 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: pyresample: autopkgtest failure with PROJ 6 (epsg data file removed) Date: Sat, 31 Aug 2019 11:52:33 +0200 Size: 4341 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#939022: fixed in pyresample 1.12.3-6 Date: Mon, 02 Sep 2019 16:51:12 +0000 Size: 5547 URL: From mail at kultnet.de Tue Sep 3 05:11:17 2019 From: mail at kultnet.de (KultNet Kultur-Ausschreibungen) Date: Tue, 3 Sep 2019 04:11:17 +0000 (UTC) Subject: Hallo Mitteilung Erinnerung 365867 Message-ID: <20190903041117.1CAFE81541@vps27734.alfahosting-vps.de> An HTML attachment was scrubbed... URL: From noreply at release.debian.org Tue Sep 3 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Tue, 03 Sep 2019 04:39:21 +0000 Subject: osmpbf 1.3.3-12 MIGRATED to testing Message-ID: FYI: The status of the osmpbf source package in Debian's testing distribution has changed. Previous version: 1.3.3-11 Current version: 1.3.3-12 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Tue Sep 3 06:45:00 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Tue, 03 Sep 2019 05:45:00 +0000 Subject: [Git][debian-gis-team/glymur][master] Set distribution to unstable Message-ID: <5d6dfddc8e24b_577b3f91ce8379cc3944b2@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / glymur Commits: 2c3a23d9 by Antonio Valentino at 2019-09-03T05:41:23Z Set distribution to unstable - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,6 @@ -glymur (0.8.18+ds-1) UNRELEASED; urgency=medium +glymur (0.8.18+ds-1) unstable; urgency=medium [ Bas Couwenberg ] - * Team upload. * Repack upstream tarball to exclude .egg-info directory. * Bump Standards-Version to 4.4.0, no changes. @@ -10,7 +9,7 @@ glymur (0.8.18+ds-1) UNRELEASED; urgency=medium * Use debhelper-compat instead of debian/compat. * Remove obsolete fields Name from debian/upstream/metadata. - -- Bas Couwenberg Sun, 07 Jul 2019 19:05:38 +0200 + -- Antonio Valentino Tue, 03 Sep 2019 05:37:57 +0000 glymur (0.8.18-1) unstable; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/glymur/commit/2c3a23d971934c69fb3b832704748b87652d7aff -- View it on GitLab: https://salsa.debian.org/debian-gis-team/glymur/commit/2c3a23d971934c69fb3b832704748b87652d7aff You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 3 12:46:57 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 03 Sep 2019 11:46:57 +0000 Subject: [Git][debian-gis-team/glymur] Pushed new tag debian/0.8.18+ds-1 Message-ID: <5d6e52b1a1978_577b2ade6179fc4c445652@godard.mail> Bas Couwenberg pushed new tag debian/0.8.18+ds-1 at Debian GIS Project / glymur -- View it on GitLab: https://salsa.debian.org/debian-gis-team/glymur/tree/debian/0.8.18+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 3 12:57:28 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 03 Sep 2019 11:57:28 +0000 Subject: Processing of glymur_0.8.18+ds-1_source.changes Message-ID: glymur_0.8.18+ds-1_source.changes uploaded successfully to localhost along with the files: glymur_0.8.18+ds-1.dsc glymur_0.8.18+ds.orig.tar.xz glymur_0.8.18+ds-1.debian.tar.xz glymur_0.8.18+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Tue Sep 3 13:05:44 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 03 Sep 2019 12:05:44 +0000 Subject: glymur_0.8.18+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Tue, 03 Sep 2019 05:37:57 +0000 Source: glymur Architecture: source Version: 0.8.18+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: glymur (0.8.18+ds-1) unstable; urgency=medium . [ Bas Couwenberg ] * Repack upstream tarball to exclude .egg-info directory. * Bump Standards-Version to 4.4.0, no changes. . [ Antonio Valentino ] * Bump debhelper from old 11 to 12. * Use debhelper-compat instead of debian/compat. * Remove obsolete fields Name from debian/upstream/metadata. Checksums-Sha1: 4b2918caf92820a44d1501315cefcf707410a53c 2191 glymur_0.8.18+ds-1.dsc 6e9a01eaf16c6b7bfbd785ba3d0818b0f3d20a88 3367588 glymur_0.8.18+ds.orig.tar.xz 7fe8abd98dcdab4bd6360bd2a7167c686442e70e 3816 glymur_0.8.18+ds-1.debian.tar.xz c09bf33c8bcf660665c6f5a30198a1fdb188ef43 10388 glymur_0.8.18+ds-1_amd64.buildinfo Checksums-Sha256: d7aebcb4ed687488ca1ee117ce0fde34ecbec285d2168887c776f7dcd6f04492 2191 glymur_0.8.18+ds-1.dsc ccda607333d27bdca17f4186be1ca85d761adab447a8d75d22ca9d9df046fa2d 3367588 glymur_0.8.18+ds.orig.tar.xz 056be138843bc54cc3f942b7d8711e0d2679ca3bba6feffa3eff39f949b60430 3816 glymur_0.8.18+ds-1.debian.tar.xz a69fc35ab634a355069d153ef97e2fee8f92231634ba46bcafdeb3bf546ad6dd 10388 glymur_0.8.18+ds-1_amd64.buildinfo Files: 5cb6cbf94f4ad2eca14cf9164010ccaf 2191 python optional glymur_0.8.18+ds-1.dsc e7f6bfec06de33a6319346d0dc301d74 3367588 python optional glymur_0.8.18+ds.orig.tar.xz a8d47e5ab58e7f21d890da97fb04c240 3816 python optional glymur_0.8.18+ds-1.debian.tar.xz 7efe2629ad1cb4a11f493cee18021e71 10388 python optional glymur_0.8.18+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1uUqEACgkQZ1DxCuiN SvF1xBAA27yndtu+vFEEzi9G6K4LJn39vs9Re/OcNZqd//0EUAH0SmXSwjjPmh7Z 2HRv83JG4ZI9v26RQyDqDfvHZKJP1XZwzfvVEgd+KDpV2zzc+8C7nOvfzv1TC+E3 yiGuf3d46KFFmIFyt/GHn51IejeIDyBHIMpNI+NtN2529FRfVPQalpvqqsO4MNCL CXF89Eo9BYG5EQtgANOgD7yS/Qp2kuk13gV1ZPFAp7ygwMMOrpk5C6bZEYe9YEkc qXnOjMnH6eY7fYRiMioAnNs4V5Y1w3l5zNDa3P08IKljqzK1GYkD/FJ8OeEWsqU3 oAekqDieyeS6HbL1Ep8BViSqZsShlK9nC+yukKdFATD9G6ip96xKXTW+//ctjHXz MydXU/5g5wZlYwD8p9s/MMn8CdntymVLNEJPvfcd4zAHT3QVQtHrXaTexZ/5tmUm H0vgX0hP79MaLhcodWXwtbFKL+lBqH1U4PRMBWxzT3JMLinBoFsR1+Dqe1TWZijX AGhuAKUikoql4o4XHuSVJupDKKJs8jf/LYJEFKdYZEABtX6fCXf9W3aWo89fay2d 0yJeuEWBnVI7um2P8u5TnWHqJMeRtKYGer53xreheq9h9G4iZ2XTxUXurrz3XnmA xF8V20NcoKfPfeAZ+448lNpUjxj5K6LKPfOuskG53nPUdDGtMg8= =KL23 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Tue Sep 3 19:18:18 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 03 Sep 2019 18:18:18 +0000 Subject: [Git][debian-gis-team/ossim][master] 6 commits: New upstream version 2.9.1 Message-ID: <5d6eae6a7b6ba_577b3f91d859751456491e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / ossim Commits: 3467f291 by Bas Couwenberg at 2019-09-03T17:27:57Z New upstream version 2.9.1 - - - - - 3001ec72 by Bas Couwenberg at 2019-09-03T17:28:28Z Update upstream source from tag 'upstream/2.9.1' Update to upstream version '2.9.1' with Debian dir 5e5f6ad1390d9f0a48167c1da44d466538efd406 - - - - - d6793f8c by Bas Couwenberg at 2019-09-03T17:28:51Z New upstream release. - - - - - 98431432 by Bas Couwenberg at 2019-09-03T17:30:03Z Drop patch applied upstream, refresh remaining patch. - - - - - b87c2f44 by Bas Couwenberg at 2019-09-03T18:00:07Z Drop unused override for spelling-error-in-binary. - - - - - 99077a41 by Bas Couwenberg at 2019-09-03T18:00:07Z Set distribution to experimental. - - - - - 30 changed files: - Jenkinsfile - apps/curl_apps/omarDataMgrUtil.cpp - apps/ossim-band-merge/ossim-band-merge.cpp - apps/ossim-create-bitmask/ossim-create-bitmask.cpp - apps/ossim-create-cg/ossim-create-cg.cpp - apps/ossim-create-histo/ossim-create-histo.cpp - apps/ossim-dump-ocg/ossim-dump-ocg.cpp - apps/ossim-envi-cg/ossim-envi-cg.cpp - apps/ossim-icp/ossim-icp.cpp - apps/ossim-image-synth/ossim-image-synth.cpp - apps/ossim-img2rr/ossim-img2rr.cpp - apps/ossim-pc2dem/ossim-pc2dem.cpp - apps/ossim-pixelflip/ossim-pixelflip.cpp - apps/ossim-plot-histo/ossim-plot-histo.cpp - apps/ossim-preproc/ossim-preproc.cpp - apps/ossim-senint/ossim-senint.cpp - debian/changelog - debian/libossim1.lintian-overrides - − debian/patches/0001-Fixed-missing-return.patch - debian/patches/series - debian/patches/spelling-errors.patch - include/ossim/base/BlockIStream.h - include/ossim/base/ossimAdjustableParameterInterface.h - include/ossim/base/ossimAxes.h - include/ossim/base/ossimBilSplitter.h - include/ossim/base/ossimBinaryDataProperty.h - include/ossim/base/ossimBlockIStream.h - include/ossim/base/ossimByteStreamBuffer.h - include/ossim/base/ossimColumnVector3d.h - include/ossim/base/ossimColumnVector4d.h The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/compare/0f3ecdd71a2771d85506708bf0b48b816366649a...99077a415c3c0f1ef1137839488be7b9ffa8826d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/compare/0f3ecdd71a2771d85506708bf0b48b816366649a...99077a415c3c0f1ef1137839488be7b9ffa8826d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 3 19:18:19 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 03 Sep 2019 18:18:19 +0000 Subject: [Git][debian-gis-team/ossim][pristine-tar] pristine-tar data for ossim_2.9.1.orig.tar.gz Message-ID: <5d6eae6b9da1a_577b3f91d859751456513f@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / ossim Commits: c90d59a8 by Bas Couwenberg at 2019-09-03T17:28:28Z pristine-tar data for ossim_2.9.1.orig.tar.gz - - - - - 2 changed files: - + ossim_2.9.1.orig.tar.gz.delta - + ossim_2.9.1.orig.tar.gz.id Changes: ===================================== ossim_2.9.1.orig.tar.gz.delta ===================================== Binary files /dev/null and b/ossim_2.9.1.orig.tar.gz.delta differ ===================================== ossim_2.9.1.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +fd0a8a61783db119e6d3c7f6f80994b12aa0cbdb View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/c90d59a8cefe6e98de436aed6004795fb516425a -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/c90d59a8cefe6e98de436aed6004795fb516425a You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 3 19:18:20 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 03 Sep 2019 18:18:20 +0000 Subject: [Git][debian-gis-team/ossim][upstream] New upstream version 2.9.1 Message-ID: <5d6eae6ce10e9_577b2ade5dd798245653d8@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / ossim Commits: 3467f291 by Bas Couwenberg at 2019-09-03T17:27:57Z New upstream version 2.9.1 - - - - - 30 changed files: - Jenkinsfile - apps/curl_apps/omarDataMgrUtil.cpp - apps/ossim-band-merge/ossim-band-merge.cpp - apps/ossim-create-bitmask/ossim-create-bitmask.cpp - apps/ossim-create-cg/ossim-create-cg.cpp - apps/ossim-create-histo/ossim-create-histo.cpp - apps/ossim-dump-ocg/ossim-dump-ocg.cpp - apps/ossim-envi-cg/ossim-envi-cg.cpp - apps/ossim-icp/ossim-icp.cpp - apps/ossim-image-synth/ossim-image-synth.cpp - apps/ossim-img2rr/ossim-img2rr.cpp - apps/ossim-pc2dem/ossim-pc2dem.cpp - apps/ossim-pixelflip/ossim-pixelflip.cpp - apps/ossim-plot-histo/ossim-plot-histo.cpp - apps/ossim-preproc/ossim-preproc.cpp - apps/ossim-senint/ossim-senint.cpp - include/ossim/base/BlockIStream.h - include/ossim/base/ossimAdjustableParameterInterface.h - include/ossim/base/ossimAxes.h - include/ossim/base/ossimBilSplitter.h - include/ossim/base/ossimBinaryDataProperty.h - include/ossim/base/ossimBlockIStream.h - include/ossim/base/ossimByteStreamBuffer.h - include/ossim/base/ossimColumnVector3d.h - include/ossim/base/ossimColumnVector4d.h - include/ossim/base/ossimDirectoryTree.h - include/ossim/base/ossimDpt.h - include/ossim/base/ossimEcefVector.h - include/ossim/base/ossimFlexLexer.h - include/ossim/base/ossimFpt3d.h The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/3467f29165899434df35662b88baee1f9943aebf -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/3467f29165899434df35662b88baee1f9943aebf You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 3 19:18:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 03 Sep 2019 18:18:26 +0000 Subject: [Git][debian-gis-team/ossim] Pushed new tag debian/2.9.1-1_exp1 Message-ID: <5d6eae72d5505_577b2ade5da46d14565559@godard.mail> Bas Couwenberg pushed new tag debian/2.9.1-1_exp1 at Debian GIS Project / ossim -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/tree/debian/2.9.1-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 3 19:18:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 03 Sep 2019 18:18:27 +0000 Subject: [Git][debian-gis-team/ossim] Pushed new tag upstream/2.9.1 Message-ID: <5d6eae73d02ac_577b3f91d85975145657bf@godard.mail> Bas Couwenberg pushed new tag upstream/2.9.1 at Debian GIS Project / ossim -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/tree/upstream/2.9.1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 3 19:26:31 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 03 Sep 2019 18:26:31 +0000 Subject: Processing of ossim_2.9.1-1~exp1_source.changes Message-ID: ossim_2.9.1-1~exp1_source.changes uploaded successfully to localhost along with the files: ossim_2.9.1-1~exp1.dsc ossim_2.9.1.orig.tar.gz ossim_2.9.1-1~exp1.debian.tar.xz ossim_2.9.1-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Tue Sep 3 19:49:22 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 03 Sep 2019 18:49:22 +0000 Subject: ossim_2.9.1-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Tue, 03 Sep 2019 19:31:32 +0200 Source: ossim Architecture: source Version: 2.9.1-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: ossim (2.9.1-1~exp1) experimental; urgency=medium . * Team upload. * New upstream release. * Drop patch applied upstream, refresh remaining patch. * Drop unused override for spelling-error-in-binary. Checksums-Sha1: f36443f781e0a72001eb2a7c090af1c61987ff79 2207 ossim_2.9.1-1~exp1.dsc b5b58e3f9a498084a971b8c6fe0e840a75c26ee6 6859702 ossim_2.9.1.orig.tar.gz 65f5c18161ab4d2f72390991748bfc0ae733d9bd 39840 ossim_2.9.1-1~exp1.debian.tar.xz 5caf8e1d3c665861c8e343415912c54a31b0811b 9199 ossim_2.9.1-1~exp1_amd64.buildinfo Checksums-Sha256: fe467608b383118d37eb84937d5d8d959b04996a9a7f0fb94e34197c51aecd5c 2207 ossim_2.9.1-1~exp1.dsc 0a053ae81fd4257fc664931bf0ed3380992250a146dc977402d4b759221f520b 6859702 ossim_2.9.1.orig.tar.gz 8e25149dd51a5eb271e5bd996621ebc424d34e6a88e9f9350c01429fe3c4c2c6 39840 ossim_2.9.1-1~exp1.debian.tar.xz b055fa7d9f2c614cf2c0e9cd520d9bd11cbd6dd2a7adfb902ebb8d367e7bf049 9199 ossim_2.9.1-1~exp1_amd64.buildinfo Files: 649aee4168b3662c04cf61ed678884ad 2207 science optional ossim_2.9.1-1~exp1.dsc 9e1ed041b470198168100a57f66b7e75 6859702 science optional ossim_2.9.1.orig.tar.gz bbace9d3d2dbccc16690679663d794d7 39840 science optional ossim_2.9.1-1~exp1.debian.tar.xz 4ee88ed63cdd10cc2a241871cce54ffc 9199 science optional ossim_2.9.1-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1urkwACgkQZ1DxCuiN SvHBRhAArUvSMAiFAPK4797RX0fh+grub0+TnyeoXYJoQfg4ShijKQ6kFSnoZG56 QfN7nfVqEI4MCwkdBAb1QyppB6Pw2WOj6MHo1NNyB0297xkrVTnh7M+sFtKU673j rw07hWwGxUZkUSLd+GIreANJp9MwFg2YXzk8eMb9kHeHSLrZGjqb81kYncfeazEl SWT2MFkbEhan1+/WSgvOOOj7OCmLGLYh/04NTMJsKPQMriM4ox3+lhOhHeLvnVRZ G38fbzmaqqGzy0ipFBQAI0fLpItNrM/3XWiuQhmy0vJ+iq2GjkHeu5jhTH0TfcgM /D8dQTb0eDUkFc82E9fr8vtHuD0dE78X71tQfs34XCvwU6REE0HjjhILU/h5D5IF ta680Q7maSinE/b2jHPjGzoKcIsuekUUSxBYeQA8QWaI/HvsprMrkdtNAET5/Bqo TXLAXgpE9mtwUfHb+dhS5rhDvYhwXWzG+m/C4+Hsw37JN8mY+JBJLsY0dChMyU6S 4Il8/4X4DmoyrXmjBglB4OsNLLUYJU5VKLdEKngjm633pYzVIy+RcG4oezsl3Z/z bg4u1FgEJ4NequvBJ6bwi732XnwcngwjhrZUZ+Wspf/y4BWlI3fkljSHB9snLr1x GaWsGy8ANFgLpwf5iXTJ9dwLrtsg59EmZLSMNcKyd5BNyLTDNEw= =x+co -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Wed Sep 4 05:07:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 04:07:09 +0000 Subject: [Git][debian-gis-team/netcdf4-python][pristine-tar] pristine-tar data for netcdf4-python_1.5.2.orig.tar.gz Message-ID: <5d6f386d411e8_577b2ade5dd7982460679e@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / netcdf4-python Commits: ee58b4b3 by Bas Couwenberg at 2019-09-04T03:55:09Z pristine-tar data for netcdf4-python_1.5.2.orig.tar.gz - - - - - 2 changed files: - + netcdf4-python_1.5.2.orig.tar.gz.delta - + netcdf4-python_1.5.2.orig.tar.gz.id Changes: ===================================== netcdf4-python_1.5.2.orig.tar.gz.delta ===================================== Binary files /dev/null and b/netcdf4-python_1.5.2.orig.tar.gz.delta differ ===================================== netcdf4-python_1.5.2.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +6a4ab2399d975981c9189ae900c8798b60732963 View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/commit/ee58b4b394967f870dcceb613533ad6236ecfad2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/commit/ee58b4b394967f870dcceb613533ad6236ecfad2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 05:07:24 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 04:07:24 +0000 Subject: [Git][debian-gis-team/netcdf4-python] Pushed new tag debian/1.5.2-1 Message-ID: <5d6f387cc078d_577b2ade5dd7982460713c@godard.mail> Bas Couwenberg pushed new tag debian/1.5.2-1 at Debian GIS Project / netcdf4-python -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/tree/debian/1.5.2-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 05:07:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 04:07:23 +0000 Subject: [Git][debian-gis-team/netcdf4-python][master] 4 commits: New upstream version 1.5.2 Message-ID: <5d6f387b74f6a_577b2ade5da46d146069e4@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / netcdf4-python Commits: 93a42b17 by Bas Couwenberg at 2019-09-04T03:55:06Z New upstream version 1.5.2 - - - - - 1bc993ba by Bas Couwenberg at 2019-09-04T03:55:10Z Update upstream source from tag 'upstream/1.5.2' Update to upstream version '1.5.2' with Debian dir 265aa714cf7782f3de407679b4f6d0a092482451 - - - - - 4a39dc90 by Bas Couwenberg at 2019-09-04T03:55:23Z New upstream release. - - - - - 16dcd4ba by Bas Couwenberg at 2019-09-04T03:56:33Z Set distribution to unstable. - - - - - 11 changed files: - .appveyor.yml - .travis.yml - Changelog - README.md - debian/changelog - docs/netCDF4/index.html - netCDF4/_netCDF4.pyx - setup.py - test/tst_atts.py - test/tst_endian.py - test/tst_netcdftime.py Changes: ===================================== .appveyor.yml ===================================== @@ -27,8 +27,8 @@ install: - cmd: call %CONDA_INSTALL_LOCN%\Scripts\activate.bat - cmd: conda config --set always_yes yes --set changeps1 no --set show_channel_urls true - cmd: conda update conda - - cmd: conda config --remove channels defaults --force - cmd: conda config --add channels conda-forge --force + - cmd: conda config --set channel_priority strict - cmd: set PYTHONUNBUFFERED=1 - cmd: conda install conda-build vs2008_express_vc_python_patch - cmd: call setup_x64 ===================================== .travis.yml ===================================== @@ -1,6 +1,6 @@ language: python dist: xenial -sudo: true +cache: pip addons: apt: @@ -17,6 +17,7 @@ env: python: - "2.7" + - "3.5" - "3.6" - "3.7" - "3.8-dev" @@ -39,7 +40,6 @@ matrix: - DEPENDS="numpy==1.10.0 cython==0.21 ordereddict==1.1 setuptools==18.0 cftime" # test MPI with latest released version - python: 3.7 - dist: xenial env: - MPI=1 - CC=mpicc.mpich @@ -55,7 +55,6 @@ matrix: - libhdf5-mpich-dev # test MPI with latest released version - python: 3.7 - dist: xenial env: - MPI=1 - CC=mpicc.mpich @@ -72,7 +71,6 @@ matrix: - libhdf5-mpich-dev # test with netcdf-c from github master - python: 3.7 - dist: xenial env: - MPI=1 - CC=mpicc.mpich ===================================== Changelog ===================================== @@ -1,3 +1,17 @@ + version 1.5.2 (not yet released) +============================== + * fix for scaling bug when _Unsigned attribute is set and byteorder of data + does not match native byteorder (issue #930). + * revise documentation for Python 3 (issue #946). + * establish support for Python 2.7, 3.5, 3.6 and 3.7 (issue #948). + * use dict built-in instead of OrderedDict for Python 3.7+ + (pull request #955). + * remove underline ANSI in Dataset string representation (pull request #956). + * remove newlines from string representation (pull request #960). + * fix for issue #957 (size of scalar var is a float since numpy.prod(())=1.0). + * make sure Variable.setncattr fails to set _FillValue (issue #959). + * fix detection of parallel HDF5 support with netcdf-c 4.6.1 (issue #964). + version 1.5.1.2 (tag v1.5.1.2rel) ================================== * fix another slicing bug introduced by the fix to issue #906 (issue #922). ===================================== README.md ===================================== @@ -10,8 +10,10 @@ ## News For details on the latest updates, see the [Changelog](https://github.com/Unidata/netcdf4-python/blob/master/Changelog). +09/03/2019: Version [1.5.2](https://pypi.python.org/pypi/netCDF4/1.5.2) released. Bugfixes, no new features. + 05/06/2019: Version [1.5.1.2](https://pypi.python.org/pypi/netCDF4/1.5.1.2) released. Fixes another slicing -slicing regression ([issue #922)](https://github.com/Unidata/netcdf4-python/issues/922)) introduced in the 1.5.1 release. +regression ([issue #922)](https://github.com/Unidata/netcdf4-python/issues/922)) introduced in the 1.5.1 release. 05/02/2019: Version [1.5.1.1](https://pypi.python.org/pypi/netCDF4/1.5.1.1) released. Fixes incorrect `__version__` module variable in 1.5.1 release, plus a slicing bug ([issue #919)](https://github.com/Unidata/netcdf4-python/issues/919)). ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +netcdf4-python (1.5.2-1) unstable; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Wed, 04 Sep 2019 05:56:15 +0200 + netcdf4-python (1.5.1.2-4) unstable; urgency=medium * Drop Python 2 support. ===================================== docs/netCDF4/index.html ===================================== The diff for this file was not included because it is too large. ===================================== netCDF4/_netCDF4.pyx ===================================== @@ -1,5 +1,5 @@ """ -Version 1.5.1.2 +Version 1.5.2 --------------- - - - @@ -151,7 +151,7 @@ Here's an example: :::python >>> from netCDF4 import Dataset >>> rootgrp = Dataset("test.nc", "w", format="NETCDF4") - >>> print rootgrp.data_model + >>> print(rootgrp.data_model) NETCDF4 >>> rootgrp.close() @@ -182,11 +182,18 @@ in a netCDF 3 file you will get an error message. >>> rootgrp = Dataset("test.nc", "a") >>> fcstgrp = rootgrp.createGroup("forecasts") >>> analgrp = rootgrp.createGroup("analyses") - >>> print rootgrp.groups - OrderedDict([("forecasts", - ), - ("analyses", - )]) + >>> print(rootgrp.groups) + {'forecasts': + group /forecasts: + dimensions(sizes): + variables(dimensions): + groups: , 'analyses': + group /analyses: + dimensions(sizes): + variables(dimensions): + groups: } + + Groups can exist within groups in a `netCDF4.Dataset`, just as directories exist within directories in a unix filesystem. Each `netCDF4.Group` instance @@ -212,40 +219,40 @@ object yields summary information about it's contents. :::python >>> def walktree(top): - >>> values = top.groups.values() - >>> yield values - >>> for value in top.groups.values(): - >>> for children in walktree(value): - >>> yield children - >>> print rootgrp - >>> for children in walktree(rootgrp): - >>> for child in children: - >>> print child - - root group (NETCDF4 file format): - dimensions: - variables: + ... values = top.groups.values() + ... yield values + ... for value in top.groups.values(): + ... for children in walktree(value): + ... yield children + >>> print(rootgrp) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): + variables(dimensions): groups: forecasts, analyses - + >>> for children in walktree(rootgrp): + ... for child in children: + ... print(child) + group /forecasts: - dimensions: - variables: + dimensions(sizes): + variables(dimensions): groups: model1, model2 - + group /analyses: - dimensions: - variables: - groups: - + dimensions(sizes): + variables(dimensions): + groups: + group /forecasts/model1: - dimensions: - variables: - groups: - + dimensions(sizes): + variables(dimensions): + groups: + group /forecasts/model2: - dimensions: - variables: - groups: + dimensions(sizes): + variables(dimensions): + groups: ##
3) Dimensions in a netCDF file. @@ -272,11 +279,8 @@ one, and it must be the first (leftmost) dimension of the variable. All of the `netCDF4.Dimension` instances are stored in a python dictionary. :::python - >>> print rootgrp.dimensions - OrderedDict([("level", ), - ("time", ), - ("lat", ), - ("lon", )]) + >>> print(rootgrp.dimensions) + {'level': (unlimited): name = 'level', size = 0, 'time': (unlimited): name = 'time', size = 0, 'lat': : name = 'lat', size = 73, 'lon': : name = 'lon', size = 144} Calling the python `len` function with a `netCDF4.Dimension` instance returns the current size of that dimension. @@ -284,11 +288,11 @@ The `netCDF4.Dimension.isunlimited` method of a `netCDF4.Dimension` instance can be used to determine if the dimensions is unlimited, or appendable. :::python - >>> print len(lon) + >>> print(len(lon)) 144 - >>> print lon.isunlimited() + >>> print(lon.isunlimited()) False - >>> print time.isunlimited() + >>> print(time.isunlimited()) True Printing the `netCDF4.Dimension` object @@ -297,12 +301,11 @@ and whether it is unlimited. :::python >>> for dimobj in rootgrp.dimensions.values(): - >>> print dimobj - (unlimited): name = "level", size = 0 - (unlimited): name = "time", size = 0 - : name = "lat", size = 73 - : name = "lon", size = 144 - (unlimited): name = "time", size = 0 + ... print(dimobj) + (unlimited): name = 'level', size = 0 + (unlimited): name = 'time', size = 0 + : name = 'lat', size = 73 + : name = 'lon', size = 144 `netCDF4.Dimension` names can be changed using the `netCDF4.Datatset.renameDimension` method of a `netCDF4.Dataset` or @@ -348,17 +351,19 @@ used later to access and set variable data and attributes. >>> longitudes = rootgrp.createVariable("lon","f4",("lon",)) >>> # two dimensions unlimited >>> temp = rootgrp.createVariable("temp","f4",("time","level","lat","lon",)) + >>> temp.units = "K" -To get summary info on a `netCDF4.Variable` instance in an interactive session, just print it. +To get summary info on a `netCDF4.Variable` instance in an interactive session, +just print it. :::python - >>> print temp - + >>> print(temp) + float32 temp(time, level, lat, lon) - least_significant_digit: 3 units: K unlimited dimensions: time, level current shape = (0, 0, 73, 144) + filling on, default _FillValue of 9.969209968386869e+36 used You can use a path to create a Variable inside a hierarchy of groups. @@ -371,30 +376,48 @@ You can also query a `netCDF4.Dataset` or `netCDF4.Group` instance directly to o `netCDF4.Variable` instances using paths. :::python - >>> print rootgrp["/forecasts/model1"] # a Group instance - + >>> print(rootgrp["/forecasts/model1"]) # a Group instance + group /forecasts/model1: - dimensions(sizes): + dimensions(sizes): variables(dimensions): float32 temp(time,level,lat,lon) - groups: - >>> print rootgrp["/forecasts/model1/temp"] # a Variable instance - + groups: + >>> print(rootgrp["/forecasts/model1/temp"]) # a Variable instance + float32 temp(time, level, lat, lon) path = /forecasts/model1 unlimited dimensions: time, level current shape = (0, 0, 73, 144) - filling on, default _FillValue of 9.96920996839e+36 used + filling on, default _FillValue of 9.969209968386869e+36 used + All of the variables in the `netCDF4.Dataset` or `netCDF4.Group` are stored in a Python dictionary, in the same way as the dimensions: :::python - >>> print rootgrp.variables - OrderedDict([("time", ), - ("level", ), - ("lat", ), - ("lon", ), - ("temp", )]) + >>> print(rootgrp.variables) + {'time': + float64 time(time) + unlimited dimensions: time + current shape = (0,) + filling on, default _FillValue of 9.969209968386869e+36 used, 'level': + int32 level(level) + unlimited dimensions: level + current shape = (0,) + filling on, default _FillValue of -2147483647 used, 'lat': + float32 lat(lat) + unlimited dimensions: + current shape = (73,) + filling on, default _FillValue of 9.969209968386869e+36 used, 'lon': + float32 lon(lon) + unlimited dimensions: + current shape = (144,) + filling on, default _FillValue of 9.969209968386869e+36 used, 'temp': + float32 temp(time, level, lat, lon) + units: K + unlimited dimensions: time, level + current shape = (0, 0, 73, 144) + filling on, default _FillValue of 9.969209968386869e+36 used} `netCDF4.Variable` names can be changed using the `netCDF4.Dataset.renameVariable` method of a `netCDF4.Dataset` @@ -432,9 +455,9 @@ and attributes that cannot (or should not) be modified by the user. :::python >>> for name in rootgrp.ncattrs(): - >>> print "Global attr", name, "=", getattr(rootgrp,name) + ... print("Global attr {} = {}".format(name, getattr(rootgrp, name))) Global attr description = bogus example script - Global attr history = Created Mon Nov 7 10.30:56 2005 + Global attr history = Created Mon Jul 8 14:19:41 2019 Global attr source = netCDF4 python module tutorial The `__dict__` attribute of a `netCDF4.Dataset`, `netCDF4.Group` or `netCDF4.Variable` @@ -442,10 +465,8 @@ instance provides all the netCDF attribute name/value pairs in a python dictionary: :::python - >>> print rootgrp.__dict__ - OrderedDict([(u"description", u"bogus example script"), - (u"history", u"Created Thu Mar 3 19:30:33 2011"), - (u"source", u"netCDF4 python module tutorial")]) + >>> print(rootgrp.__dict__) + {'description': 'bogus example script', 'history': 'Created Mon Jul 8 14:19:41 2019', 'source': 'netCDF4 python module tutorial'} Attributes can be deleted from a netCDF `netCDF4.Dataset`, `netCDF4.Group` or `netCDF4.Variable` using the python `del` statement (i.e. `del grp.foo` @@ -462,7 +483,7 @@ into it? You can just treat it like an array and assign data to a slice. >>> lons = numpy.arange(-180,180,2.5) >>> latitudes[:] = lats >>> longitudes[:] = lons - >>> print "latitudes =\\n",latitudes[:] + >>> print("latitudes =\\n{}".format(latitudes[:])) latitudes = [-90. -87.5 -85. -82.5 -80. -77.5 -75. -72.5 -70. -67.5 -65. -62.5 -60. -57.5 -55. -52.5 -50. -47.5 -45. -42.5 -40. -37.5 -35. -32.5 @@ -480,17 +501,17 @@ assign data outside the currently defined range of indices. >>> # append along two unlimited dimensions by assigning to slice. >>> nlats = len(rootgrp.dimensions["lat"]) >>> nlons = len(rootgrp.dimensions["lon"]) - >>> print "temp shape before adding data = ",temp.shape - temp shape before adding data = (0, 0, 73, 144) + >>> print("temp shape before adding data = {}".format(temp.shape)) + temp shape before adding data = (0, 0, 73, 144) >>> >>> from numpy.random import uniform - >>> temp[0:5,0:10,:,:] = uniform(size=(5,10,nlats,nlons)) - >>> print "temp shape after adding data = ",temp.shape - temp shape after adding data = (6, 10, 73, 144) + >>> temp[0:5, 0:10, :, :] = uniform(size=(5, 10, nlats, nlons)) + >>> print("temp shape after adding data = {}".format(temp.shape)) + temp shape after adding data = (5, 10, 73, 144) >>> >>> # levels have grown, but no values yet assigned. - >>> print "levels shape after adding pressure data = ",levels.shape - levels shape after adding pressure data = (10,) + >>> print("levels shape after adding pressure data = {}".format(levels.shape)) + levels shape after adding pressure data = (10,) Note that the size of the levels variable grows when data is appended along the `level` dimension of the variable `temp`, even though no @@ -510,7 +531,8 @@ allowed, and these indices work independently along each dimension (similar to the way vector subscripts work in fortran). This means that :::python - >>> temp[0, 0, [0,1,2,3], [0,1,2,3]] + >>> temp[0, 0, [0,1,2,3], [0,1,2,3]].shape + (4, 4) returns an array of shape (4,4) when slicing a netCDF variable, but for a numpy array it returns an array of shape (4,). @@ -534,12 +556,12 @@ will extract time indices 0,2 and 4, pressure levels Hemisphere longitudes, resulting in a numpy array of shape (3, 3, 36, 71). :::python - >>> print "shape of fancy temp slice = ",tempdat.shape - shape of fancy temp slice = (3, 3, 36, 71) + >>> print("shape of fancy temp slice = {}".format(tempdat.shape)) + shape of fancy temp slice = (3, 3, 36, 71) ***Special note for scalar variables***: To extract data from a scalar variable -`v` with no associated dimensions, use `numpy.asarray(v)` or `v[...]`. The result -will be a numpy scalar array. +`v` with no associated dimensions, use `numpy.asarray(v)` or `v[...]`. +The result will be a numpy scalar array. By default, netcdf4-python returns numpy masked arrays with values equal to the `missing_value` or `_FillValue` variable attributes masked. The @@ -572,14 +594,15 @@ can be used: >>> from netCDF4 import num2date, date2num >>> dates = [datetime(2001,3,1)+n*timedelta(hours=12) for n in range(temp.shape[0])] >>> times[:] = date2num(dates,units=times.units,calendar=times.calendar) - >>> print "time values (in units %s): " % times.units+"\\n",times[:] - time values (in units hours since January 1, 0001): - [ 17533056. 17533068. 17533080. 17533092. 17533104.] + >>> print("time values (in units {}):\\n{}".format(times.units, times[:])) + time values (in units hours since 0001-01-01 00:00:00.0): + [17533104. 17533116. 17533128. 17533140. 17533152.] >>> dates = num2date(times[:],units=times.units,calendar=times.calendar) - >>> print "dates corresponding to time values:\\n",dates + >>> print("dates corresponding to time values:\\n{}".format(dates)) dates corresponding to time values: - [2001-03-01 00:00:00 2001-03-01 12:00:00 2001-03-02 00:00:00 - 2001-03-02 12:00:00 2001-03-03 00:00:00] + [real_datetime(2001, 3, 1, 0, 0) real_datetime(2001, 3, 1, 12, 0) + real_datetime(2001, 3, 2, 0, 0) real_datetime(2001, 3, 2, 12, 0) + real_datetime(2001, 3, 3, 0, 0)] `netCDF4.num2date` converts numeric values of time in the specified `units` and `calendar` to datetime objects, and `netCDF4.date2num` does the reverse. @@ -607,22 +630,22 @@ datasets are not supported). :::python >>> for nf in range(10): - >>> f = Dataset("mftest%s.nc" % nf,"w") - >>> f.createDimension("x",None) - >>> x = f.createVariable("x","i",("x",)) - >>> x[0:10] = numpy.arange(nf*10,10*(nf+1)) - >>> f.close() + ... with Dataset("mftest%s.nc" % nf, "w", format="NETCDF4_CLASSIC") as f: + ... _ = f.createDimension("x",None) + ... x = f.createVariable("x","i",("x",)) + ... x[0:10] = numpy.arange(nf*10,10*(nf+1)) Now read all the files back in at once with `netCDF4.MFDataset` :::python >>> from netCDF4 import MFDataset >>> f = MFDataset("mftest*nc") - >>> print f.variables["x"][:] - [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 - 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 - 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 - 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99] + >>> print(f.variables["x"][:]) + [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 + 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 + 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 + 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 + 96 97 98 99] Note that `netCDF4.MFDataset` can only be used to read, not write, multi-file datasets. @@ -673,12 +696,12 @@ In our example, try replacing the line with :::python - >>> temp = dataset.createVariable("temp","f4",("time","level","lat","lon",),zlib=True) + >>> temp = rootgrp.createVariable("temp","f4",("time","level","lat","lon",),zlib=True) and then :::python - >>> temp = dataset.createVariable("temp","f4",("time","level","lat","lon",),zlib=True,least_significant_digit=3) + >>> temp = rootgrp.createVariable("temp","f4",("time","level","lat","lon",),zlib=True,least_significant_digit=3) and see how much smaller the resulting files are. @@ -707,7 +730,7 @@ for storing numpy complex arrays. Here's an example: >>> complex128 = numpy.dtype([("real",numpy.float64),("imag",numpy.float64)]) >>> complex128_t = f.createCompoundType(complex128,"complex128") >>> # create a variable with this data type, write some data to it. - >>> f.createDimension("x_dim",None) + >>> x_dim = f.createDimension("x_dim",None) >>> v = f.createVariable("cmplx_var",complex128_t,"x_dim") >>> data = numpy.empty(size,complex128) # numpy structured array >>> data["real"] = datac.real; data["imag"] = datac.imag @@ -720,11 +743,11 @@ for storing numpy complex arrays. Here's an example: >>> datac2 = numpy.empty(datain.shape,numpy.complex128) >>> # .. fill it with contents of structured array. >>> datac2.real = datain["real"]; datac2.imag = datain["imag"] - >>> print datac.dtype,datac # original data - complex128 [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] + >>> print('{}: {}'.format(datac.dtype, datac)) # original data + complex128: [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] >>> - >>> print datac2.dtype,datac2 # data from file - complex128 [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] + >>> print('{}: {}'.format(datac2.dtype, datac2)) # data from file + complex128: [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] Compound types can be nested, but you must create the 'inner' ones first. All possible numpy structured arrays cannot be @@ -735,22 +758,22 @@ in a Python dictionary, just like variables and dimensions. As always, printing objects gives useful summary information in an interactive session: :::python - >>> print f - - root group (NETCDF4 file format): - dimensions: x_dim - variables: cmplx_var - groups: - - >>> print f.variables["cmplx_var"] + >>> print(f) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): x_dim(3) + variables(dimensions): {'names':['real','imag'], 'formats':['>> print(f.variables["cmplx_var"]) + compound cmplx_var(x_dim) - compound data type: [("real", ">> print f.cmptypes - OrderedDict([("complex128", )]) - >>> print f.cmptypes["complex128"] - : name = "complex128", numpy dtype = [(u"real",">> print(f.cmptypes) + {'complex128': : name = 'complex128', numpy dtype = {'names':['real','imag'], 'formats':['>> print(f.cmptypes["complex128"]) + : name = 'complex128', numpy dtype = {'names':['real','imag'], 'formats':['11) Variable-length (vlen) data types. @@ -784,32 +807,37 @@ In this case, they contain 1-D numpy `int32` arrays of random length between :::python >>> import random + >>> random.seed(54321) >>> data = numpy.empty(len(y)*len(x),object) >>> for n in range(len(y)*len(x)): - >>> data[n] = numpy.arange(random.randint(1,10),dtype="int32")+1 + ... data[n] = numpy.arange(random.randint(1,10),dtype="int32")+1 >>> data = numpy.reshape(data,(len(y),len(x))) >>> vlvar[:] = data - >>> print "vlen variable =\\n",vlvar[:] + >>> print("vlen variable =\\n{}".format(vlvar[:])) vlen variable = - [[[ 1 2 3 4 5 6 7 8 9 10] [1 2 3 4 5] [1 2 3 4 5 6 7 8]] - [[1 2 3 4 5 6 7] [1 2 3 4 5 6] [1 2 3 4 5]] - [[1 2 3 4 5] [1 2 3 4] [1]] - [[ 1 2 3 4 5 6 7 8 9 10] [ 1 2 3 4 5 6 7 8 9 10] - [1 2 3 4 5 6 7 8]]] - >>> print f - - root group (NETCDF4 file format): - dimensions: x, y - variables: phony_vlen_var - groups: - >>> print f.variables["phony_vlen_var"] - + [[array([1, 2, 3, 4, 5, 6, 7, 8], dtype=int32) array([1, 2], dtype=int32) + array([1, 2, 3, 4], dtype=int32)] + [array([1, 2, 3], dtype=int32) + array([1, 2, 3, 4, 5, 6, 7, 8, 9], dtype=int32) + array([1, 2, 3, 4, 5, 6, 7, 8, 9], dtype=int32)] + [array([1, 2, 3, 4, 5, 6, 7], dtype=int32) array([1, 2, 3], dtype=int32) + array([1, 2, 3, 4, 5, 6], dtype=int32)] + [array([1, 2, 3, 4, 5, 6, 7, 8, 9], dtype=int32) + array([1, 2, 3, 4, 5], dtype=int32) array([1, 2], dtype=int32)]] + >>> print(f) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): x(3), y(4) + variables(dimensions): int32 phony_vlen_var(y,x) + groups: + >>> print(f.variables["phony_vlen_var"]) + vlen phony_vlen_var(y, x) vlen data type: int32 - unlimited dimensions: + unlimited dimensions: current shape = (4, 3) - >>> print f.VLtypes["phony_vlen"] - : name = "phony_vlen", numpy dtype = int32 + >>> print(f.vltypes["phony_vlen"]) + : name = 'phony_vlen', numpy dtype = int32 Numpy object arrays containing python strings can also be written as vlen variables, For vlen strings, you don't need to create a vlen data type. @@ -819,7 +847,7 @@ with fixed length greater than 1) when calling the :::python >>> z = f.createDimension("z",10) - >>> strvar = rootgrp.createVariable("strvar", str, "z") + >>> strvar = f.createVariable("strvar", str, "z") In this example, an object array is filled with random python strings with random lengths between 2 and 12 characters, and the data in the object @@ -829,24 +857,25 @@ array is assigned to the vlen string variable. >>> chars = "1234567890aabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ" >>> data = numpy.empty(10,"O") >>> for n in range(10): - >>> stringlen = random.randint(2,12) - >>> data[n] = "".join([random.choice(chars) for i in range(stringlen)]) + ... stringlen = random.randint(2,12) + ... data[n] = "".join([random.choice(chars) for i in range(stringlen)]) >>> strvar[:] = data - >>> print "variable-length string variable:\\n",strvar[:] + >>> print("variable-length string variable:\\n{}".format(strvar[:])) variable-length string variable: - [aDy29jPt 5DS9X8 jd7aplD b8t4RM jHh8hq KtaPWF9cQj Q1hHN5WoXSiT MMxsVeq tdLUzvVTzj] - >>> print f - - root group (NETCDF4 file format): - dimensions: x, y, z - variables: phony_vlen_var, strvar - groups: - >>> print f.variables["strvar"] - + ['Lh' '25F8wBbMI' '53rmM' 'vvjnb3t63ao' 'qjRBQk6w' 'aJh' 'QF' + 'jtIJbJACaQk4' '3Z5' 'bftIIq'] + >>> print(f) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): x(3), y(4), z(10) + variables(dimensions): int32 phony_vlen_var(y,x), strvar(z) + groups: + >>> print(f.variables["strvar"]) + vlen strvar(z) - vlen data type: - unlimited dimensions: - current size = (10,) + vlen data type: + unlimited dimensions: + current shape = (10,) It is also possible to set contents of vlen string variables with numpy arrays of any string or unicode data type. Note, however, that accessing the contents @@ -866,19 +895,14 @@ values and their names are used to define an Enum data type using :::python >>> nc = Dataset('clouds.nc','w') >>> # python dict with allowed values and their names. - >>> enum_dict = {u'Altocumulus': 7, u'Missing': 255, - >>> u'Stratus': 2, u'Clear': 0, - >>> u'Nimbostratus': 6, u'Cumulus': 4, u'Altostratus': 5, - >>> u'Cumulonimbus': 1, u'Stratocumulus': 3} + >>> enum_dict = {'Altocumulus': 7, 'Missing': 255, + ... 'Stratus': 2, 'Clear': 0, + ... 'Nimbostratus': 6, 'Cumulus': 4, 'Altostratus': 5, + ... 'Cumulonimbus': 1, 'Stratocumulus': 3} >>> # create the Enum type called 'cloud_t'. >>> cloud_type = nc.createEnumType(numpy.uint8,'cloud_t',enum_dict) - >>> print cloud_type - : name = 'cloud_t', - numpy dtype = uint8, fields/values ={u'Cumulus': 4, - u'Altocumulus': 7, u'Missing': 255, - u'Stratus': 2, u'Clear': 0, - u'Cumulonimbus': 1, u'Stratocumulus': 3, - u'Nimbostratus': 6, u'Altostratus': 5} + >>> print(cloud_type) + : name = 'cloud_t', numpy dtype = uint8, fields/values ={'Altocumulus': 7, 'Missing': 255, 'Stratus': 2, 'Clear': 0, 'Nimbostratus': 6, 'Cumulus': 4, 'Altostratus': 5, 'Cumulonimbus': 1, 'Stratocumulus': 3} A new variable can be created in the usual way using this data type. Integer data is written to the variable that represents the named @@ -890,30 +914,25 @@ specified names. >>> time = nc.createDimension('time',None) >>> # create a 1d variable of type 'cloud_type'. >>> # The fill_value is set to the 'Missing' named value. - >>> cloud_var = - >>> nc.createVariable('primary_cloud',cloud_type,'time', - >>> fill_value=enum_dict['Missing']) + >>> cloud_var = nc.createVariable('primary_cloud',cloud_type,'time', + ... fill_value=enum_dict['Missing']) >>> # write some data to the variable. - >>> cloud_var[:] = [enum_dict['Clear'],enum_dict['Stratus'], - >>> enum_dict['Cumulus'],enum_dict['Missing'], - >>> enum_dict['Cumulonimbus']] + >>> cloud_var[:] = [enum_dict[k] for k in ['Clear', 'Stratus', 'Cumulus', + ... 'Missing', 'Cumulonimbus']] >>> nc.close() >>> # reopen the file, read the data. >>> nc = Dataset('clouds.nc') >>> cloud_var = nc.variables['primary_cloud'] - >>> print cloud_var - + >>> print(cloud_var) + enum primary_cloud(time) _FillValue: 255 enum data type: uint8 unlimited dimensions: time current shape = (5,) - >>> print cloud_var.datatype.enum_dict - {u'Altocumulus': 7, u'Missing': 255, u'Stratus': 2, - u'Clear': 0, u'Nimbostratus': 6, u'Cumulus': 4, - u'Altostratus': 5, u'Cumulonimbus': 1, - u'Stratocumulus': 3} - >>> print cloud_var[:] + >>> print(cloud_var.datatype.enum_dict) + {'Altocumulus': 7, 'Missing': 255, 'Stratus': 2, 'Clear': 0, 'Nimbostratus': 6, 'Cumulus': 4, 'Altostratus': 5, 'Cumulonimbus': 1, 'Stratocumulus': 3} + >>> print(cloud_var[:]) [0 2 4 -- 1] >>> nc.close() @@ -941,7 +960,7 @@ when a new dataset is created or an existing dataset is opened, use the `parallel` keyword to enable parallel access. :::python - >>> nc = Dataset('parallel_tst.nc','w',parallel=True) + >>> nc = Dataset('parallel_test.nc','w',parallel=True) The optional `comm` keyword may be used to specify a particular MPI communicator (`MPI_COMM_WORLD` is used by default). Each process (or rank) @@ -950,7 +969,7 @@ written to a different variable index on each task :::python >>> d = nc.createDimension('dim',4) - >>> v = nc.createVariable('var', numpy.int, 'dim') + >>> v = nc.createVariable('var', np.int, 'dim') >>> v[rank] = rank >>> nc.close() @@ -958,9 +977,9 @@ written to a different variable index on each task netcdf parallel_test { dimensions: dim = 4 ; - variables: + variables: int64 var(dim) ; - data: + data: var = 0, 1, 2, 3 ; } @@ -1010,18 +1029,19 @@ fixed-width byte string array (dtype `S#`), otherwise a numpy unicode (dtype characters with one more dimension. For example, :::python + >>> from netCDF4 import stringtochar >>> nc = Dataset('stringtest.nc','w',format='NETCDF4_CLASSIC') - >>> nc.createDimension('nchars',3) - >>> nc.createDimension('nstrings',None) + >>> _ = nc.createDimension('nchars',3) + >>> _ = nc.createDimension('nstrings',None) >>> v = nc.createVariable('strings','S1',('nstrings','nchars')) >>> datain = numpy.array(['foo','bar'],dtype='S3') >>> v[:] = stringtochar(datain) # manual conversion to char array - >>> v[:] # data returned as char array + >>> print(v[:]) # data returned as char array [[b'f' b'o' b'o'] - [b'b' b'a' b'r']] + [b'b' b'a' b'r']] >>> v._Encoding = 'ascii' # this enables automatic conversion >>> v[:] = datain # conversion to char array done internally - >>> v[:] # data returned in numpy string array + >>> print(v[:]) # data returned in numpy string array ['foo' 'bar'] >>> nc.close() @@ -1044,25 +1064,25 @@ Here's an example: :::python >>> nc = Dataset('compoundstring_example.nc','w') >>> dtype = numpy.dtype([('observation', 'f4'), - ('station_name','S80')]) + ... ('station_name','S10')]) >>> station_data_t = nc.createCompoundType(dtype,'station_data') - >>> nc.createDimension('station',None) + >>> _ = nc.createDimension('station',None) >>> statdat = nc.createVariable('station_obs', station_data_t, ('station',)) >>> data = numpy.empty(2,dtype) >>> data['observation'][:] = (123.,3.14) >>> data['station_name'][:] = ('Boulder','New York') - >>> statdat.dtype # strings actually stored as character arrays - {'names':['observation','station_name'], 'formats':['>> print(statdat.dtype) # strings actually stored as character arrays + {'names':['observation','station_name'], 'formats':['>> statdat[:] = data # strings converted to character arrays internally - >>> statdat[:] # character arrays converted back to strings - [(123. , 'Boulder') ( 3.14, 'New York')] - >>> statdat[:].dtype - {'names':['observation','station_name'], 'formats':['>> print(statdat[:]) # character arrays converted back to strings + [(123. , b'Boulder') ( 3.14, b'New York')] + >>> print(statdat[:].dtype) + {'names':['observation','station_name'], 'formats':['>> statdat.set_auto_chartostring(False) # turn off auto-conversion >>> statdat[:] = data.view(dtype=[('observation', 'f4'),('station_name','S1',10)]) - >>> statdat[:] # now structured array with char array subtype is returned - [(123. , ['B', 'o', 'u', 'l', 'd', 'e', 'r', '', '', '']) - ( 3.14, ['N', 'e', 'w', ' ', 'Y', 'o', 'r', 'k', '', ''])] + >>> print(statdat[:]) # now structured array with char array subtype is returned + [(123. , [b'B', b'o', b'u', b'l', b'd', b'e', b'r', b'', b'', b'']) + ( 3.14, [b'N', b'e', b'w', b' ', b'Y', b'o', b'r', b'k', b'', b''])] >>> nc.close() Note that there is currently no support for mapping numpy structured arrays with @@ -1094,11 +1114,11 @@ approaches. >>> v = nc.createVariable('v',numpy.int32,'x') >>> v[0:5] = numpy.arange(5) >>> print(nc) - + root group (NETCDF4 data model, file format HDF5): - dimensions(sizes): x(5) - variables(dimensions): int32 v(x) - groups: + dimensions(sizes): x(5) + variables(dimensions): int32 v(x) + groups: >>> print(nc['v'][:]) [0 1 2 3 4] >>> nc.close() # file saved to disk @@ -1106,16 +1126,16 @@ approaches. >>> # python memory buffer. >>> # read the newly created netcdf file into a python >>> # bytes object. - >>> f = open('diskless_example.nc', 'rb') - >>> nc_bytes = f.read(); f.close() + >>> with open('diskless_example.nc', 'rb') as f: + ... nc_bytes = f.read() >>> # create a netCDF in-memory dataset from the bytes object. >>> nc = Dataset('inmemory.nc', memory=nc_bytes) >>> print(nc) - + root group (NETCDF4 data model, file format HDF5): - dimensions(sizes): x(5) - variables(dimensions): int32 v(x) - groups: + dimensions(sizes): x(5) + variables(dimensions): int32 v(x) + groups: >>> print(nc['v'][:]) [0 1 2 3 4] >>> nc.close() @@ -1129,17 +1149,17 @@ approaches. >>> v[0:5] = numpy.arange(5) >>> nc_buf = nc.close() # close returns memoryview >>> print(type(nc_buf)) - + >>> # save nc_buf to disk, read it back in and check. - >>> f = open('inmemory.nc', 'wb') - >>> f.write(nc_buf); f.close() + >>> with open('inmemory.nc', 'wb') as f: + ... f.write(nc_buf) >>> nc = Dataset('inmemory.nc') >>> print(nc) - + root group (NETCDF4 data model, file format HDF5): - dimensions(sizes): x(5) - variables(dimensions): int32 v(x) - groups: + dimensions(sizes): x(5) + variables(dimensions): int32 v(x) + groups: >>> print(nc['v'][:]) [0 1 2 3 4] >>> nc.close() @@ -1176,28 +1196,23 @@ from cpython.bytes cimport PyBytes_FromStringAndSize # pure python utilities from .utils import (_StartCountStride, _quantize, _find_dim, _walk_grps, _out_array_shape, _sortbylist, _tostr, _safecast, _is_int) -# try to use built-in ordered dict in python >= 2.7 -try: +import sys +if sys.version_info[0:2] < (3, 7): + # Python 3.7+ guarantees order; older versions need OrderedDict from collections import OrderedDict -except ImportError: # or else use drop-in substitute - try: - from ordereddict import OrderedDict - except ImportError: - raise ImportError('please install ordereddict (https://pypi.python.org/pypi/ordereddict)') try: from itertools import izip as zip except ImportError: # python3: zip is already python2's itertools.izip pass -__version__ = "1.5.1.2" +__version__ = "1.5.2" # Initialize numpy import posixpath from cftime import num2date, date2num, date2index import numpy import weakref -import sys import warnings from glob import glob from numpy import ma @@ -1620,9 +1635,15 @@ cdef _get_types(group): ierr = nc_inq_typeids(_grpid, &ntypes, typeids) _ensure_nc_success(ierr) # create empty dictionary for CompoundType instances. - cmptypes = OrderedDict() - vltypes = OrderedDict() - enumtypes = OrderedDict() + if sys.version_info[0:2] < (3, 7): + cmptypes = OrderedDict() + vltypes = OrderedDict() + enumtypes = OrderedDict() + else: + cmptypes = dict() + vltypes = dict() + enumtypes = dict() + if ntypes > 0: for n from 0 <= n < ntypes: xtype = typeids[n] @@ -1678,7 +1699,10 @@ cdef _get_dims(group): ierr = nc_inq_ndims(_grpid, &numdims) _ensure_nc_success(ierr) # create empty dictionary for dimensions. - dimensions = OrderedDict() + if sys.version_info[0:2] < (3, 7): + dimensions = OrderedDict() + else: + dimensions = dict() if numdims > 0: dimids = malloc(sizeof(int) * numdims) if group.data_model == 'NETCDF4': @@ -1709,7 +1733,10 @@ cdef _get_grps(group): ierr = nc_inq_grps(_grpid, &numgrps, NULL) _ensure_nc_success(ierr) # create dictionary containing `netCDF4.Group` instances for groups in this group - groups = OrderedDict() + if sys.version_info[0:2] < (3, 7): + groups = OrderedDict() + else: + groups = dict() if numgrps > 0: grpids = malloc(sizeof(int) * numgrps) with nogil: @@ -1739,7 +1766,10 @@ cdef _get_vars(group): ierr = nc_inq_nvars(_grpid, &numvars) _ensure_nc_success(ierr, err_cls=AttributeError) # create empty dictionary for variables. - variables = OrderedDict() + if sys.version_info[0:2] < (3, 7): + variables = OrderedDict() + else: + variables = dict() if numvars > 0: # get variable ids. varids = malloc(sizeof(int) * numvars) @@ -2316,7 +2346,10 @@ strings. if self.data_model == 'NETCDF4': self.groups = _get_grps(self) else: - self.groups = OrderedDict() + if sys.version_info[0:2] < (3, 7): + self.groups = OrderedDict() + else: + self.groups = dict() # these allow Dataset objects to be used via a "with" statement. def __enter__(self): @@ -2386,29 +2419,28 @@ version 4.1.2 or higher of the netcdf C lib, and rebuild netcdf4-python.""" return unicode(self).encode('utf-8') def __unicode__(self): - ncdump = ['%r\n' % type(self)] - dimnames = tuple([_tostr(dimname)+'(%s)'%len(self.dimensions[dimname])\ - for dimname in self.dimensions.keys()]) + ncdump = [repr(type(self))] + dimnames = tuple(_tostr(dimname)+'(%s)'%len(self.dimensions[dimname])\ + for dimname in self.dimensions.keys()) varnames = tuple(\ - [_tostr(self.variables[varname].dtype)+' \033[4m'+_tostr(varname)+'\033[0m'+ + [_tostr(self.variables[varname].dtype)+' '+_tostr(varname)+ (((_tostr(self.variables[varname].dimensions) .replace("u'",""))\ .replace("'",""))\ .replace(", ",","))\ .replace(",)",")") for varname in self.variables.keys()]) - grpnames = tuple([_tostr(grpname) for grpname in self.groups.keys()]) + grpnames = tuple(_tostr(grpname) for grpname in self.groups.keys()) if self.path == '/': - ncdump.append('root group (%s data model, file format %s):\n' % + ncdump.append('root group (%s data model, file format %s):' % (self.data_model, self.disk_format)) else: - ncdump.append('group %s:\n' % self.path) - attrs = [' %s: %s\n' % (name,self.getncattr(name)) for name in\ - self.ncattrs()] - ncdump = ncdump + attrs - ncdump.append(' dimensions(sizes): %s\n' % ', '.join(dimnames)) - ncdump.append(' variables(dimensions): %s\n' % ', '.join(varnames)) - ncdump.append(' groups: %s\n' % ', '.join(grpnames)) - return ''.join(ncdump) + ncdump.append('group %s:' % self.path) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.getncattr(name))) + ncdump.append(' dimensions(sizes): %s' % ', '.join(dimnames)) + ncdump.append(' variables(dimensions): %s' % ', '.join(varnames)) + ncdump.append(' groups: %s' % ', '.join(grpnames)) + return '\n'.join(ncdump) def _close(self, check_err): cdef int ierr = nc_close(self._grpid) @@ -2897,7 +2929,11 @@ attributes.""" values = [] for name in names: values.append(_get_att(self, NC_GLOBAL, name)) - return OrderedDict(zip(names,values)) + gen = zip(names, values) + if sys.version_info[0:2] < (3, 7): + return OrderedDict(gen) + else: + return dict(gen) else: raise AttributeError elif name in _private_atts: @@ -3058,8 +3094,10 @@ this `netCDF4.Dataset` or `netCDF4.Group`, as well as for all variables in all its subgroups. **`True_or_False`**: Boolean determining if automatic conversion of -masked arrays with no missing values to regular ararys shall be -applied for all variables. +masked arrays with no missing values to regular numpy arrays shall be +applied for all variables. Default True. Set to False to restore the default behaviour +in versions prior to 1.4.1 (numpy array returned unless missing values are present, +otherwise masked array returned). ***Note***: Calling this function only affects existing variables. Variables created after calling this function will follow @@ -3223,12 +3261,21 @@ Additional read-only class variables: bytestr = _strencode(name) groupname = bytestr _ensure_nc_success(nc_def_grp(parent._grpid, groupname, &self._grpid)) - self.cmptypes = OrderedDict() - self.vltypes = OrderedDict() - self.enumtypes = OrderedDict() - self.dimensions = OrderedDict() - self.variables = OrderedDict() - self.groups = OrderedDict() + if sys.version_info[0:2] < (3, 7): + self.cmptypes = OrderedDict() + self.vltypes = OrderedDict() + self.enumtypes = OrderedDict() + self.dimensions = OrderedDict() + self.variables = OrderedDict() + self.groups = OrderedDict() + else: + self.cmptypes = dict() + self.vltypes = dict() + self.enumtypes = dict() + self.dimensions = dict() + self.variables = dict() + self.groups = dict() + def close(self): """ @@ -3356,9 +3403,11 @@ Read-only class variables: if not dir(self._grp): return 'Dimension object no longer valid' if self.isunlimited(): - return repr(type(self))+" (unlimited): name = '%s', size = %s\n" % (self._name,len(self)) + return "%r (unlimited): name = '%s', size = %s" %\ + (type(self), self._name, len(self)) else: - return repr(type(self))+": name = '%s', size = %s\n" % (self._name,len(self)) + return "%r: name = '%s', size = %s" %\ + (type(self), self._name, len(self)) def __len__(self): # len(`netCDF4.Dimension` instance) returns current size of dimension @@ -3906,37 +3955,32 @@ behavior is similar to Fortran or Matlab, but different than numpy. cdef int ierr, no_fill if not dir(self._grp): return 'Variable object no longer valid' - ncdump_var = ['%r\n' % type(self)] - dimnames = tuple([_tostr(dimname) for dimname in self.dimensions]) - attrs = [' %s: %s\n' % (name,self.getncattr(name)) for name in\ - self.ncattrs()] + ncdump = [repr(type(self))] + show_more_dtype = True if self._iscompound: - ncdump_var.append('%s %s(%s)\n' %\ - ('compound',self._name,', '.join(dimnames))) + kind = 'compound' elif self._isvlen: - ncdump_var.append('%s %s(%s)\n' %\ - ('vlen',self._name,', '.join(dimnames))) + kind = 'vlen' elif self._isenum: - ncdump_var.append('%s %s(%s)\n' %\ - ('enum',self._name,', '.join(dimnames))) + kind = 'enum' else: - ncdump_var.append('%s %s(%s)\n' %\ - (self.dtype,self._name,', '.join(dimnames))) - ncdump_var = ncdump_var + attrs - if self._iscompound: - ncdump_var.append('compound data type: %s\n' % self.dtype) - elif self._isvlen: - ncdump_var.append('vlen data type: %s\n' % self.dtype) - elif self._isenum: - ncdump_var.append('enum data type: %s\n' % self.dtype) + show_more_dtype = False + kind = str(self.dtype) + dimnames = tuple(_tostr(dimname) for dimname in self.dimensions) + ncdump.append('%s %s(%s)' %\ + (kind, self._name, ', '.join(dimnames))) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.getncattr(name))) + if show_more_dtype: + ncdump.append('%s data type: %s' % (kind, self.dtype)) unlimdims = [] for dimname in self.dimensions: dim = _find_dim(self._grp, dimname) if dim.isunlimited(): unlimdims.append(dimname) - if (self._grp.path != '/'): ncdump_var.append('path = %s\n' % self._grp.path) - ncdump_var.append('unlimited dimensions: %s\n' % ', '.join(unlimdims)) - ncdump_var.append('current shape = %s\n' % repr(self.shape)) + if (self._grp.path != '/'): ncdump.append('path = %s' % self._grp.path) + ncdump.append('unlimited dimensions: %s' % ', '.join(unlimdims)) + ncdump.append('current shape = %r' % (self.shape,)) if __netcdf4libversion__ < '4.5.1' and\ self._grp.file_format.startswith('NETCDF3'): # issue #908: no_fill not correct for NETCDF3 files before 4.5.1 @@ -3955,15 +3999,15 @@ behavior is similar to Fortran or Matlab, but different than numpy. except AttributeError: fillval = default_fillvals[self.dtype.str[1:]] if self.dtype.str[1:] in ['u1','i1']: - msg = 'filling on, default _FillValue of %s ignored\n' % fillval + msg = 'filling on, default _FillValue of %s ignored' % fillval else: - msg = 'filling on, default _FillValue of %s used\n' % fillval - ncdump_var.append(msg) + msg = 'filling on, default _FillValue of %s used' % fillval + ncdump.append(msg) else: - ncdump_var.append('filling off\n') + ncdump.append('filling off') - return ''.join(ncdump_var) + return '\n'.join(ncdump) def _getdims(self): # Private method to get variables's dimension names @@ -4036,7 +4080,8 @@ behavior is similar to Fortran or Matlab, but different than numpy. property size: """Return the number of stored elements.""" def __get__(self): - return numpy.prod(self.shape) + # issue #957: add int since prod(())=1.0 + return int(numpy.prod(self.shape)) property dimensions: """get variables's dimension names""" @@ -4069,6 +4114,13 @@ netCDF attribute with the same name as one of the reserved python attributes.""" cdef nc_type xtype xtype=-99 + # issue #959 - trying to set _FillValue results in mysterious + # error when close method is called so catch it here. It is + # already caught in __setattr__. + if name == '_FillValue': + msg='_FillValue attribute must be set when variable is '+\ + 'created (using fill_value keyword to createVariable)' + raise AttributeError(msg) if self._grp.data_model != 'NETCDF4': self._grp._redef() _set_att(self._grp, self._varid, name, value, xtype=xtype, force_ncstring=self._ncstring_attrs__) if self._grp.data_model != 'NETCDF4': self._grp._enddef() @@ -4294,7 +4346,12 @@ details.""" values = [] for name in names: values.append(_get_att(self._grp, self._varid, name)) - return OrderedDict(zip(names,values)) + gen = zip(names, values) + if sys.version_info[0:2] < (3, 7): + return OrderedDict(gen) + else: + return dict(gen) + else: raise AttributeError elif name in _private_atts: @@ -4393,7 +4450,7 @@ rename a `netCDF4.Variable` attribute named `oldname` to `newname`.""" if self.scale: # only do this if autoscale option is on. is_unsigned = getattr(self, '_Unsigned', False) if is_unsigned and data.dtype.kind == 'i': - data = data.view('u%s' % data.dtype.itemsize) + data=data.view('%su%s'%(data.dtype.byteorder,data.dtype.itemsize)) if self.scale and self._isprimitive and valid_scaleoffset: # if variable has scale_factor and add_offset attributes, apply @@ -4447,7 +4504,7 @@ rename a `netCDF4.Variable` attribute named `oldname` to `newname`.""" is_unsigned = getattr(self, '_Unsigned', False) is_unsigned_int = is_unsigned and data.dtype.kind == 'i' if self.scale and is_unsigned_int: # only do this if autoscale option is on. - dtype_unsigned_int = 'u%s' % data.dtype.itemsize + dtype_unsigned_int='%su%s' % (data.dtype.byteorder,data.dtype.itemsize) data = data.view(dtype_unsigned_int) # private function for creating a masked array, masking missing_values # and/or _FillValues. @@ -5080,12 +5137,11 @@ The default value of `mask` is `True` turn on or off conversion of data without missing values to regular numpy arrays. -If `always_mask` is set to `True` then a masked array with no missing -values is converted to a regular numpy array. - -The default value of `always_mask` is `True` (conversions to regular -numpy arrays are not performed). - +`always_mask` is a Boolean determining if automatic conversion of +masked arrays with no missing values to regular numpy arrays shall be +applied. Default is True. Set to False to restore the default behaviour +in versions prior to 1.4.1 (numpy array returned unless missing values are present, +otherwise masked array returned). """ self.always_mask = bool(always_mask) @@ -5498,8 +5554,8 @@ the user. return unicode(self).encode('utf-8') def __unicode__(self): - return repr(type(self))+": name = '%s', numpy dtype = %s\n" %\ - (self.name,self.dtype) + return "%r: name = '%s', numpy dtype = %s" %\ + (type(self), self.name, self.dtype) def __reduce__(self): # raise error is user tries to pickle a CompoundType object. @@ -5788,10 +5844,10 @@ the user. def __unicode__(self): if self.dtype == str: - return repr(type(self))+': string type' + return '%r: string type' % (type(self),) else: - return repr(type(self))+": name = '%s', numpy dtype = %s\n" %\ - (self.name, self.dtype) + return "%r: name = '%s', numpy dtype = %s" %\ + (type(self), self.name, self.dtype) def __reduce__(self): # raise error is user tries to pickle a VLType object. @@ -5906,9 +5962,8 @@ the user. return unicode(self).encode('utf-8') def __unicode__(self): - return repr(type(self))+\ - ": name = '%s', numpy dtype = %s, fields/values =%s\n" %\ - (self.name, self.dtype, self.enum_dict) + return "%r: name = '%s', numpy dtype = %s, fields/values =%s" %\ + (type(self), self.name, self.dtype, self.enum_dict) def __reduce__(self): # raise error is user tries to pickle a EnumType object. @@ -6089,18 +6144,18 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details): >>> # create a series of netCDF files with a variable sharing >>> # the same unlimited dimension. >>> for nf in range(10): - >>> f = Dataset("mftest%s.nc" % nf,"w",format='NETCDF4_CLASSIC') - >>> f.createDimension("x",None) - >>> x = f.createVariable("x","i",("x",)) - >>> x[0:10] = np.arange(nf*10,10*(nf+1)) - >>> f.close() + ... with Dataset("mftest%s.nc" % nf, "w", format='NETCDF4_CLASSIC') as f: + ... f.createDimension("x",None) + ... x = f.createVariable("x","i",("x",)) + ... x[0:10] = np.arange(nf*10,10*(nf+1)) >>> # now read all those files in at once, in one Dataset. >>> f = MFDataset("mftest*nc") - >>> print f.variables["x"][:] - [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 - 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 - 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 - 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99] + >>> print(f.variables["x"][:]) + [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 + 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 + 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 + 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 + 96 97 98 99] """ def __init__(self, files, check=False, aggdim=None, exclude=[], @@ -6336,22 +6391,21 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details): dset.close() def __repr__(self): - ncdump = ['%r\n' % type(self)] - dimnames = tuple([str(dimname) for dimname in self.dimensions.keys()]) - varnames = tuple([str(varname) for varname in self.variables.keys()]) + ncdump = [repr(type(self))] + dimnames = tuple(str(dimname) for dimname in self.dimensions.keys()) + varnames = tuple(str(varname) for varname in self.variables.keys()) grpnames = () if self.path == '/': - ncdump.append('root group (%s data model, file format %s):\n' % + ncdump.append('root group (%s data model, file format %s):' % (self.data_model[0], self.disk_format[0])) else: - ncdump.append('group %s:\n' % self.path) - attrs = [' %s: %s\n' % (name,self.__dict__[name]) for name in\ - self.ncattrs()] - ncdump = ncdump + attrs - ncdump.append(' dimensions = %s\n' % str(dimnames)) - ncdump.append(' variables = %s\n' % str(varnames)) - ncdump.append(' groups = %s\n' % str(grpnames)) - return ''.join(ncdump) + ncdump.append('group %s:' % self.path) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.__dict__[name])) + ncdump.append(' dimensions = %s' % str(dimnames)) + ncdump.append(' variables = %s' % str(varnames)) + ncdump.append(' groups = %s' % str(grpnames)) + return '\n'.join(ncdump) def __reduce__(self): # raise error is user tries to pickle a MFDataset object. @@ -6368,9 +6422,11 @@ class _Dimension(object): return True def __repr__(self): if self.isunlimited(): - return repr(type(self))+" (unlimited): name = '%s', size = %s\n" % (self._name,len(self)) + return "%r (unlimited): name = '%s', size = %s" %\ + (type(self), self._name, len(self)) else: - return repr(type(self))+": name = '%s', size = %s\n" % (self._name,len(self)) + return "%r: name = '%s', size = %s" %\ + (type(self), self._name, len(self)) class _Variable(object): def __init__(self, dset, varname, var, recdimname): @@ -6398,21 +6454,19 @@ class _Variable(object): except: raise AttributeError(name) def __repr__(self): - ncdump_var = ['%r\n' % type(self)] - dimnames = tuple([str(dimname) for dimname in self.dimensions]) - attrs = [' %s: %s\n' % (name,self.__dict__[name]) for name in\ - self.ncattrs()] - ncdump_var.append('%s %s%s\n' %\ - (self.dtype,self._name,dimnames)) - ncdump_var = ncdump_var + attrs + ncdump = [repr(type(self))] + dimnames = tuple(str(dimname) for dimname in self.dimensions) + ncdump.append('%s %s%s' % (self.dtype, self._name, dimnames)) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.__dict__[name])) unlimdims = [] for dimname in self.dimensions: dim = _find_dim(self._grp, dimname) if dim.isunlimited(): unlimdims.append(str(dimname)) - ncdump_var.append('unlimited dimensions = %s\n' % repr(tuple(unlimdims))) - ncdump_var.append('current size = %s\n' % repr(self.shape)) - return ''.join(ncdump_var) + ncdump.append('unlimited dimensions = %r' % (tuple(unlimdims),)) + ncdump.append('current size = %r' % (self.shape,)) + return '\n'.join(ncdump) def __len__(self): if not self._shape: raise TypeError('len() of unsized object') @@ -6564,14 +6618,14 @@ Example usage (See `netCDF4.MFTime.__init__` for more details): >>> f1.close() >>> f2.close() >>> # Read the two files in at once, in one Dataset. - >>> f = MFDataset("mftest*nc") + >>> f = MFDataset("mftest_*nc") >>> t = f.variables["time"] - >>> print t.units + >>> print(t.units) days since 2000-01-01 - >>> print t[32] # The value written in the file, inconsistent with the MF time units. + >>> print(t[32]) # The value written in the file, inconsistent with the MF time units. 1 >>> T = MFTime(t) - >>> print T[32] + >>> print(T[32]) 32 """ ===================================== setup.py ===================================== @@ -49,13 +49,14 @@ def check_ifnetcdf4(netcdf4_includedir): return isnetcdf4 -def check_api(inc_dirs): +def check_api(inc_dirs,netcdf_lib_version): has_rename_grp = False has_nc_inq_path = False has_nc_inq_format_extended = False has_cdf5_format = False has_nc_open_mem = False has_nc_create_mem = False + has_parallel_support = False has_parallel4_support = False has_pnetcdf_support = False @@ -91,10 +92,20 @@ def check_api(inc_dirs): for line in open(ncmetapath): if line.startswith('#define NC_HAS_CDF5'): has_cdf5_format = bool(int(line.split()[2])) - elif line.startswith('#define NC_HAS_PARALLEL4'): + if line.startswith('#define NC_HAS_PARALLEL'): + has_parallel_support = bool(int(line.split()[2])) + if line.startswith('#define NC_HAS_PARALLEL4'): has_parallel4_support = bool(int(line.split()[2])) - elif line.startswith('#define NC_HAS_PNETCDF'): + if line.startswith('#define NC_HAS_PNETCDF'): has_pnetcdf_support = bool(int(line.split()[2])) + # NC_HAS_PARALLEL4 missing in 4.6.1 (issue #964) + if not has_parallel4_support and has_parallel_support and not has_pnetcdf_support: + has_parallel4_support = True + # for 4.6.1, if NC_HAS_PARALLEL=NC_HAS_PNETCDF=1, guess that + # parallel HDF5 is enabled (must guess since there is no + # NC_HAS_PARALLEL4) + elif netcdf_lib_version == "4.6.1" and not has_parallel4_support and has_parallel_support: + has_parallel4_support = True break return has_rename_grp, has_nc_inq_path, has_nc_inq_format_extended, \ @@ -182,7 +193,7 @@ ncconfig = None use_ncconfig = None if USE_SETUPCFG and os.path.exists(setup_cfg): sys.stdout.write('reading from setup.cfg...\n') - config = configparser.SafeConfigParser() + config = configparser.ConfigParser() config.read(setup_cfg) try: HDF5_dir = config.get("directories", "HDF5_dir") @@ -494,7 +505,8 @@ if 'sdist' not in sys.argv[1:] and 'clean' not in sys.argv[1:]: # this determines whether renameGroup and filepath methods will work. has_rename_grp, has_nc_inq_path, has_nc_inq_format_extended, \ has_cdf5_format, has_nc_open_mem, has_nc_create_mem, \ - has_parallel4_support, has_pnetcdf_support = check_api(inc_dirs) + has_parallel4_support, has_pnetcdf_support = \ + check_api(inc_dirs,netcdf_lib_version) # for netcdf 4.4.x CDF5 format is always enabled. if netcdf_lib_version is not None and\ (netcdf_lib_version > "4.4" and netcdf_lib_version < "4.5"): @@ -584,7 +596,7 @@ else: setup(name="netCDF4", cmdclass=cmdclass, - version="1.5.1.2", + version="1.5.2", long_description="netCDF version 4 has many features not found in earlier versions of the library, such as hierarchical groups, zlib compression, multiple unlimited dimensions, and new data types. It is implemented on top of HDF5. This module implements most of the new features, and can read and write netCDF files compatible with older versions of the library. The API is modelled after Scientific.IO.NetCDF, and should be familiar to users of that module.\n\nThis project is hosted on a `GitHub repository `_ where you may access the most up-to-date source.", author="Jeff Whitaker", author_email="jeffrey.s.whitaker at noaa.gov", @@ -597,12 +609,11 @@ setup(name="netCDF4", 'meteorology', 'climate'], classifiers=["Development Status :: 3 - Alpha", "Programming Language :: Python :: 2", - "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", - "Programming Language :: Python :: 3.3", - "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", + "Programming Language :: Python :: 3.6", + "Programming Language :: Python :: 3.7", "Intended Audience :: Science/Research", "License :: OSI Approved", "Topic :: Software Development :: Libraries :: Python Modules", ===================================== test/tst_atts.py ===================================== @@ -7,13 +7,10 @@ import tempfile import warnings import numpy as NP +from collections import OrderedDict from numpy.random.mtrand import uniform -import netCDF4 -try: - from collections import OrderedDict -except ImportError: # or else use drop-in substitute - from ordereddict import OrderedDict +import netCDF4 # test attribute creation. FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name @@ -94,6 +91,19 @@ class VariablesTestCase(unittest.TestCase): v1.seqatt = SEQATT v1.stringseqatt = STRINGSEQATT v1.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING + # issue #959: should not be able to set _FillValue after var creation + try: + v1._FillValue(-999.) + except AttributeError: + pass + else: + raise ValueError('This test should have failed.') + try: + v1.setncattr('_FillValue',-999.) + except AttributeError: + pass + else: + raise ValueError('This test should have failed.') # issue #485 (triggers segfault in C lib # with version 1.2.1 without pull request #486) f.foo = NP.array('bar','S') ===================================== test/tst_endian.py ===================================== @@ -121,6 +121,27 @@ def issue346(file): assert_array_equal(datal,xl) nc.close() +def issue930(file): + # make sure view to unsigned data type (triggered + # by _Unsigned attribute being set) is correct when + # data byte order is non-native. + nc = netCDF4.Dataset(file,'w') + d = nc.createDimension('x',2) + v1 = nc.createVariable('v1','i2','x',endian='big') + v2 = nc.createVariable('v2','i2','x',endian='little') + v1[0] = 255; v1[1] = 1 + v2[0] = 255; v2[1] = 1 + v1._Unsigned="TRUE"; v1.missing_value=np.int16(1) + v2._Unsigned="TRUE"; v2.missing_value=np.int16(1) + nc.close() + nc = netCDF4.Dataset(file) + assert_array_equal(nc['v1'][:],np.ma.masked_array([255,1],mask=[False,True])) + assert_array_equal(nc['v2'][:],np.ma.masked_array([255,1],mask=[False,True])) + nc.set_auto_mask(False) + assert_array_equal(nc['v1'][:],np.array([255,1])) + assert_array_equal(nc['v2'][:],np.array([255,1])) + nc.close() + class EndianTestCase(unittest.TestCase): def setUp(self): @@ -141,6 +162,7 @@ class EndianTestCase(unittest.TestCase): check_byteswap(self.file3, data) issue310(self.file) issue346(self.file2) + issue930(self.file2) if __name__ == '__main__': unittest.main() ===================================== test/tst_netcdftime.py ===================================== @@ -523,7 +523,7 @@ class TestDate2index(unittest.TestCase): :Example: >>> t = TestTime(datetime(1989, 2, 18), 45, 6, 'hours since 1979-01-01') - >>> print num2date(t[1], t.units) + >>> print(num2date(t[1], t.units)) 1989-02-18 06:00:00 """ self.units = units View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/compare/96f6fdf0814ed57daa26d0e75deac0b4950f8fc9...16dcd4ba1e04ad1935a02a2410cb5caaa6b2607f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/compare/96f6fdf0814ed57daa26d0e75deac0b4950f8fc9...16dcd4ba1e04ad1935a02a2410cb5caaa6b2607f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 05:07:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 04:07:32 +0000 Subject: [Git][debian-gis-team/netcdf4-python] Pushed new tag upstream/1.5.2 Message-ID: <5d6f3884d03fe_577b2ade5dd79824607370@godard.mail> Bas Couwenberg pushed new tag upstream/1.5.2 at Debian GIS Project / netcdf4-python -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/tree/upstream/1.5.2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 05:07:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 04:07:38 +0000 Subject: [Git][debian-gis-team/netcdf4-python][upstream] New upstream version 1.5.2 Message-ID: <5d6f388aa6006_577b2ade5d389d00607439@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / netcdf4-python Commits: 93a42b17 by Bas Couwenberg at 2019-09-04T03:55:06Z New upstream version 1.5.2 - - - - - 10 changed files: - .appveyor.yml - .travis.yml - Changelog - README.md - docs/netCDF4/index.html - netCDF4/_netCDF4.pyx - setup.py - test/tst_atts.py - test/tst_endian.py - test/tst_netcdftime.py Changes: ===================================== .appveyor.yml ===================================== @@ -27,8 +27,8 @@ install: - cmd: call %CONDA_INSTALL_LOCN%\Scripts\activate.bat - cmd: conda config --set always_yes yes --set changeps1 no --set show_channel_urls true - cmd: conda update conda - - cmd: conda config --remove channels defaults --force - cmd: conda config --add channels conda-forge --force + - cmd: conda config --set channel_priority strict - cmd: set PYTHONUNBUFFERED=1 - cmd: conda install conda-build vs2008_express_vc_python_patch - cmd: call setup_x64 ===================================== .travis.yml ===================================== @@ -1,6 +1,6 @@ language: python dist: xenial -sudo: true +cache: pip addons: apt: @@ -17,6 +17,7 @@ env: python: - "2.7" + - "3.5" - "3.6" - "3.7" - "3.8-dev" @@ -39,7 +40,6 @@ matrix: - DEPENDS="numpy==1.10.0 cython==0.21 ordereddict==1.1 setuptools==18.0 cftime" # test MPI with latest released version - python: 3.7 - dist: xenial env: - MPI=1 - CC=mpicc.mpich @@ -55,7 +55,6 @@ matrix: - libhdf5-mpich-dev # test MPI with latest released version - python: 3.7 - dist: xenial env: - MPI=1 - CC=mpicc.mpich @@ -72,7 +71,6 @@ matrix: - libhdf5-mpich-dev # test with netcdf-c from github master - python: 3.7 - dist: xenial env: - MPI=1 - CC=mpicc.mpich ===================================== Changelog ===================================== @@ -1,3 +1,17 @@ + version 1.5.2 (not yet released) +============================== + * fix for scaling bug when _Unsigned attribute is set and byteorder of data + does not match native byteorder (issue #930). + * revise documentation for Python 3 (issue #946). + * establish support for Python 2.7, 3.5, 3.6 and 3.7 (issue #948). + * use dict built-in instead of OrderedDict for Python 3.7+ + (pull request #955). + * remove underline ANSI in Dataset string representation (pull request #956). + * remove newlines from string representation (pull request #960). + * fix for issue #957 (size of scalar var is a float since numpy.prod(())=1.0). + * make sure Variable.setncattr fails to set _FillValue (issue #959). + * fix detection of parallel HDF5 support with netcdf-c 4.6.1 (issue #964). + version 1.5.1.2 (tag v1.5.1.2rel) ================================== * fix another slicing bug introduced by the fix to issue #906 (issue #922). ===================================== README.md ===================================== @@ -10,8 +10,10 @@ ## News For details on the latest updates, see the [Changelog](https://github.com/Unidata/netcdf4-python/blob/master/Changelog). +09/03/2019: Version [1.5.2](https://pypi.python.org/pypi/netCDF4/1.5.2) released. Bugfixes, no new features. + 05/06/2019: Version [1.5.1.2](https://pypi.python.org/pypi/netCDF4/1.5.1.2) released. Fixes another slicing -slicing regression ([issue #922)](https://github.com/Unidata/netcdf4-python/issues/922)) introduced in the 1.5.1 release. +regression ([issue #922)](https://github.com/Unidata/netcdf4-python/issues/922)) introduced in the 1.5.1 release. 05/02/2019: Version [1.5.1.1](https://pypi.python.org/pypi/netCDF4/1.5.1.1) released. Fixes incorrect `__version__` module variable in 1.5.1 release, plus a slicing bug ([issue #919)](https://github.com/Unidata/netcdf4-python/issues/919)). ===================================== docs/netCDF4/index.html ===================================== The diff for this file was not included because it is too large. ===================================== netCDF4/_netCDF4.pyx ===================================== @@ -1,5 +1,5 @@ """ -Version 1.5.1.2 +Version 1.5.2 --------------- - - - @@ -151,7 +151,7 @@ Here's an example: :::python >>> from netCDF4 import Dataset >>> rootgrp = Dataset("test.nc", "w", format="NETCDF4") - >>> print rootgrp.data_model + >>> print(rootgrp.data_model) NETCDF4 >>> rootgrp.close() @@ -182,11 +182,18 @@ in a netCDF 3 file you will get an error message. >>> rootgrp = Dataset("test.nc", "a") >>> fcstgrp = rootgrp.createGroup("forecasts") >>> analgrp = rootgrp.createGroup("analyses") - >>> print rootgrp.groups - OrderedDict([("forecasts", - ), - ("analyses", - )]) + >>> print(rootgrp.groups) + {'forecasts': + group /forecasts: + dimensions(sizes): + variables(dimensions): + groups: , 'analyses': + group /analyses: + dimensions(sizes): + variables(dimensions): + groups: } + + Groups can exist within groups in a `netCDF4.Dataset`, just as directories exist within directories in a unix filesystem. Each `netCDF4.Group` instance @@ -212,40 +219,40 @@ object yields summary information about it's contents. :::python >>> def walktree(top): - >>> values = top.groups.values() - >>> yield values - >>> for value in top.groups.values(): - >>> for children in walktree(value): - >>> yield children - >>> print rootgrp - >>> for children in walktree(rootgrp): - >>> for child in children: - >>> print child - - root group (NETCDF4 file format): - dimensions: - variables: + ... values = top.groups.values() + ... yield values + ... for value in top.groups.values(): + ... for children in walktree(value): + ... yield children + >>> print(rootgrp) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): + variables(dimensions): groups: forecasts, analyses - + >>> for children in walktree(rootgrp): + ... for child in children: + ... print(child) + group /forecasts: - dimensions: - variables: + dimensions(sizes): + variables(dimensions): groups: model1, model2 - + group /analyses: - dimensions: - variables: - groups: - + dimensions(sizes): + variables(dimensions): + groups: + group /forecasts/model1: - dimensions: - variables: - groups: - + dimensions(sizes): + variables(dimensions): + groups: + group /forecasts/model2: - dimensions: - variables: - groups: + dimensions(sizes): + variables(dimensions): + groups: ##
3) Dimensions in a netCDF file. @@ -272,11 +279,8 @@ one, and it must be the first (leftmost) dimension of the variable. All of the `netCDF4.Dimension` instances are stored in a python dictionary. :::python - >>> print rootgrp.dimensions - OrderedDict([("level", ), - ("time", ), - ("lat", ), - ("lon", )]) + >>> print(rootgrp.dimensions) + {'level': (unlimited): name = 'level', size = 0, 'time': (unlimited): name = 'time', size = 0, 'lat': : name = 'lat', size = 73, 'lon': : name = 'lon', size = 144} Calling the python `len` function with a `netCDF4.Dimension` instance returns the current size of that dimension. @@ -284,11 +288,11 @@ The `netCDF4.Dimension.isunlimited` method of a `netCDF4.Dimension` instance can be used to determine if the dimensions is unlimited, or appendable. :::python - >>> print len(lon) + >>> print(len(lon)) 144 - >>> print lon.isunlimited() + >>> print(lon.isunlimited()) False - >>> print time.isunlimited() + >>> print(time.isunlimited()) True Printing the `netCDF4.Dimension` object @@ -297,12 +301,11 @@ and whether it is unlimited. :::python >>> for dimobj in rootgrp.dimensions.values(): - >>> print dimobj - (unlimited): name = "level", size = 0 - (unlimited): name = "time", size = 0 - : name = "lat", size = 73 - : name = "lon", size = 144 - (unlimited): name = "time", size = 0 + ... print(dimobj) + (unlimited): name = 'level', size = 0 + (unlimited): name = 'time', size = 0 + : name = 'lat', size = 73 + : name = 'lon', size = 144 `netCDF4.Dimension` names can be changed using the `netCDF4.Datatset.renameDimension` method of a `netCDF4.Dataset` or @@ -348,17 +351,19 @@ used later to access and set variable data and attributes. >>> longitudes = rootgrp.createVariable("lon","f4",("lon",)) >>> # two dimensions unlimited >>> temp = rootgrp.createVariable("temp","f4",("time","level","lat","lon",)) + >>> temp.units = "K" -To get summary info on a `netCDF4.Variable` instance in an interactive session, just print it. +To get summary info on a `netCDF4.Variable` instance in an interactive session, +just print it. :::python - >>> print temp - + >>> print(temp) + float32 temp(time, level, lat, lon) - least_significant_digit: 3 units: K unlimited dimensions: time, level current shape = (0, 0, 73, 144) + filling on, default _FillValue of 9.969209968386869e+36 used You can use a path to create a Variable inside a hierarchy of groups. @@ -371,30 +376,48 @@ You can also query a `netCDF4.Dataset` or `netCDF4.Group` instance directly to o `netCDF4.Variable` instances using paths. :::python - >>> print rootgrp["/forecasts/model1"] # a Group instance - + >>> print(rootgrp["/forecasts/model1"]) # a Group instance + group /forecasts/model1: - dimensions(sizes): + dimensions(sizes): variables(dimensions): float32 temp(time,level,lat,lon) - groups: - >>> print rootgrp["/forecasts/model1/temp"] # a Variable instance - + groups: + >>> print(rootgrp["/forecasts/model1/temp"]) # a Variable instance + float32 temp(time, level, lat, lon) path = /forecasts/model1 unlimited dimensions: time, level current shape = (0, 0, 73, 144) - filling on, default _FillValue of 9.96920996839e+36 used + filling on, default _FillValue of 9.969209968386869e+36 used + All of the variables in the `netCDF4.Dataset` or `netCDF4.Group` are stored in a Python dictionary, in the same way as the dimensions: :::python - >>> print rootgrp.variables - OrderedDict([("time", ), - ("level", ), - ("lat", ), - ("lon", ), - ("temp", )]) + >>> print(rootgrp.variables) + {'time': + float64 time(time) + unlimited dimensions: time + current shape = (0,) + filling on, default _FillValue of 9.969209968386869e+36 used, 'level': + int32 level(level) + unlimited dimensions: level + current shape = (0,) + filling on, default _FillValue of -2147483647 used, 'lat': + float32 lat(lat) + unlimited dimensions: + current shape = (73,) + filling on, default _FillValue of 9.969209968386869e+36 used, 'lon': + float32 lon(lon) + unlimited dimensions: + current shape = (144,) + filling on, default _FillValue of 9.969209968386869e+36 used, 'temp': + float32 temp(time, level, lat, lon) + units: K + unlimited dimensions: time, level + current shape = (0, 0, 73, 144) + filling on, default _FillValue of 9.969209968386869e+36 used} `netCDF4.Variable` names can be changed using the `netCDF4.Dataset.renameVariable` method of a `netCDF4.Dataset` @@ -432,9 +455,9 @@ and attributes that cannot (or should not) be modified by the user. :::python >>> for name in rootgrp.ncattrs(): - >>> print "Global attr", name, "=", getattr(rootgrp,name) + ... print("Global attr {} = {}".format(name, getattr(rootgrp, name))) Global attr description = bogus example script - Global attr history = Created Mon Nov 7 10.30:56 2005 + Global attr history = Created Mon Jul 8 14:19:41 2019 Global attr source = netCDF4 python module tutorial The `__dict__` attribute of a `netCDF4.Dataset`, `netCDF4.Group` or `netCDF4.Variable` @@ -442,10 +465,8 @@ instance provides all the netCDF attribute name/value pairs in a python dictionary: :::python - >>> print rootgrp.__dict__ - OrderedDict([(u"description", u"bogus example script"), - (u"history", u"Created Thu Mar 3 19:30:33 2011"), - (u"source", u"netCDF4 python module tutorial")]) + >>> print(rootgrp.__dict__) + {'description': 'bogus example script', 'history': 'Created Mon Jul 8 14:19:41 2019', 'source': 'netCDF4 python module tutorial'} Attributes can be deleted from a netCDF `netCDF4.Dataset`, `netCDF4.Group` or `netCDF4.Variable` using the python `del` statement (i.e. `del grp.foo` @@ -462,7 +483,7 @@ into it? You can just treat it like an array and assign data to a slice. >>> lons = numpy.arange(-180,180,2.5) >>> latitudes[:] = lats >>> longitudes[:] = lons - >>> print "latitudes =\\n",latitudes[:] + >>> print("latitudes =\\n{}".format(latitudes[:])) latitudes = [-90. -87.5 -85. -82.5 -80. -77.5 -75. -72.5 -70. -67.5 -65. -62.5 -60. -57.5 -55. -52.5 -50. -47.5 -45. -42.5 -40. -37.5 -35. -32.5 @@ -480,17 +501,17 @@ assign data outside the currently defined range of indices. >>> # append along two unlimited dimensions by assigning to slice. >>> nlats = len(rootgrp.dimensions["lat"]) >>> nlons = len(rootgrp.dimensions["lon"]) - >>> print "temp shape before adding data = ",temp.shape - temp shape before adding data = (0, 0, 73, 144) + >>> print("temp shape before adding data = {}".format(temp.shape)) + temp shape before adding data = (0, 0, 73, 144) >>> >>> from numpy.random import uniform - >>> temp[0:5,0:10,:,:] = uniform(size=(5,10,nlats,nlons)) - >>> print "temp shape after adding data = ",temp.shape - temp shape after adding data = (6, 10, 73, 144) + >>> temp[0:5, 0:10, :, :] = uniform(size=(5, 10, nlats, nlons)) + >>> print("temp shape after adding data = {}".format(temp.shape)) + temp shape after adding data = (5, 10, 73, 144) >>> >>> # levels have grown, but no values yet assigned. - >>> print "levels shape after adding pressure data = ",levels.shape - levels shape after adding pressure data = (10,) + >>> print("levels shape after adding pressure data = {}".format(levels.shape)) + levels shape after adding pressure data = (10,) Note that the size of the levels variable grows when data is appended along the `level` dimension of the variable `temp`, even though no @@ -510,7 +531,8 @@ allowed, and these indices work independently along each dimension (similar to the way vector subscripts work in fortran). This means that :::python - >>> temp[0, 0, [0,1,2,3], [0,1,2,3]] + >>> temp[0, 0, [0,1,2,3], [0,1,2,3]].shape + (4, 4) returns an array of shape (4,4) when slicing a netCDF variable, but for a numpy array it returns an array of shape (4,). @@ -534,12 +556,12 @@ will extract time indices 0,2 and 4, pressure levels Hemisphere longitudes, resulting in a numpy array of shape (3, 3, 36, 71). :::python - >>> print "shape of fancy temp slice = ",tempdat.shape - shape of fancy temp slice = (3, 3, 36, 71) + >>> print("shape of fancy temp slice = {}".format(tempdat.shape)) + shape of fancy temp slice = (3, 3, 36, 71) ***Special note for scalar variables***: To extract data from a scalar variable -`v` with no associated dimensions, use `numpy.asarray(v)` or `v[...]`. The result -will be a numpy scalar array. +`v` with no associated dimensions, use `numpy.asarray(v)` or `v[...]`. +The result will be a numpy scalar array. By default, netcdf4-python returns numpy masked arrays with values equal to the `missing_value` or `_FillValue` variable attributes masked. The @@ -572,14 +594,15 @@ can be used: >>> from netCDF4 import num2date, date2num >>> dates = [datetime(2001,3,1)+n*timedelta(hours=12) for n in range(temp.shape[0])] >>> times[:] = date2num(dates,units=times.units,calendar=times.calendar) - >>> print "time values (in units %s): " % times.units+"\\n",times[:] - time values (in units hours since January 1, 0001): - [ 17533056. 17533068. 17533080. 17533092. 17533104.] + >>> print("time values (in units {}):\\n{}".format(times.units, times[:])) + time values (in units hours since 0001-01-01 00:00:00.0): + [17533104. 17533116. 17533128. 17533140. 17533152.] >>> dates = num2date(times[:],units=times.units,calendar=times.calendar) - >>> print "dates corresponding to time values:\\n",dates + >>> print("dates corresponding to time values:\\n{}".format(dates)) dates corresponding to time values: - [2001-03-01 00:00:00 2001-03-01 12:00:00 2001-03-02 00:00:00 - 2001-03-02 12:00:00 2001-03-03 00:00:00] + [real_datetime(2001, 3, 1, 0, 0) real_datetime(2001, 3, 1, 12, 0) + real_datetime(2001, 3, 2, 0, 0) real_datetime(2001, 3, 2, 12, 0) + real_datetime(2001, 3, 3, 0, 0)] `netCDF4.num2date` converts numeric values of time in the specified `units` and `calendar` to datetime objects, and `netCDF4.date2num` does the reverse. @@ -607,22 +630,22 @@ datasets are not supported). :::python >>> for nf in range(10): - >>> f = Dataset("mftest%s.nc" % nf,"w") - >>> f.createDimension("x",None) - >>> x = f.createVariable("x","i",("x",)) - >>> x[0:10] = numpy.arange(nf*10,10*(nf+1)) - >>> f.close() + ... with Dataset("mftest%s.nc" % nf, "w", format="NETCDF4_CLASSIC") as f: + ... _ = f.createDimension("x",None) + ... x = f.createVariable("x","i",("x",)) + ... x[0:10] = numpy.arange(nf*10,10*(nf+1)) Now read all the files back in at once with `netCDF4.MFDataset` :::python >>> from netCDF4 import MFDataset >>> f = MFDataset("mftest*nc") - >>> print f.variables["x"][:] - [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 - 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 - 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 - 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99] + >>> print(f.variables["x"][:]) + [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 + 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 + 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 + 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 + 96 97 98 99] Note that `netCDF4.MFDataset` can only be used to read, not write, multi-file datasets. @@ -673,12 +696,12 @@ In our example, try replacing the line with :::python - >>> temp = dataset.createVariable("temp","f4",("time","level","lat","lon",),zlib=True) + >>> temp = rootgrp.createVariable("temp","f4",("time","level","lat","lon",),zlib=True) and then :::python - >>> temp = dataset.createVariable("temp","f4",("time","level","lat","lon",),zlib=True,least_significant_digit=3) + >>> temp = rootgrp.createVariable("temp","f4",("time","level","lat","lon",),zlib=True,least_significant_digit=3) and see how much smaller the resulting files are. @@ -707,7 +730,7 @@ for storing numpy complex arrays. Here's an example: >>> complex128 = numpy.dtype([("real",numpy.float64),("imag",numpy.float64)]) >>> complex128_t = f.createCompoundType(complex128,"complex128") >>> # create a variable with this data type, write some data to it. - >>> f.createDimension("x_dim",None) + >>> x_dim = f.createDimension("x_dim",None) >>> v = f.createVariable("cmplx_var",complex128_t,"x_dim") >>> data = numpy.empty(size,complex128) # numpy structured array >>> data["real"] = datac.real; data["imag"] = datac.imag @@ -720,11 +743,11 @@ for storing numpy complex arrays. Here's an example: >>> datac2 = numpy.empty(datain.shape,numpy.complex128) >>> # .. fill it with contents of structured array. >>> datac2.real = datain["real"]; datac2.imag = datain["imag"] - >>> print datac.dtype,datac # original data - complex128 [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] + >>> print('{}: {}'.format(datac.dtype, datac)) # original data + complex128: [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] >>> - >>> print datac2.dtype,datac2 # data from file - complex128 [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] + >>> print('{}: {}'.format(datac2.dtype, datac2)) # data from file + complex128: [ 0.54030231+0.84147098j -0.84147098+0.54030231j -0.54030231-0.84147098j] Compound types can be nested, but you must create the 'inner' ones first. All possible numpy structured arrays cannot be @@ -735,22 +758,22 @@ in a Python dictionary, just like variables and dimensions. As always, printing objects gives useful summary information in an interactive session: :::python - >>> print f - - root group (NETCDF4 file format): - dimensions: x_dim - variables: cmplx_var - groups: - - >>> print f.variables["cmplx_var"] + >>> print(f) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): x_dim(3) + variables(dimensions): {'names':['real','imag'], 'formats':['>> print(f.variables["cmplx_var"]) + compound cmplx_var(x_dim) - compound data type: [("real", ">> print f.cmptypes - OrderedDict([("complex128", )]) - >>> print f.cmptypes["complex128"] - : name = "complex128", numpy dtype = [(u"real",">> print(f.cmptypes) + {'complex128': : name = 'complex128', numpy dtype = {'names':['real','imag'], 'formats':['>> print(f.cmptypes["complex128"]) + : name = 'complex128', numpy dtype = {'names':['real','imag'], 'formats':['11) Variable-length (vlen) data types. @@ -784,32 +807,37 @@ In this case, they contain 1-D numpy `int32` arrays of random length between :::python >>> import random + >>> random.seed(54321) >>> data = numpy.empty(len(y)*len(x),object) >>> for n in range(len(y)*len(x)): - >>> data[n] = numpy.arange(random.randint(1,10),dtype="int32")+1 + ... data[n] = numpy.arange(random.randint(1,10),dtype="int32")+1 >>> data = numpy.reshape(data,(len(y),len(x))) >>> vlvar[:] = data - >>> print "vlen variable =\\n",vlvar[:] + >>> print("vlen variable =\\n{}".format(vlvar[:])) vlen variable = - [[[ 1 2 3 4 5 6 7 8 9 10] [1 2 3 4 5] [1 2 3 4 5 6 7 8]] - [[1 2 3 4 5 6 7] [1 2 3 4 5 6] [1 2 3 4 5]] - [[1 2 3 4 5] [1 2 3 4] [1]] - [[ 1 2 3 4 5 6 7 8 9 10] [ 1 2 3 4 5 6 7 8 9 10] - [1 2 3 4 5 6 7 8]]] - >>> print f - - root group (NETCDF4 file format): - dimensions: x, y - variables: phony_vlen_var - groups: - >>> print f.variables["phony_vlen_var"] - + [[array([1, 2, 3, 4, 5, 6, 7, 8], dtype=int32) array([1, 2], dtype=int32) + array([1, 2, 3, 4], dtype=int32)] + [array([1, 2, 3], dtype=int32) + array([1, 2, 3, 4, 5, 6, 7, 8, 9], dtype=int32) + array([1, 2, 3, 4, 5, 6, 7, 8, 9], dtype=int32)] + [array([1, 2, 3, 4, 5, 6, 7], dtype=int32) array([1, 2, 3], dtype=int32) + array([1, 2, 3, 4, 5, 6], dtype=int32)] + [array([1, 2, 3, 4, 5, 6, 7, 8, 9], dtype=int32) + array([1, 2, 3, 4, 5], dtype=int32) array([1, 2], dtype=int32)]] + >>> print(f) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): x(3), y(4) + variables(dimensions): int32 phony_vlen_var(y,x) + groups: + >>> print(f.variables["phony_vlen_var"]) + vlen phony_vlen_var(y, x) vlen data type: int32 - unlimited dimensions: + unlimited dimensions: current shape = (4, 3) - >>> print f.VLtypes["phony_vlen"] - : name = "phony_vlen", numpy dtype = int32 + >>> print(f.vltypes["phony_vlen"]) + : name = 'phony_vlen', numpy dtype = int32 Numpy object arrays containing python strings can also be written as vlen variables, For vlen strings, you don't need to create a vlen data type. @@ -819,7 +847,7 @@ with fixed length greater than 1) when calling the :::python >>> z = f.createDimension("z",10) - >>> strvar = rootgrp.createVariable("strvar", str, "z") + >>> strvar = f.createVariable("strvar", str, "z") In this example, an object array is filled with random python strings with random lengths between 2 and 12 characters, and the data in the object @@ -829,24 +857,25 @@ array is assigned to the vlen string variable. >>> chars = "1234567890aabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ" >>> data = numpy.empty(10,"O") >>> for n in range(10): - >>> stringlen = random.randint(2,12) - >>> data[n] = "".join([random.choice(chars) for i in range(stringlen)]) + ... stringlen = random.randint(2,12) + ... data[n] = "".join([random.choice(chars) for i in range(stringlen)]) >>> strvar[:] = data - >>> print "variable-length string variable:\\n",strvar[:] + >>> print("variable-length string variable:\\n{}".format(strvar[:])) variable-length string variable: - [aDy29jPt 5DS9X8 jd7aplD b8t4RM jHh8hq KtaPWF9cQj Q1hHN5WoXSiT MMxsVeq tdLUzvVTzj] - >>> print f - - root group (NETCDF4 file format): - dimensions: x, y, z - variables: phony_vlen_var, strvar - groups: - >>> print f.variables["strvar"] - + ['Lh' '25F8wBbMI' '53rmM' 'vvjnb3t63ao' 'qjRBQk6w' 'aJh' 'QF' + 'jtIJbJACaQk4' '3Z5' 'bftIIq'] + >>> print(f) + + root group (NETCDF4 data model, file format HDF5): + dimensions(sizes): x(3), y(4), z(10) + variables(dimensions): int32 phony_vlen_var(y,x), strvar(z) + groups: + >>> print(f.variables["strvar"]) + vlen strvar(z) - vlen data type: - unlimited dimensions: - current size = (10,) + vlen data type: + unlimited dimensions: + current shape = (10,) It is also possible to set contents of vlen string variables with numpy arrays of any string or unicode data type. Note, however, that accessing the contents @@ -866,19 +895,14 @@ values and their names are used to define an Enum data type using :::python >>> nc = Dataset('clouds.nc','w') >>> # python dict with allowed values and their names. - >>> enum_dict = {u'Altocumulus': 7, u'Missing': 255, - >>> u'Stratus': 2, u'Clear': 0, - >>> u'Nimbostratus': 6, u'Cumulus': 4, u'Altostratus': 5, - >>> u'Cumulonimbus': 1, u'Stratocumulus': 3} + >>> enum_dict = {'Altocumulus': 7, 'Missing': 255, + ... 'Stratus': 2, 'Clear': 0, + ... 'Nimbostratus': 6, 'Cumulus': 4, 'Altostratus': 5, + ... 'Cumulonimbus': 1, 'Stratocumulus': 3} >>> # create the Enum type called 'cloud_t'. >>> cloud_type = nc.createEnumType(numpy.uint8,'cloud_t',enum_dict) - >>> print cloud_type - : name = 'cloud_t', - numpy dtype = uint8, fields/values ={u'Cumulus': 4, - u'Altocumulus': 7, u'Missing': 255, - u'Stratus': 2, u'Clear': 0, - u'Cumulonimbus': 1, u'Stratocumulus': 3, - u'Nimbostratus': 6, u'Altostratus': 5} + >>> print(cloud_type) + : name = 'cloud_t', numpy dtype = uint8, fields/values ={'Altocumulus': 7, 'Missing': 255, 'Stratus': 2, 'Clear': 0, 'Nimbostratus': 6, 'Cumulus': 4, 'Altostratus': 5, 'Cumulonimbus': 1, 'Stratocumulus': 3} A new variable can be created in the usual way using this data type. Integer data is written to the variable that represents the named @@ -890,30 +914,25 @@ specified names. >>> time = nc.createDimension('time',None) >>> # create a 1d variable of type 'cloud_type'. >>> # The fill_value is set to the 'Missing' named value. - >>> cloud_var = - >>> nc.createVariable('primary_cloud',cloud_type,'time', - >>> fill_value=enum_dict['Missing']) + >>> cloud_var = nc.createVariable('primary_cloud',cloud_type,'time', + ... fill_value=enum_dict['Missing']) >>> # write some data to the variable. - >>> cloud_var[:] = [enum_dict['Clear'],enum_dict['Stratus'], - >>> enum_dict['Cumulus'],enum_dict['Missing'], - >>> enum_dict['Cumulonimbus']] + >>> cloud_var[:] = [enum_dict[k] for k in ['Clear', 'Stratus', 'Cumulus', + ... 'Missing', 'Cumulonimbus']] >>> nc.close() >>> # reopen the file, read the data. >>> nc = Dataset('clouds.nc') >>> cloud_var = nc.variables['primary_cloud'] - >>> print cloud_var - + >>> print(cloud_var) + enum primary_cloud(time) _FillValue: 255 enum data type: uint8 unlimited dimensions: time current shape = (5,) - >>> print cloud_var.datatype.enum_dict - {u'Altocumulus': 7, u'Missing': 255, u'Stratus': 2, - u'Clear': 0, u'Nimbostratus': 6, u'Cumulus': 4, - u'Altostratus': 5, u'Cumulonimbus': 1, - u'Stratocumulus': 3} - >>> print cloud_var[:] + >>> print(cloud_var.datatype.enum_dict) + {'Altocumulus': 7, 'Missing': 255, 'Stratus': 2, 'Clear': 0, 'Nimbostratus': 6, 'Cumulus': 4, 'Altostratus': 5, 'Cumulonimbus': 1, 'Stratocumulus': 3} + >>> print(cloud_var[:]) [0 2 4 -- 1] >>> nc.close() @@ -941,7 +960,7 @@ when a new dataset is created or an existing dataset is opened, use the `parallel` keyword to enable parallel access. :::python - >>> nc = Dataset('parallel_tst.nc','w',parallel=True) + >>> nc = Dataset('parallel_test.nc','w',parallel=True) The optional `comm` keyword may be used to specify a particular MPI communicator (`MPI_COMM_WORLD` is used by default). Each process (or rank) @@ -950,7 +969,7 @@ written to a different variable index on each task :::python >>> d = nc.createDimension('dim',4) - >>> v = nc.createVariable('var', numpy.int, 'dim') + >>> v = nc.createVariable('var', np.int, 'dim') >>> v[rank] = rank >>> nc.close() @@ -958,9 +977,9 @@ written to a different variable index on each task netcdf parallel_test { dimensions: dim = 4 ; - variables: + variables: int64 var(dim) ; - data: + data: var = 0, 1, 2, 3 ; } @@ -1010,18 +1029,19 @@ fixed-width byte string array (dtype `S#`), otherwise a numpy unicode (dtype characters with one more dimension. For example, :::python + >>> from netCDF4 import stringtochar >>> nc = Dataset('stringtest.nc','w',format='NETCDF4_CLASSIC') - >>> nc.createDimension('nchars',3) - >>> nc.createDimension('nstrings',None) + >>> _ = nc.createDimension('nchars',3) + >>> _ = nc.createDimension('nstrings',None) >>> v = nc.createVariable('strings','S1',('nstrings','nchars')) >>> datain = numpy.array(['foo','bar'],dtype='S3') >>> v[:] = stringtochar(datain) # manual conversion to char array - >>> v[:] # data returned as char array + >>> print(v[:]) # data returned as char array [[b'f' b'o' b'o'] - [b'b' b'a' b'r']] + [b'b' b'a' b'r']] >>> v._Encoding = 'ascii' # this enables automatic conversion >>> v[:] = datain # conversion to char array done internally - >>> v[:] # data returned in numpy string array + >>> print(v[:]) # data returned in numpy string array ['foo' 'bar'] >>> nc.close() @@ -1044,25 +1064,25 @@ Here's an example: :::python >>> nc = Dataset('compoundstring_example.nc','w') >>> dtype = numpy.dtype([('observation', 'f4'), - ('station_name','S80')]) + ... ('station_name','S10')]) >>> station_data_t = nc.createCompoundType(dtype,'station_data') - >>> nc.createDimension('station',None) + >>> _ = nc.createDimension('station',None) >>> statdat = nc.createVariable('station_obs', station_data_t, ('station',)) >>> data = numpy.empty(2,dtype) >>> data['observation'][:] = (123.,3.14) >>> data['station_name'][:] = ('Boulder','New York') - >>> statdat.dtype # strings actually stored as character arrays - {'names':['observation','station_name'], 'formats':['>> print(statdat.dtype) # strings actually stored as character arrays + {'names':['observation','station_name'], 'formats':['>> statdat[:] = data # strings converted to character arrays internally - >>> statdat[:] # character arrays converted back to strings - [(123. , 'Boulder') ( 3.14, 'New York')] - >>> statdat[:].dtype - {'names':['observation','station_name'], 'formats':['>> print(statdat[:]) # character arrays converted back to strings + [(123. , b'Boulder') ( 3.14, b'New York')] + >>> print(statdat[:].dtype) + {'names':['observation','station_name'], 'formats':['>> statdat.set_auto_chartostring(False) # turn off auto-conversion >>> statdat[:] = data.view(dtype=[('observation', 'f4'),('station_name','S1',10)]) - >>> statdat[:] # now structured array with char array subtype is returned - [(123. , ['B', 'o', 'u', 'l', 'd', 'e', 'r', '', '', '']) - ( 3.14, ['N', 'e', 'w', ' ', 'Y', 'o', 'r', 'k', '', ''])] + >>> print(statdat[:]) # now structured array with char array subtype is returned + [(123. , [b'B', b'o', b'u', b'l', b'd', b'e', b'r', b'', b'', b'']) + ( 3.14, [b'N', b'e', b'w', b' ', b'Y', b'o', b'r', b'k', b'', b''])] >>> nc.close() Note that there is currently no support for mapping numpy structured arrays with @@ -1094,11 +1114,11 @@ approaches. >>> v = nc.createVariable('v',numpy.int32,'x') >>> v[0:5] = numpy.arange(5) >>> print(nc) - + root group (NETCDF4 data model, file format HDF5): - dimensions(sizes): x(5) - variables(dimensions): int32 v(x) - groups: + dimensions(sizes): x(5) + variables(dimensions): int32 v(x) + groups: >>> print(nc['v'][:]) [0 1 2 3 4] >>> nc.close() # file saved to disk @@ -1106,16 +1126,16 @@ approaches. >>> # python memory buffer. >>> # read the newly created netcdf file into a python >>> # bytes object. - >>> f = open('diskless_example.nc', 'rb') - >>> nc_bytes = f.read(); f.close() + >>> with open('diskless_example.nc', 'rb') as f: + ... nc_bytes = f.read() >>> # create a netCDF in-memory dataset from the bytes object. >>> nc = Dataset('inmemory.nc', memory=nc_bytes) >>> print(nc) - + root group (NETCDF4 data model, file format HDF5): - dimensions(sizes): x(5) - variables(dimensions): int32 v(x) - groups: + dimensions(sizes): x(5) + variables(dimensions): int32 v(x) + groups: >>> print(nc['v'][:]) [0 1 2 3 4] >>> nc.close() @@ -1129,17 +1149,17 @@ approaches. >>> v[0:5] = numpy.arange(5) >>> nc_buf = nc.close() # close returns memoryview >>> print(type(nc_buf)) - + >>> # save nc_buf to disk, read it back in and check. - >>> f = open('inmemory.nc', 'wb') - >>> f.write(nc_buf); f.close() + >>> with open('inmemory.nc', 'wb') as f: + ... f.write(nc_buf) >>> nc = Dataset('inmemory.nc') >>> print(nc) - + root group (NETCDF4 data model, file format HDF5): - dimensions(sizes): x(5) - variables(dimensions): int32 v(x) - groups: + dimensions(sizes): x(5) + variables(dimensions): int32 v(x) + groups: >>> print(nc['v'][:]) [0 1 2 3 4] >>> nc.close() @@ -1176,28 +1196,23 @@ from cpython.bytes cimport PyBytes_FromStringAndSize # pure python utilities from .utils import (_StartCountStride, _quantize, _find_dim, _walk_grps, _out_array_shape, _sortbylist, _tostr, _safecast, _is_int) -# try to use built-in ordered dict in python >= 2.7 -try: +import sys +if sys.version_info[0:2] < (3, 7): + # Python 3.7+ guarantees order; older versions need OrderedDict from collections import OrderedDict -except ImportError: # or else use drop-in substitute - try: - from ordereddict import OrderedDict - except ImportError: - raise ImportError('please install ordereddict (https://pypi.python.org/pypi/ordereddict)') try: from itertools import izip as zip except ImportError: # python3: zip is already python2's itertools.izip pass -__version__ = "1.5.1.2" +__version__ = "1.5.2" # Initialize numpy import posixpath from cftime import num2date, date2num, date2index import numpy import weakref -import sys import warnings from glob import glob from numpy import ma @@ -1620,9 +1635,15 @@ cdef _get_types(group): ierr = nc_inq_typeids(_grpid, &ntypes, typeids) _ensure_nc_success(ierr) # create empty dictionary for CompoundType instances. - cmptypes = OrderedDict() - vltypes = OrderedDict() - enumtypes = OrderedDict() + if sys.version_info[0:2] < (3, 7): + cmptypes = OrderedDict() + vltypes = OrderedDict() + enumtypes = OrderedDict() + else: + cmptypes = dict() + vltypes = dict() + enumtypes = dict() + if ntypes > 0: for n from 0 <= n < ntypes: xtype = typeids[n] @@ -1678,7 +1699,10 @@ cdef _get_dims(group): ierr = nc_inq_ndims(_grpid, &numdims) _ensure_nc_success(ierr) # create empty dictionary for dimensions. - dimensions = OrderedDict() + if sys.version_info[0:2] < (3, 7): + dimensions = OrderedDict() + else: + dimensions = dict() if numdims > 0: dimids = malloc(sizeof(int) * numdims) if group.data_model == 'NETCDF4': @@ -1709,7 +1733,10 @@ cdef _get_grps(group): ierr = nc_inq_grps(_grpid, &numgrps, NULL) _ensure_nc_success(ierr) # create dictionary containing `netCDF4.Group` instances for groups in this group - groups = OrderedDict() + if sys.version_info[0:2] < (3, 7): + groups = OrderedDict() + else: + groups = dict() if numgrps > 0: grpids = malloc(sizeof(int) * numgrps) with nogil: @@ -1739,7 +1766,10 @@ cdef _get_vars(group): ierr = nc_inq_nvars(_grpid, &numvars) _ensure_nc_success(ierr, err_cls=AttributeError) # create empty dictionary for variables. - variables = OrderedDict() + if sys.version_info[0:2] < (3, 7): + variables = OrderedDict() + else: + variables = dict() if numvars > 0: # get variable ids. varids = malloc(sizeof(int) * numvars) @@ -2316,7 +2346,10 @@ strings. if self.data_model == 'NETCDF4': self.groups = _get_grps(self) else: - self.groups = OrderedDict() + if sys.version_info[0:2] < (3, 7): + self.groups = OrderedDict() + else: + self.groups = dict() # these allow Dataset objects to be used via a "with" statement. def __enter__(self): @@ -2386,29 +2419,28 @@ version 4.1.2 or higher of the netcdf C lib, and rebuild netcdf4-python.""" return unicode(self).encode('utf-8') def __unicode__(self): - ncdump = ['%r\n' % type(self)] - dimnames = tuple([_tostr(dimname)+'(%s)'%len(self.dimensions[dimname])\ - for dimname in self.dimensions.keys()]) + ncdump = [repr(type(self))] + dimnames = tuple(_tostr(dimname)+'(%s)'%len(self.dimensions[dimname])\ + for dimname in self.dimensions.keys()) varnames = tuple(\ - [_tostr(self.variables[varname].dtype)+' \033[4m'+_tostr(varname)+'\033[0m'+ + [_tostr(self.variables[varname].dtype)+' '+_tostr(varname)+ (((_tostr(self.variables[varname].dimensions) .replace("u'",""))\ .replace("'",""))\ .replace(", ",","))\ .replace(",)",")") for varname in self.variables.keys()]) - grpnames = tuple([_tostr(grpname) for grpname in self.groups.keys()]) + grpnames = tuple(_tostr(grpname) for grpname in self.groups.keys()) if self.path == '/': - ncdump.append('root group (%s data model, file format %s):\n' % + ncdump.append('root group (%s data model, file format %s):' % (self.data_model, self.disk_format)) else: - ncdump.append('group %s:\n' % self.path) - attrs = [' %s: %s\n' % (name,self.getncattr(name)) for name in\ - self.ncattrs()] - ncdump = ncdump + attrs - ncdump.append(' dimensions(sizes): %s\n' % ', '.join(dimnames)) - ncdump.append(' variables(dimensions): %s\n' % ', '.join(varnames)) - ncdump.append(' groups: %s\n' % ', '.join(grpnames)) - return ''.join(ncdump) + ncdump.append('group %s:' % self.path) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.getncattr(name))) + ncdump.append(' dimensions(sizes): %s' % ', '.join(dimnames)) + ncdump.append(' variables(dimensions): %s' % ', '.join(varnames)) + ncdump.append(' groups: %s' % ', '.join(grpnames)) + return '\n'.join(ncdump) def _close(self, check_err): cdef int ierr = nc_close(self._grpid) @@ -2897,7 +2929,11 @@ attributes.""" values = [] for name in names: values.append(_get_att(self, NC_GLOBAL, name)) - return OrderedDict(zip(names,values)) + gen = zip(names, values) + if sys.version_info[0:2] < (3, 7): + return OrderedDict(gen) + else: + return dict(gen) else: raise AttributeError elif name in _private_atts: @@ -3058,8 +3094,10 @@ this `netCDF4.Dataset` or `netCDF4.Group`, as well as for all variables in all its subgroups. **`True_or_False`**: Boolean determining if automatic conversion of -masked arrays with no missing values to regular ararys shall be -applied for all variables. +masked arrays with no missing values to regular numpy arrays shall be +applied for all variables. Default True. Set to False to restore the default behaviour +in versions prior to 1.4.1 (numpy array returned unless missing values are present, +otherwise masked array returned). ***Note***: Calling this function only affects existing variables. Variables created after calling this function will follow @@ -3223,12 +3261,21 @@ Additional read-only class variables: bytestr = _strencode(name) groupname = bytestr _ensure_nc_success(nc_def_grp(parent._grpid, groupname, &self._grpid)) - self.cmptypes = OrderedDict() - self.vltypes = OrderedDict() - self.enumtypes = OrderedDict() - self.dimensions = OrderedDict() - self.variables = OrderedDict() - self.groups = OrderedDict() + if sys.version_info[0:2] < (3, 7): + self.cmptypes = OrderedDict() + self.vltypes = OrderedDict() + self.enumtypes = OrderedDict() + self.dimensions = OrderedDict() + self.variables = OrderedDict() + self.groups = OrderedDict() + else: + self.cmptypes = dict() + self.vltypes = dict() + self.enumtypes = dict() + self.dimensions = dict() + self.variables = dict() + self.groups = dict() + def close(self): """ @@ -3356,9 +3403,11 @@ Read-only class variables: if not dir(self._grp): return 'Dimension object no longer valid' if self.isunlimited(): - return repr(type(self))+" (unlimited): name = '%s', size = %s\n" % (self._name,len(self)) + return "%r (unlimited): name = '%s', size = %s" %\ + (type(self), self._name, len(self)) else: - return repr(type(self))+": name = '%s', size = %s\n" % (self._name,len(self)) + return "%r: name = '%s', size = %s" %\ + (type(self), self._name, len(self)) def __len__(self): # len(`netCDF4.Dimension` instance) returns current size of dimension @@ -3906,37 +3955,32 @@ behavior is similar to Fortran or Matlab, but different than numpy. cdef int ierr, no_fill if not dir(self._grp): return 'Variable object no longer valid' - ncdump_var = ['%r\n' % type(self)] - dimnames = tuple([_tostr(dimname) for dimname in self.dimensions]) - attrs = [' %s: %s\n' % (name,self.getncattr(name)) for name in\ - self.ncattrs()] + ncdump = [repr(type(self))] + show_more_dtype = True if self._iscompound: - ncdump_var.append('%s %s(%s)\n' %\ - ('compound',self._name,', '.join(dimnames))) + kind = 'compound' elif self._isvlen: - ncdump_var.append('%s %s(%s)\n' %\ - ('vlen',self._name,', '.join(dimnames))) + kind = 'vlen' elif self._isenum: - ncdump_var.append('%s %s(%s)\n' %\ - ('enum',self._name,', '.join(dimnames))) + kind = 'enum' else: - ncdump_var.append('%s %s(%s)\n' %\ - (self.dtype,self._name,', '.join(dimnames))) - ncdump_var = ncdump_var + attrs - if self._iscompound: - ncdump_var.append('compound data type: %s\n' % self.dtype) - elif self._isvlen: - ncdump_var.append('vlen data type: %s\n' % self.dtype) - elif self._isenum: - ncdump_var.append('enum data type: %s\n' % self.dtype) + show_more_dtype = False + kind = str(self.dtype) + dimnames = tuple(_tostr(dimname) for dimname in self.dimensions) + ncdump.append('%s %s(%s)' %\ + (kind, self._name, ', '.join(dimnames))) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.getncattr(name))) + if show_more_dtype: + ncdump.append('%s data type: %s' % (kind, self.dtype)) unlimdims = [] for dimname in self.dimensions: dim = _find_dim(self._grp, dimname) if dim.isunlimited(): unlimdims.append(dimname) - if (self._grp.path != '/'): ncdump_var.append('path = %s\n' % self._grp.path) - ncdump_var.append('unlimited dimensions: %s\n' % ', '.join(unlimdims)) - ncdump_var.append('current shape = %s\n' % repr(self.shape)) + if (self._grp.path != '/'): ncdump.append('path = %s' % self._grp.path) + ncdump.append('unlimited dimensions: %s' % ', '.join(unlimdims)) + ncdump.append('current shape = %r' % (self.shape,)) if __netcdf4libversion__ < '4.5.1' and\ self._grp.file_format.startswith('NETCDF3'): # issue #908: no_fill not correct for NETCDF3 files before 4.5.1 @@ -3955,15 +3999,15 @@ behavior is similar to Fortran or Matlab, but different than numpy. except AttributeError: fillval = default_fillvals[self.dtype.str[1:]] if self.dtype.str[1:] in ['u1','i1']: - msg = 'filling on, default _FillValue of %s ignored\n' % fillval + msg = 'filling on, default _FillValue of %s ignored' % fillval else: - msg = 'filling on, default _FillValue of %s used\n' % fillval - ncdump_var.append(msg) + msg = 'filling on, default _FillValue of %s used' % fillval + ncdump.append(msg) else: - ncdump_var.append('filling off\n') + ncdump.append('filling off') - return ''.join(ncdump_var) + return '\n'.join(ncdump) def _getdims(self): # Private method to get variables's dimension names @@ -4036,7 +4080,8 @@ behavior is similar to Fortran or Matlab, but different than numpy. property size: """Return the number of stored elements.""" def __get__(self): - return numpy.prod(self.shape) + # issue #957: add int since prod(())=1.0 + return int(numpy.prod(self.shape)) property dimensions: """get variables's dimension names""" @@ -4069,6 +4114,13 @@ netCDF attribute with the same name as one of the reserved python attributes.""" cdef nc_type xtype xtype=-99 + # issue #959 - trying to set _FillValue results in mysterious + # error when close method is called so catch it here. It is + # already caught in __setattr__. + if name == '_FillValue': + msg='_FillValue attribute must be set when variable is '+\ + 'created (using fill_value keyword to createVariable)' + raise AttributeError(msg) if self._grp.data_model != 'NETCDF4': self._grp._redef() _set_att(self._grp, self._varid, name, value, xtype=xtype, force_ncstring=self._ncstring_attrs__) if self._grp.data_model != 'NETCDF4': self._grp._enddef() @@ -4294,7 +4346,12 @@ details.""" values = [] for name in names: values.append(_get_att(self._grp, self._varid, name)) - return OrderedDict(zip(names,values)) + gen = zip(names, values) + if sys.version_info[0:2] < (3, 7): + return OrderedDict(gen) + else: + return dict(gen) + else: raise AttributeError elif name in _private_atts: @@ -4393,7 +4450,7 @@ rename a `netCDF4.Variable` attribute named `oldname` to `newname`.""" if self.scale: # only do this if autoscale option is on. is_unsigned = getattr(self, '_Unsigned', False) if is_unsigned and data.dtype.kind == 'i': - data = data.view('u%s' % data.dtype.itemsize) + data=data.view('%su%s'%(data.dtype.byteorder,data.dtype.itemsize)) if self.scale and self._isprimitive and valid_scaleoffset: # if variable has scale_factor and add_offset attributes, apply @@ -4447,7 +4504,7 @@ rename a `netCDF4.Variable` attribute named `oldname` to `newname`.""" is_unsigned = getattr(self, '_Unsigned', False) is_unsigned_int = is_unsigned and data.dtype.kind == 'i' if self.scale and is_unsigned_int: # only do this if autoscale option is on. - dtype_unsigned_int = 'u%s' % data.dtype.itemsize + dtype_unsigned_int='%su%s' % (data.dtype.byteorder,data.dtype.itemsize) data = data.view(dtype_unsigned_int) # private function for creating a masked array, masking missing_values # and/or _FillValues. @@ -5080,12 +5137,11 @@ The default value of `mask` is `True` turn on or off conversion of data without missing values to regular numpy arrays. -If `always_mask` is set to `True` then a masked array with no missing -values is converted to a regular numpy array. - -The default value of `always_mask` is `True` (conversions to regular -numpy arrays are not performed). - +`always_mask` is a Boolean determining if automatic conversion of +masked arrays with no missing values to regular numpy arrays shall be +applied. Default is True. Set to False to restore the default behaviour +in versions prior to 1.4.1 (numpy array returned unless missing values are present, +otherwise masked array returned). """ self.always_mask = bool(always_mask) @@ -5498,8 +5554,8 @@ the user. return unicode(self).encode('utf-8') def __unicode__(self): - return repr(type(self))+": name = '%s', numpy dtype = %s\n" %\ - (self.name,self.dtype) + return "%r: name = '%s', numpy dtype = %s" %\ + (type(self), self.name, self.dtype) def __reduce__(self): # raise error is user tries to pickle a CompoundType object. @@ -5788,10 +5844,10 @@ the user. def __unicode__(self): if self.dtype == str: - return repr(type(self))+': string type' + return '%r: string type' % (type(self),) else: - return repr(type(self))+": name = '%s', numpy dtype = %s\n" %\ - (self.name, self.dtype) + return "%r: name = '%s', numpy dtype = %s" %\ + (type(self), self.name, self.dtype) def __reduce__(self): # raise error is user tries to pickle a VLType object. @@ -5906,9 +5962,8 @@ the user. return unicode(self).encode('utf-8') def __unicode__(self): - return repr(type(self))+\ - ": name = '%s', numpy dtype = %s, fields/values =%s\n" %\ - (self.name, self.dtype, self.enum_dict) + return "%r: name = '%s', numpy dtype = %s, fields/values =%s" %\ + (type(self), self.name, self.dtype, self.enum_dict) def __reduce__(self): # raise error is user tries to pickle a EnumType object. @@ -6089,18 +6144,18 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details): >>> # create a series of netCDF files with a variable sharing >>> # the same unlimited dimension. >>> for nf in range(10): - >>> f = Dataset("mftest%s.nc" % nf,"w",format='NETCDF4_CLASSIC') - >>> f.createDimension("x",None) - >>> x = f.createVariable("x","i",("x",)) - >>> x[0:10] = np.arange(nf*10,10*(nf+1)) - >>> f.close() + ... with Dataset("mftest%s.nc" % nf, "w", format='NETCDF4_CLASSIC') as f: + ... f.createDimension("x",None) + ... x = f.createVariable("x","i",("x",)) + ... x[0:10] = np.arange(nf*10,10*(nf+1)) >>> # now read all those files in at once, in one Dataset. >>> f = MFDataset("mftest*nc") - >>> print f.variables["x"][:] - [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 - 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 - 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 - 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99] + >>> print(f.variables["x"][:]) + [ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 + 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 + 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 + 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 + 96 97 98 99] """ def __init__(self, files, check=False, aggdim=None, exclude=[], @@ -6336,22 +6391,21 @@ Example usage (See `netCDF4.MFDataset.__init__` for more details): dset.close() def __repr__(self): - ncdump = ['%r\n' % type(self)] - dimnames = tuple([str(dimname) for dimname in self.dimensions.keys()]) - varnames = tuple([str(varname) for varname in self.variables.keys()]) + ncdump = [repr(type(self))] + dimnames = tuple(str(dimname) for dimname in self.dimensions.keys()) + varnames = tuple(str(varname) for varname in self.variables.keys()) grpnames = () if self.path == '/': - ncdump.append('root group (%s data model, file format %s):\n' % + ncdump.append('root group (%s data model, file format %s):' % (self.data_model[0], self.disk_format[0])) else: - ncdump.append('group %s:\n' % self.path) - attrs = [' %s: %s\n' % (name,self.__dict__[name]) for name in\ - self.ncattrs()] - ncdump = ncdump + attrs - ncdump.append(' dimensions = %s\n' % str(dimnames)) - ncdump.append(' variables = %s\n' % str(varnames)) - ncdump.append(' groups = %s\n' % str(grpnames)) - return ''.join(ncdump) + ncdump.append('group %s:' % self.path) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.__dict__[name])) + ncdump.append(' dimensions = %s' % str(dimnames)) + ncdump.append(' variables = %s' % str(varnames)) + ncdump.append(' groups = %s' % str(grpnames)) + return '\n'.join(ncdump) def __reduce__(self): # raise error is user tries to pickle a MFDataset object. @@ -6368,9 +6422,11 @@ class _Dimension(object): return True def __repr__(self): if self.isunlimited(): - return repr(type(self))+" (unlimited): name = '%s', size = %s\n" % (self._name,len(self)) + return "%r (unlimited): name = '%s', size = %s" %\ + (type(self), self._name, len(self)) else: - return repr(type(self))+": name = '%s', size = %s\n" % (self._name,len(self)) + return "%r: name = '%s', size = %s" %\ + (type(self), self._name, len(self)) class _Variable(object): def __init__(self, dset, varname, var, recdimname): @@ -6398,21 +6454,19 @@ class _Variable(object): except: raise AttributeError(name) def __repr__(self): - ncdump_var = ['%r\n' % type(self)] - dimnames = tuple([str(dimname) for dimname in self.dimensions]) - attrs = [' %s: %s\n' % (name,self.__dict__[name]) for name in\ - self.ncattrs()] - ncdump_var.append('%s %s%s\n' %\ - (self.dtype,self._name,dimnames)) - ncdump_var = ncdump_var + attrs + ncdump = [repr(type(self))] + dimnames = tuple(str(dimname) for dimname in self.dimensions) + ncdump.append('%s %s%s' % (self.dtype, self._name, dimnames)) + for name in self.ncattrs(): + ncdump.append(' %s: %s' % (name, self.__dict__[name])) unlimdims = [] for dimname in self.dimensions: dim = _find_dim(self._grp, dimname) if dim.isunlimited(): unlimdims.append(str(dimname)) - ncdump_var.append('unlimited dimensions = %s\n' % repr(tuple(unlimdims))) - ncdump_var.append('current size = %s\n' % repr(self.shape)) - return ''.join(ncdump_var) + ncdump.append('unlimited dimensions = %r' % (tuple(unlimdims),)) + ncdump.append('current size = %r' % (self.shape,)) + return '\n'.join(ncdump) def __len__(self): if not self._shape: raise TypeError('len() of unsized object') @@ -6564,14 +6618,14 @@ Example usage (See `netCDF4.MFTime.__init__` for more details): >>> f1.close() >>> f2.close() >>> # Read the two files in at once, in one Dataset. - >>> f = MFDataset("mftest*nc") + >>> f = MFDataset("mftest_*nc") >>> t = f.variables["time"] - >>> print t.units + >>> print(t.units) days since 2000-01-01 - >>> print t[32] # The value written in the file, inconsistent with the MF time units. + >>> print(t[32]) # The value written in the file, inconsistent with the MF time units. 1 >>> T = MFTime(t) - >>> print T[32] + >>> print(T[32]) 32 """ ===================================== setup.py ===================================== @@ -49,13 +49,14 @@ def check_ifnetcdf4(netcdf4_includedir): return isnetcdf4 -def check_api(inc_dirs): +def check_api(inc_dirs,netcdf_lib_version): has_rename_grp = False has_nc_inq_path = False has_nc_inq_format_extended = False has_cdf5_format = False has_nc_open_mem = False has_nc_create_mem = False + has_parallel_support = False has_parallel4_support = False has_pnetcdf_support = False @@ -91,10 +92,20 @@ def check_api(inc_dirs): for line in open(ncmetapath): if line.startswith('#define NC_HAS_CDF5'): has_cdf5_format = bool(int(line.split()[2])) - elif line.startswith('#define NC_HAS_PARALLEL4'): + if line.startswith('#define NC_HAS_PARALLEL'): + has_parallel_support = bool(int(line.split()[2])) + if line.startswith('#define NC_HAS_PARALLEL4'): has_parallel4_support = bool(int(line.split()[2])) - elif line.startswith('#define NC_HAS_PNETCDF'): + if line.startswith('#define NC_HAS_PNETCDF'): has_pnetcdf_support = bool(int(line.split()[2])) + # NC_HAS_PARALLEL4 missing in 4.6.1 (issue #964) + if not has_parallel4_support and has_parallel_support and not has_pnetcdf_support: + has_parallel4_support = True + # for 4.6.1, if NC_HAS_PARALLEL=NC_HAS_PNETCDF=1, guess that + # parallel HDF5 is enabled (must guess since there is no + # NC_HAS_PARALLEL4) + elif netcdf_lib_version == "4.6.1" and not has_parallel4_support and has_parallel_support: + has_parallel4_support = True break return has_rename_grp, has_nc_inq_path, has_nc_inq_format_extended, \ @@ -182,7 +193,7 @@ ncconfig = None use_ncconfig = None if USE_SETUPCFG and os.path.exists(setup_cfg): sys.stdout.write('reading from setup.cfg...\n') - config = configparser.SafeConfigParser() + config = configparser.ConfigParser() config.read(setup_cfg) try: HDF5_dir = config.get("directories", "HDF5_dir") @@ -494,7 +505,8 @@ if 'sdist' not in sys.argv[1:] and 'clean' not in sys.argv[1:]: # this determines whether renameGroup and filepath methods will work. has_rename_grp, has_nc_inq_path, has_nc_inq_format_extended, \ has_cdf5_format, has_nc_open_mem, has_nc_create_mem, \ - has_parallel4_support, has_pnetcdf_support = check_api(inc_dirs) + has_parallel4_support, has_pnetcdf_support = \ + check_api(inc_dirs,netcdf_lib_version) # for netcdf 4.4.x CDF5 format is always enabled. if netcdf_lib_version is not None and\ (netcdf_lib_version > "4.4" and netcdf_lib_version < "4.5"): @@ -584,7 +596,7 @@ else: setup(name="netCDF4", cmdclass=cmdclass, - version="1.5.1.2", + version="1.5.2", long_description="netCDF version 4 has many features not found in earlier versions of the library, such as hierarchical groups, zlib compression, multiple unlimited dimensions, and new data types. It is implemented on top of HDF5. This module implements most of the new features, and can read and write netCDF files compatible with older versions of the library. The API is modelled after Scientific.IO.NetCDF, and should be familiar to users of that module.\n\nThis project is hosted on a `GitHub repository `_ where you may access the most up-to-date source.", author="Jeff Whitaker", author_email="jeffrey.s.whitaker at noaa.gov", @@ -597,12 +609,11 @@ setup(name="netCDF4", 'meteorology', 'climate'], classifiers=["Development Status :: 3 - Alpha", "Programming Language :: Python :: 2", - "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", - "Programming Language :: Python :: 3.3", - "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", + "Programming Language :: Python :: 3.6", + "Programming Language :: Python :: 3.7", "Intended Audience :: Science/Research", "License :: OSI Approved", "Topic :: Software Development :: Libraries :: Python Modules", ===================================== test/tst_atts.py ===================================== @@ -7,13 +7,10 @@ import tempfile import warnings import numpy as NP +from collections import OrderedDict from numpy.random.mtrand import uniform -import netCDF4 -try: - from collections import OrderedDict -except ImportError: # or else use drop-in substitute - from ordereddict import OrderedDict +import netCDF4 # test attribute creation. FILE_NAME = tempfile.NamedTemporaryFile(suffix='.nc', delete=False).name @@ -94,6 +91,19 @@ class VariablesTestCase(unittest.TestCase): v1.seqatt = SEQATT v1.stringseqatt = STRINGSEQATT v1.setncattr_string('stringseqatt_array',STRINGSEQATT) # array of NC_STRING + # issue #959: should not be able to set _FillValue after var creation + try: + v1._FillValue(-999.) + except AttributeError: + pass + else: + raise ValueError('This test should have failed.') + try: + v1.setncattr('_FillValue',-999.) + except AttributeError: + pass + else: + raise ValueError('This test should have failed.') # issue #485 (triggers segfault in C lib # with version 1.2.1 without pull request #486) f.foo = NP.array('bar','S') ===================================== test/tst_endian.py ===================================== @@ -121,6 +121,27 @@ def issue346(file): assert_array_equal(datal,xl) nc.close() +def issue930(file): + # make sure view to unsigned data type (triggered + # by _Unsigned attribute being set) is correct when + # data byte order is non-native. + nc = netCDF4.Dataset(file,'w') + d = nc.createDimension('x',2) + v1 = nc.createVariable('v1','i2','x',endian='big') + v2 = nc.createVariable('v2','i2','x',endian='little') + v1[0] = 255; v1[1] = 1 + v2[0] = 255; v2[1] = 1 + v1._Unsigned="TRUE"; v1.missing_value=np.int16(1) + v2._Unsigned="TRUE"; v2.missing_value=np.int16(1) + nc.close() + nc = netCDF4.Dataset(file) + assert_array_equal(nc['v1'][:],np.ma.masked_array([255,1],mask=[False,True])) + assert_array_equal(nc['v2'][:],np.ma.masked_array([255,1],mask=[False,True])) + nc.set_auto_mask(False) + assert_array_equal(nc['v1'][:],np.array([255,1])) + assert_array_equal(nc['v2'][:],np.array([255,1])) + nc.close() + class EndianTestCase(unittest.TestCase): def setUp(self): @@ -141,6 +162,7 @@ class EndianTestCase(unittest.TestCase): check_byteswap(self.file3, data) issue310(self.file) issue346(self.file2) + issue930(self.file2) if __name__ == '__main__': unittest.main() ===================================== test/tst_netcdftime.py ===================================== @@ -523,7 +523,7 @@ class TestDate2index(unittest.TestCase): :Example: >>> t = TestTime(datetime(1989, 2, 18), 45, 6, 'hours since 1979-01-01') - >>> print num2date(t[1], t.units) + >>> print(num2date(t[1], t.units)) 1989-02-18 06:00:00 """ self.units = units View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/commit/93a42b17d930f90f145eb723f6057438185da39f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/commit/93a42b17d930f90f145eb723f6057438185da39f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 05:19:10 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 04:19:10 +0000 Subject: Processing of netcdf4-python_1.5.2-1_source.changes Message-ID: netcdf4-python_1.5.2-1_source.changes uploaded successfully to localhost along with the files: netcdf4-python_1.5.2-1.dsc netcdf4-python_1.5.2.orig.tar.gz netcdf4-python_1.5.2-1.debian.tar.xz netcdf4-python_1.5.2-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Wed Sep 4 05:28:35 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 04:28:35 +0000 Subject: [Git][debian-gis-team/ossim][master] Move from experimental to unstable. Message-ID: <5d6f3d73bf52_577b2ade618b8570608459@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / ossim Commits: 4329a697 by Bas Couwenberg at 2019-09-04T04:08:04Z Move from experimental to unstable. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +ossim (2.9.1-1) unstable; urgency=medium + + * Team upload. + * Move from experimental to unstable. + + -- Bas Couwenberg Wed, 04 Sep 2019 06:07:36 +0200 + ossim (2.9.1-1~exp1) experimental; urgency=medium * Team upload. View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/4329a697fb39187a4e9a13ef54a0f8a17b8e05f5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/4329a697fb39187a4e9a13ef54a0f8a17b8e05f5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 05:28:39 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 04:28:39 +0000 Subject: [Git][debian-gis-team/ossim] Pushed new tag debian/2.9.1-1 Message-ID: <5d6f3d77f1b19_577b2ade5dd798246086c5@godard.mail> Bas Couwenberg pushed new tag debian/2.9.1-1 at Debian GIS Project / ossim -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/tree/debian/2.9.1-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 05:36:24 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 04:36:24 +0000 Subject: netcdf4-python_1.5.2-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 05:56:15 +0200 Source: netcdf4-python Architecture: source Version: 1.5.2-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: netcdf4-python (1.5.2-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: 25e4ed4be3e47f094de0ff7543fb141a112fca63 2168 netcdf4-python_1.5.2-1.dsc 9bfd457b5a20a13ec470791faaecb0258b2d411c 808295 netcdf4-python_1.5.2.orig.tar.gz 6294c9ade105d8018544c69424f849f71fcdbd05 4840 netcdf4-python_1.5.2-1.debian.tar.xz 60db71c258d36f6f8a5d615bee672b6b43218cfb 8716 netcdf4-python_1.5.2-1_amd64.buildinfo Checksums-Sha256: 6aea20dfda80157da81caec1c4de203ce97e1a3ac87a00582ccc6bba619e0727 2168 netcdf4-python_1.5.2-1.dsc ba9aacc77056ac2477ec663978e7789e95b83e571aba270e3e160879f3959bee 808295 netcdf4-python_1.5.2.orig.tar.gz f3c831b8881b94ea3594910671a76bf0262dc6bcfc4ec086182b77123ed437f6 4840 netcdf4-python_1.5.2-1.debian.tar.xz c34a180abd84750b1a8370dff475d2aa9b44d9e09b4429ec9ac8db6a7336a989 8716 netcdf4-python_1.5.2-1_amd64.buildinfo Files: da91d84b752f61edc27176f0c894968b 2168 science optional netcdf4-python_1.5.2-1.dsc c15fad6fbc7640a1085913de3f432ec4 808295 science optional netcdf4-python_1.5.2.orig.tar.gz 94f93ababe7424234ab3c8f80aeaa044 4840 science optional netcdf4-python_1.5.2-1.debian.tar.xz 9b91cab48360c82b7c1f220720129587 8716 science optional netcdf4-python_1.5.2-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1vOEsACgkQZ1DxCuiN SvEeERAAyWzufAzxuSgf04+BOLvKJQNN1GAY5tcgKTUQiTdzbHhAeq1k9eflnhwG SyW4nDm/YI9tuSJK6YRP1tsc4ZI3Zz1f3qPZempuCNwOcxe+dlvGfBN6eVe0lOlp vegwU/slkQYoKD7ho8ZczBr1KYYxx6MViuxQlpuFEqrbhadMj3KM0MiQ7NqhZwc1 eiqTYxo8Dds/JHaibsZ2OL96bSSf8OzR7x25wVbiD5MKXrZG9xwl1LPkq/BD2tYV JobGRQGJKH3VvLlMGhrmKY81T+cVCz61m57nSBB0zt2f3pYU4ZtISVCfQvzCfenz jmAHMvSmbwmo6u24iw8pxMKCr8JOQ1CtIu11fq5Z90U6qx3XadTd9rGuO/AIz8bQ w+sG+1FAYm0ucBcdK04+T5QJfuMcdVXpsNN+qxIobyG2arKm/N+iNm05V+8JpFyg P4Sn0zayZeBnp204bvJds7ajVWsIatlU2AbgNsCCsUh8r+kil+E4hMHiv5DhsXfx dfCbg37xtxS864uUaeNFdT0RDSjEvidWAPKjQ/oSmuHm7lXIphp42qrZocHncCkm 9dartgoL0XwaUYoIEFiBDRk8DAS/3gXi1pg2W9Q/RHewYEwhlozNfv0rqqbBduQK 2d7VGeY9Kc0OEtfNKVPVVId5ndxj4SL1hOT4XBg/IjRn7ZG/PxY= =KSeF -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Wed Sep 4 05:39:17 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 04:39:17 +0000 Subject: Processing of ossim_2.9.1-1_source.changes Message-ID: ossim_2.9.1-1_source.changes uploaded successfully to localhost along with the files: ossim_2.9.1-1.dsc ossim_2.9.1-1.debian.tar.xz ossim_2.9.1-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From noreply at release.debian.org Wed Sep 4 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:21 +0000 Subject: doris 5.0.3~beta+dfsg-13 MIGRATED to testing Message-ID: FYI: The status of the doris source package in Debian's testing distribution has changed. Previous version: 5.0.3~beta+dfsg-12 Current version: 5.0.3~beta+dfsg-13 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:23 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:23 +0000 Subject: libgeotiff 1.5.1-1 MIGRATED to testing Message-ID: FYI: The status of the libgeotiff source package in Debian's testing distribution has changed. Previous version: (not in testing) Current version: 1.5.1-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:24 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:24 +0000 Subject: osm-gps-map 1.1.0-6 MIGRATED to testing Message-ID: FYI: The status of the osm-gps-map source package in Debian's testing distribution has changed. Previous version: 1.1.0-5 Current version: 1.1.0-6 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:23 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:23 +0000 Subject: mapnik 3.0.22+ds1-1 MIGRATED to testing Message-ID: FYI: The status of the mapnik source package in Debian's testing distribution has changed. Previous version: 3.0.22+ds-2 Current version: 3.0.22+ds1-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:24 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:24 +0000 Subject: osm2pgsql 1.0.0+ds-1 MIGRATED to testing Message-ID: FYI: The status of the osm2pgsql source package in Debian's testing distribution has changed. Previous version: 0.96.0+ds-3 Current version: 1.0.0+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:23 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:23 +0000 Subject: mapnik-vector-tile 1.6.1+dfsg-8 MIGRATED to testing Message-ID: FYI: The status of the mapnik-vector-tile source package in Debian's testing distribution has changed. Previous version: 1.6.1+dfsg-7 Current version: 1.6.1+dfsg-8 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:26 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:26 +0000 Subject: qgis 3.4.11+dfsg-1 MIGRATED to testing Message-ID: FYI: The status of the qgis source package in Debian's testing distribution has changed. Previous version: 3.4.10+dfsg-1 Current version: 3.4.11+dfsg-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:25 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:25 +0000 Subject: proj 6.2.0-1 MIGRATED to testing Message-ID: FYI: The status of the proj source package in Debian's testing distribution has changed. Previous version: 5.2.0-1 Current version: 6.2.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 4 05:39:26 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 04 Sep 2019 04:39:26 +0000 Subject: python-pyproj 2.3.1+ds-1 MIGRATED to testing Message-ID: FYI: The status of the python-pyproj source package in Debian's testing distribution has changed. Previous version: 1.9.6-2 Current version: 2.3.1+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From ftpmaster at ftp-master.debian.org Wed Sep 4 05:49:25 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 04:49:25 +0000 Subject: ossim_2.9.1-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 06:07:36 +0200 Source: ossim Architecture: source Version: 2.9.1-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: ossim (2.9.1-1) unstable; urgency=medium . * Team upload. * Move from experimental to unstable. Checksums-Sha1: 365b6a4e3a127aff4c15d14f1a96210b2990b739 2187 ossim_2.9.1-1.dsc 74c5eeeeae100ef97d037e49d589b1344a3891db 39844 ossim_2.9.1-1.debian.tar.xz c7410b6b2fca62eae8e39e77064cf3b4ad062e93 9104 ossim_2.9.1-1_amd64.buildinfo Checksums-Sha256: aabe3fa7adb4569414c9852228814914164923f7d5220f8092dd5c055b69284d 2187 ossim_2.9.1-1.dsc f20c88dfeb0e6e368ee855fefd5776d27db72ac36a0106cdccb736b5d5cbad65 39844 ossim_2.9.1-1.debian.tar.xz 0438e033feb8ecb25fb8a0676393b6924728890383ed5734e88df5a97ab54e31 9104 ossim_2.9.1-1_amd64.buildinfo Files: 62e1b175b1c7c43cbcbdb4ebe853f373 2187 science optional ossim_2.9.1-1.dsc 6b983f6335d1ff7ff3980a100cbf135d 39844 science optional ossim_2.9.1-1.debian.tar.xz 2e427c0d5bfb0d468603a0050da3cb98 9104 science optional ossim_2.9.1-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1vPVMACgkQZ1DxCuiN SvH6yRAAgtZQwCL5nYGyM5HZzl3yi94gmT5k6zOLJB/RBrbjAhByGCvr0efXWVCK WFEHo8HmaJK/5hNMf2hIvllE0B0QQsSmJbOZLUcLysif2ue+sqOHG9R+zX9E7v9n Ys4xoWVX2W4EScVuLS8fb36/PdXm0SB5AvTNuRFWsIdCt9rS3vb6kn5BEUvSR3o0 BX8D81tZyHVk6FMNrD5w9XpLso+HtDumF59qgegk8PcgJDIi+59xLLdoHOB9nKbB YiXk+KdENSVm7kmvFz6oryYKkzqfFH4irMFptIs0dnxbxN8kI5myKFrzTaWJs3Qo 7Xcs+/UGwD/bTwj6eQxoTEhKx7lXQc7nZxhUW4Qc7AxfvInX4sLP3oIuhnnqiHDU puIOYKBhCgIJ2TxQ2fC8sYrF0fN59RkspVppqxgEkDMCziH13lbfhWYcSAwuVbZV kK0qbQLlYvfwXOuCriUFvryMCkBlDRHMTliOGFG/YHk/wFmpCv05A/F2GYkpuVtx xfLuxmi71Lw1n7t2d379E6329/qr0Y8icSp1qWwvgK/XE1SWe5nwjgCpUGI/Zlbq 5j+A6dOjc4vio6kNNRMqwYetWR7wtdlmzVs+Tk+p0K2DXh9slWbmeGxF4yx96T16 azLpPIvkOB5EMXZ6PSZM66IQBgsYno0xz/BwxmZHRdbStp75ZCc= =pp9T -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Wed Sep 4 06:38:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 05:38:49 +0000 Subject: [Git][debian-gis-team/osm2pgsql] Pushed new branch buster-backports Message-ID: <5d6f4de993dba_577b2ade618b8570610132@godard.mail> Bas Couwenberg pushed new branch buster-backports at Debian GIS Project / osm2pgsql -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osm2pgsql/tree/buster-backports You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 06:39:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 05:39:26 +0000 Subject: [Git][debian-gis-team/osm2pgsql] Pushed new tag debian/1.0.0+ds-1_bpo10+1 Message-ID: <5d6f4e0e226e9_577b2ade5da46d1461036a@godard.mail> Bas Couwenberg pushed new tag debian/1.0.0+ds-1_bpo10+1 at Debian GIS Project / osm2pgsql -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osm2pgsql/tree/debian/1.0.0+ds-1_bpo10+1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 06:50:06 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 05:50:06 +0000 Subject: Processing of osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.changes Message-ID: osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.changes uploaded successfully to localhost along with the files: osm2pgsql_1.0.0+ds-1~bpo10+1.dsc osm2pgsql_1.0.0+ds-1~bpo10+1.debian.tar.xz osm2pgsql-dbgsym_1.0.0+ds-1~bpo10+1_amd64.deb osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.buildinfo osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 4 07:04:29 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 06:04:29 +0000 Subject: osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.changes is NEW Message-ID: binary:osm2pgsql is NEW. binary:osm2pgsql is NEW. source:osm2pgsql is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From gitlab at salsa.debian.org Wed Sep 4 08:56:58 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 07:56:58 +0000 Subject: [Git][debian-gis-team/pdal][master] 20 commits: Update branch in gbp.conf & Vcs-Git URL. Message-ID: <5d6f6e4adb07a_577b2ade5da46d1462063b@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pdal Commits: 58e3553f by Bas Couwenberg at 2019-08-22T19:29:20Z Update branch in gbp.conf & Vcs-Git URL. - - - - - d61807da by Bas Couwenberg at 2019-08-22T19:29:39Z New upstream version 2.0.0+ds - - - - - 6242fbff by Bas Couwenberg at 2019-08-22T19:31:35Z Update upstream source from tag 'upstream/2.0.0+ds' Update to upstream version '2.0.0+ds' with Debian dir 8a2320417b298152a37dd231b0d2695c1a5682cb - - - - - 434fa34f by Bas Couwenberg at 2019-08-23T04:27:07Z New upstream release. - - - - - 07cde787 by Bas Couwenberg at 2019-08-23T04:45:37Z Drop libjsoncpp-dev build dependency, add libzstd-dev. - - - - - a224ad0f by Bas Couwenberg at 2019-08-23T04:47:33Z Drop greyhound plugin, removed upstream. - - - - - 4190f756 by Bas Couwenberg at 2019-08-23T06:41:24Z Update copyright file. Changes: - Update copyright years for existing copyright holders - Add new copyright holders - Add license & copyright for libE57Format & CRCpp - Drop license & copyright for jsoncpp, removed upstream - Add license & copyright for nlohmann - - - - - e2c4854f by Bas Couwenberg at 2019-08-23T06:41:30Z Drop obsolete config.{guess,sub} update for gtest. - - - - - a78942ff by Bas Couwenberg at 2019-08-23T06:42:40Z Drop unused override for spelling-error-in-binary. - - - - - f85afb8f by Bas Couwenberg at 2019-08-23T07:13:01Z Update symbols for amd64. - - - - - 07aa3b54 by Bas Couwenberg at 2019-08-23T07:13:01Z Set distribution to experimental. - - - - - 854b7623 by Bas Couwenberg at 2019-08-23T13:32:05Z New upstream version 2.0.1+ds - - - - - fa4f867d by Bas Couwenberg at 2019-08-23T13:33:59Z Update upstream source from tag 'upstream/2.0.1+ds' Update to upstream version '2.0.1+ds' with Debian dir f582da6a946524d4aa0c4811dca75bb66ce92568 - - - - - 5a2ff768 by Bas Couwenberg at 2019-08-23T13:34:45Z New upstream release. - - - - - 4bd6dda8 by Bas Couwenberg at 2019-08-23T13:37:12Z Rename library packages for SONAME bump. - - - - - fb4edfa3 by Bas Couwenberg at 2019-08-23T14:02:34Z Update symbols for amd64. - - - - - 9747f29b by Bas Couwenberg at 2019-08-23T14:02:37Z Set distribution to experimental. - - - - - e8c27913 by Bas Couwenberg at 2019-09-04T06:35:07Z Revert "Update branch in gbp.conf & Vcs-Git URL." This reverts commit 58e3553fe471f3553acf857e8491261663a43615. - - - - - f9cb79b7 by Bas Couwenberg at 2019-09-04T07:16:22Z Update symbols for other architectures. - - - - - ab7688a7 by Bas Couwenberg at 2019-09-04T07:16:22Z Set distribution to unstable. - - - - - 30 changed files: - .travis.yml - CMakeLists.txt - HOWTORELEASE.txt - PDALConfig.cmake.in - RELEASENOTES.txt - apps/CMakeLists.txt - apps/pdal.cpp - cmake/arbiter.cmake - cmake/config.cmake - cmake/examples/hobu-config.sh - cmake/examples/hobu-windows.bat - cmake/gdal.cmake - − cmake/json.cmake - cmake/laszip.cmake - cmake/macros.cmake - + cmake/modules/FindFbx.cmake - − cmake/modules/FindJSONCPP.cmake - + cmake/nlohmann.cmake - + cmake/openssl.cmake - cmake/options.cmake - cmake/policies.cmake - cmake/python.cmake - cmake/unix_compiler_options.cmake - cmake/zstd.cmake - debian/changelog - debian/control - debian/copyright - debian/libpdal-base8.install → debian/libpdal-base9.install - debian/libpdal-base8.lintian-overrides → debian/libpdal-base9.lintian-overrides - debian/libpdal-base8.symbols → debian/libpdal-base9.symbols The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/pdal/compare/006c153034ad7ca7d54e1569fc2b048d887afa3d...ab7688a7a133bbb08271ec3d98ea5995129ed6be -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pdal/compare/006c153034ad7ca7d54e1569fc2b048d887afa3d...ab7688a7a133bbb08271ec3d98ea5995129ed6be You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 08:57:07 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 07:57:07 +0000 Subject: [Git][debian-gis-team/pdal] Pushed new tag debian/2.0.1+ds-1 Message-ID: <5d6f6e534b5b1_577b2ade618b85706208a4@godard.mail> Bas Couwenberg pushed new tag debian/2.0.1+ds-1 at Debian GIS Project / pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pdal/tree/debian/2.0.1+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 09:00:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 08:00:16 +0000 Subject: [Git][debian-gis-team/qgis][buster-backports] 11 commits: New upstream version 3.4.11+dfsg Message-ID: <5d6f6f10ca538_577b2ade5dd7982462276@godard.mail> Bas Couwenberg pushed to branch buster-backports at Debian GIS Project / qgis Commits: 3126e21f by Bas Couwenberg at 2019-08-17T06:57:45Z New upstream version 3.4.11+dfsg - - - - - 337c4de7 by Bas Couwenberg at 2019-08-17T07:01:16Z Update upstream source from tag 'upstream/3.4.11+dfsg' Update to upstream version '3.4.11+dfsg' with Debian dir e3569b03a04939623d767040381c704af1c1e136 - - - - - 90b584a2 by Bas Couwenberg at 2019-08-17T07:12:10Z New upstream release. - - - - - 5d0a43bf by Bas Couwenberg at 2019-08-17T07:24:32Z Merge upstream packaging changes. - - - - - a2ce174f by Bas Couwenberg at 2019-08-17T09:44:35Z Update symbols for amd64. - - - - - ab6173c0 by Bas Couwenberg at 2019-08-17T09:59:11Z Add lintian override for spelling-error-in-binary false positive. - - - - - 934144ad by Bas Couwenberg at 2019-08-17T09:59:11Z Set distribution to experimental. - - - - - 2566b063 by Bas Couwenberg at 2019-08-24T14:10:17Z Add Breaks/Replaces to fix upgrade from 2.18.18. (closes: #935613) - - - - - 4377fa51 by Bas Couwenberg at 2019-08-30T12:13:31Z Update symbols for other architectures. - - - - - cd8c6be3 by Bas Couwenberg at 2019-08-30T12:13:58Z Set distribution to unstable. - - - - - 0e550057 by Bas Couwenberg at 2019-09-04T04:55:01Z Rebuild for buster-backports. - - - - - 30 changed files: - .ci/travis/scripts/ctest2travis.py - CMakeLists.txt - CTestConfig.cmake - ChangeLog - INSTALL - debian/changelog - debian/control - debian/libqgis-3d3.4.10.install → debian/libqgis-3d3.4.11.install - debian/libqgis-3d3.4.10.symbols → debian/libqgis-3d3.4.11.symbols - debian/libqgis-analysis3.4.10.install → debian/libqgis-analysis3.4.11.install - debian/libqgis-analysis3.4.10.symbols → debian/libqgis-analysis3.4.11.symbols - debian/libqgis-app3.4.10.install → debian/libqgis-app3.4.11.install - debian/libqgis-core3.4.10.lintian-overrides → debian/libqgis-app3.4.11.lintian-overrides - debian/libqgis-app3.4.10.symbols → debian/libqgis-app3.4.11.symbols - debian/libqgis-core3.4.10.install → debian/libqgis-core3.4.11.install - debian/libqgis-app3.4.10.lintian-overrides → debian/libqgis-core3.4.11.lintian-overrides - debian/libqgis-core3.4.10.symbols → debian/libqgis-core3.4.11.symbols - debian/libqgis-gui3.4.10.install → debian/libqgis-gui3.4.11.install - debian/libqgis-gui3.4.10.symbols → debian/libqgis-gui3.4.11.symbols - debian/libqgis-native3.4.10.install → debian/libqgis-native3.4.11.install - debian/libqgis-native3.4.10.symbols → debian/libqgis-native3.4.11.symbols - debian/libqgis-server3.4.10.install → debian/libqgis-server3.4.11.install - debian/libqgis-server3.4.10.symbols → debian/libqgis-server3.4.11.symbols - debian/libqgisgrass7-3.4.10.install → debian/libqgisgrass7-3.4.11.install - debian/libqgisgrass7-3.4.10.lintian-overrides → debian/libqgisgrass7-3.4.11.lintian-overrides - debian/libqgisgrass7-3.4.10.symbols → debian/libqgisgrass7-3.4.11.symbols - debian/libqgispython3.4.10.install → debian/libqgispython3.4.11.install - debian/libqgispython3.4.10.symbols → debian/libqgispython3.4.11.symbols - debian/rules - doc/INSTALL.html The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/779fa91a454ef7008401789dc47b21965d6e17cd...0e550057a25f3d7fe18895e9f2e104549ec291b3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/779fa91a454ef7008401789dc47b21965d6e17cd...0e550057a25f3d7fe18895e9f2e104549ec291b3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 09:00:20 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 08:00:20 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag debian/3.4.11+dfsg-1_bpo10+1 Message-ID: <5d6f6f14afecf_577b2ade5d389d0062293@godard.mail> Bas Couwenberg pushed new tag debian/3.4.11+dfsg-1_bpo10+1 at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/debian/3.4.11+dfsg-1_bpo10+1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 09:06:52 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 08:06:52 +0000 Subject: Processing of pdal_2.0.1+ds-1_source.changes Message-ID: pdal_2.0.1+ds-1_source.changes uploaded successfully to localhost along with the files: pdal_2.0.1+ds-1.dsc pdal_2.0.1+ds-1.debian.tar.xz pdal_2.0.1+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 4 09:07:26 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 08:07:26 +0000 Subject: Processing of qgis_3.4.11+dfsg-1~bpo10+1_amd64.changes Message-ID: qgis_3.4.11+dfsg-1~bpo10+1_amd64.changes uploaded successfully to localhost along with the files: qgis_3.4.11+dfsg-1~bpo10+1.dsc qgis_3.4.11+dfsg-1~bpo10+1.debian.tar.xz libqgis-3d3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-3d3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-analysis3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-analysis3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-app3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-app3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-core3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-core3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-customwidgets-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-customwidgets_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-dev_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-gui3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-gui3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-native3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-native3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-server3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgis-server3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgisgrass7-3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgisgrass7-3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgispython3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb libqgispython3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb python3-qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb python3-qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb python3-qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-api-doc_3.4.11+dfsg-1~bpo10+1_all.deb qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-plugin-grass-common_3.4.11+dfsg-1~bpo10+1_all.deb qgis-plugin-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-plugin-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-provider-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-provider-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-providers-common_3.4.11+dfsg-1~bpo10+1_all.deb qgis-providers-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-providers_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-server-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis-server_3.4.11+dfsg-1~bpo10+1_amd64.deb qgis_3.4.11+dfsg-1~bpo10+1_amd64.buildinfo qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 4 09:40:46 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 08:40:46 +0000 Subject: pdal_2.0.1+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 08:38:46 +0200 Source: pdal Architecture: source Version: 2.0.1+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: pdal (2.0.1+ds-1) unstable; urgency=medium . * Update symbols for other architectures. * Move from experimental to unstable. Checksums-Sha1: b7bff0cf0015b8c2743067dfa651bb66727156a9 3109 pdal_2.0.1+ds-1.dsc 961fa356636e04850843478c16662cefa8c42035 76888 pdal_2.0.1+ds-1.debian.tar.xz cfcfe53d8954015f18e351c00007b08fa88a305f 21283 pdal_2.0.1+ds-1_amd64.buildinfo Checksums-Sha256: f097a2b2439f4832025bd5e4932f6efe7a477a25bc5c28e0e6d08f56ce1ce6e2 3109 pdal_2.0.1+ds-1.dsc 982652d5da91c69bd9dcc97663ee70c98deedfdd0cc1a8a37276a475a44170bd 76888 pdal_2.0.1+ds-1.debian.tar.xz 824bac0da7e97c80fda7adc2c5a312f71a510dc5d6765c188e093c96da18303e 21283 pdal_2.0.1+ds-1_amd64.buildinfo Files: d907bdccb030c2ce814d4780bb7c7342 3109 science optional pdal_2.0.1+ds-1.dsc af5ad5e2d3026c5ce67f3bc842679b41 76888 science optional pdal_2.0.1+ds-1.debian.tar.xz 45248f1f89b95a19f795449f783f268f 21283 science optional pdal_2.0.1+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1vbjIACgkQZ1DxCuiN SvFWEw//bjs29HEDzX1O3ymGG2VkQLwTHoKdjXtrmJEaITAquLpbM/fqXlSIK98w XfTA/iGmT8yGOz1Z7fRkznSfMfXVkZde1kcSwbkLl3M3YR74M3yIsvNV3w7QNT/R Z8s3oKOi8p6pluv6h0lJzYBaQyixWEKt7GuGheLqRisV6kzgBURjIz8+Qj6WM/jT ivuUpViZTd1DS5AcdYX6dSVnHYOnVRXgpyNl67D9nsJkXPa84TR1cPMG7XF/SkbV +l/Jl/HPzv0Uk0vNMxvve0CQDm8nPW2DQSajmHMVpxzWYZeDSen8y7+o66vG2Ljd PXfIm4sX/CT/isbU2wJNxvfmZv/+fWh4/aKwKIvlQbMofDdFfD+r1Tq/kcBil+FF mfp+BvyH27Zr/paozQfLGIcUcggTHdQEhqVbEZx1UIH152RT1Ha0R1IK5GwN7+IV sxJvxMvKrc6cLU5sZsU479nAEGC8DKl+sRn9JJMP5L3tMCFz4XigTYEWjAOAiGi1 tj/bD41dUIN+ilVA0J7zFwkHsAYi5R6BO5SNuILz7dP74a1gioKtD7JDJZBuDpa1 peP4EEGGV1Bi6u6GcuFXjE3n9B2JFCtsK67L/mnB1DE89azSigJ+31NISaVYnr7o U3lY/ctNSU+9jnhamaCd1vWe2efmS3ck0e2bURDUbEDOUXj0NNc= =l+p4 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Wed Sep 4 09:43:18 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 08:43:18 +0000 Subject: qgis_3.4.11+dfsg-1~bpo10+1_amd64.changes is NEW Message-ID: binary:libqgis-3d3.4.11 is NEW. binary:libqgis-analysis3.4.11 is NEW. binary:libqgis-app3.4.11 is NEW. binary:libqgis-core3.4.11 is NEW. binary:libqgis-gui3.4.11 is NEW. binary:libqgis-native3.4.11 is NEW. binary:libqgis-server3.4.11 is NEW. binary:libqgisgrass7-3.4.11 is NEW. binary:libqgispython3.4.11 is NEW. binary:libqgis-gui3.4.11 is NEW. binary:libqgis-3d3.4.11 is NEW. binary:libqgis-native3.4.11 is NEW. binary:libqgis-server3.4.11 is NEW. binary:libqgis-core3.4.11 is NEW. binary:libqgis-analysis3.4.11 is NEW. binary:libqgispython3.4.11 is NEW. binary:libqgisgrass7-3.4.11 is NEW. binary:libqgis-app3.4.11 is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From myon at debian.org Wed Sep 4 11:18:17 2019 From: myon at debian.org (Christoph Berg) Date: Wed, 4 Sep 2019 12:18:17 +0200 Subject: Bug#939384: qgis-providers: proj_create: crs not found Message-ID: <20190904101817.GA15957@msg.df7cb.de> Package: qgis-providers Version: 3.4.11+dfsg-1 Severity: normal Hi, while upgrading my bullseye system: workrave (1.10.34-2+b1) wird eingerichtet ... qgis-providers (3.4.11+dfsg-1) wird eingerichtet ... proj_create: crs not found proj_create: crs not found proj_create: crs not found proj_create: crs not found ... a lot more of those ... proj_create: crs not found proj_create: crs not found proj_create: crs not found libqgis-customwidgets (3.4.11+dfsg-1) wird eingerichtet ... python3-qgis-common (3.4.11+dfsg-1) wird eingerichtet ... -- System Information: Debian Release: bullseye/sid APT prefers testing APT policy: (700, 'testing'), (600, 'unstable'), (150, 'experimental') Architecture: amd64 (x86_64) Kernel: Linux 4.19.0-5-amd64 (SMP w/8 CPU cores) Kernel taint flags: TAINT_OOT_MODULE, TAINT_UNSIGNED_MODULE Locale: LANG=de_DE.utf8, LC_CTYPE=de_DE.utf8 (charmap=UTF-8), LANGUAGE=de:en_US:en (charmap=UTF-8) Shell: /bin/sh linked to /bin/dash Init: systemd (via /run/systemd/system) LSM: AppArmor: enabled Versions of packages qgis-providers depends on: ii dpkg 1.19.7 ii libc6 2.28-10 ii libexpat1 2.2.7-1 ii libgcc1 1:9.2.1-4 ii libgdal20 2.4.2+dfsg-1+b2 ii libhdf5-103 1.10.4+repack-10 ii libnetcdf13 1:4.6.2-1+b1 ii libpq5 12~beta3-1.pgdg+1 ii libqca-qt5-2 2.2.1-2 ii libqca-qt5-2-plugins 2.2.1-2 ii libqgis-core3.4.11 3.4.11+dfsg-1 ii libqgis-gui3.4.11 3.4.11+dfsg-1 ii libqscintilla2-qt5-13 2.10.4+dfsg-2.1 ii libqt5core5a [qtbase-abi-5-11-3] 5.11.3+dfsg1-4 ii libqt5gui5 5.11.3+dfsg1-4 ii libqt5network5 5.11.3+dfsg1-4 ii libqt5sql5 5.11.3+dfsg1-4 ii libqt5sql5-sqlite 5.11.3+dfsg1-4 ii libqt5widgets5 5.11.3+dfsg1-4 ii libqt5xml5 5.11.3+dfsg1-4 ii libspatialite7 4.3.0a-6+b1 ii libsqlite3-0 3.29.0-2 ii libstdc++6 9.2.1-4 ii qgis-providers-common 3.4.11+dfsg-1 qgis-providers recommends no packages. qgis-providers suggests no packages. -- no debconf information Christoph From owner at bugs.debian.org Wed Sep 4 11:51:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Wed, 04 Sep 2019 10:51:03 +0000 Subject: Processed: Re: Bug#939384: qgis-providers: proj_create: crs not found References: Message-ID: Processing commands for control at bugs.debian.org: > tags 939384 wontfix Bug #939384 [qgis-providers] qgis-providers: proj_create: crs not found Added tag(s) wontfix. > thanks Stopping processing here. Please contact me if you need assistance. -- 939384: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939384 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From owner at bugs.debian.org Wed Sep 4 11:57:09 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Wed, 04 Sep 2019 10:57:09 +0000 Subject: Bug#939384: marked as done (qgis-providers: proj_create: crs not found) References: <20190904101817.GA15957@msg.df7cb.de> Message-ID: Your message dated Wed, 4 Sep 2019 12:48:54 +0200 with message-id and subject line Re: Bug#939384: qgis-providers: proj_create: crs not found has caused the Debian Bug report #939384, regarding qgis-providers: proj_create: crs not found to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 939384: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939384 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Christoph Berg Subject: qgis-providers: proj_create: crs not found Date: Wed, 4 Sep 2019 12:18:17 +0200 Size: 4489 URL: -------------- next part -------------- An embedded message was scrubbed... From: Sebastiaan Couwenberg Subject: Re: Bug#939384: qgis-providers: proj_create: crs not found Date: Wed, 4 Sep 2019 12:48:54 +0200 Size: 7465 URL: From sebastic at xs4all.nl Wed Sep 4 11:53:19 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Wed, 4 Sep 2019 12:53:19 +0200 Subject: Bug#939384: qgis-providers: proj_create: crs not found In-Reply-To: References: <20190904101817.GA15957@msg.df7cb.de> <20190904101817.GA15957@msg.df7cb.de> Message-ID: <9f8ee4fe-1837-d5ec-7d13-3bd264bf9e85@xs4all.nl> On 9/4/19 12:48 PM, Sebastiaan Couwenberg wrote: > On 9/4/19 12:18 PM, Christoph Berg wrote: >> while upgrading my bullseye system: >> >> workrave (1.10.34-2+b1) wird eingerichtet ... >> qgis-providers (3.4.11+dfsg-1) wird eingerichtet ... >> proj_create: crs not found >> ... a lot more of those ... >> proj_create: crs not found >> libqgis-customwidgets (3.4.11+dfsg-1) wird eingerichtet ... >> python3-qgis-common (3.4.11+dfsg-1) wird eingerichtet ... > Not much we can do about this. PROJ 6 just migrated to testing and we'll > need to update a lot more of the stack to get them to support it fully. > > QGIS 3.4 has partial support for PROJ 6, 3.8 has full support. But > before we get to that we need to update GDAL to 3.x for its PROJ 6 support. > > This is just output from crssync which is unable to sync some projection > definitions from GDAL/PROJ to the QGIS srs.db, and can be safely ignored. If you want to see which projections these are, enable PROJ debug output, e.g.: # PROJ_DEBUG=5 /usr/lib/qgis/crssync --verbose 2>&1 | less > There may be changes for PROJ 6 in upcoming QGIS 3.4.x releases, but > full support will take some time until we upgrade to the 3.10 LTR in > early 2020. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From sebastic at xs4all.nl Wed Sep 4 12:18:22 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Wed, 4 Sep 2019 13:18:22 +0200 Subject: Bug#939384: qgis-providers: proj_create: crs not found In-Reply-To: <20190904105810.GC1402@msg.df7cb.de> References: <20190904101817.GA15957@msg.df7cb.de> <20190904105810.GC1402@msg.df7cb.de> <20190904101817.GA15957@msg.df7cb.de> Message-ID: <2e124b4f-8aae-0aa0-41db-4c375576dbb6@xs4all.nl> On 9/4/19 12:58 PM, Christoph Berg wrote: > Re: Sebastiaan Couwenberg 2019-09-04 >> Not much we can do about this. PROJ 6 just migrated to testing and we'll >> need to update a lot more of the stack to get them to support it fully. > > Oh, if that's just a transient thing then it's not a big deal at all. > > We just shouldn't release it in that state, it really printed 2 or 3 > screens full of that which would be quite embarrassing to see in > stable. bullseye will release sometime in 2021, plenty of time to update the rest of the stack to versions with full support for PROJ6. This will include at least GDAL >= 3.x, QGIS >= 3.10.x, PostGIS >= 3.x, GRASS >= 3.8 which all have support PROJ 6. PROJ 7 is scheduled for release in March 2020, and will drop support for proj_api.h which many other projects still use, most importantly spatialite. And it doesn't look like it will see a release that supports the proj.h API (introduced in PROJ 5) before that time. So I don't think we can release bullseye with PROJ 7, but PROJ 6 seems very doable. With the first steps taken to get PROJ 6 into bullseye by transitioning to PROJ 6.2.0 and libgeotiff 1.5.1 we now need to continue with rest of stack. GDAL 3 is ready in experimental, but not all rdeps are ready for it yet (no big blockers though). GRASS 3.8 will publish the final release soon with the major changes including support for PROJ 6 and the switch to Python 3. PostGIS 3.x will take a bit more time until the final release, but should happen sometime next year at the latest I expect. And we'll switch to QGIS 3.10 when it becomes the next LTR in early 2020 as mentioned before. A major change to the stack like PROJ 6 needs quite some time to settle, please be patient while we update the other packages for it. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From plugwash at debian.org Wed Sep 4 14:38:51 2019 From: plugwash at debian.org (Peter Michael Green) Date: Wed, 4 Sep 2019 14:38:51 +0100 Subject: Bug#939399: libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies. Message-ID: <682488ac-d594-aea3-e5bd-a10d27a01b64@debian.org> package: libgeotiff\ version: 1.5.1-1 severity: serious tags: bullseye Hi, libgeotiff just failed to build in raspbian bullseye with the following message. > ============================================ > Running ../test/testlistgeo using ../bin/listgeo: > ============================================ > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > proj_create: unrecognized format / unknown name > diff testlistgeo_out with testlistgeo_out.dist > --- testlistgeo_out 2019-09-04 06:58:26.979704475 +0000 > +++ ../test/testlistgeo_out.dist 2019-09-04 06:57:50.000000000 +0000 > @@ -1697,11 +1697,11 @@ > Keyed_Information: > GTModelTypeGeoKey (Short,1): ModelTypeProjected > GTRasterTypeGeoKey (Short,1): RasterPixelIsArea > - ProjectedCSTypeGeoKey (Short,1): Code-3035 (ETRS89-extended / LAEA Europe) > + ProjectedCSTypeGeoKey (Short,1): Code-3035 (ETRS89 / LAEA Europe) > End_Of_Keys. > End_Of_Geotiff. > > -PCS = 3035 (ETRS89-extended / LAEA Europe) > +PCS = 3035 (ETRS89 / LAEA Europe) > Projection = 19986 (Europe Equal Area 2001) > Projection Method: CT_LambertAzimEqualArea > ProjCenterLatGeoKey: 52.000000 ( 52d 0' 0.00"N) > > PROBLEMS HAVE OCCURRED > test file testlistgeo_out saved It also failed in the same way on the Debian reproducible builds site for bullseye armhf. The reproducible builds site has not yet tested it on other architectures in bullseye. It seems to be fine in unstable. From gitlab at salsa.debian.org Wed Sep 4 15:10:40 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 14:10:40 +0000 Subject: [Git][debian-gis-team/libgeotiff][master] 2 commits: Add patch to fix FTFBS with PROJ 6.2.0. (closes: #939399) Message-ID: <5d6fc5e0e84e6_577b2ade61046388727672@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libgeotiff Commits: 5e708726 by Bas Couwenberg at 2019-09-04T14:09:01Z Add patch to fix FTFBS with PROJ 6.2.0. (closes: #939399) - - - - - 7ce0df28 by Bas Couwenberg at 2019-09-04T14:09:02Z Set distribution to unstable. - - - - - 3 changed files: - debian/changelog - + debian/patches/proj-6.2.patch - debian/patches/series Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +libgeotiff (1.5.1-2) unstable; urgency=medium + + * Add patch to fix FTFBS with PROJ 6.2.0. + (closes: #939399) + + -- Bas Couwenberg Wed, 04 Sep 2019 16:05:10 +0200 + libgeotiff (1.5.1-1) unstable; urgency=medium * Update gbp.conf to use --source-only-changes by default. ===================================== debian/patches/proj-6.2.patch ===================================== @@ -0,0 +1,21 @@ +Description: Fix FTBFS with PROJ 6.2.0. +Author: Bas Couwenberg +Bug: https://github.com/OSGeo/libgeotiff/issues/22 +Bug-Debian: https://bugs.debian.org/939399 + +--- a/test/testlistgeo_out.dist ++++ b/test/testlistgeo_out.dist +@@ -1697,11 +1697,11 @@ Geotiff_Information: + Keyed_Information: + GTModelTypeGeoKey (Short,1): ModelTypeProjected + GTRasterTypeGeoKey (Short,1): RasterPixelIsArea +- ProjectedCSTypeGeoKey (Short,1): Code-3035 (ETRS89 / LAEA Europe) ++ ProjectedCSTypeGeoKey (Short,1): Code-3035 (ETRS89-extended / LAEA Europe) + End_Of_Keys. + End_Of_Geotiff. + +-PCS = 3035 (ETRS89 / LAEA Europe) ++PCS = 3035 (ETRS89-extended / LAEA Europe) + Projection = 19986 (Europe Equal Area 2001) + Projection Method: CT_LambertAzimEqualArea + ProjCenterLatGeoKey: 52.000000 ( 52d 0' 0.00"N) ===================================== debian/patches/series ===================================== @@ -1,2 +1,3 @@ 0001-GTIFDecToDMS-fix-rounding-issue-refs-16.patch 0001-GTIFDecToDMS-avoid-invalid-double-int-cast-on-invali.patch +proj-6.2.patch View it on GitLab: https://salsa.debian.org/debian-gis-team/libgeotiff/compare/09d245bf0c05726bbb42b5c3aa55c20ec77ea5f6...7ce0df2867dcc846c67d7ad404d537f3c554ef36 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libgeotiff/compare/09d245bf0c05726bbb42b5c3aa55c20ec77ea5f6...7ce0df2867dcc846c67d7ad404d537f3c554ef36 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 15:10:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 14:10:46 +0000 Subject: [Git][debian-gis-team/libgeotiff] Pushed new tag debian/1.5.1-2 Message-ID: <5d6fc5e6f13b0_577b3f91b50c79bc727855@godard.mail> Bas Couwenberg pushed new tag debian/1.5.1-2 at Debian GIS Project / libgeotiff -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libgeotiff/tree/debian/1.5.1-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 15:13:06 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 14:13:06 +0000 Subject: netcdf-fortran_4.5.0+ds-1~exp1_amd64.changes ACCEPTED into experimental, experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 30 Aug 2019 07:54:45 +0200 Source: netcdf-fortran Binary: libnetcdff-dev libnetcdff-doc libnetcdff7 libnetcdff7-dbgsym Architecture: source amd64 all Version: 4.5.0+ds-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: libnetcdff-dev - creation, access, and sharing of scientific data in Fortran libnetcdff-doc - NetCDF Fortran documentation libnetcdff7 - Fortran interface for scientific data access to large binary data Closes: 922645 Changes: netcdf-fortran (4.5.0+ds-1~exp1) experimental; urgency=medium . * New upstream release. * Add fortran-compiler as alternative build dependency. (closes: #922645) * Update gbp.conf to use --source-only-changes by default. * Bump Standards-Version to 4.4.0, no changes. * Update copyright file, changes: - Update copyright years for UCAR - Drop license & copyright for removed files * Rename library package for SONAME bump. * Refresh patches. * Update symbols for 4.5.0. Checksums-Sha1: c88fdc5e203b920b97ba17f118d509b0c613f779 2399 netcdf-fortran_4.5.0+ds-1~exp1.dsc e412491554473e18446b909c54effd4ea440f2d9 669496 netcdf-fortran_4.5.0+ds.orig.tar.xz e06bbd63fb3246e4a1f4830def4704121ce040b9 10196 netcdf-fortran_4.5.0+ds-1~exp1.debian.tar.xz 02d181c2ea81ae0f2987f012d524c80e9697b1f9 269796 libnetcdff-dev_4.5.0+ds-1~exp1_amd64.deb 94e9e1f4b90d40bd70a5a80275ad5ca4fc347e41 275640 libnetcdff-doc_4.5.0+ds-1~exp1_all.deb 224ef9a8c4d885a53cd3d0348e3858d8d88f5b1d 286316 libnetcdff7-dbgsym_4.5.0+ds-1~exp1_amd64.deb 06a432a7d2247127a4467e34e6be9c9f040d0b27 102220 libnetcdff7_4.5.0+ds-1~exp1_amd64.deb ee41d9fcf008f3ea97ac5ddc31a3a7409c858ec0 10033 netcdf-fortran_4.5.0+ds-1~exp1_amd64.buildinfo Checksums-Sha256: 94e2a12a807d208cee68611c043cb032fcd15f08e71c68eb652cacb2be34474f 2399 netcdf-fortran_4.5.0+ds-1~exp1.dsc 14b4a0e16a90ea12ee085ef377e2b7170a0394f9bf493945092f861966277955 669496 netcdf-fortran_4.5.0+ds.orig.tar.xz ca45284e242a57a9ac0c2b36c8d1475281e7c5240524be0c9c8e4c9dd9bfd4bd 10196 netcdf-fortran_4.5.0+ds-1~exp1.debian.tar.xz 41d13f0622b56e02ea9cc819a0a0ca93784095f95100251a1716314a7b87a45f 269796 libnetcdff-dev_4.5.0+ds-1~exp1_amd64.deb 3b82915e6651b490943f7a0302912108b3bd1da8a36efc8be7b8b0983eb383bc 275640 libnetcdff-doc_4.5.0+ds-1~exp1_all.deb bbe445ab7bdd2ffdd773a2dd492ab29824bd38116632927bd8688060b433d384 286316 libnetcdff7-dbgsym_4.5.0+ds-1~exp1_amd64.deb eb61da12de4c8aa9b612e9bec426f7a01287df636d24c1152f44debd67eddf6d 102220 libnetcdff7_4.5.0+ds-1~exp1_amd64.deb e1cad4e8e0cdfc9fdad1eff5a9b78dac941c2a3d4ed324586f9fb4b9340a7cf5 10033 netcdf-fortran_4.5.0+ds-1~exp1_amd64.buildinfo Files: a572766b925cd5dc92d4865fffcaad76 2399 science optional netcdf-fortran_4.5.0+ds-1~exp1.dsc 1e128d571f9ec76ebc8c95a477f9cbe2 669496 science optional netcdf-fortran_4.5.0+ds.orig.tar.xz 104ed2138802b6958c6d3fa5604a430a 10196 science optional netcdf-fortran_4.5.0+ds-1~exp1.debian.tar.xz 769510d9f07a69e8a50c08fc5ed7e3d2 269796 libdevel optional libnetcdff-dev_4.5.0+ds-1~exp1_amd64.deb a926c7f346ff9b25ab16fbfbe8cbb4f6 275640 doc optional libnetcdff-doc_4.5.0+ds-1~exp1_all.deb 3ecd6c5708a5267b0216d3db4c3d0688 286316 debug optional libnetcdff7-dbgsym_4.5.0+ds-1~exp1_amd64.deb 31d786e6d26b87d5f2d378dfb3d53c09 102220 libs optional libnetcdff7_4.5.0+ds-1~exp1_amd64.deb 4ebc1e977882d4c2b930a8b9b4b34f7c 10033 science optional netcdf-fortran_4.5.0+ds-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1o0uYACgkQZ1DxCuiN SvF1aQ//fDeInj2jAweQ0o17oX9zhi/BCOIuIlS0XZN8mzWOBfBWlIel1Ojk/o4i w4lNKri9PM5Ah3LhA+HQqR6TYAuYpwuRrXcR4qPR1KQhFQbpjF6daMCdJFh2oMq/ uAlrELrtna0LxOlHf7q/XVTHa6EsTbJFKK4fRqrXaAFBDfYbmLoBaPSQs5Ep82YQ lUOAAcpVvWTHHcWCXnllrTAK8ZBtrwls+pKPI/hObcSgDx1hfdHAf9pdLh3cXEZ/ +QeDfx+HcjXZYc9TirSx8malq4wLmtUelIaEknkcPn+/cmIPZ8vLlG2m8uBqSmia wxLOSDExU1zz0NAmbZZd9qhUjCHms0kR1yR6LJPsRj1q69LhhkP+e5f2KOOUAWdy sw41Ao0vpKwnHV2oxX0qCP4ljPHE/MnzDhFjQDJ1jZqLpfeDCjFHZvpoZAyYkZ2f dagcpxFdgfshOyQqj0GlbcpZyNK8ATrNIWKVmxbtitW3ignlWFU/915I3L6g0TCq q5AFu+hhyt+tTBrxEvwI/wIucYy1qvi3ABt2uTs6ihJF1kAEbzUDJWLjfHMVHQgm e37wJZR7ZoBomtuGbcyszYfDjt1cWfn/ug1dR2iTBIzA4bm37ZH6WvVGOnQYJJ5W U7FHO3sOaluhmF85xV+qID2mynf/9uLkD42VUjlQn/KiaW4TUhc= =lHul -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From sebastic at xs4all.nl Wed Sep 4 15:07:01 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Wed, 4 Sep 2019 16:07:01 +0200 Subject: Bug#939399: libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies. In-Reply-To: <682488ac-d594-aea3-e5bd-a10d27a01b64@debian.org> References: <682488ac-d594-aea3-e5bd-a10d27a01b64@debian.org> <682488ac-d594-aea3-e5bd-a10d27a01b64@debian.org> Message-ID: <5ccb7ee2-820b-e102-26f0-7207ff48709e@xs4all.nl> Control: tags -1 pending Control: forwarded -1 https://github.com/OSGeo/libgeotiff/issues/22 On 9/4/19 3:38 PM, Peter Michael Green wrote: >> diff testlistgeo_out with testlistgeo_out.dist >> --- testlistgeo_out    2019-09-04 06:58:26.979704475 +0000 >> +++ ../test/testlistgeo_out.dist    2019-09-04 06:57:50.000000000 +0000 >> @@ -1697,11 +1697,11 @@ >>      Keyed_Information: >>         GTModelTypeGeoKey (Short,1): ModelTypeProjected >>         GTRasterTypeGeoKey (Short,1): RasterPixelIsArea >> -      ProjectedCSTypeGeoKey (Short,1): Code-3035 (ETRS89-extended / >> LAEA Europe) >> +      ProjectedCSTypeGeoKey (Short,1): Code-3035 (ETRS89 / LAEA Europe) >>         End_Of_Keys. >>      End_Of_Geotiff. >>   -PCS = 3035 (ETRS89-extended / LAEA Europe) >> +PCS = 3035 (ETRS89 / LAEA Europe) >>   Projection = 19986 (Europe Equal Area 2001) >>   Projection Method: CT_LambertAzimEqualArea >>      ProjCenterLatGeoKey: 52.000000 ( 52d 0' 0.00"N) >> >> PROBLEMS HAVE OCCURRED >> test file testlistgeo_out saved This is caused by PROJ 6.2.0 and easy to workaround with a patch. I've forwarded the issue upstream for a more permanent fix in a future release. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From owner at bugs.debian.org Wed Sep 4 15:15:12 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Wed, 04 Sep 2019 14:15:12 +0000 Subject: Bug#922645: marked as done (netcdf-fortran: Please change build-depends to gfortran | fortran-compiler) References: <155052085269.9219.11046020505913049143.reportbug@debian.sceal.ie> Message-ID: Your message dated Wed, 04 Sep 2019 14:13:06 +0000 with message-id and subject line Bug#922645: fixed in netcdf-fortran 4.5.0+ds-1~exp1 has caused the Debian Bug report #922645, regarding netcdf-fortran: Please change build-depends to gfortran | fortran-compiler to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 922645: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=922645 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Alastair McKinstry Subject: netcdf-fortran: Please change build-depends to gfortran | fortran-compiler Date: Mon, 18 Feb 2019 20:14:12 +0000 Size: 3542 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#922645: fixed in netcdf-fortran 4.5.0+ds-1~exp1 Date: Wed, 04 Sep 2019 14:13:06 +0000 Size: 7448 URL: From owner at bugs.debian.org Wed Sep 4 15:21:06 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Wed, 04 Sep 2019 14:21:06 +0000 Subject: Processed: reassign 939399 to src:libgeotiff, found 939399 in libgeotiff/1.5.1-1 References: <1567606606-3338-bts-sebastic@debian.org> Message-ID: Processing commands for control at bugs.debian.org: > reassign 939399 src:libgeotiff Bug #939399 [libgeotiff\] libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies. Warning: Unknown package 'libgeotiff\' Bug reassigned from package 'libgeotiff\' to 'src:libgeotiff'. No longer marked as found in versions 1.5.1-1. Ignoring request to alter fixed versions of bug #939399 to the same values previously set > found 939399 libgeotiff/1.5.1-1 Bug #939399 [src:libgeotiff] libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies. Marked as found in versions libgeotiff/1.5.1-1. > thanks Stopping processing here. Please contact me if you need assistance. -- 939399: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939399 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From ftpmaster at ftp-master.debian.org Wed Sep 4 15:22:44 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 14:22:44 +0000 Subject: Processing of libgeotiff_1.5.1-2_source.changes Message-ID: libgeotiff_1.5.1-2_source.changes uploaded successfully to localhost along with the files: libgeotiff_1.5.1-2.dsc libgeotiff_1.5.1-2.debian.tar.xz libgeotiff_1.5.1-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 4 15:39:06 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 14:39:06 +0000 Subject: libgeotiff_1.5.1-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 16:05:10 +0200 Source: libgeotiff Architecture: source Version: 1.5.1-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 939399 Changes: libgeotiff (1.5.1-2) unstable; urgency=medium . * Add patch to fix FTFBS with PROJ 6.2.0. (closes: #939399) Checksums-Sha1: 07f3d1edd72e7544ab8a1ea3c77cf4fadf7ef728 2223 libgeotiff_1.5.1-2.dsc 2de5ee79440c231d0dfebb10e25d9cf697a89844 10148 libgeotiff_1.5.1-2.debian.tar.xz 7c6fdaa7a1d7fe561af81d905b6a25bc2552def4 7881 libgeotiff_1.5.1-2_amd64.buildinfo Checksums-Sha256: 7a7fe15770ff01f81c34df577bb27a91615ed36b299b72b70f3d5fc86ed86454 2223 libgeotiff_1.5.1-2.dsc e5892c06ab5fb7c3d3539a2cd6192994ba8ca31054f10b55e6442bbed1d2f20c 10148 libgeotiff_1.5.1-2.debian.tar.xz bdf1fc757fc4d4c532290fa6d80b0b3e7b47bce71f9c62450a4f3e6ccc4fbb1d 7881 libgeotiff_1.5.1-2_amd64.buildinfo Files: 2b3ad1abc804010b3be94eab68dd5005 2223 science optional libgeotiff_1.5.1-2.dsc f5be020c098265a726d34053e2327f7a 10148 science optional libgeotiff_1.5.1-2.debian.tar.xz 6fd96decf5b831b0bffe24d003ebd46e 7881 science optional libgeotiff_1.5.1-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1vxcwACgkQZ1DxCuiN SvHNjRAAiPqG/BgqIyouNKnk/hPTpF7eUNOsY679UkI4g6EXOnCSIYLf0VNRiPeu UWMybOOezkctMvA22Ru5v64AT0ZnxWjBy6abIRD2qM5ricQwtH/w9fGE8DYTKlys H5W69/GuWNhUNG6VX1B0dWz0RbvbHfn8FuKP6gV9IfuszsNgy6zOh4bdanblzkYL eRTsTXaYQSaAt/WO23I/yH5FSRZAKXUu5W7wsmxRWF9LTGIe/Pu9cLMLnqTfBmE7 Pc899jcOi+5rnV50bGLV+ONCUtFT6A9RarTfq1bJRLJtowImKXlo7WAidu9MhGOJ EfysctJL5y8/MgQYZvQXM3SAapLR5trTxR0hpQPqYU98XucJbJifTU/O6dxUt4qn 9s5IJUGjGP7HzPsaUZ1kQZ8iliR1x2yVWNy9cIKXrvoJRkGH/SKXT65M5qnDdZfS KnRktz5svmgh81ZslXCUhz7G6uAyt+oNfjuqT68LsvF/6MY6nrWS9RCI+70uGilP ITloEPXvl8G/L3yIk5n9pCrarLvOIdMLUAViJuZiSF84fEL9sotIvEs0/vxiAikb TUNN8OcClKHwjqPhvACCTXiVdlpM+JH910stF4nm1zp3wYNDWtVeVlvbLyNgEPi/ yOXqFFpFSWEPGFQ8Rz0AFaucoB/eBkr+OkhdsIxIcyPfmCcVw6M= =vZKG -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Wed Sep 4 15:42:05 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Wed, 04 Sep 2019 14:42:05 +0000 Subject: Bug#939399: marked as done (libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies.) References: <682488ac-d594-aea3-e5bd-a10d27a01b64@debian.org> Message-ID: Your message dated Wed, 04 Sep 2019 14:39:07 +0000 with message-id and subject line Bug#939399: fixed in libgeotiff 1.5.1-2 has caused the Debian Bug report #939399, regarding libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies. to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 939399: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939399 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Peter Michael Green Subject: libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies. Date: Wed, 4 Sep 2019 14:38:51 +0100 Size: 3617 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#939399: fixed in libgeotiff 1.5.1-2 Date: Wed, 04 Sep 2019 14:39:07 +0000 Size: 5131 URL: From owner at bugs.debian.org Wed Sep 4 16:54:46 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Wed, 04 Sep 2019 15:54:46 +0000 Subject: Processed: closing 936766, closing 936826, closing 936965, closing 937046, closing 937220, closing 937248 ... References: Message-ID: Processing commands for control at bugs.debian.org: > close 936766 0.2.0-2 Bug #936766 [src:jinja2-time] jinja2-time: Python2 removal in sid/bullseye Marked as fixed in versions jinja2-time/0.2.0-2. Bug #936766 [src:jinja2-time] jinja2-time: Python2 removal in sid/bullseye Marked Bug as done > close 936826 1.3.1-2 Bug #936826 [src:lazy-object-proxy] lazy-object-proxy: Python2 removal in sid/bullseye Marked as fixed in versions lazy-object-proxy/1.3.1-2. Bug #936826 [src:lazy-object-proxy] lazy-object-proxy: Python2 removal in sid/bullseye Marked Bug as done > close 936965 1.4.3-2 Bug #936965 [src:logilab-common] logilab-common: Python2 removal in sid/bullseye Marked as fixed in versions logilab-common/1.4.3-2. Bug #936965 [src:logilab-common] logilab-common: Python2 removal in sid/bullseye Marked Bug as done > close 937046 0.12.0-1 Bug #937046 [src:migrate] migrate: Python2 removal in sid/bullseye Marked as fixed in versions migrate/0.12.0-1. Bug #937046 [src:migrate] migrate: Python2 removal in sid/bullseye Marked Bug as done > close 937220 1.1.0-6 Bug #937220 [src:osm-gps-map] osm-gps-map: Python2 removal in sid/bullseye Marked as fixed in versions osm-gps-map/1.1.0-6. Bug #937220 [src:osm-gps-map] osm-gps-map: Python2 removal in sid/bullseye Marked Bug as done > close 937248 12.0.1-1 Bug #937248 [src:path.py] path.py: Python2 removal in sid/bullseye Marked as fixed in versions path.py/12.0.1-1. Bug #937248 [src:path.py] path.py: Python2 removal in sid/bullseye Marked Bug as done > close 937309 1.5.0-1 Bug #937309 [src:portalocker] portalocker: Python2 removal in sid/bullseye Marked as fixed in versions portalocker/1.5.0-1. Bug #937309 [src:portalocker] portalocker: Python2 removal in sid/bullseye Marked Bug as done > close 937315 0.7.2-1.1 Bug #937315 [src:powerline-taskwarrior] powerline-taskwarrior: Python2 removal in sid/bullseye Marked as fixed in versions powerline-taskwarrior/0.7.2-1.1. Bug #937315 [src:powerline-taskwarrior] powerline-taskwarrior: Python2 removal in sid/bullseye Marked Bug as done > close 937573 2.4.2-1 Bug #937573 [src:python-amqp] python-amqp: Python2 removal in sid/bullseye Marked as fixed in versions python-amqp/2.4.2-1. Bug #937573 [src:python-amqp] python-amqp: Python2 removal in sid/bullseye Marked Bug as done > close 937629 0.6.1+repack1-1 Bug #937629 [src:python-caldav] python-caldav: Python2 removal in sid/bullseye Marked as fixed in versions python-caldav/0.6.1+repack1-1. Bug #937629 [src:python-caldav] python-caldav: Python2 removal in sid/bullseye Marked Bug as done > close 937649 2.14.1-1 Bug #937649 [src:python-cliff] python-cliff: Python2 removal in sid/bullseye Marked as fixed in versions python-cliff/2.14.1-1. Bug #937649 [src:python-cliff] python-cliff: Python2 removal in sid/bullseye Marked Bug as done > close 937670 1.9.3-3 Bug #937670 [src:python-crontab] python-crontab: Python2 removal in sid/bullseye Marked as fixed in versions python-crontab/1.9.3-3. Bug #937670 [src:python-crontab] python-crontab: Python2 removal in sid/bullseye Marked Bug as done > close 937693 1.21.0-1 Bug #937693 [src:python-debtcollector] python-debtcollector: Python2 removal in sid/bullseye Marked as fixed in versions python-debtcollector/1.21.0-1. Bug #937693 [src:python-debtcollector] python-debtcollector: Python2 removal in sid/bullseye Marked Bug as done > close 937699 2.0.6-2 Bug #937699 [src:python-deprecation] python-deprecation: Python2 removal in sid/bullseye Marked as fixed in versions python-deprecation/2.0.6-2. Bug #937699 [src:python-deprecation] python-deprecation: Python2 removal in sid/bullseye Marked Bug as done > close 937716 0.7.1-1 Bug #937716 [src:python-dogpile.cache] python-dogpile.cache: Python2 removal in sid/bullseye Marked as fixed in versions python-dogpile.cache/0.7.1-1. Bug #937716 [src:python-dogpile.cache] python-dogpile.cache: Python2 removal in sid/bullseye Marked Bug as done > close 937722 0.13.2-3 Bug #937722 [src:python-ecdsa] python-ecdsa: Python2 removal in sid/bullseye Marked as fixed in versions python-ecdsa/0.13.2-3. Bug #937722 [src:python-ecdsa] python-ecdsa: Python2 removal in sid/bullseye Marked Bug as done > close 937775 1.40.0-5 Bug #937775 [src:python-gabbi] python-gabbi: Python2 removal in sid/bullseye Marked as fixed in versions python-gabbi/1.40.0-5. Bug #937775 [src:python-gabbi] python-gabbi: Python2 removal in sid/bullseye Marked Bug as done > close 937798 1.7.11-1 Bug #937798 [src:python-googleapi] python-googleapi: Python2 removal in sid/bullseye Marked as fixed in versions python-googleapi/1.7.11-1. Bug #937798 [src:python-googleapi] python-googleapi: Python2 removal in sid/bullseye Marked Bug as done > close 937805 1.1.0-3 Bug #937805 [src:python-hacking] python-hacking: Python2 removal in sid/bullseye Marked as fixed in versions python-hacking/1.1.0-3. Bug #937805 [src:python-hacking] python-hacking: Python2 removal in sid/bullseye Marked Bug as done > thanks Stopping processing here. Please contact me if you need assistance. -- 936766: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=936766 936826: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=936826 936965: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=936965 937046: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937046 937220: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937220 937248: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937248 937309: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937309 937315: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937315 937573: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937573 937629: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937629 937649: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937649 937670: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937670 937693: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937693 937699: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937699 937716: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937716 937722: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937722 937775: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937775 937798: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937798 937805: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937805 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From gitlab at salsa.debian.org Wed Sep 4 18:40:51 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 17:40:51 +0000 Subject: [Git][debian-gis-team/python-pdal][master] Set distribution to unstable. Message-ID: <5d6ff7232e0db_577b3f91b50c79bc753525@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pdal Commits: ab8e6273 by Bas Couwenberg at 2019-09-04T17:15:09Z Set distribution to unstable. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,8 +1,9 @@ -python-pdal (2.1.8+ds-3) UNRELEASED; urgency=medium +python-pdal (2.1.8+ds-3) unstable; urgency=medium * Add filenamemangle to distinguish it from pdal releases. + * No change rebuild with PDAL 2.0.1. - -- Bas Couwenberg Thu, 22 Aug 2019 21:19:18 +0200 + -- Bas Couwenberg Wed, 04 Sep 2019 19:13:49 +0200 python-pdal (2.1.8+ds-2) unstable; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/ab8e6273e083cb527782fda0eecc2dae1ea37df4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/ab8e6273e083cb527782fda0eecc2dae1ea37df4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 18:41:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 17:41:15 +0000 Subject: [Git][debian-gis-team/python-pdal] Pushed new tag debian/2.1.8+ds-3 Message-ID: <5d6ff73b5519f_577b2ade613071bc753717@godard.mail> Bas Couwenberg pushed new tag debian/2.1.8+ds-3 at Debian GIS Project / python-pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/tree/debian/2.1.8+ds-3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 18:54:07 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 17:54:07 +0000 Subject: Processing of python-pdal_2.1.8+ds-3_source.changes Message-ID: python-pdal_2.1.8+ds-3_source.changes uploaded successfully to localhost along with the files: python-pdal_2.1.8+ds-3.dsc python-pdal_2.1.8+ds-3.debian.tar.xz python-pdal_2.1.8+ds-3_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 4 19:06:06 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 18:06:06 +0000 Subject: python-pdal_2.1.8+ds-3_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 19:13:49 +0200 Source: python-pdal Architecture: source Version: 2.1.8+ds-3 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pdal (2.1.8+ds-3) unstable; urgency=medium . * Add filenamemangle to distinguish it from pdal releases. * No change rebuild with PDAL 2.0.1. Checksums-Sha1: 12ddd9391a8507a1e572d0fd1c25b7241263e59b 2103 python-pdal_2.1.8+ds-3.dsc 7d27de86656252bdd8f78ad8caba991aefce291d 4556 python-pdal_2.1.8+ds-3.debian.tar.xz b848a254968f3cb50ad4da52e5bcc2f6a14793fc 13695 python-pdal_2.1.8+ds-3_amd64.buildinfo Checksums-Sha256: eb7ed19cc1d5004e990398043aafac94161a02f8b5ba60cb69915ebd944f4680 2103 python-pdal_2.1.8+ds-3.dsc b95cb9bb8969758c1a469e0226c9aef47b768de295054b7f7480003ecb51b936 4556 python-pdal_2.1.8+ds-3.debian.tar.xz 01dac76f46e300972dfcbe491a6a172a85436f49fe7ac9f1df46d923ad683c98 13695 python-pdal_2.1.8+ds-3_amd64.buildinfo Files: 71c4b24e67fff27de1df361391a72fb8 2103 science optional python-pdal_2.1.8+ds-3.dsc 38a05e8d44a721654c84ac36d601f6dc 4556 science optional python-pdal_2.1.8+ds-3.debian.tar.xz 867fcd99a89025311fb4862392af5b3e 13695 science optional python-pdal_2.1.8+ds-3_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1v9w8ACgkQZ1DxCuiN SvEQIRAAj9Du47yoY54bTM7aiDuGyUk59uhlSec8BJ29wTrvygwXVZE8usRgTjpc pr2xByqrVFaPDrS8xVN2M2DysZZCccjiQJISElA0znIL7V2RQytD994qB/1LTWTx TPPl05OMtTUT1OP6y45/vIo0sH1BBwWs75B1gqx0+IEi7V9Hkpp/klRrilT9ufxb grfswGStYVvI5Mp7mTlCwsQiWDigbTDaILkIQ+ldku6YIA8cQEYthsPCFgrjmtQi S/K1Z0vlUPqRG3k/VVkvFYJuTa2Z9bUs3iF4zKg11A406gZciVvNM0ydIhK4TNj8 YbSn1Rn94jLyRxwJ5mRiGF2HVw2UgjSf8scTcqGz94GK82ydqfOjSPtyFv/r95SD iXmxXe7RGplJFMaWHgguYk5HUeRLQ4dWbKvLuRLEaPthenFn+fvM92O7nPYCkDsv UM8eLNsDQhYl5lNXl2N7DURqpiRyKLLx92fi7jVEr/jfi6aNxRg8epEuD+ngoSie FDB6enYtDUdmkXSmfoxhqaOWXYOLoxwGZoD3a7aYZkk2+D5Lryh78dym0Ju9fRDj szyZr9pr/QlMyJkfw377e+nSZGl3xwUuW4elvq1z3mGLsTgaqbO16ofytxbC7Db9 ajVyzibCTP7FsvRqi7DkuiMGIYyBdzLtMYkWzKtsknFtIqrDX04= =tkd0 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Wed Sep 4 19:16:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 18:16:30 +0000 Subject: [Git][debian-gis-team/grass][master] 2 commits: Drop unused override for spelling-error-in-binary. Message-ID: <5d6fff7e2707e_577b3f91b50d432475626e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / grass Commits: aed1bbb2 by Bas Couwenberg at 2019-09-04T17:45:29Z Drop unused override for spelling-error-in-binary. - - - - - 28498784 by Bas Couwenberg at 2019-09-04T17:45:40Z Set distribution to unstable. - - - - - 2 changed files: - debian/changelog - debian/grass-core.lintian-overrides Changes: ===================================== debian/changelog ===================================== @@ -1,8 +1,10 @@ -grass (7.6.1-4) UNRELEASED; urgency=medium +grass (7.6.1-4) unstable; urgency=medium * Update PIE hardening conditional, trusty is EOL. + * Drop unused override for spelling-error-in-binary. + * No change rebuild with PDAL 2.0.1. - -- Bas Couwenberg Tue, 16 Jul 2019 13:49:04 +0200 + -- Bas Couwenberg Wed, 04 Sep 2019 19:16:06 +0200 grass (7.6.1-3) unstable; urgency=medium ===================================== debian/grass-core.lintian-overrides ===================================== @@ -51,9 +51,6 @@ dependency-on-python-version-marked-for-end-of-life (Depends: python) # Not a problem package-contains-documentation-outside-usr-share-doc usr/lib/grass*/etc/license -# False positive, part of NormalX, NormalY, NormalZ -spelling-error-in-binary usr/lib/grass*/bin/v.in.pdal NormalY Normally - # https://trac.osgeo.org/grass/ticket/3786 file-references-package-build-path View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/compare/df99cd2ae70bbce13168193b0b11af2c73d2c196...28498784fc4acb75e241ff5e2bc2d4260afd18d2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/compare/df99cd2ae70bbce13168193b0b11af2c73d2c196...28498784fc4acb75e241ff5e2bc2d4260afd18d2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 19:16:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 18:16:34 +0000 Subject: [Git][debian-gis-team/grass] Pushed new tag debian/7.6.1-4 Message-ID: <5d6fff829d90e_577b3f91b50c79bc7564c0@godard.mail> Bas Couwenberg pushed new tag debian/7.6.1-4 at Debian GIS Project / grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/tree/debian/7.6.1-4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 19:24:14 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 18:24:14 +0000 Subject: Processing of grass_7.6.1-4_source.changes Message-ID: grass_7.6.1-4_source.changes uploaded successfully to localhost along with the files: grass_7.6.1-4.dsc grass_7.6.1-4.debian.tar.xz grass_7.6.1-4_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 4 19:34:47 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 18:34:47 +0000 Subject: grass_7.6.1-4_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 19:16:06 +0200 Source: grass Architecture: source Version: 7.6.1-4 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: grass (7.6.1-4) unstable; urgency=medium . * Update PIE hardening conditional, trusty is EOL. * Drop unused override for spelling-error-in-binary. * No change rebuild with PDAL 2.0.1. Checksums-Sha1: 82331ddc52583a4ab138b3b7bfa8c01557627b5b 2789 grass_7.6.1-4.dsc a68cbdaa9382d7ac37a3fb10de40cccb3f94f521 35304 grass_7.6.1-4.debian.tar.xz 1b4eb6ab43e4982086c4299c3fc525df86932104 22168 grass_7.6.1-4_amd64.buildinfo Checksums-Sha256: 62740d9942a484bcb6b4992d55c24928babf4c6732cfc5750e87b69ea9424606 2789 grass_7.6.1-4.dsc 29c25b1f3b1102af76a3cd3f827983d38f7199c5e2040a58db39c24767936ce0 35304 grass_7.6.1-4.debian.tar.xz fe5259876bf6c7a406ac6cadb632dd677d101af1f7dedd88dd25855606764614 22168 grass_7.6.1-4_amd64.buildinfo Files: 9564ccbb4345583d14b5b97e44be036b 2789 science optional grass_7.6.1-4.dsc 778144dc4bb506ccb342b015c7a41178 35304 science optional grass_7.6.1-4.debian.tar.xz b1b39745be148f16f0db82ad430ee95e 22168 science optional grass_7.6.1-4_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1v/xIACgkQZ1DxCuiN SvEb1xAAlQD0bpmnQHYYuZUXWiBh2ckxjboy9Po4zm9tI4ZjYvbyUxMcYmbHWEo8 QKWwIUbXDEcM2lrfva+Hi17p3EAgDUllZSErTOpylEjjcsmMDJYmvlK0UJ8crkLX zd7pUju8ngBZXUCwFJqPLAHtOX6/qKefc3qpKgZFojpxK9157QOyn2sfTcYMzuo3 Crueg5KVk9cDM6PnR2nOHagzpzK4c5bvtg40hzlRlQDzFc19z/voB9q1Ina7SpZL OdOU1ePqb2QfswjpTlq1dvKVoHz7Y4z6O63SvEOAVoLJy7nvZMrhyDpNzVA45ahT tBQ391QKhC59y6GiemLi33s2IltubbrgnSflS6BE+IsM8vO90womf6HO7F8VmzaE Lp24LpQgsPh2omq6eX0HSvFPM2cMg5aaHDZpx8bS0b3hX0boWs1v78e5CL0l2HD3 qZVrFu4zrocpr0BXywqou1XRNsKveg2jDujoGVUd0UyEztpDHs92NUvDYG/P11j6 gQrTckvE9GkKGAsPFMk+6ST3C2/YYLXRlRwTluBxL4GpXqBQ7I2g824rUtDR+1w3 9+l8IG97yzqCQ87p6efcbYdoFXZoi57xljKyRbnxXYENwpZloz/EdnSgtM9f70kO rIphfIrN1rVRVIEOaz0LhmHko/TBHKmbXlmoF3g09qltKAGVgcE= =PQq8 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Wed Sep 4 19:53:07 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 18:53:07 +0000 Subject: [Git][debian-gis-team/grass][experimental] Drop unused override for spelling-error-in-binary. Message-ID: <5d7008133b0cb_577b2ade610463887582b2@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / grass Commits: c3cfceda by Bas Couwenberg at 2019-09-04T18:52:57Z Drop unused override for spelling-error-in-binary. - - - - - 2 changed files: - debian/changelog - debian/grass-core.lintian-overrides Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ grass (7.8.0~rc1-1~exp2) UNRELEASED; urgency=medium * Recommend both git & subversion for g.extension. + * Drop unused override for spelling-error-in-binary. -- Bas Couwenberg Sat, 31 Aug 2019 10:23:18 +0200 ===================================== debian/grass-core.lintian-overrides ===================================== @@ -48,9 +48,6 @@ extra-license-file usr/share/doc/grass-core/COPYING # Not a problem package-contains-documentation-outside-usr-share-doc usr/lib/grass*/etc/license -# False positive, part of NormalX, NormalY, NormalZ -spelling-error-in-binary usr/lib/grass*/bin/v.in.pdal NormalY Normally - # https://trac.osgeo.org/grass/ticket/3786 file-references-package-build-path View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/c3cfceda654e082602e0d9fdcfe10de66edcdcc1 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/c3cfceda654e082602e0d9fdcfe10de66edcdcc1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 21:06:08 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Wed, 04 Sep 2019 20:06:08 +0000 Subject: [Git][debian-gis-team/python-hdf4][master] 2 commits: Remove obsolete fields Name, Contact from debian/upstream/metadata. Message-ID: <5d7019301ce1b_577b3f91ce1ecce87939fa@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / python-hdf4 Commits: ba6fe5de by Antonio Valentino at 2019-09-04T20:00:39Z Remove obsolete fields Name, Contact from debian/upstream/metadata. - - - - - 32e7d80d by Antonio Valentino at 2019-09-04T20:04:25Z Set distribution to unstable - - - - - 2 changed files: - debian/changelog - debian/upstream/metadata Changes: ===================================== debian/changelog ===================================== @@ -1,11 +1,12 @@ -python-hdf4 (0.10.1-2) UNRELEASED; urgency=medium +python-hdf4 (0.10.1-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. * Use debhelper-compat instead of debian/compat. * Set compat to 12. * Set upstream metadata fields: Contact. + * Remove obsolete fields Name, Contact from debian/upstream/metadata. - -- Antonio Valentino Wed, 10 Jul 2019 19:07:15 +0200 + -- Antonio Valentino Wed, 04 Sep 2019 20:04:00 +0000 python-hdf4 (0.10.1-1) unstable; urgency=medium ===================================== debian/upstream/metadata ===================================== @@ -1,6 +1,4 @@ Bug-Database: https://github.com/fhs/python-hdf4/issues Bug-Submit: https://github.com/fhs/python-hdf4/issues/new -Contact: Andre Gosselin et al. -Name: python-hdf4 Repository: https://github.com/fhs/python-hdf4.git Repository-Browse: https://github.com/fhs/python-hdf4 View it on GitLab: https://salsa.debian.org/debian-gis-team/python-hdf4/compare/c7e04202c928d9106e31263d0b4bb8bb7b19d848...32e7d80d3bbbc181b2f728d998ecce6445549795 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-hdf4/compare/c7e04202c928d9106e31263d0b4bb8bb7b19d848...32e7d80d3bbbc181b2f728d998ecce6445549795 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 4 21:10:59 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 04 Sep 2019 20:10:59 +0000 Subject: [Git][debian-gis-team/python-hdf4] Pushed new tag debian/0.10.1-2 Message-ID: <5d701a53a39c3_577b3f91ce1ecce879564e@godard.mail> Bas Couwenberg pushed new tag debian/0.10.1-2 at Debian GIS Project / python-hdf4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-hdf4/tree/debian/0.10.1-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 4 21:20:19 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 20:20:19 +0000 Subject: Processing of python-hdf4_0.10.1-2_source.changes Message-ID: python-hdf4_0.10.1-2_source.changes uploaded successfully to localhost along with the files: python-hdf4_0.10.1-2.dsc python-hdf4_0.10.1-2.debian.tar.xz python-hdf4_0.10.1-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 4 21:43:36 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 04 Sep 2019 20:43:36 +0000 Subject: python-hdf4_0.10.1-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 20:04:00 +0000 Source: python-hdf4 Architecture: source Version: 0.10.1-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: python-hdf4 (0.10.1-2) unstable; urgency=medium . * Bump Standards-Version to 4.4.0, no changes. * Use debhelper-compat instead of debian/compat. * Set compat to 12. * Set upstream metadata fields: Contact. * Remove obsolete fields Name, Contact from debian/upstream/metadata. Checksums-Sha1: 448e24d1a7b3f3cecbd250fed1b3b66ccb48fb53 2145 python-hdf4_0.10.1-2.dsc 17325662ee0415d8bd366657a0fed9c36adcf4ef 3176 python-hdf4_0.10.1-2.debian.tar.xz cf3b4ec663d831ea8766513b70c8616c991dd19c 8664 python-hdf4_0.10.1-2_amd64.buildinfo Checksums-Sha256: 3f466af85ae681a4e958403058d6e694f2c1c14712dd52a2c5928791874fe74b 2145 python-hdf4_0.10.1-2.dsc ea8932ca1c07151ae05b5e02cbe37c0a15209d4457a18cb486cb793ee4d1ddbe 3176 python-hdf4_0.10.1-2.debian.tar.xz a14f495d6e0f88704217295b0b26a8b00f9b9e26a1df891ebf9950fbc626fe47 8664 python-hdf4_0.10.1-2_amd64.buildinfo Files: 117a7b1f68c57d7cf2ea9cec11fd3380 2145 python optional python-hdf4_0.10.1-2.dsc e3c2bafb6594a00564d2d8ee0d7a979f 3176 python optional python-hdf4_0.10.1-2.debian.tar.xz 4d8295b8130f1c7db095f299ee8d6ef7 8664 python optional python-hdf4_0.10.1-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1wGjwACgkQZ1DxCuiN SvHmTBAAz4TCLkklb6X5D3G7U+qYtaAPsGkSO2GG1ctyeq0wxq+262VK7wQi0BDb 6qRQhEWZBXeAFGZkmAOIc7Xh7YldxiNkh4L/c3moe+jdaGxvqvYA8zMlYoKqe4tb dXMSJW6oYHpj5A9XNYfTMehpST6e4j/V3CtpEfigGteZSc+ai4VFZXZ4gewJVs39 cs8TTgrCfkPq2+JpSYg1ezujDQiOMsf9k6z8SIvRaMOQoq+3sVjLEuUiJivwI/jS KvY4UKE6EOI0Y+Oq5115eT2CBqianDJiQVdHdvvPLsISirSmbpC16QB7m649TO5W oIxz26SM34BxcmorHrClVG2SN01nGT0P18spGLI07DDcMO/emxQ6BUtNwL/ixAjh LVCLX14jDeyDa4+6Knx0Uq5GUY/YCo+83b/ddxr+3U51liovC3JvB1kZ5tC3bU+Y v0eNt+S5gWlhw1lvZMgcZPYNlHEK3DBnPrUz1gqzFw3ekh2IY+EeXYHgGvEUm49L KR/ebs2nDY6B3x8POmlr/KzVsjBFGX8D6Uw1YUwUufBqz5c3EE05sxOsUALOBXL6 Ne6khi084waW0HQEYsQ7C0lsPkotxMf0Xfe/mmU39SUL42G0JA3w/A3PBNuUJSXF 6nfM0zWwqjHZpLSL6FY0wNUfzg1PM1upL9UsrMavdNpNmFuOekw= =sG0i -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From jdg at debian.org Wed Sep 4 22:38:42 2019 From: jdg at debian.org (Julian Gilbey) Date: Wed, 04 Sep 2019 22:38:42 +0100 Subject: Your Debian package(s) will not migrate to testing Message-ID: Dear Debian GIS Project, This is a courtesy message about your package(s) which is/are stuck in sid, and will not migrate to testing, in case you are unaware; they are listed below. The release managers announced on 7th July 2019 in their email to debian-devel-announce and debian-release (Message-ID: <20190707014700.GF15255 at powdarrmonkey.net>) that only source-only uploads would transition to testing. Here is the relevant part of what they wrote: ~~~~~ No binary maintainer uploads for bullseye ========================================= The release of buster also means the bullseye release cycle is about to begin. >From now on, we will no longer allow binaries uploaded by maintainers to migrate to testing. This means that you will need to do source-only uploads if you want them to reach bullseye. Q: I already did a binary upload, do I need to do a new (source-only) upload? A: Yes (preferably with other changes, not just a version bump). Q: I needed to do a binary upload because my upload went to the NEW queue, do I need to do a new (source-only) upload for it to reach bullseye? A: Yes. We also suggest going through NEW in experimental instead of unstable where possible, to avoid disruption in unstable. Q: Does this also apply to contrib and non-free? A: No. Not all packages in contrib and non-free can be built on the buildds, so maintainer uploads will still be allowed to migrate for packages outside main. ~~~~~ To perform a source-only build and upload, run dpkg-buildpackage -S and then upload the relevant files in the normal way (for example, using dupload or dput). Your package(s) involved is/are as follows: Source package: openstreetmap-carto Version: 2.45.1-1 One relevant line of the excuse file (including the uploader): Not built on buildd: arch all binaries uploaded by sebastic, a new source-only upload is needed to allow migration Transition verdict: REJECTED_PERMANENTLY I hope this is of help to you! Best wishes, Julian From gitlab at salsa.debian.org Thu Sep 5 05:17:39 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:17:39 +0000 Subject: [Git][debian-gis-team/netcdf-fortran][pristine-tar] pristine-tar data for netcdf-fortran_4.5.1+ds.orig.tar.xz Message-ID: <5d708c63dd278_577b3f91d8580544838390@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / netcdf-fortran Commits: 3915c2f3 by Bas Couwenberg at 2019-09-05T04:02:15Z pristine-tar data for netcdf-fortran_4.5.1+ds.orig.tar.xz - - - - - 2 changed files: - + netcdf-fortran_4.5.1+ds.orig.tar.xz.delta - + netcdf-fortran_4.5.1+ds.orig.tar.xz.id Changes: ===================================== netcdf-fortran_4.5.1+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/netcdf-fortran_4.5.1+ds.orig.tar.xz.delta differ ===================================== netcdf-fortran_4.5.1+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +c6d79debfcc1980ce1f5834ebff530b9940d8b5f View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/3915c2f3204b2bde8aa8a206c86bdf11c8515ccf -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/3915c2f3204b2bde8aa8a206c86bdf11c8515ccf You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:18:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:18:05 +0000 Subject: [Git][debian-gis-team/netcdf-fortran] Pushed new tag debian/4.5.1+ds-1_exp1 Message-ID: <5d708c7d4ced8_577b2ade5f375484838618@godard.mail> Bas Couwenberg pushed new tag debian/4.5.1+ds-1_exp1 at Debian GIS Project / netcdf-fortran -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/tree/debian/4.5.1+ds-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:18:13 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:18:13 +0000 Subject: [Git][debian-gis-team/netcdf-fortran] Pushed new tag upstream/4.5.1+ds Message-ID: <5d708c858335e_577b2ade611bfd408388ef@godard.mail> Bas Couwenberg pushed new tag upstream/4.5.1+ds at Debian GIS Project / netcdf-fortran -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/tree/upstream/4.5.1+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:18:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:18:17 +0000 Subject: [Git][debian-gis-team/netcdf-fortran][experimental] 4 commits: New upstream version 4.5.1+ds Message-ID: <5d708c89886cc_577b2ade61196f448389d7@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / netcdf-fortran Commits: c78bcb88 by Bas Couwenberg at 2019-09-05T04:02:11Z New upstream version 4.5.1+ds - - - - - 4a937c25 by Bas Couwenberg at 2019-09-05T04:02:15Z Update upstream source from tag 'upstream/4.5.1+ds' Update to upstream version '4.5.1+ds' with Debian dir ce29883e2b42e74c67e5b5b58b3d33c63f24288d - - - - - 94c1d943 by Bas Couwenberg at 2019-09-05T04:02:27Z New upstream release. - - - - - ded57453 by Bas Couwenberg at 2019-09-05T04:04:00Z Set distribution to experimental. - - - - - 21 changed files: - CMakeExtras/Makefile.am - CMakeExtras/Makefile.in - CMakeLists.txt - Makefile.in - RELEASE_NOTES.md - aclocal.m4 - configure - configure.ac - debian/changelog - docs/CMakeLists.txt - docs/Makefile.in - examples/F77/Makefile.in - examples/F90/Makefile.in - examples/Makefile.in - fortran/Makefile.in - libsrc/Makefile.in - nf-config.cmake.in - nf03_test/Makefile.in - nf03_test4/Makefile.in - nf_test/Makefile.in - nf_test4/Makefile.in Changes: ===================================== CMakeExtras/Makefile.am ===================================== @@ -8,4 +8,4 @@ # Russ Rew, Ward Fisher EXTRA_DIST = sizeof_ptrdiff_t.c test_c_ptrdiff_t.f90 \ MatchNetCDFFortranTypes.cmake check_relax_coord_bound.c \ -check_pnetcdf.c check_parallel4.c +check_pnetcdf.c check_parallel4.c check_cdf5.c ===================================== CMakeExtras/Makefile.in ===================================== @@ -191,6 +191,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -284,12 +285,12 @@ valgrind_tools = @valgrind_tools@ # Russ Rew, Ward Fisher EXTRA_DIST = sizeof_ptrdiff_t.c test_c_ptrdiff_t.f90 \ MatchNetCDFFortranTypes.cmake check_relax_coord_bound.c \ -check_pnetcdf.c check_parallel4.c +check_pnetcdf.c check_parallel4.c check_cdf5.c all: all-am .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -313,9 +314,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== CMakeLists.txt ===================================== @@ -17,7 +17,7 @@ set(PACKAGE "${NC4F_CTEST_PROJECT_NAME}" CACHE STRING "") #Project Version SET(NC4F_VERSION_MAJOR 4) SET(NC4F_VERSION_MINOR 5) -SET(NC4F_VERSION_PATCH 0) +SET(NC4F_VERSION_PATCH 1) SET(NC4F_VERSION_NOTE "") SET(NC4F_VERSION ${NC4F_VERSION_MAJOR}.${NC4F_VERSION_MINOR}.${NC4F_VERSION_PATCH}${NC4F_VERSION_NOTE}) SET(VERSION ${NC4F_VERSION}) @@ -75,6 +75,16 @@ ENDIF() FIND_PACKAGE(Doxygen) FIND_PROGRAM(NC_DOT NAMES dot) +# A basic script used to convert m4 files +FIND_PROGRAM(NF_M4 NAMES m4 m4.exe) +IF(NF_M4) + MESSAGE(STATUS "Found m4: ${NF_M4}") + SET(HAVE_M4 TRUE) +ELSE() + MESSAGE(STATUS "m4 not found.") +ENDIF() + + # For CMAKE_INSTALL_LIBDIR INCLUDE(GNUInstallDirs) @@ -535,25 +545,32 @@ CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nccreate "" USE_NETCDF_V2) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_set_log_level "" USE_LOGGING) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} oc_open "" BUILD_DAP) +SET(HAS_NC4 FALSE) IF(USE_NETCDF4) SET(NC_BUILD_V4 TRUE) SET(HAVE_NC_DEF_OPAQUE TRUE) + SET(HAS_NC4 TRUE) ENDIF(USE_NETCDF4) +SET(HAS_NC2 FALSE) IF(USE_NETCDF_V2) SET(HAVE_NCCREATE TRUE) + SET(USE_NC2 TRUE) + SET(HAS_NC2 TRUE) ENDIF(USE_NETCDF_V2) -SET(STATUS_LOGGING OFF) +SET(STATUS_LOGGING FALSE) IF(USE_LOGGING) SET(HAVE_NC_SET_LOG_LEVEL TRUE) SET(STATUS_LOGGING ON) ENDIF(USE_LOGGING) SET(ENABLE_DAP OFF) +SET(HAS_DAP FALSE) IF(BUILD_DAP) SET(HAVE_OC_OPEN TRUE) SET(ENABLE_DAP ON) + SET(HAS_DAP TRUE) ENDIF(BUILD_DAP) ## @@ -682,6 +699,18 @@ SET(BUILD_F90 "ON") SET(BUILD_V2 "ON") SET(BUILD_F03 "OFF") +IF(BUILD_F90) + SET(HAS_F90 TRUE) +ELSE() + SET(HAS_F90 FALSE) +ENDIF(BUILD_F90) + +IF(BUILD_F03) + SET(HAS_F03 TRUE) +ELSE() + SET(HAS_F03 FALSE) +ENDIF(BUILD_F03) + IF(ENABLE_FORTRAN_TYPE_CHECKS) # Do tests to determine which Fortran types correspond to NCBYTE, NCSHORT, ... # The configure file got these by testing an F77 program, invoking @@ -712,14 +741,8 @@ ENDIF(ENABLE_FORTRAN_TYPE_CHECKS) # configure_file("${NC4F_SOURCE_DIR}/config.h.cmake.in" # "${NC4F_BINARY_DIR}/config.h") -# # For now, just copy a stub file -# FILE(COPY "${NC4F_SOURCE_DIR}/nf-config.cmake.in" -# DESTINATION "${NC4F_BINARY_DIR}" -# FILE_PERMISSIONS OWNER_READ OWNER_WRITE OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE) -# FILE(RENAME "${NC4F_BINARY_DIR}/nf-config.cmake.in" "${NC4F_BINARY_DIR}/nf-config") - # make sure previous two files get cleaned up... -SET_DIRECTORY_PROPERTIES (DIRECTORY PROPERTY ADDITIONAL_MAKE_CLEAN_FILES ${netcdf-fortran_BINARY_DIR}/config.h ${netcdf-fortran_BINARY_DIR}/nf-config) +#SET_DIRECTORY_PROPERTIES (DIRECTORY PROPERTY ADDITIONAL_MAKE_CLEAN_FILES ${netcdf-fortran_BINARY_DIR}/config.h ${netcdf-fortran_BINARY_DIR}/nf-config) ## # Configuration for post-install RPath @@ -768,7 +791,13 @@ CONFIGURE_FILE("${CMAKE_SOURCE_DIR}/CTestConfig.cmake.in" "${CMAKE_SOURCE_DIR}/CTestConfig.cmake" @ONLY) - +# Generate nf-config form template. +CONFIGURE_FILE("${CMAKE_SOURCE_DIR}/nf-config.cmake.in" + "${CMAKE_BINARY_DIR}/tmp/nf-config" @ONLY + NEWLINE_STYLE LF) +FILE(COPY "${CMAKE_BINARY_DIR}/tmp/nf-config" + DESTINATION ${CMAKE_BINARY_DIR}/ + FILE_PERMISSIONS OWNER_READ OWNER_WRITE OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE) ### # Allow the user to specify libraries # to link against, similar to automakes 'LIBS' variable. @@ -841,6 +870,21 @@ IF(CMAKE_LINK_FLAGS) SET(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} ${CMAKE_LINK_FLAGS}") ENDIF() +# Set +SET(prefix ${CMAKE_INSTALL_PREFIX}) +SET(exec_prefix ${CMAKE_INSTALL_PREFIX}) +SET(libdir ${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_LIBDIR}) +SET(includedir ${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}) +SET(CC ${CMAKE_C_COMPILER}) +SET(FC ${CMAKE_Fortran_COMPILER}) + +configure_file( + ${CMAKE_SOURCE_DIR}/netcdf-fortran.pc.in + ${CMAKE_BINARY_DIR}/netcdf-fortran.pc @ONLY) + +INSTALL(FILES ${CMAKE_BINARY_DIR}/netcdf-fortran.pc + DESTINATION ${CMAKE_INSTALL_LIBDIR}/pkgconfig + COMPONENT utilities) INSTALL(PROGRAMS ${NC4F_BINARY_DIR}/nf-config DESTINATION "${CMAKE_INSTALL_BINDIR}" ===================================== Makefile.in ===================================== @@ -306,6 +306,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -427,7 +428,7 @@ all: all-recursive .SUFFIXES: am--refresh: Makefile @: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -453,9 +454,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) $(SHELL) ./config.status --recheck -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) $(am__cd) $(srcdir) && $(AUTOCONF) -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) $(am__cd) $(srcdir) && $(ACLOCAL) $(ACLOCAL_AMFLAGS) $(am__aclocal_m4_deps): nf-config: $(top_builddir)/config.status $(srcdir)/nf-config.in ===================================== RELEASE_NOTES.md ===================================== @@ -6,7 +6,18 @@ Release Notes {#nf_release_notes} This file contains a high-level description of this package's evolution. Entries are in reverse chronological order (most recent first). -## 4.5.1 - TBD +## 4.5.2 - TBD + +## 4.5.1 - September 4, 2019 + +### Requirements + +* netCDF-C: 4.6.0 or greater + +### Changes + +* Corrected an issue where a cmake-specific file wasn't being captured by `make dist`. +* Corrected an issue where nf-config wasn't being generated by cmake-based builds. Corrected a couple of other missing files. See [Github #108](https://github.com/Unidata/netcdf-fortran/issues/108) for more information. ## 4.5.0 - August 28, 2019 @@ -19,25 +30,11 @@ Entries are in reverse chronological order (most recent first). * Moved netCDF classic F90 API tests to new subdirectory nf03_test. * Moved netCDF-4 F77 API tests to new subdirectory nf_test4. * Moved netCDF-4 F90 API tests to new subdirectory nf03_test4. -* Fixed bug which caused parallel I/O tests to not be run. See - [#155](https://github.com/Unidata/netcdf-fortran/issues/155) and - [#157](https://github.com/Unidata/netcdf-fortran/issues/157). -* Fixed bug in the setting of file cache preemption for netCDF-4 - files. See - [#146](https://github.com/Unidata/netcdf-fortran/issues/146). -* Removed many near-duplicate tests files, now they are created at - build time with sed. See - [#165](https://github.com/Unidata/netcdf-fortran/issues/165). -* Removed no longer needed configure options --enable-dll (see - [#161](https://github.com/Unidata/netcdf-fortran/issues/161)), - --enable-extra-tests (see - [#114](https://github.com/Unidata/netcdf-fortran/issues/114)), - --enable-extra-example-tests(see - [#126](https://github.com/Unidata/netcdf-fortran/issues/126)), and - --enable-valgrind (see - [#118](https://github.com/Unidata/netcdf-fortran/issues/118)). -* Moved handling of F77 man page to the docs directory. See - [#141](https://github.com/Unidata/netcdf-fortran/issues/141) +* Fixed bug which caused parallel I/O tests to not be run. See [#155](https://github.com/Unidata/netcdf-fortran/issues/155) and [#157](https://github.com/Unidata/netcdf-fortran/issues/157). +* Fixed bug in the setting of file cache preemption for netCDF-4 files. See [#146](https://github.com/Unidata/netcdf-fortran/issues/146). +* Removed many near-duplicate tests files, now they are created at build time with sed. See [#165](https://github.com/Unidata/netcdf-fortran/issues/165). +* Removed no longer needed configure options --enable-dll (see [#161](https://github.com/Unidata/netcdf-fortran/issues/161)), `--enable-extra-tests` (see [#114](https://github.com/Unidata/netcdf-fortran/issues/114)), `--enable-extra-example-tests` (see [#126](https://github.com/Unidata/netcdf-fortran/issues/126)), and `--enable-valgrind` (see [#118](https://github.com/Unidata/netcdf-fortran/issues/118)). +* Moved handling of F77 man page to the docs directory. See [#141](https://github.com/Unidata/netcdf-fortran/issues/141). ## 4.4.5 - Release Jan 9, 2019 ===================================== aclocal.m4 ===================================== @@ -652,6 +652,42 @@ fi rmdir .tst 2>/dev/null AC_SUBST([am__leading_dot])]) +# Add --enable-maintainer-mode option to configure. -*- Autoconf -*- +# From Jim Meyering + +# Copyright (C) 1996-2018 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_MAINTAINER_MODE([DEFAULT-MODE]) +# ---------------------------------- +# Control maintainer-specific portions of Makefiles. +# Default is to disable them, unless 'enable' is passed literally. +# For symmetry, 'disable' may be passed as well. Anyway, the user +# can override the default with the --enable/--disable switch. +AC_DEFUN([AM_MAINTAINER_MODE], +[m4_case(m4_default([$1], [disable]), + [enable], [m4_define([am_maintainer_other], [disable])], + [disable], [m4_define([am_maintainer_other], [enable])], + [m4_define([am_maintainer_other], [enable]) + m4_warn([syntax], [unexpected argument to AM@&t at _MAINTAINER_MODE: $1])]) +AC_MSG_CHECKING([whether to enable maintainer-specific portions of Makefiles]) + dnl maintainer-mode's default is 'disable' unless 'enable' is passed + AC_ARG_ENABLE([maintainer-mode], + [AS_HELP_STRING([--]am_maintainer_other[-maintainer-mode], + am_maintainer_other[ make rules and dependencies not useful + (and sometimes confusing) to the casual installer])], + [USE_MAINTAINER_MODE=$enableval], + [USE_MAINTAINER_MODE=]m4_if(am_maintainer_other, [enable], [no], [yes])) + AC_MSG_RESULT([$USE_MAINTAINER_MODE]) + AM_CONDITIONAL([MAINTAINER_MODE], [test $USE_MAINTAINER_MODE = yes]) + MAINT=$MAINTAINER_MODE_TRUE + AC_SUBST([MAINT])dnl +] +) + # Check to see how 'make' treats includes. -*- Autoconf -*- # Copyright (C) 2001-2018 Free Software Foundation, Inc. ===================================== configure ===================================== @@ -1,6 +1,6 @@ #! /bin/sh # Guess values for system-dependent variables and create Makefiles. -# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.0. +# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.1. # # Report bugs to . # @@ -590,8 +590,8 @@ MAKEFLAGS= # Identity of this package. PACKAGE_NAME='netCDF-Fortran' PACKAGE_TARNAME='netcdf-fortran' -PACKAGE_VERSION='4.5.0' -PACKAGE_STRING='netCDF-Fortran 4.5.0' +PACKAGE_VERSION='4.5.1' +PACKAGE_STRING='netCDF-Fortran 4.5.1' PACKAGE_BUGREPORT='support-netcdf at unidata.ucar.edu' PACKAGE_URL='' @@ -734,6 +734,9 @@ CPPFLAGS LDFLAGS CFLAGS CC +MAINT +MAINTAINER_MODE_FALSE +MAINTAINER_MODE_TRUE AM_BACKSLASH AM_DEFAULT_VERBOSITY AM_DEFAULT_V @@ -816,6 +819,7 @@ ac_subst_files='' ac_user_opts=' enable_option_checking enable_silent_rules +enable_maintainer_mode enable_dependency_tracking enable_valgrind enable_valgrind_memcheck @@ -1396,7 +1400,7 @@ if test "$ac_init_help" = "long"; then # Omit some internal or obsolete options to make the list less imposing. # This message is too long to be a string in the A/UX 3.1 sh. cat <<_ACEOF -\`configure' configures netCDF-Fortran 4.5.0 to adapt to many kinds of systems. +\`configure' configures netCDF-Fortran 4.5.1 to adapt to many kinds of systems. Usage: $0 [OPTION]... [VAR=VALUE]... @@ -1467,7 +1471,7 @@ fi if test -n "$ac_init_help"; then case $ac_init_help in - short | recursive ) echo "Configuration of netCDF-Fortran 4.5.0:";; + short | recursive ) echo "Configuration of netCDF-Fortran 4.5.1:";; esac cat <<\_ACEOF @@ -1477,6 +1481,9 @@ Optional Features: --enable-FEATURE[=ARG] include FEATURE [ARG=yes] --enable-silent-rules less verbose build output (undo: "make V=1") --disable-silent-rules verbose build output (undo: "make V=0") + --enable-maintainer-mode + enable make rules and dependencies not useful (and + sometimes confusing) to the casual installer --enable-dependency-tracking do not reject slow dependency extractors --disable-dependency-tracking @@ -1616,7 +1623,7 @@ fi test -n "$ac_init_help" && exit $ac_status if $ac_init_version; then cat <<\_ACEOF -netCDF-Fortran configure 4.5.0 +netCDF-Fortran configure 4.5.1 generated by GNU Autoconf 2.69 Copyright (C) 2012 Free Software Foundation, Inc. @@ -2421,7 +2428,7 @@ cat >config.log <<_ACEOF This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. -It was created by netCDF-Fortran $as_me 4.5.0, which was +It was created by netCDF-Fortran $as_me 4.5.1, which was generated by GNU Autoconf 2.69. Invocation command line was $ $0 $@ @@ -2772,11 +2779,11 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu # Create the VERSION file, which contains the package version from # AC_INIT. -echo -n 4.5.0>VERSION +echo -n 4.5.1>VERSION -{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.0" >&5 -$as_echo "$as_me: netCDF-Fortran 4.5.0" >&6;} +{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.1" >&5 +$as_echo "$as_me: netCDF-Fortran 4.5.1" >&6;} # Keep libtool macros in an m4 directory. @@ -3411,7 +3418,7 @@ fi # Define the identity of the package. PACKAGE='netcdf-fortran' - VERSION='4.5.0' + VERSION='4.5.1' cat >>confdefs.h <<_ACEOF @@ -3505,6 +3512,29 @@ END fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether to enable maintainer-specific portions of Makefiles" >&5 +$as_echo_n "checking whether to enable maintainer-specific portions of Makefiles... " >&6; } + # Check whether --enable-maintainer-mode was given. +if test "${enable_maintainer_mode+set}" = set; then : + enableval=$enable_maintainer_mode; USE_MAINTAINER_MODE=$enableval +else + USE_MAINTAINER_MODE=no +fi + + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $USE_MAINTAINER_MODE" >&5 +$as_echo "$USE_MAINTAINER_MODE" >&6; } + if test $USE_MAINTAINER_MODE = yes; then + MAINTAINER_MODE_TRUE= + MAINTAINER_MODE_FALSE='#' +else + MAINTAINER_MODE_TRUE='#' + MAINTAINER_MODE_FALSE= +fi + + MAINT=$MAINTAINER_MODE_TRUE + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking user options" >&5 $as_echo "$as_me: checking user options" >&6;} @@ -24138,6 +24168,10 @@ else am__EXEEXT_FALSE= fi +if test -z "${MAINTAINER_MODE_TRUE}" && test -z "${MAINTAINER_MODE_FALSE}"; then + as_fn_error $? "conditional \"MAINTAINER_MODE\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi if test -z "${AMDEP_TRUE}" && test -z "${AMDEP_FALSE}"; then as_fn_error $? "conditional \"AMDEP\" was never defined. Usually this means the macro was only invoked conditionally." "$LINENO" 5 @@ -24607,7 +24641,7 @@ cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 # report actual input values of CONFIG_FILES etc. instead of their # values after options handling. ac_log=" -This file was extended by netCDF-Fortran $as_me 4.5.0, which was +This file was extended by netCDF-Fortran $as_me 4.5.1, which was generated by GNU Autoconf 2.69. Invocation command line was CONFIG_FILES = $CONFIG_FILES @@ -24668,7 +24702,7 @@ _ACEOF cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" ac_cs_version="\\ -netCDF-Fortran config.status 4.5.0 +netCDF-Fortran config.status 4.5.1 configured by $0, generated by GNU Autoconf 2.69, with options \\"\$ac_cs_config\\" ===================================== configure.ac ===================================== @@ -9,7 +9,7 @@ AC_PREREQ([2.59]) # Initialize with name, version, and support email address. -AC_INIT([netCDF-Fortran], [4.5.0], [support-netcdf at unidata.ucar.edu]) +AC_INIT([netCDF-Fortran], [4.5.1], [support-netcdf at unidata.ucar.edu]) # Create the VERSION file, which contains the package version from # AC_INIT. @@ -29,6 +29,7 @@ AC_CANONICAL_TARGET # This call is required by automake. AM_INIT_AUTOMAKE([foreign dist-zip subdir-objects]) +AM_MAINTAINER_MODE() AC_MSG_NOTICE([checking user options]) ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +netcdf-fortran (4.5.1+ds-1~exp1) experimental; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Thu, 05 Sep 2019 06:03:45 +0200 + netcdf-fortran (4.5.0+ds-1~exp1) experimental; urgency=medium * New upstream release. ===================================== docs/CMakeLists.txt ===================================== @@ -23,6 +23,13 @@ IF(ENABLE_DOXYGEN) FILE(COPY ${IMG_FILES} DESTINATION ${CMAKE_CURRENT_BINARY_DIR}/html/) ENDIF() -SET(CUR_EXTRA_DIST ${CUR_EXTRA_DIST} CMakeLists.txt Makefile.am netcdf.m4 DoxygenLayout.xml Doxyfile.in footer.html mainpage.doc tutorial.doc install.doc dispatch.doc guide.doc types.doc notes.doc cdl.doc architecture.doc internal.doc Doxyfile.in.cmake windows-binaries.md Building-with-CMake.md) +IF(HAVE_M4) +IF(NOT MSVC) + ADD_CUSTOM_TARGET(manpage ALL + COMMAND ${NF_M4} '${CMAKE_CURRENT_SOURCE_DIR}/netcdf.m4' > '${CMAKE_CURRENT_BINARY_DIR}/netcdf_fortran.3' + WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR} + ) -ADD_EXTRA_DIST("${CUR_EXTRA_DIST}") + INSTALL(FILES ${CMAKE_CURRENT_BINARY_DIR}/netcdf_fortran.3 DESTINATION "share/man/man3" COMPONENT documentation) +ENDIF(NOT MSVC) +ENDIF(HAVE_M4) \ No newline at end of file ===================================== docs/Makefile.in ===================================== @@ -227,6 +227,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -343,7 +344,7 @@ MAINTAINERCLEANFILES = netcdf_fortran-man.html stamp-* version* all: all-am .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -367,9 +368,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): Doxyfile: $(top_builddir)/config.status $(srcdir)/Doxyfile.in ===================================== examples/F77/Makefile.in ===================================== @@ -484,6 +484,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -601,7 +602,7 @@ all: all-am .SUFFIXES: .SUFFIXES: .f .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -625,9 +626,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== examples/F90/Makefile.in ===================================== @@ -548,6 +548,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -685,7 +686,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .f90 .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -709,9 +710,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== examples/Makefile.in ===================================== @@ -249,6 +249,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -345,7 +346,7 @@ EXTRA_DIST = CMakeLists.txt all: all-recursive .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -369,9 +370,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== fortran/Makefile.in ===================================== @@ -418,6 +418,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -601,7 +602,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .F90 .c .f90 .lo .o .obj -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -625,9 +626,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== libsrc/Makefile.in ===================================== @@ -191,6 +191,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -288,7 +289,7 @@ EXTRA_DIST = CMakeLists.txt all: all-am .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -312,9 +313,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf-config.cmake.in ===================================== @@ -3,160 +3,130 @@ # This forms the basis for the nf-config utility, which tells you # various things about the netCDF Fortran installation. -echo "nf-config not yet implemented for cmake builds" -exit 1 - -# prefix=@CMAKE_INSTALL_PREFIX@ -# exec_prefix=@CMAKE_INSTALL_PREFIX@ -# libdir=@CMAKE_INSTALL_PREFIX@/lib -# includedir=@CMAKE_INSTALL_PREFIX@/include -# -# cc="@CMAKE_C_COMPILER@" -# fc="@CMAKE_FORTRAN_COMPILER@" -# cflags="-I at CMAKE_INSTALL_PREFIX@/include @CMAKE_C_FLAGS@ @CMAKE_CPP_FLAGS@" -# fflags="@MOD_FLAG@${includedir}" -# -# has_dap="@HAS_DAP@" -# has_nc2="@HAS_NC2@" -# has_nc4="@HAS_NC4@" -# has_f90="@HAS_F90@" -# has_f03="@HAS_F03@" -# flibs="-L${libdir} @NC_FLIBS@" -# version="@PACKAGE_NAME@ @PACKAGE_VERSION@" -# -# usage() -# { -# cat < $cc" -# echo " --cflags -> $cflags" -# echo -# echo " --fc -> $fc" -# echo " --fflags -> $fflags" -# echo " --flibs -> $flibs" -# echo " --has-f90 -> $has_f90" -# echo " --has-f03 -> $has_f03" -# echo -# echo " --has-nc2 -> $has_nc2" -# echo " --has-nc4 -> $has_nc4" -# echo -# echo " --prefix -> $prefix" -# echo " --includedir-> $includedir" -# echo " --version -> $version" -# echo -# } -# -# if test $# -eq 0; then -# usage 1 -# fi -# -# while test $# -gt 0; do -# case "$1" in -# # this deals with options in the style -# # --option=value and extracts the value part -# # [not currently used] -# -*=*) value=`echo "$1" | sed 's/[-_a-zA-Z0-9]*=//'` ;; -# *) value= ;; -# esac -# -# case "$1" in -# -# --help) -# usage 0 -# ;; -# -# --all) -# all -# ;; -# -# --cc) -# echo $cc -# ;; -# -# --fc) -# echo $fc -# ;; -# -# --cflags) -# echo $cflags -# ;; -# -# --fflags) -# echo $fflags -# ;; -# -# --has-dap) -# echo $has_dap -# ;; -# -# --has-nc2) -# echo $has_nc2 -# ;; -# -# --has-nc4) -# echo $has_nc4 -# ;; -# -# --has-f90) -# echo $has_f90 -# ;; -# -# --has-f03) -# echo $has_f03 -# ;; -# -# --flibs) -# echo $flibs -# ;; -# -# --prefix) -# echo "${CMAKE_INSTALL_PREFIX}" -# ;; -# -# --includedir) -# echo "${includedir}" -# ;; -# -# --version) -# echo $version -# ;; -# -# *) -# echo "unknown option: $1" -# usage -# exit 1 -# ;; -# esac -# shift -# done -# -# exit 0 + +prefix=@CMAKE_INSTALL_PREFIX@ +exec_prefix=@CMAKE_INSTALL_PREFIX@ +libdir=@CMAKE_INSTALL_PREFIX@/lib +includedir=@CMAKE_INSTALL_PREFIX@/include +# +cc="@CMAKE_C_COMPILER@" +fc="@CMAKE_Fortran_COMPILER@" +cflags="-I at CMAKE_INSTALL_PREFIX@/include @CMAKE_C_FLAGS@ @CMAKE_CPP_FLAGS@" +fflags="@MOD_FLAG@${includedir}" +# +has_dap="@HAS_DAP@" +has_nc2="@HAS_NC2@" +has_nc4="@HAS_NC4@" +has_f90="@HAS_F90@" +has_f03="@HAS_F03@" +flibs="-L${libdir} @NC_FLIBS@" +version="@PACKAGE_NAME@ @PACKAGE_VERSION@" + + usage() + { + echo + echo "This $version has been built with the following features: " + echo + echo " --cc -> $cc" + echo " --cflags -> $cflags" + echo + echo " --fc -> $fc" + echo " --fflags -> $fflags" + echo " --flibs -> $flibs" + echo " --has-f90 -> $has_f90" + echo " --has-f03 -> $has_f03" + echo + echo " --has-nc2 -> $has_nc2" + echo " --has-nc4 -> $has_nc4" + echo + echo " --prefix -> $prefix" + echo " --includedir-> $includedir" + echo " --version -> $version" + echo + } + + if test $# -eq 0; then + usage 1 + fi + + while test $# -gt 0; do + case "$1" in + #this deals with options in the style + #--option=value and extracts the value part + #[not currently used] + -*=*) value=`echo "$1" | sed 's/[-_a-zA-Z0-9]*=//'` ;; + *) value= ;; + esac + + case "$1" in + + --help) + usage 0 + ;; + + --all) + all + ;; + + --cc) + echo $cc + ;; + + --fc) + echo $fc + ;; + + --cflags) + echo $cflags + ;; + + --fflags) + echo $fflags + ;; + + --has-dap) + echo $has_dap + ;; + + --has-nc2) + echo $has_nc2 + ;; + + --has-nc4) + echo $has_nc4 + ;; + + --has-f90) + echo $has_f90 + ;; + + --has-f03) + echo $has_f03 + ;; + + --flibs) + echo $flibs + ;; + + --prefix) + echo "${CMAKE_INSTALL_PREFIX}" + ;; + + --includedir) + echo "${includedir}" + ;; + + --version) + echo $version + ;; + + *) + echo "unknown option: $1" + usage + exit 1 + ;; + esac + shift + done + + exit 0 ===================================== nf03_test/Makefile.in ===================================== @@ -471,6 +471,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -575,7 +576,7 @@ all: all-am .SUFFIXES: .SUFFIXES: .F90 .f90 .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -599,9 +600,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf03_test4/Makefile.in ===================================== @@ -610,6 +610,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -741,7 +742,7 @@ all: all-am .SUFFIXES: .SUFFIXES: .F90 .f90 .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -765,9 +766,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf_test/Makefile.in ===================================== @@ -568,6 +568,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -733,7 +734,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .F .F90 .c .f .f90 .lo .log .m4 .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -757,9 +758,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf_test4/Makefile.in ===================================== @@ -655,6 +655,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -819,7 +820,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .F .f .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -843,9 +844,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/compare/a26c1c0ce15b63a01f629e33a836bbe1b32b8e94...ded57453dfda30c4b3b33adb04b9190bc9d3ae76 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/compare/a26c1c0ce15b63a01f629e33a836bbe1b32b8e94...ded57453dfda30c4b3b33adb04b9190bc9d3ae76 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:18:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:18:30 +0000 Subject: [Git][debian-gis-team/netcdf-fortran][upstream] New upstream version 4.5.1+ds Message-ID: <5d708c96d5437_577b3f91d85805448391eb@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / netcdf-fortran Commits: c78bcb88 by Bas Couwenberg at 2019-09-05T04:02:11Z New upstream version 4.5.1+ds - - - - - 20 changed files: - CMakeExtras/Makefile.am - CMakeExtras/Makefile.in - CMakeLists.txt - Makefile.in - RELEASE_NOTES.md - aclocal.m4 - configure - configure.ac - docs/CMakeLists.txt - docs/Makefile.in - examples/F77/Makefile.in - examples/F90/Makefile.in - examples/Makefile.in - fortran/Makefile.in - libsrc/Makefile.in - nf-config.cmake.in - nf03_test/Makefile.in - nf03_test4/Makefile.in - nf_test/Makefile.in - nf_test4/Makefile.in Changes: ===================================== CMakeExtras/Makefile.am ===================================== @@ -8,4 +8,4 @@ # Russ Rew, Ward Fisher EXTRA_DIST = sizeof_ptrdiff_t.c test_c_ptrdiff_t.f90 \ MatchNetCDFFortranTypes.cmake check_relax_coord_bound.c \ -check_pnetcdf.c check_parallel4.c +check_pnetcdf.c check_parallel4.c check_cdf5.c ===================================== CMakeExtras/Makefile.in ===================================== @@ -191,6 +191,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -284,12 +285,12 @@ valgrind_tools = @valgrind_tools@ # Russ Rew, Ward Fisher EXTRA_DIST = sizeof_ptrdiff_t.c test_c_ptrdiff_t.f90 \ MatchNetCDFFortranTypes.cmake check_relax_coord_bound.c \ -check_pnetcdf.c check_parallel4.c +check_pnetcdf.c check_parallel4.c check_cdf5.c all: all-am .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -313,9 +314,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== CMakeLists.txt ===================================== @@ -17,7 +17,7 @@ set(PACKAGE "${NC4F_CTEST_PROJECT_NAME}" CACHE STRING "") #Project Version SET(NC4F_VERSION_MAJOR 4) SET(NC4F_VERSION_MINOR 5) -SET(NC4F_VERSION_PATCH 0) +SET(NC4F_VERSION_PATCH 1) SET(NC4F_VERSION_NOTE "") SET(NC4F_VERSION ${NC4F_VERSION_MAJOR}.${NC4F_VERSION_MINOR}.${NC4F_VERSION_PATCH}${NC4F_VERSION_NOTE}) SET(VERSION ${NC4F_VERSION}) @@ -75,6 +75,16 @@ ENDIF() FIND_PACKAGE(Doxygen) FIND_PROGRAM(NC_DOT NAMES dot) +# A basic script used to convert m4 files +FIND_PROGRAM(NF_M4 NAMES m4 m4.exe) +IF(NF_M4) + MESSAGE(STATUS "Found m4: ${NF_M4}") + SET(HAVE_M4 TRUE) +ELSE() + MESSAGE(STATUS "m4 not found.") +ENDIF() + + # For CMAKE_INSTALL_LIBDIR INCLUDE(GNUInstallDirs) @@ -535,25 +545,32 @@ CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nccreate "" USE_NETCDF_V2) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_set_log_level "" USE_LOGGING) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} oc_open "" BUILD_DAP) +SET(HAS_NC4 FALSE) IF(USE_NETCDF4) SET(NC_BUILD_V4 TRUE) SET(HAVE_NC_DEF_OPAQUE TRUE) + SET(HAS_NC4 TRUE) ENDIF(USE_NETCDF4) +SET(HAS_NC2 FALSE) IF(USE_NETCDF_V2) SET(HAVE_NCCREATE TRUE) + SET(USE_NC2 TRUE) + SET(HAS_NC2 TRUE) ENDIF(USE_NETCDF_V2) -SET(STATUS_LOGGING OFF) +SET(STATUS_LOGGING FALSE) IF(USE_LOGGING) SET(HAVE_NC_SET_LOG_LEVEL TRUE) SET(STATUS_LOGGING ON) ENDIF(USE_LOGGING) SET(ENABLE_DAP OFF) +SET(HAS_DAP FALSE) IF(BUILD_DAP) SET(HAVE_OC_OPEN TRUE) SET(ENABLE_DAP ON) + SET(HAS_DAP TRUE) ENDIF(BUILD_DAP) ## @@ -682,6 +699,18 @@ SET(BUILD_F90 "ON") SET(BUILD_V2 "ON") SET(BUILD_F03 "OFF") +IF(BUILD_F90) + SET(HAS_F90 TRUE) +ELSE() + SET(HAS_F90 FALSE) +ENDIF(BUILD_F90) + +IF(BUILD_F03) + SET(HAS_F03 TRUE) +ELSE() + SET(HAS_F03 FALSE) +ENDIF(BUILD_F03) + IF(ENABLE_FORTRAN_TYPE_CHECKS) # Do tests to determine which Fortran types correspond to NCBYTE, NCSHORT, ... # The configure file got these by testing an F77 program, invoking @@ -712,14 +741,8 @@ ENDIF(ENABLE_FORTRAN_TYPE_CHECKS) # configure_file("${NC4F_SOURCE_DIR}/config.h.cmake.in" # "${NC4F_BINARY_DIR}/config.h") -# # For now, just copy a stub file -# FILE(COPY "${NC4F_SOURCE_DIR}/nf-config.cmake.in" -# DESTINATION "${NC4F_BINARY_DIR}" -# FILE_PERMISSIONS OWNER_READ OWNER_WRITE OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE) -# FILE(RENAME "${NC4F_BINARY_DIR}/nf-config.cmake.in" "${NC4F_BINARY_DIR}/nf-config") - # make sure previous two files get cleaned up... -SET_DIRECTORY_PROPERTIES (DIRECTORY PROPERTY ADDITIONAL_MAKE_CLEAN_FILES ${netcdf-fortran_BINARY_DIR}/config.h ${netcdf-fortran_BINARY_DIR}/nf-config) +#SET_DIRECTORY_PROPERTIES (DIRECTORY PROPERTY ADDITIONAL_MAKE_CLEAN_FILES ${netcdf-fortran_BINARY_DIR}/config.h ${netcdf-fortran_BINARY_DIR}/nf-config) ## # Configuration for post-install RPath @@ -768,7 +791,13 @@ CONFIGURE_FILE("${CMAKE_SOURCE_DIR}/CTestConfig.cmake.in" "${CMAKE_SOURCE_DIR}/CTestConfig.cmake" @ONLY) - +# Generate nf-config form template. +CONFIGURE_FILE("${CMAKE_SOURCE_DIR}/nf-config.cmake.in" + "${CMAKE_BINARY_DIR}/tmp/nf-config" @ONLY + NEWLINE_STYLE LF) +FILE(COPY "${CMAKE_BINARY_DIR}/tmp/nf-config" + DESTINATION ${CMAKE_BINARY_DIR}/ + FILE_PERMISSIONS OWNER_READ OWNER_WRITE OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE) ### # Allow the user to specify libraries # to link against, similar to automakes 'LIBS' variable. @@ -841,6 +870,21 @@ IF(CMAKE_LINK_FLAGS) SET(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} ${CMAKE_LINK_FLAGS}") ENDIF() +# Set +SET(prefix ${CMAKE_INSTALL_PREFIX}) +SET(exec_prefix ${CMAKE_INSTALL_PREFIX}) +SET(libdir ${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_LIBDIR}) +SET(includedir ${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_INCLUDEDIR}) +SET(CC ${CMAKE_C_COMPILER}) +SET(FC ${CMAKE_Fortran_COMPILER}) + +configure_file( + ${CMAKE_SOURCE_DIR}/netcdf-fortran.pc.in + ${CMAKE_BINARY_DIR}/netcdf-fortran.pc @ONLY) + +INSTALL(FILES ${CMAKE_BINARY_DIR}/netcdf-fortran.pc + DESTINATION ${CMAKE_INSTALL_LIBDIR}/pkgconfig + COMPONENT utilities) INSTALL(PROGRAMS ${NC4F_BINARY_DIR}/nf-config DESTINATION "${CMAKE_INSTALL_BINDIR}" ===================================== Makefile.in ===================================== @@ -306,6 +306,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -427,7 +428,7 @@ all: all-recursive .SUFFIXES: am--refresh: Makefile @: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -453,9 +454,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) $(SHELL) ./config.status --recheck -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) $(am__cd) $(srcdir) && $(AUTOCONF) -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) $(am__cd) $(srcdir) && $(ACLOCAL) $(ACLOCAL_AMFLAGS) $(am__aclocal_m4_deps): nf-config: $(top_builddir)/config.status $(srcdir)/nf-config.in ===================================== RELEASE_NOTES.md ===================================== @@ -6,7 +6,18 @@ Release Notes {#nf_release_notes} This file contains a high-level description of this package's evolution. Entries are in reverse chronological order (most recent first). -## 4.5.1 - TBD +## 4.5.2 - TBD + +## 4.5.1 - September 4, 2019 + +### Requirements + +* netCDF-C: 4.6.0 or greater + +### Changes + +* Corrected an issue where a cmake-specific file wasn't being captured by `make dist`. +* Corrected an issue where nf-config wasn't being generated by cmake-based builds. Corrected a couple of other missing files. See [Github #108](https://github.com/Unidata/netcdf-fortran/issues/108) for more information. ## 4.5.0 - August 28, 2019 @@ -19,25 +30,11 @@ Entries are in reverse chronological order (most recent first). * Moved netCDF classic F90 API tests to new subdirectory nf03_test. * Moved netCDF-4 F77 API tests to new subdirectory nf_test4. * Moved netCDF-4 F90 API tests to new subdirectory nf03_test4. -* Fixed bug which caused parallel I/O tests to not be run. See - [#155](https://github.com/Unidata/netcdf-fortran/issues/155) and - [#157](https://github.com/Unidata/netcdf-fortran/issues/157). -* Fixed bug in the setting of file cache preemption for netCDF-4 - files. See - [#146](https://github.com/Unidata/netcdf-fortran/issues/146). -* Removed many near-duplicate tests files, now they are created at - build time with sed. See - [#165](https://github.com/Unidata/netcdf-fortran/issues/165). -* Removed no longer needed configure options --enable-dll (see - [#161](https://github.com/Unidata/netcdf-fortran/issues/161)), - --enable-extra-tests (see - [#114](https://github.com/Unidata/netcdf-fortran/issues/114)), - --enable-extra-example-tests(see - [#126](https://github.com/Unidata/netcdf-fortran/issues/126)), and - --enable-valgrind (see - [#118](https://github.com/Unidata/netcdf-fortran/issues/118)). -* Moved handling of F77 man page to the docs directory. See - [#141](https://github.com/Unidata/netcdf-fortran/issues/141) +* Fixed bug which caused parallel I/O tests to not be run. See [#155](https://github.com/Unidata/netcdf-fortran/issues/155) and [#157](https://github.com/Unidata/netcdf-fortran/issues/157). +* Fixed bug in the setting of file cache preemption for netCDF-4 files. See [#146](https://github.com/Unidata/netcdf-fortran/issues/146). +* Removed many near-duplicate tests files, now they are created at build time with sed. See [#165](https://github.com/Unidata/netcdf-fortran/issues/165). +* Removed no longer needed configure options --enable-dll (see [#161](https://github.com/Unidata/netcdf-fortran/issues/161)), `--enable-extra-tests` (see [#114](https://github.com/Unidata/netcdf-fortran/issues/114)), `--enable-extra-example-tests` (see [#126](https://github.com/Unidata/netcdf-fortran/issues/126)), and `--enable-valgrind` (see [#118](https://github.com/Unidata/netcdf-fortran/issues/118)). +* Moved handling of F77 man page to the docs directory. See [#141](https://github.com/Unidata/netcdf-fortran/issues/141). ## 4.4.5 - Release Jan 9, 2019 ===================================== aclocal.m4 ===================================== @@ -652,6 +652,42 @@ fi rmdir .tst 2>/dev/null AC_SUBST([am__leading_dot])]) +# Add --enable-maintainer-mode option to configure. -*- Autoconf -*- +# From Jim Meyering + +# Copyright (C) 1996-2018 Free Software Foundation, Inc. +# +# This file is free software; the Free Software Foundation +# gives unlimited permission to copy and/or distribute it, +# with or without modifications, as long as this notice is preserved. + +# AM_MAINTAINER_MODE([DEFAULT-MODE]) +# ---------------------------------- +# Control maintainer-specific portions of Makefiles. +# Default is to disable them, unless 'enable' is passed literally. +# For symmetry, 'disable' may be passed as well. Anyway, the user +# can override the default with the --enable/--disable switch. +AC_DEFUN([AM_MAINTAINER_MODE], +[m4_case(m4_default([$1], [disable]), + [enable], [m4_define([am_maintainer_other], [disable])], + [disable], [m4_define([am_maintainer_other], [enable])], + [m4_define([am_maintainer_other], [enable]) + m4_warn([syntax], [unexpected argument to AM@&t at _MAINTAINER_MODE: $1])]) +AC_MSG_CHECKING([whether to enable maintainer-specific portions of Makefiles]) + dnl maintainer-mode's default is 'disable' unless 'enable' is passed + AC_ARG_ENABLE([maintainer-mode], + [AS_HELP_STRING([--]am_maintainer_other[-maintainer-mode], + am_maintainer_other[ make rules and dependencies not useful + (and sometimes confusing) to the casual installer])], + [USE_MAINTAINER_MODE=$enableval], + [USE_MAINTAINER_MODE=]m4_if(am_maintainer_other, [enable], [no], [yes])) + AC_MSG_RESULT([$USE_MAINTAINER_MODE]) + AM_CONDITIONAL([MAINTAINER_MODE], [test $USE_MAINTAINER_MODE = yes]) + MAINT=$MAINTAINER_MODE_TRUE + AC_SUBST([MAINT])dnl +] +) + # Check to see how 'make' treats includes. -*- Autoconf -*- # Copyright (C) 2001-2018 Free Software Foundation, Inc. ===================================== configure ===================================== @@ -1,6 +1,6 @@ #! /bin/sh # Guess values for system-dependent variables and create Makefiles. -# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.0. +# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.1. # # Report bugs to . # @@ -590,8 +590,8 @@ MAKEFLAGS= # Identity of this package. PACKAGE_NAME='netCDF-Fortran' PACKAGE_TARNAME='netcdf-fortran' -PACKAGE_VERSION='4.5.0' -PACKAGE_STRING='netCDF-Fortran 4.5.0' +PACKAGE_VERSION='4.5.1' +PACKAGE_STRING='netCDF-Fortran 4.5.1' PACKAGE_BUGREPORT='support-netcdf at unidata.ucar.edu' PACKAGE_URL='' @@ -734,6 +734,9 @@ CPPFLAGS LDFLAGS CFLAGS CC +MAINT +MAINTAINER_MODE_FALSE +MAINTAINER_MODE_TRUE AM_BACKSLASH AM_DEFAULT_VERBOSITY AM_DEFAULT_V @@ -816,6 +819,7 @@ ac_subst_files='' ac_user_opts=' enable_option_checking enable_silent_rules +enable_maintainer_mode enable_dependency_tracking enable_valgrind enable_valgrind_memcheck @@ -1396,7 +1400,7 @@ if test "$ac_init_help" = "long"; then # Omit some internal or obsolete options to make the list less imposing. # This message is too long to be a string in the A/UX 3.1 sh. cat <<_ACEOF -\`configure' configures netCDF-Fortran 4.5.0 to adapt to many kinds of systems. +\`configure' configures netCDF-Fortran 4.5.1 to adapt to many kinds of systems. Usage: $0 [OPTION]... [VAR=VALUE]... @@ -1467,7 +1471,7 @@ fi if test -n "$ac_init_help"; then case $ac_init_help in - short | recursive ) echo "Configuration of netCDF-Fortran 4.5.0:";; + short | recursive ) echo "Configuration of netCDF-Fortran 4.5.1:";; esac cat <<\_ACEOF @@ -1477,6 +1481,9 @@ Optional Features: --enable-FEATURE[=ARG] include FEATURE [ARG=yes] --enable-silent-rules less verbose build output (undo: "make V=1") --disable-silent-rules verbose build output (undo: "make V=0") + --enable-maintainer-mode + enable make rules and dependencies not useful (and + sometimes confusing) to the casual installer --enable-dependency-tracking do not reject slow dependency extractors --disable-dependency-tracking @@ -1616,7 +1623,7 @@ fi test -n "$ac_init_help" && exit $ac_status if $ac_init_version; then cat <<\_ACEOF -netCDF-Fortran configure 4.5.0 +netCDF-Fortran configure 4.5.1 generated by GNU Autoconf 2.69 Copyright (C) 2012 Free Software Foundation, Inc. @@ -2421,7 +2428,7 @@ cat >config.log <<_ACEOF This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. -It was created by netCDF-Fortran $as_me 4.5.0, which was +It was created by netCDF-Fortran $as_me 4.5.1, which was generated by GNU Autoconf 2.69. Invocation command line was $ $0 $@ @@ -2772,11 +2779,11 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu # Create the VERSION file, which contains the package version from # AC_INIT. -echo -n 4.5.0>VERSION +echo -n 4.5.1>VERSION -{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.0" >&5 -$as_echo "$as_me: netCDF-Fortran 4.5.0" >&6;} +{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.1" >&5 +$as_echo "$as_me: netCDF-Fortran 4.5.1" >&6;} # Keep libtool macros in an m4 directory. @@ -3411,7 +3418,7 @@ fi # Define the identity of the package. PACKAGE='netcdf-fortran' - VERSION='4.5.0' + VERSION='4.5.1' cat >>confdefs.h <<_ACEOF @@ -3505,6 +3512,29 @@ END fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking whether to enable maintainer-specific portions of Makefiles" >&5 +$as_echo_n "checking whether to enable maintainer-specific portions of Makefiles... " >&6; } + # Check whether --enable-maintainer-mode was given. +if test "${enable_maintainer_mode+set}" = set; then : + enableval=$enable_maintainer_mode; USE_MAINTAINER_MODE=$enableval +else + USE_MAINTAINER_MODE=no +fi + + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $USE_MAINTAINER_MODE" >&5 +$as_echo "$USE_MAINTAINER_MODE" >&6; } + if test $USE_MAINTAINER_MODE = yes; then + MAINTAINER_MODE_TRUE= + MAINTAINER_MODE_FALSE='#' +else + MAINTAINER_MODE_TRUE='#' + MAINTAINER_MODE_FALSE= +fi + + MAINT=$MAINTAINER_MODE_TRUE + + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking user options" >&5 $as_echo "$as_me: checking user options" >&6;} @@ -24138,6 +24168,10 @@ else am__EXEEXT_FALSE= fi +if test -z "${MAINTAINER_MODE_TRUE}" && test -z "${MAINTAINER_MODE_FALSE}"; then + as_fn_error $? "conditional \"MAINTAINER_MODE\" was never defined. +Usually this means the macro was only invoked conditionally." "$LINENO" 5 +fi if test -z "${AMDEP_TRUE}" && test -z "${AMDEP_FALSE}"; then as_fn_error $? "conditional \"AMDEP\" was never defined. Usually this means the macro was only invoked conditionally." "$LINENO" 5 @@ -24607,7 +24641,7 @@ cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 # report actual input values of CONFIG_FILES etc. instead of their # values after options handling. ac_log=" -This file was extended by netCDF-Fortran $as_me 4.5.0, which was +This file was extended by netCDF-Fortran $as_me 4.5.1, which was generated by GNU Autoconf 2.69. Invocation command line was CONFIG_FILES = $CONFIG_FILES @@ -24668,7 +24702,7 @@ _ACEOF cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" ac_cs_version="\\ -netCDF-Fortran config.status 4.5.0 +netCDF-Fortran config.status 4.5.1 configured by $0, generated by GNU Autoconf 2.69, with options \\"\$ac_cs_config\\" ===================================== configure.ac ===================================== @@ -9,7 +9,7 @@ AC_PREREQ([2.59]) # Initialize with name, version, and support email address. -AC_INIT([netCDF-Fortran], [4.5.0], [support-netcdf at unidata.ucar.edu]) +AC_INIT([netCDF-Fortran], [4.5.1], [support-netcdf at unidata.ucar.edu]) # Create the VERSION file, which contains the package version from # AC_INIT. @@ -29,6 +29,7 @@ AC_CANONICAL_TARGET # This call is required by automake. AM_INIT_AUTOMAKE([foreign dist-zip subdir-objects]) +AM_MAINTAINER_MODE() AC_MSG_NOTICE([checking user options]) ===================================== docs/CMakeLists.txt ===================================== @@ -23,6 +23,13 @@ IF(ENABLE_DOXYGEN) FILE(COPY ${IMG_FILES} DESTINATION ${CMAKE_CURRENT_BINARY_DIR}/html/) ENDIF() -SET(CUR_EXTRA_DIST ${CUR_EXTRA_DIST} CMakeLists.txt Makefile.am netcdf.m4 DoxygenLayout.xml Doxyfile.in footer.html mainpage.doc tutorial.doc install.doc dispatch.doc guide.doc types.doc notes.doc cdl.doc architecture.doc internal.doc Doxyfile.in.cmake windows-binaries.md Building-with-CMake.md) +IF(HAVE_M4) +IF(NOT MSVC) + ADD_CUSTOM_TARGET(manpage ALL + COMMAND ${NF_M4} '${CMAKE_CURRENT_SOURCE_DIR}/netcdf.m4' > '${CMAKE_CURRENT_BINARY_DIR}/netcdf_fortran.3' + WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR} + ) -ADD_EXTRA_DIST("${CUR_EXTRA_DIST}") + INSTALL(FILES ${CMAKE_CURRENT_BINARY_DIR}/netcdf_fortran.3 DESTINATION "share/man/man3" COMPONENT documentation) +ENDIF(NOT MSVC) +ENDIF(HAVE_M4) \ No newline at end of file ===================================== docs/Makefile.in ===================================== @@ -227,6 +227,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -343,7 +344,7 @@ MAINTAINERCLEANFILES = netcdf_fortran-man.html stamp-* version* all: all-am .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -367,9 +368,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): Doxyfile: $(top_builddir)/config.status $(srcdir)/Doxyfile.in ===================================== examples/F77/Makefile.in ===================================== @@ -484,6 +484,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -601,7 +602,7 @@ all: all-am .SUFFIXES: .SUFFIXES: .f .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -625,9 +626,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== examples/F90/Makefile.in ===================================== @@ -548,6 +548,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -685,7 +686,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .f90 .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -709,9 +710,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== examples/Makefile.in ===================================== @@ -249,6 +249,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -345,7 +346,7 @@ EXTRA_DIST = CMakeLists.txt all: all-recursive .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -369,9 +370,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== fortran/Makefile.in ===================================== @@ -418,6 +418,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -601,7 +602,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .F90 .c .f90 .lo .o .obj -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -625,9 +626,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== libsrc/Makefile.in ===================================== @@ -191,6 +191,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -288,7 +289,7 @@ EXTRA_DIST = CMakeLists.txt all: all-am .SUFFIXES: -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -312,9 +313,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf-config.cmake.in ===================================== @@ -3,160 +3,130 @@ # This forms the basis for the nf-config utility, which tells you # various things about the netCDF Fortran installation. -echo "nf-config not yet implemented for cmake builds" -exit 1 - -# prefix=@CMAKE_INSTALL_PREFIX@ -# exec_prefix=@CMAKE_INSTALL_PREFIX@ -# libdir=@CMAKE_INSTALL_PREFIX@/lib -# includedir=@CMAKE_INSTALL_PREFIX@/include -# -# cc="@CMAKE_C_COMPILER@" -# fc="@CMAKE_FORTRAN_COMPILER@" -# cflags="-I at CMAKE_INSTALL_PREFIX@/include @CMAKE_C_FLAGS@ @CMAKE_CPP_FLAGS@" -# fflags="@MOD_FLAG@${includedir}" -# -# has_dap="@HAS_DAP@" -# has_nc2="@HAS_NC2@" -# has_nc4="@HAS_NC4@" -# has_f90="@HAS_F90@" -# has_f03="@HAS_F03@" -# flibs="-L${libdir} @NC_FLIBS@" -# version="@PACKAGE_NAME@ @PACKAGE_VERSION@" -# -# usage() -# { -# cat < $cc" -# echo " --cflags -> $cflags" -# echo -# echo " --fc -> $fc" -# echo " --fflags -> $fflags" -# echo " --flibs -> $flibs" -# echo " --has-f90 -> $has_f90" -# echo " --has-f03 -> $has_f03" -# echo -# echo " --has-nc2 -> $has_nc2" -# echo " --has-nc4 -> $has_nc4" -# echo -# echo " --prefix -> $prefix" -# echo " --includedir-> $includedir" -# echo " --version -> $version" -# echo -# } -# -# if test $# -eq 0; then -# usage 1 -# fi -# -# while test $# -gt 0; do -# case "$1" in -# # this deals with options in the style -# # --option=value and extracts the value part -# # [not currently used] -# -*=*) value=`echo "$1" | sed 's/[-_a-zA-Z0-9]*=//'` ;; -# *) value= ;; -# esac -# -# case "$1" in -# -# --help) -# usage 0 -# ;; -# -# --all) -# all -# ;; -# -# --cc) -# echo $cc -# ;; -# -# --fc) -# echo $fc -# ;; -# -# --cflags) -# echo $cflags -# ;; -# -# --fflags) -# echo $fflags -# ;; -# -# --has-dap) -# echo $has_dap -# ;; -# -# --has-nc2) -# echo $has_nc2 -# ;; -# -# --has-nc4) -# echo $has_nc4 -# ;; -# -# --has-f90) -# echo $has_f90 -# ;; -# -# --has-f03) -# echo $has_f03 -# ;; -# -# --flibs) -# echo $flibs -# ;; -# -# --prefix) -# echo "${CMAKE_INSTALL_PREFIX}" -# ;; -# -# --includedir) -# echo "${includedir}" -# ;; -# -# --version) -# echo $version -# ;; -# -# *) -# echo "unknown option: $1" -# usage -# exit 1 -# ;; -# esac -# shift -# done -# -# exit 0 + +prefix=@CMAKE_INSTALL_PREFIX@ +exec_prefix=@CMAKE_INSTALL_PREFIX@ +libdir=@CMAKE_INSTALL_PREFIX@/lib +includedir=@CMAKE_INSTALL_PREFIX@/include +# +cc="@CMAKE_C_COMPILER@" +fc="@CMAKE_Fortran_COMPILER@" +cflags="-I at CMAKE_INSTALL_PREFIX@/include @CMAKE_C_FLAGS@ @CMAKE_CPP_FLAGS@" +fflags="@MOD_FLAG@${includedir}" +# +has_dap="@HAS_DAP@" +has_nc2="@HAS_NC2@" +has_nc4="@HAS_NC4@" +has_f90="@HAS_F90@" +has_f03="@HAS_F03@" +flibs="-L${libdir} @NC_FLIBS@" +version="@PACKAGE_NAME@ @PACKAGE_VERSION@" + + usage() + { + echo + echo "This $version has been built with the following features: " + echo + echo " --cc -> $cc" + echo " --cflags -> $cflags" + echo + echo " --fc -> $fc" + echo " --fflags -> $fflags" + echo " --flibs -> $flibs" + echo " --has-f90 -> $has_f90" + echo " --has-f03 -> $has_f03" + echo + echo " --has-nc2 -> $has_nc2" + echo " --has-nc4 -> $has_nc4" + echo + echo " --prefix -> $prefix" + echo " --includedir-> $includedir" + echo " --version -> $version" + echo + } + + if test $# -eq 0; then + usage 1 + fi + + while test $# -gt 0; do + case "$1" in + #this deals with options in the style + #--option=value and extracts the value part + #[not currently used] + -*=*) value=`echo "$1" | sed 's/[-_a-zA-Z0-9]*=//'` ;; + *) value= ;; + esac + + case "$1" in + + --help) + usage 0 + ;; + + --all) + all + ;; + + --cc) + echo $cc + ;; + + --fc) + echo $fc + ;; + + --cflags) + echo $cflags + ;; + + --fflags) + echo $fflags + ;; + + --has-dap) + echo $has_dap + ;; + + --has-nc2) + echo $has_nc2 + ;; + + --has-nc4) + echo $has_nc4 + ;; + + --has-f90) + echo $has_f90 + ;; + + --has-f03) + echo $has_f03 + ;; + + --flibs) + echo $flibs + ;; + + --prefix) + echo "${CMAKE_INSTALL_PREFIX}" + ;; + + --includedir) + echo "${includedir}" + ;; + + --version) + echo $version + ;; + + *) + echo "unknown option: $1" + usage + exit 1 + ;; + esac + shift + done + + exit 0 ===================================== nf03_test/Makefile.in ===================================== @@ -471,6 +471,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -575,7 +576,7 @@ all: all-am .SUFFIXES: .SUFFIXES: .F90 .f90 .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -599,9 +600,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf03_test4/Makefile.in ===================================== @@ -610,6 +610,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -741,7 +742,7 @@ all: all-am .SUFFIXES: .SUFFIXES: .F90 .f90 .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -765,9 +766,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf_test/Makefile.in ===================================== @@ -568,6 +568,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -733,7 +734,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .F .F90 .c .f .f90 .lo .log .m4 .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -757,9 +758,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): ===================================== nf_test4/Makefile.in ===================================== @@ -655,6 +655,7 @@ LIPO = @LIPO@ LN_S = @LN_S@ LTLIBOBJS = @LTLIBOBJS@ LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@ +MAINT = @MAINT@ MAKEINFO = @MAKEINFO@ MANIFEST_TOOL = @MANIFEST_TOOL@ MKDIR_P = @MKDIR_P@ @@ -819,7 +820,7 @@ all: $(BUILT_SOURCES) .SUFFIXES: .SUFFIXES: .F .f .lo .log .o .obj .test .test$(EXEEXT) .trs -$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps) +$(srcdir)/Makefile.in: @MAINTAINER_MODE_TRUE@ $(srcdir)/Makefile.am $(am__configure_deps) @for dep in $?; do \ case '$(am__configure_deps)' in \ *$$dep*) \ @@ -843,9 +844,9 @@ Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status $(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(top_srcdir)/configure: $(am__configure_deps) +$(top_srcdir)/configure: @MAINTAINER_MODE_TRUE@ $(am__configure_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh -$(ACLOCAL_M4): $(am__aclocal_m4_deps) +$(ACLOCAL_M4): @MAINTAINER_MODE_TRUE@ $(am__aclocal_m4_deps) cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh $(am__aclocal_m4_deps): View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/c78bcb88441842c505e23dfe8e688608df1d067d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/c78bcb88441842c505e23dfe8e688608df1d067d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 5 05:29:21 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 04:29:21 +0000 Subject: Processing of netcdf-fortran_4.5.1+ds-1~exp1_source.changes Message-ID: netcdf-fortran_4.5.1+ds-1~exp1_source.changes uploaded successfully to localhost along with the files: netcdf-fortran_4.5.1+ds-1~exp1.dsc netcdf-fortran_4.5.1+ds.orig.tar.xz netcdf-fortran_4.5.1+ds-1~exp1.debian.tar.xz netcdf-fortran_4.5.1+ds-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Thu Sep 5 05:34:33 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 04:34:33 +0000 Subject: netcdf-fortran_4.5.1+ds-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 05 Sep 2019 06:03:45 +0200 Source: netcdf-fortran Architecture: source Version: 4.5.1+ds-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: netcdf-fortran (4.5.1+ds-1~exp1) experimental; urgency=medium . * New upstream release. Checksums-Sha1: d3b085d62c4d4f72f6003b1083e5cfcd0de3c79b 2399 netcdf-fortran_4.5.1+ds-1~exp1.dsc c99db6e2efddab509182a348d3760e15aa98b301 669832 netcdf-fortran_4.5.1+ds.orig.tar.xz 2390f1a045f0a9facd4db01d08a3b227c3619856 10216 netcdf-fortran_4.5.1+ds-1~exp1.debian.tar.xz cad2c2c0fe9b47ad945087b6952dafc56a07e394 10038 netcdf-fortran_4.5.1+ds-1~exp1_amd64.buildinfo Checksums-Sha256: 00e278e1eb6499a42a3ff8ea90d3b5579436c2f3005aa9878ba83842797399c0 2399 netcdf-fortran_4.5.1+ds-1~exp1.dsc ec123b8dd14662b3b331f19fe03d2e9bcd6f30ecabbea44c583aaff1491847e8 669832 netcdf-fortran_4.5.1+ds.orig.tar.xz 9a95cb973d622cf697f2d74009ff123a1157b9368ede6e36d7a83138b14c4f17 10216 netcdf-fortran_4.5.1+ds-1~exp1.debian.tar.xz 150168c546937debb3b9e581eaba134b5825bf4ffe7cb06cc096617fbad0accd 10038 netcdf-fortran_4.5.1+ds-1~exp1_amd64.buildinfo Files: 0dfc6f981b13214492f7483b30cc14c1 2399 science optional netcdf-fortran_4.5.1+ds-1~exp1.dsc 4b4db02789480c4ae0200e225310e341 669832 science optional netcdf-fortran_4.5.1+ds.orig.tar.xz cf5cf888c8b881ca1de83722eb4f2333 10216 science optional netcdf-fortran_4.5.1+ds-1~exp1.debian.tar.xz 7bd52ae6647a10a524b604486e1cde74 10038 science optional netcdf-fortran_4.5.1+ds-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1wjEIACgkQZ1DxCuiN SvEECw//T9EcSUCAXT1i8NggHrHXt7pCaqD5whNZJEOkxcjsUXTM70Xp5oIMxhYT X+Tp3I94//rCHOWJl+zFs8heYX7SEnkMuJiazuhpLIixQdEAKKHrbX2R0k/lOeOK Gi6FEK0LG4VPUNK346RKtWN08QN8aGwT0E+90LiXqLZEsrGe4kpA7As4aFL0dwKI jT2NszZ40ujHbjRNV2N8tcl+mUMKiP73DveiadIXOoLLB/3M7tBsZqmvj4LWwU0E zAw/izy5evyHWy49O1BKJh2k3xGU3uyKpRvSLohljZiWK50P2uFT/9FrCh4Dua+S WM7NF66oKw2JNtZtcrYJ9nejUi0xcr9pUZ11Yc+Q8ac5plM2yMc/5fb27Yo42F96 wVOhFl5C2Fxu37HMmtids7G2dAf5oflPFv/lT8MNJaJQLIOBfj+i1dtdgsd2RTBd Qn8WZTRm/Sj8ptQw9hnLrzCT7OZYWAT1MtUryEoXd3W2dK2TtRp/qJ/YQzR5VDG4 z5MegHY1WoApfD9C3AJSwf2RqcCxixvy/BJ9/1Y1LTdxrcDI25ZyFEpw+Rt+8pg+ /Oxeu+YzmdZT9cXdAwUJhwSrKKL67ABylRzhbrvwadS26hlnmcsEXnvb45ip7Zb7 MQIPulKOScPw/1h3Rb+3UQ3n+84yxseR3Sf9CNGaLaKmBbncPPQ= =GLF9 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Thu Sep 5 05:36:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:36:25 +0000 Subject: [Git][debian-gis-team/python-affine][master] 4 commits: New upstream version 2.3.0 Message-ID: <5d7090c9baf6_577b2ade61196f448407c9@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-affine Commits: f476055d by Bas Couwenberg at 2019-09-05T04:18:17Z New upstream version 2.3.0 - - - - - cde681ff by Bas Couwenberg at 2019-09-05T04:18:18Z Update upstream source from tag 'upstream/2.3.0' Update to upstream version '2.3.0' with Debian dir df10dc9ccf89fa32c9fa99da2acff8ad1e44a30f - - - - - 1eacfa0a by Bas Couwenberg at 2019-09-05T04:18:50Z New upstream release. - - - - - 542a040e by Bas Couwenberg at 2019-09-05T04:19:48Z Set distribution to unstable. - - - - - 7 changed files: - .travis.yml - AUTHORS.txt - CHANGES.txt - affine/__init__.py - affine/tests/test_transform.py - debian/changelog - tox.ini Changes: ===================================== .travis.yml ===================================== @@ -1,9 +1,10 @@ -sudo: false +dist: xenial language: python cache: pip python: - 2.7 - 3.6 + - 3.7 install: - pip install -r requirements.txt - pip install .[test] @@ -17,4 +18,4 @@ deploy: tags: true provider: pypi distributions: "sdist bdist_wheel" - user: seang + user: __token__ ===================================== AUTHORS.txt ===================================== @@ -3,7 +3,7 @@ Authors - Sean Gillies - Steven Ring -- Mike Toews +- Mike Taves - Kevin Wurster - Todd Small - Juan Luis Cano Rodríguez ===================================== CHANGES.txt ===================================== @@ -1,6 +1,19 @@ CHANGES ======= +2.3.0 (2019-09-04) +------------------ + +Deprecations: + +- Right multiplication like vector * matrix is deprecated and will raise + AffineError in version 3.0.0. + +Bug fixes: + +- Change signature of Affine constructor to help users of PyCharm (#45). +- The Affine class docstring has been improved. + 2.2.2 (2018-12-20) ------------------ - Affine.itransform computed the wrong results for arrays with rotation or ===================================== affine/__init__.py ===================================== @@ -1,12 +1,5 @@ """Affine transformation matrices -The 3x3 augmented affine transformation matrix for transformations in two -dimensions is illustrated below. - - | x' | | a b c | | x | - | y' | = | d e f | | y | - | 1 | | 0 0 1 | | 1 | - The Affine package is derived from Casey Duncan's Planar package. See the copyright statement below. """ @@ -43,11 +36,12 @@ from __future__ import division from collections import namedtuple import math +import warnings __all__ = ['Affine'] __author__ = "Sean Gillies" -__version__ = "2.2.2" +__version__ = "2.3.0" EPSILON = 1e-5 @@ -123,10 +117,36 @@ class Affine( namedtuple('Affine', ('a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i'))): """Two dimensional affine transform for 2D linear mapping. - Parallel lines are preserved by these transforms. Affine transforms - can perform any combination of translations, scales/flips, shears, - and rotations. Class methods are provided to conveniently compose - transforms from these operations. + Parameters + ---------- + a, b, c, d, e, f : float + Coefficients of an augmented affine transformation matrix + + | x' | | a b c | | x | + | y' | = | d e f | | y | + | 1 | | 0 0 1 | | 1 | + + `a`, `b`, and `c` are the elements of the first row of the + matrix. `d`, `e`, and `f` are the elements of the second row. + + Attributes + ---------- + a, b, c, d, e, f, g, h, i : float + The coefficients of the 3x3 augumented affine transformation + matrix + + | x' | | a b c | | x | + | y' | = | d e f | | y | + | 1 | | g h i | | 1 | + + `g`, `h`, and `i` are always 0, 0, and 1. + + The Affine package is derived from Casey Duncan's Planar package. + See the copyright statement below. Parallel lines are preserved by + these transforms. Affine transforms can perform any combination of + translations, scales/flips, shears, and rotations. Class methods + are provided to conveniently compose transforms from these + operations. Internally the transform is stored as a 3x3 transformation matrix. The transform may be constructed directly by specifying the first @@ -140,25 +160,19 @@ class Affine( matrices and vectors in general, but provides a convenience for users of this class. - :param members: 6 floats for the first two matrix rows. - :type members: float """ precision = EPSILON - def __new__(cls, *members): + def __new__(cls, a, b, c, d, e, f): """Create a new object Parameters ---------- - members : list of float - Affine matrix members a, b, c, d, e, f + a, b, c, d, e, f : float + Elements of an augmented affine transformation matrix. """ - if len(members) == 6: - mat3x3 = [x * 1.0 for x in members] + [0.0, 0.0, 1.0] - return tuple.__new__(cls, mat3x3) - else: - raise TypeError( - "Expected 6 coefficients, found %d" % len(members)) + mat3x3 = [x * 1.0 for x in [a, b, c, d, e, f]] + [0.0, 0.0, 1.0] + return tuple.__new__(cls, mat3x3) @classmethod def from_gdal(cls, c, a, b, f, d, e): @@ -265,7 +279,10 @@ class Affine( @classmethod def permutation(cls, *scaling): - """Create the permutation transform. For 2x2 matrices, there is only one permutation matrix that is not the identity. + """Create the permutation transform + + For 2x2 matrices, there is only one permutation matrix that is + not the identity. :rtype: Affine """ @@ -491,12 +508,17 @@ class Affine( def __rmul__(self, other): """Right hand multiplication + .. deprecated:: 2.3.0 + Right multiplication will be prohibited in version 3.0. This method + will raise AffineError. + Notes ----- We should not be called if other is an affine instance This is just a guarantee, since we would potentially return the wrong answer in that case. """ + warnings.warn("Right multiplication will be prohibited in version 3.0", DeprecationWarning, stacklevel=2) assert not isinstance(other, Affine) return self.__mul__(other) ===================================== affine/tests/test_transform.py ===================================== @@ -499,8 +499,9 @@ def test_mul_tuple(): def test_rmul_tuple(): - t = Affine(1, 2, 3, 4, 5, 6) - (2.0, 2.0) * t + with pytest.warns(DeprecationWarning): + t = Affine(1, 2, 3, 4, 5, 6) + (2.0, 2.0) * t def test_transform_precision(): ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-affine (2.3.0-1) unstable; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Thu, 05 Sep 2019 06:19:35 +0200 + python-affine (2.2.2-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== tox.ini ===================================== @@ -1,6 +1,6 @@ [tox] envlist = - py27,py36 + py27,py36,py37 [testenv] usedevelop = true View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/compare/3b87567a22497cfa424c1b16147aa0d7af4495b1...542a040e41296624866b749ab94d442479caea73 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/compare/3b87567a22497cfa424c1b16147aa0d7af4495b1...542a040e41296624866b749ab94d442479caea73 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:36:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:36:26 +0000 Subject: [Git][debian-gis-team/python-affine][pristine-tar] pristine-tar data for python-affine_2.3.0.orig.tar.gz Message-ID: <5d7090ca7c161_577b2ade5f3754848409c1@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-affine Commits: 0a9c9b63 by Bas Couwenberg at 2019-09-05T04:18:18Z pristine-tar data for python-affine_2.3.0.orig.tar.gz - - - - - 2 changed files: - + python-affine_2.3.0.orig.tar.gz.delta - + python-affine_2.3.0.orig.tar.gz.id Changes: ===================================== python-affine_2.3.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/python-affine_2.3.0.orig.tar.gz.delta differ ===================================== python-affine_2.3.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +4bddeef687f86c44c58130a4afb38848ea31a7ac View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/commit/0a9c9b638b219fd3eda38be205d6369a8593680b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/commit/0a9c9b638b219fd3eda38be205d6369a8593680b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:36:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:36:30 +0000 Subject: [Git][debian-gis-team/python-affine][upstream] New upstream version 2.3.0 Message-ID: <5d7090ce3c88a_577b2ade611bfd4084118d@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-affine Commits: f476055d by Bas Couwenberg at 2019-09-05T04:18:17Z New upstream version 2.3.0 - - - - - 6 changed files: - .travis.yml - AUTHORS.txt - CHANGES.txt - affine/__init__.py - affine/tests/test_transform.py - tox.ini Changes: ===================================== .travis.yml ===================================== @@ -1,9 +1,10 @@ -sudo: false +dist: xenial language: python cache: pip python: - 2.7 - 3.6 + - 3.7 install: - pip install -r requirements.txt - pip install .[test] @@ -17,4 +18,4 @@ deploy: tags: true provider: pypi distributions: "sdist bdist_wheel" - user: seang + user: __token__ ===================================== AUTHORS.txt ===================================== @@ -3,7 +3,7 @@ Authors - Sean Gillies - Steven Ring -- Mike Toews +- Mike Taves - Kevin Wurster - Todd Small - Juan Luis Cano Rodríguez ===================================== CHANGES.txt ===================================== @@ -1,6 +1,19 @@ CHANGES ======= +2.3.0 (2019-09-04) +------------------ + +Deprecations: + +- Right multiplication like vector * matrix is deprecated and will raise + AffineError in version 3.0.0. + +Bug fixes: + +- Change signature of Affine constructor to help users of PyCharm (#45). +- The Affine class docstring has been improved. + 2.2.2 (2018-12-20) ------------------ - Affine.itransform computed the wrong results for arrays with rotation or ===================================== affine/__init__.py ===================================== @@ -1,12 +1,5 @@ """Affine transformation matrices -The 3x3 augmented affine transformation matrix for transformations in two -dimensions is illustrated below. - - | x' | | a b c | | x | - | y' | = | d e f | | y | - | 1 | | 0 0 1 | | 1 | - The Affine package is derived from Casey Duncan's Planar package. See the copyright statement below. """ @@ -43,11 +36,12 @@ from __future__ import division from collections import namedtuple import math +import warnings __all__ = ['Affine'] __author__ = "Sean Gillies" -__version__ = "2.2.2" +__version__ = "2.3.0" EPSILON = 1e-5 @@ -123,10 +117,36 @@ class Affine( namedtuple('Affine', ('a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i'))): """Two dimensional affine transform for 2D linear mapping. - Parallel lines are preserved by these transforms. Affine transforms - can perform any combination of translations, scales/flips, shears, - and rotations. Class methods are provided to conveniently compose - transforms from these operations. + Parameters + ---------- + a, b, c, d, e, f : float + Coefficients of an augmented affine transformation matrix + + | x' | | a b c | | x | + | y' | = | d e f | | y | + | 1 | | 0 0 1 | | 1 | + + `a`, `b`, and `c` are the elements of the first row of the + matrix. `d`, `e`, and `f` are the elements of the second row. + + Attributes + ---------- + a, b, c, d, e, f, g, h, i : float + The coefficients of the 3x3 augumented affine transformation + matrix + + | x' | | a b c | | x | + | y' | = | d e f | | y | + | 1 | | g h i | | 1 | + + `g`, `h`, and `i` are always 0, 0, and 1. + + The Affine package is derived from Casey Duncan's Planar package. + See the copyright statement below. Parallel lines are preserved by + these transforms. Affine transforms can perform any combination of + translations, scales/flips, shears, and rotations. Class methods + are provided to conveniently compose transforms from these + operations. Internally the transform is stored as a 3x3 transformation matrix. The transform may be constructed directly by specifying the first @@ -140,25 +160,19 @@ class Affine( matrices and vectors in general, but provides a convenience for users of this class. - :param members: 6 floats for the first two matrix rows. - :type members: float """ precision = EPSILON - def __new__(cls, *members): + def __new__(cls, a, b, c, d, e, f): """Create a new object Parameters ---------- - members : list of float - Affine matrix members a, b, c, d, e, f + a, b, c, d, e, f : float + Elements of an augmented affine transformation matrix. """ - if len(members) == 6: - mat3x3 = [x * 1.0 for x in members] + [0.0, 0.0, 1.0] - return tuple.__new__(cls, mat3x3) - else: - raise TypeError( - "Expected 6 coefficients, found %d" % len(members)) + mat3x3 = [x * 1.0 for x in [a, b, c, d, e, f]] + [0.0, 0.0, 1.0] + return tuple.__new__(cls, mat3x3) @classmethod def from_gdal(cls, c, a, b, f, d, e): @@ -265,7 +279,10 @@ class Affine( @classmethod def permutation(cls, *scaling): - """Create the permutation transform. For 2x2 matrices, there is only one permutation matrix that is not the identity. + """Create the permutation transform + + For 2x2 matrices, there is only one permutation matrix that is + not the identity. :rtype: Affine """ @@ -491,12 +508,17 @@ class Affine( def __rmul__(self, other): """Right hand multiplication + .. deprecated:: 2.3.0 + Right multiplication will be prohibited in version 3.0. This method + will raise AffineError. + Notes ----- We should not be called if other is an affine instance This is just a guarantee, since we would potentially return the wrong answer in that case. """ + warnings.warn("Right multiplication will be prohibited in version 3.0", DeprecationWarning, stacklevel=2) assert not isinstance(other, Affine) return self.__mul__(other) ===================================== affine/tests/test_transform.py ===================================== @@ -499,8 +499,9 @@ def test_mul_tuple(): def test_rmul_tuple(): - t = Affine(1, 2, 3, 4, 5, 6) - (2.0, 2.0) * t + with pytest.warns(DeprecationWarning): + t = Affine(1, 2, 3, 4, 5, 6) + (2.0, 2.0) * t def test_transform_precision(): ===================================== tox.ini ===================================== @@ -1,6 +1,6 @@ [tox] envlist = - py27,py36 + py27,py36,py37 [testenv] usedevelop = true View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/commit/f476055da79eebc02b2531328f7fb19419b6998b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/commit/f476055da79eebc02b2531328f7fb19419b6998b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:36:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:36:31 +0000 Subject: [Git][debian-gis-team/python-affine] Pushed new tag debian/2.3.0-1 Message-ID: <5d7090cf55024_577b2ade611bfd40841338@godard.mail> Bas Couwenberg pushed new tag debian/2.3.0-1 at Debian GIS Project / python-affine -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/tree/debian/2.3.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 05:36:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 04:36:32 +0000 Subject: [Git][debian-gis-team/python-affine] Pushed new tag upstream/2.3.0 Message-ID: <5d7090d054a48_577b2ade61196f44841543@godard.mail> Bas Couwenberg pushed new tag upstream/2.3.0 at Debian GIS Project / python-affine -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/tree/upstream/2.3.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From noreply at release.debian.org Thu Sep 5 05:39:18 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 05 Sep 2019 04:39:18 +0000 Subject: pyresample 1.12.3-6 MIGRATED to testing Message-ID: FYI: The status of the pyresample source package in Debian's testing distribution has changed. Previous version: 1.12.3-5 Current version: 1.12.3-6 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From ftpmaster at ftp-master.debian.org Thu Sep 5 05:44:27 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 04:44:27 +0000 Subject: Processing of python-affine_2.3.0-1_source.changes Message-ID: python-affine_2.3.0-1_source.changes uploaded successfully to localhost along with the files: python-affine_2.3.0-1.dsc python-affine_2.3.0.orig.tar.gz python-affine_2.3.0-1.debian.tar.xz python-affine_2.3.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Thu Sep 5 05:49:15 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 04:49:15 +0000 Subject: python-affine_2.3.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 05 Sep 2019 06:19:35 +0200 Source: python-affine Architecture: source Version: 2.3.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-affine (2.3.0-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: d798b9a9e3d32a58a69a73b970345d1d8b0ef5cd 2070 python-affine_2.3.0-1.dsc 49f2afc0072c3aca5936083aa3c8ad26368b7339 14860 python-affine_2.3.0.orig.tar.gz 3b42d45f089e40bc25bb34e4a8a151a6a3be718f 3296 python-affine_2.3.0-1.debian.tar.xz 0ec13669759370d040b24ebdc4aeb63f06488ea6 7254 python-affine_2.3.0-1_amd64.buildinfo Checksums-Sha256: 2600e10bd43eaa5e5c16a0658dceb20c2a4080653e1642a1929e5ed77f52e342 2070 python-affine_2.3.0-1.dsc 505ec1bd32b092423cfc4bc3c4ac0e68b79f1d7ff452f0b81ddd21fa8aca9c91 14860 python-affine_2.3.0.orig.tar.gz 2ec0b5c01511ae02f7a1b5a3a7fc75d7133716d4085841acb7533e6c60c9c8d0 3296 python-affine_2.3.0-1.debian.tar.xz 283ee50dfbbd64a1bb5602251d512f471a5b540ee9fa014f51e381a51baed6ed 7254 python-affine_2.3.0-1_amd64.buildinfo Files: 04060af6348622a85c7a6454bccb3468 2070 python optional python-affine_2.3.0-1.dsc 32bb6a5fec7fbe95455c992368edbf09 14860 python optional python-affine_2.3.0.orig.tar.gz 773203fd70f8b43a5c536f365139b428 3296 python optional python-affine_2.3.0-1.debian.tar.xz 7d8fb2fe181388bea0b27b315bb377b6 7254 python optional python-affine_2.3.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1wkEcACgkQZ1DxCuiN SvFD9hAAip3/oIhRArfyqQ9lF1z9Wq6ZjXmEDgCReGd4crZYkUXTMqX8oq1i2sle DzRtCDAp/0r4OwzbXAO9J5nU+R6NbUKGVKfrY+7WZhwe5KWrNDAgArdxGoF+0C29 lWC/lJR+wcgWg2Ep9g4p35cGEXZ77e6dF6UQ/C4u9jycJC78MfTl0aUHHH8L/V/k sBcjpodVzGszOCoX7G/D4aNsicgUhxPd+ONjei+Td5IUPVu65iXILFGXMvMC5rPK 1jBUFCOIvQlqhT7c9Q6Uyf4OzmPeY+jCsTLV5JVel6HMWrpLrjgIbaKBzFVKxiXh dg82Xp+aIs76LEkjKX1qxBRACSz23MR0rMjx8kXRNgso9O41NeaLL6iU0BJBnVEX KtcgVvAPUxiSTGUX3P3b/RzZ+yHAuAmXS2OLr/CZWWUIPHTJpoxy71IUJSjFEFH0 AYADXM1CbsbiMTw1ChXdUDMqxQPY1Z8uTRyvLW/ZIrxaTBhnRMmUXYSJT80Xf2xh be86RpFiPvNNHLpYQKXn1n+NmPfmZ1SnPLpkZl1W13OeODbAV6Mvrhvyht/ut1B5 bpAi62lwj2DLG465hV5ZliI1+TWeuOew+JmlbRAn3Rwx7u0QmNycS+HdDp0JrfwS w2ItiweX8x18P9XG5lPQ3zmOiLFRgi+Hbqc01cK8aVJFdJrtORw= =n5Ur -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Thu Sep 5 08:01:10 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 07:01:10 +0000 Subject: osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.changes ACCEPTED into buster-backports, buster-backports Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 07:05:24 +0200 Source: osm2pgsql Binary: osm2pgsql osm2pgsql-dbgsym Architecture: source amd64 Version: 1.0.0+ds-1~bpo10+1 Distribution: buster-backports Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: osm2pgsql - OpenStreetMap data to PostgreSQL converter Changes: osm2pgsql (1.0.0+ds-1~bpo10+1) buster-backports; urgency=medium . * Rebuild for buster-backports. * Update branch in gbp.conf & Vcs-Git URL. . osm2pgsql (1.0.0+ds-1) unstable; urgency=medium . * New upstream release. * Drop spelling-errors.patch, applied upstream. * Move from experimental to unstable. . osm2pgsql (1.00.0~rc1+ds-1~exp1) experimental; urgency=medium . * New upstream release candidate. * Add patch to fix spelling errors. . osm2pgsql (0.96.0+ds-3) unstable; urgency=medium . * Bump Standards-Version to 4.4.0, no changes. * Update watch file to limit matches to archive path. * Define ACCEPT_USE_OF_DEPRECATED_PROJ_API_H for PROJ 6.0.0 compatibility. * Update gbp.conf to use --source-only-changes by default. * Append -DNDEBUG to CXXFLAGS to remove buildpath from binaries. Checksums-Sha1: 85ab47d930741d558be88c6ad7093433c80506b8 2330 osm2pgsql_1.0.0+ds-1~bpo10+1.dsc d7a306ee67485907ed1defc597ac98131c7d9269 15112 osm2pgsql_1.0.0+ds-1~bpo10+1.debian.tar.xz a41c03e24666a01bae16a798342e69ac542153d1 8269608 osm2pgsql-dbgsym_1.0.0+ds-1~bpo10+1_amd64.deb eb543291a55c6b16f78e2a7dadd3a688e02834b7 8246 osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.buildinfo e179332a84a911754824e23611443d1c117ee9b8 388492 osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.deb Checksums-Sha256: 27d60e861f2bf9f3ea9f6c024b96b29c0ebfe422cb6d4aa38dd635774afb97f9 2330 osm2pgsql_1.0.0+ds-1~bpo10+1.dsc e3ac5d1a088024fa81f273f8ad44dd8ccf92f8ab05c782c2862e8bdf53c11486 15112 osm2pgsql_1.0.0+ds-1~bpo10+1.debian.tar.xz 63c00b175a82f99f6c40c51a3d1fa72b5c07ec54141950acb48fa84715c94a4f 8269608 osm2pgsql-dbgsym_1.0.0+ds-1~bpo10+1_amd64.deb 03be855774921ca94359ce1f068ba477ed96c8fe0d32b1330ac71c09970b971c 8246 osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.buildinfo c7cf0531752a5e21b2451124aea94fba993526421e9d75fe42de0954c7fb0f86 388492 osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.deb Files: 0389f4e2414a865dc781e1dfb2c5a722 2330 utils optional osm2pgsql_1.0.0+ds-1~bpo10+1.dsc 4c7c87dcff1f1b20017292af807a7087 15112 utils optional osm2pgsql_1.0.0+ds-1~bpo10+1.debian.tar.xz eeacefa11098addebb9a3cf5a6a99454 8269608 debug optional osm2pgsql-dbgsym_1.0.0+ds-1~bpo10+1_amd64.deb 63e2246f6842541819e5b603a7219db2 8246 utils optional osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.buildinfo f793a4358bc73927608b75d5e0e5d2aa 388492 utils optional osm2pgsql_1.0.0+ds-1~bpo10+1_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1vTc4ACgkQZ1DxCuiN SvEl4BAAjmnzt1nYoZS9LtOWWHqN11uiLdXYa5cdO6B08T+cGakds44rFrB/4sZu Gdwh3L3Cngxn7rWNxhlDOqkkwuNYJRO0TnDPOEYbsvcMBpgq9A74+pKmf/iiSFHS 1Y3hsSWJsa1q9ix4dMNvWQj2CLzHdWG8X/tXhj7ZQwKoHxQzUZWoUHuRAuhRgKay XZzp5UBfAshwcWBPwLNTJh1eK76AhsMauCveHBDSI9T2pcNNTG40JCU6TeRn/lZN WunJIagpjbnj0Gpk9NBIGwhPRxJfibaoBOBmd4gOijsBmil0Fnb16ZF7rNK/h6qE 4/2d5aHb4v3se4FNdEt5Jixcxjt45UP+lv2Wp/4xlfdCkkBavu+JEWRnMb327zaW mxPEsu5FTyJ+tzebkEwXY5ca2ZLs38PDSGV7zI7fXnIltQxjsFKXbAhPGZvYI7o2 ZqFY8lywiu3n35jJeUOkRk+FYc5yOKR4An0EYXdwplh89ft9JadkN+t7DdVxDjAs W2tdzBjFh/UZJMy6i2F2O8RaryHtgA8dC6P6XFSjFV3Yb+yc58/lUhGKsWAcUeJW nK626jKRQlVhaACDsQXJEeIupzaTov6YO1l48aq771yoUL5G2FiARgqlLfbXZI9v q/McgLcpf5b6+rgvwucHNJKz+Te1iHLaWE51MLqitfa47DNQpWA= =lPdL -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Thu Sep 5 08:01:17 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 07:01:17 +0000 Subject: qgis_3.4.11+dfsg-1~bpo10+1_amd64.changes ACCEPTED into buster-backports, buster-backports Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 04 Sep 2019 06:54:44 +0200 Source: qgis Binary: libqgis-3d3.4.11 libqgis-3d3.4.11-dbgsym libqgis-analysis3.4.11 libqgis-analysis3.4.11-dbgsym libqgis-app3.4.11 libqgis-app3.4.11-dbgsym libqgis-core3.4.11 libqgis-core3.4.11-dbgsym libqgis-customwidgets libqgis-customwidgets-dbgsym libqgis-dev libqgis-gui3.4.11 libqgis-gui3.4.11-dbgsym libqgis-native3.4.11 libqgis-native3.4.11-dbgsym libqgis-server3.4.11 libqgis-server3.4.11-dbgsym libqgisgrass7-3.4.11 libqgisgrass7-3.4.11-dbgsym libqgispython3.4.11 libqgispython3.4.11-dbgsym python3-qgis python3-qgis-common python3-qgis-dbgsym qgis qgis-api-doc qgis-common qgis-dbgsym qgis-plugin-grass qgis-plugin-grass-common qgis-plugin-grass-dbgsym qgis-provider-grass qgis-provider-grass-dbgsym qgis-providers qgis-providers-common qgis-providers-dbgsym qgis-server qgis-server-dbgsym Architecture: source amd64 all Version: 3.4.11+dfsg-1~bpo10+1 Distribution: buster-backports Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: libqgis-3d3.4.11 - QGIS - shared 3d library libqgis-analysis3.4.11 - QGIS - shared analysis library libqgis-app3.4.11 - QGIS - shared app library libqgis-core3.4.11 - QGIS - shared core library libqgis-customwidgets - QGIS custom widgets for Qt Designer libqgis-dev - QGIS - development files libqgis-gui3.4.11 - QGIS - shared gui library libqgis-native3.4.11 - QGIS - shared native gui library libqgis-server3.4.11 - QGIS - shared server library libqgisgrass7-3.4.11 - QGIS - shared grass library libqgispython3.4.11 - QGIS - shared Python library python3-qgis - Python bindings to QGIS python3-qgis-common - Python bindings to QGIS - architecture-independent files qgis - Geographic Information System (GIS) qgis-api-doc - QGIS API documentation qgis-common - QGIS - architecture-independent data qgis-plugin-grass - GRASS plugin for QGIS qgis-plugin-grass-common - GRASS plugin for QGIS - architecture-independent data qgis-provider-grass - GRASS provider for QGIS qgis-providers - collection of data providers to QGIS qgis-providers-common - collection of data providers to QGIS - architecture-independent f qgis-server - QGIS server providing various OGC services Closes: 935613 Changes: qgis (3.4.11+dfsg-1~bpo10+1) buster-backports; urgency=medium . * Rebuild for buster-backports. . qgis (3.4.11+dfsg-1) unstable; urgency=medium . * Add Breaks/Replaces to fix upgrade from 2.18.18. (closes: #935613) * Update symbols for other architectures. * Move from experimental to unstable. . qgis (3.4.11+dfsg-1~exp1) experimental; urgency=medium . * New upstream release. * Merge upstream packaging changes. * Update symbols for amd64. * Add lintian override for spelling-error-in-binary false positive. Checksums-Sha1: 4d5a3cc808b859e04bc2ee2dd78ce86d91a201f3 4707 qgis_3.4.11+dfsg-1~bpo10+1.dsc d5357af86dd2876135e5c7f2d8435ad4241e6535 265956 qgis_3.4.11+dfsg-1~bpo10+1.debian.tar.xz 38b453bf095c68cb28c01e8c03950a695305a678 10435568 libqgis-3d3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 76ab737ac3381b22110076c3575dff0c13c586e8 2139920 libqgis-3d3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 9b712a58711999d7801d9e84e519e54e18c47d3b 50372464 libqgis-analysis3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 6123a71b2cc9a616283e80fd319ee0e4d3d2101d 2712076 libqgis-analysis3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb db506e0da029ffeb56fe7a95dab7192eb8edb104 113106504 libqgis-app3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 0317ec41b6cef218b4fc3b5c7ad16000b9b2a04b 4671640 libqgis-app3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb e08c5da910c98c67265355b571338c91a671f447 157519848 libqgis-core3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 534444c10c844681501d393f292cf805445ce438 6279836 libqgis-core3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb c0f3f662b908f974646b0f1f673cc8e184b8f95d 4976432 libqgis-customwidgets-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 2c2ca850eb343ac15ea189293159299873f49bc8 5400448 libqgis-customwidgets_3.4.11+dfsg-1~bpo10+1_amd64.deb 43b4dbb5ab5aab4802595a9afebf69ee2170f6bc 2928960 libqgis-dev_3.4.11+dfsg-1~bpo10+1_amd64.deb 71ff81d4a4e24a8b3d38d905b21953a4e1a96af9 148576112 libqgis-gui3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 7753b10c6fabb19d032a640ef99f572426c5089c 4799072 libqgis-gui3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 77fe247c823f09814a32624b1ce7eb7d64550264 603460 libqgis-native3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 124071fd859a068b3ecd0fad09023914e9fa693a 1999848 libqgis-native3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 3a1ff9bd34588f18ba853adf529f43bb49f86e72 6224828 libqgis-server3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb dad4cdffb977491e5a34fdf5af9e97f3461ee63e 2131564 libqgis-server3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb fb1cfe0cd3e7844d04357d6ab38a011a7baa1f30 4912808 libqgisgrass7-3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 68b668532e19f632e2cbd08cb219eface6202e7d 2181548 libqgisgrass7-3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb f1d7e6d80c0ddf4f9f4578495b2ce2d950fd8364 387836 libqgispython3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 67a6ecb264b45707bbc54f5fc4d765333b77d601 2001524 libqgispython3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb e8b85180eda7875f1d65baa64bf10e5afbed34f3 4309496 python3-qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb 331fb538431b5649ecf16a703412b03801806231 41664320 python3-qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb f4b00b569d7841f4c90599aace81ba93ff57fdcf 9113040 python3-qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb 292d54d70f9d2416659a9a0bd1b7524f091f31cf 994711724 qgis-api-doc_3.4.11+dfsg-1~bpo10+1_all.deb b4a658edf0b7b88c2cc87af1b1f877609a78b435 12007644 qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb d5ef94ee67fbf06b502bd4fb4bc0df7e264dc428 23618524 qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb b2b1c78d0e3df8549f45c8dc30fcf89c0c8e1058 2461356 qgis-plugin-grass-common_3.4.11+dfsg-1~bpo10+1_all.deb 42557cc9e971ac48f43cf406113fa4c5dd7bd833 11256296 qgis-plugin-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 7c60458d95603a76f50e74ac7ccd0e36d09e994f 2554152 qgis-plugin-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb 60bae41f9efa5a6196bd28478ca6ecdc8e60d640 1662816 qgis-provider-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 350d16db1a6971534ad7939729df9b82a0ac5b3e 2049664 qgis-provider-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb 9826f8a2e0d1abccb66e585e117d0e6139730827 2932804 qgis-providers-common_3.4.11+dfsg-1~bpo10+1_all.deb 4fa9cbbeeb80a2414e03c27d66125e180d7c86a4 67615736 qgis-providers-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 186eb138c3c6fa89b1e8c5680094140f8de4c1b1 3839304 qgis-providers_3.4.11+dfsg-1~bpo10+1_amd64.deb 61c55942fd69e9a44382a69ff76bad5b5b8195d8 12704068 qgis-server-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb a86e6384cac92d61736bdcbc8d2a57504f498a6c 2462740 qgis-server_3.4.11+dfsg-1~bpo10+1_amd64.deb bf82e9c881b55b228bd2ebd9f64b002bb5851711 35574 qgis_3.4.11+dfsg-1~bpo10+1_amd64.buildinfo 3ec6f4a634dde535755c7cd6bf10e81653c5694d 6802312 qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb Checksums-Sha256: a18dd3c6b4e23e498cede7f26196b0719d77b6f206d9f33d2ea1b9858a9a5cbc 4707 qgis_3.4.11+dfsg-1~bpo10+1.dsc 2dab328113e15149eaab4bcf695e1e4382fce7ddb98f6a346818aaefff2aeeb3 265956 qgis_3.4.11+dfsg-1~bpo10+1.debian.tar.xz 27754c865f6b5f937cd4c7ed13418d9e7520b4a3a5a1f1b212371fac22405dfb 10435568 libqgis-3d3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb b6d2d44dc63c9108627ffccd52f8abae3896970f0851cb5d8bdffca30018c094 2139920 libqgis-3d3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 98f94cba4bacd0e78ab58405a10fd0438600e6a5d11ab04b55700588a3f56f86 50372464 libqgis-analysis3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 1159d70f40af46c9769eda6c3ee5f240ca2f3a66ef3ef783868ccf004eba3157 2712076 libqgis-analysis3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 6d488cc26513d6fefb08da53680e2f6d3d0bf206b933cc3009dc80c5ca5aaf69 113106504 libqgis-app3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb f14a2305e6262d8fab5e87a881ec310d023e55c635a3fbfd9e0e6b65b6635d8b 4671640 libqgis-app3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb eb5820c747aad5fcbbfc6303ec97898009776cf2efd7d41d2ce7293f911236dc 157519848 libqgis-core3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb b0e76640893ae95a45e633c3077a7bd6c3888e7fce747147c0cfbd829dec2b01 6279836 libqgis-core3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 802fb6b2f45add7b26d972880e6addd8ed612c14bf7ea2cde925d824cc19c861 4976432 libqgis-customwidgets-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 1a950398327a64032b5d830aac4591d8d439b1da81713286f5e14a358470a84b 5400448 libqgis-customwidgets_3.4.11+dfsg-1~bpo10+1_amd64.deb f5a7543e69b634d0c204916bc10d5e3ef2a9444e2310300484de47d89584978f 2928960 libqgis-dev_3.4.11+dfsg-1~bpo10+1_amd64.deb 330d1b0b686a71d56122e39d99f51f4f72b1e46ad505e928c94a08cfcf7ab899 148576112 libqgis-gui3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb f38bb894ab202c377e2be350c52cca81658d63024d4d12810ff62070b57531d7 4799072 libqgis-gui3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 2f7f2832b3ba744191570f580a3bc083dbd85abdfc0f9416181f0336f60a9806 603460 libqgis-native3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb efb2149878b37bf92ec2052857ef3395764d713076c4fcd68651e8499dfc221a 1999848 libqgis-native3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 23630aed5a6fe84d006b7b55361d3528a335cbc765cf107d8dfecbf20653a34f 6224828 libqgis-server3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 052cecfbd64416591ed895dd54d977d523f8ec1859839d3ebbb597e7529563f5 2131564 libqgis-server3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 6f9119de014363236669a6f524d7bc4c4a27747ad996ab4c2d65ff295900a9e8 4912808 libqgisgrass7-3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb b12d2ee4396d6d64562c8d206a63474c5a5c4cdf5f67df638ebaf62a8d56fb17 2181548 libqgisgrass7-3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 2664a3107ecd8c4e9ada892098d0c64b46a12ebdac6c3bde1eb1975069411f1e 387836 libqgispython3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb c41a2223ce94e8bb959a85bd5e6b932d2b638dfdef46c5ada2f08f0fac3e2a90 2001524 libqgispython3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 90a47e0327a501bc3add7248b57a8ea0de87e5a6448dd8994f02fb4a20a259c9 4309496 python3-qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb b3153f633b70b3e5f7b16d1d2e9c9fac10cd8130cd5df6aa0fcbd4e285c0c11c 41664320 python3-qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb faafdec8e9b2a7ea30d8510c599bb110fc6602b66e3415c9185023b5649f28dc 9113040 python3-qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb ea1cdcf93fe953371d002251a58a6184ffd177916154367fb8032ea4ce61148a 994711724 qgis-api-doc_3.4.11+dfsg-1~bpo10+1_all.deb b83409cb87d7d2500a2d7a44c8a53a028cc18d0b8e5d4388aae75a8f393ddcf7 12007644 qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb ae360d59fa6b5fbe9cfc350da5f498ac628b0a526c4f1ad7057219e1a12747c5 23618524 qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb dc5e932def037562542a86a39d2f13fca812ec49ae9b0678347a88f502843dc6 2461356 qgis-plugin-grass-common_3.4.11+dfsg-1~bpo10+1_all.deb cfcb41157132bc5267514f9e4c8179c7097061ac6ec4ad3227fb0cd3406a68af 11256296 qgis-plugin-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 70d68a93941948f102856643aa5265446a8278090f0c83eb123e072205f8aa35 2554152 qgis-plugin-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb 8de82c79bdb56e0bd450acd15b9905c4c2ff08c56bc34dd45bf4cc7080cfdaa7 1662816 qgis-provider-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb a51c4bcc9ee01473392a3f8cf7350249dfcf3cd58ae7f42e1551df0a054bdd1d 2049664 qgis-provider-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb 75ca79c8b5f9818a354a1f225c86f154423d9721fd046ba081e5940efea6b8ae 2932804 qgis-providers-common_3.4.11+dfsg-1~bpo10+1_all.deb 77c7ab4221b42ccd4cba9cb0f14c434dda5d98103fb6ac560196cb9f30ef497d 67615736 qgis-providers-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 025f58c9001c65c17930782759945db9225bd351fa673816a09f61b810aa1dff 3839304 qgis-providers_3.4.11+dfsg-1~bpo10+1_amd64.deb 751c7c31057e1079a3f0352d1d73cd1a990a991dd66ae6989f81af1f83df0f2f 12704068 qgis-server-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 4cb8220b9662d617e94df95b01cc5df0743179e4ba95ed5f97e84e1008979bc9 2462740 qgis-server_3.4.11+dfsg-1~bpo10+1_amd64.deb 01fd9c2dfb4f8d397a54fd305ae4d45f7b8a25a18f51dd83b53bc645dbf74c5e 35574 qgis_3.4.11+dfsg-1~bpo10+1_amd64.buildinfo ac851b7070cb035986d2683a8da313c28c04c3b50f61184ce742ea4f67e37f05 6802312 qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb Files: 8a649f7dc0493d49d8fcfd2ad9bd29ac 4707 science optional qgis_3.4.11+dfsg-1~bpo10+1.dsc f1a1f277111e574151af30aa317633dc 265956 science optional qgis_3.4.11+dfsg-1~bpo10+1.debian.tar.xz abc68349ded0a175a8b9f09d5f705c80 10435568 debug optional libqgis-3d3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 6924ee8da14bfd849c6467720eea0a83 2139920 libs optional libqgis-3d3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 7480380ec56ae386221578474dc2a61e 50372464 debug optional libqgis-analysis3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 3334cf59b9aeaff7917af895973487ec 2712076 libs optional libqgis-analysis3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 7fc8c45d5a43981b7af47dacb6e6c506 113106504 debug optional libqgis-app3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 02e9b259b44b8e0f6f8d83f6875617a0 4671640 libs optional libqgis-app3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 52937b67704bb14534f900be51890e0e 157519848 debug optional libqgis-core3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 91838b52eee3a12c40c246455d8d8bd5 6279836 libs optional libqgis-core3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb e3a662e39eb0f3e9099fa4fd629f1caa 4976432 debug optional libqgis-customwidgets-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb d30e6e311ff34df8a3355a2da4c04369 5400448 science optional libqgis-customwidgets_3.4.11+dfsg-1~bpo10+1_amd64.deb ae0e89c5544032f645337d5c33e07801 2928960 libdevel optional libqgis-dev_3.4.11+dfsg-1~bpo10+1_amd64.deb b6a954a9d0a894836bb1531cbe8da62d 148576112 debug optional libqgis-gui3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb dead0388aad7520f86b6cb3d6b5f1d54 4799072 libs optional libqgis-gui3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 89c5f37e6025300a3283e33cd929c8ee 603460 debug optional libqgis-native3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 724f711b18a81a85c6ab8554c196b5fd 1999848 libs optional libqgis-native3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb 529911cf3b87801a4e825668487a271e 6224828 debug optional libqgis-server3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 3dd07f9e117368997ad24c90f4719db2 2131564 libs optional libqgis-server3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb b957a08dadb1a19b1a3570a4f82e10ea 4912808 debug optional libqgisgrass7-3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 23b2c3ed1feca596e48c1a3f674e7173 2181548 libs optional libqgisgrass7-3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb e88ddc16fb8ab44832f63caee5fa0849 387836 debug optional libqgispython3.4.11-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 24dc6567db6ff23524e3cef87115078e 2001524 libs optional libqgispython3.4.11_3.4.11+dfsg-1~bpo10+1_amd64.deb d3bfba0049d2a27da9a79ea8a8b5b625 4309496 python optional python3-qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb b2ee394af0c7bb0f694789dca6a0618c 41664320 debug optional python3-qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 9c73f61c17edbf08b8bcb472dc464b98 9113040 python optional python3-qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb 8cc41975d0b8d8504cdea3f6faf21d18 994711724 doc optional qgis-api-doc_3.4.11+dfsg-1~bpo10+1_all.deb 31bbe2cb4952fedf3a05c84b3ef104a8 12007644 science optional qgis-common_3.4.11+dfsg-1~bpo10+1_all.deb 70c596e7c261d519076967a97cdb40f7 23618524 debug optional qgis-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 164aa572e7bbf2b33f89397001e83daf 2461356 science optional qgis-plugin-grass-common_3.4.11+dfsg-1~bpo10+1_all.deb 37e9966578f5647877c4e04cc982b054 11256296 debug optional qgis-plugin-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 46e9df49ba53533648575e0cddff873b 2554152 science optional qgis-plugin-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb 868a431ca8045b94f96200c40dc527f0 1662816 debug optional qgis-provider-grass-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb b15cd2e9f7e4335c627faccc8776d192 2049664 science optional qgis-provider-grass_3.4.11+dfsg-1~bpo10+1_amd64.deb 387f7e4ac8bcbe5f20c1e7791e634d1c 2932804 science optional qgis-providers-common_3.4.11+dfsg-1~bpo10+1_all.deb 702fb486823b32f25301847ca42af283 67615736 debug optional qgis-providers-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb c0ec84fb2b04076450aa422f419f88bb 3839304 science optional qgis-providers_3.4.11+dfsg-1~bpo10+1_amd64.deb 90923000c1230f0e1f19577ff65fa90f 12704068 debug optional qgis-server-dbgsym_3.4.11+dfsg-1~bpo10+1_amd64.deb 6e9c17f2113bda495436e57e8ecfa9e6 2462740 science optional qgis-server_3.4.11+dfsg-1~bpo10+1_amd64.deb 9446880bda7a23e58514be8d96734cc7 35574 science optional qgis_3.4.11+dfsg-1~bpo10+1_amd64.buildinfo 2c40ab90627410062f78709f21022ffb 6802312 science optional qgis_3.4.11+dfsg-1~bpo10+1_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1vbggACgkQZ1DxCuiN SvFoSg//aXVsELIi4ZB07w5Y6aBsB8OQRBpmqW+rAViiY7uyWRdLJjT5d6WMO86i M1RAnNJKUJpgwYVQvI6Foz0jreUqlvsYt7jhid5ltEIFPpX2lGAvYn8uihL/MVa/ JV3JtQfzcc2Od5Gz3SOOiDboJa/0L9DIYSas5yqU0YSxHB0AMiIUgm1GPtGRFp7z oLLha8tcZGPRAZRa19iVDsmg8cI9sCLESzGdcIEwz24KTN+I2bszyT09YRayDvzV duLZ4DDIvmEciQgVqq2yLX6z+YCs9QdMK0px4ZtXHbPrEdovdcYJ5gFdvjKSD+CZ k7UeugwSUUhDw/gEbeZnsXJMi1LrLjtgx+BJ/cVI8Olu9RO+FS7cGfEaPhJepJXW 54MwVRW2LziI97ScRDL1RrOdk6gQSAhjjHWPyWknAaA05LDqyfFf8tInvhb7DsNR wZNsdg823+TQjRYCe6IVxCciSUfRMzW6HfPdMqwhiZncZW7toyHhs3rrgqxSgekf BSkj3XLoZFpDkQWiDl2fnGfEJhIctIfT5XAfzTYpMHuqRsz29WxI+KzOrnleaO0c XMH3UHoWU7xj1BGHy9VByuzKafigBcH1GT1Zv1FBchAILsbRZwUcilAtoj1nWU7k FYlxqEenKw8ghUNCbvbKfDTrUbCzayMpv5tMZvz1pPQgRSbm9bw= =uHyR -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Thu Sep 5 14:00:19 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Thu, 05 Sep 2019 13:00:19 +0000 Subject: Processed: 936293 937304 References: <20190905144222.36beefce@vip.jcf.pm> Message-ID: Processing commands for control at bugs.debian.org: > block 936293 with 938525 938425 936806 936659 937491 937301 937417 937058 939478 939480 939482 939483 Bug #936293 [src:cheetah] cheetah: Python2 removal in sid/bullseye 936293 was not blocked by any bugs. 936293 was blocking: 937907 Added blocking bug(s) of 936293: 939483, 939480, 939478, 937058, 937301, 937417, 938525, 937491, 939482, 938425, 936806, and 936659 > block 937304 with 937154 937883 937539 938629 936515 936516 936552 936655 937411 937539 937883 938618 938666 939479 939481 Bug #937304 [src:ply] ply: Python2 removal in sid/bullseye 937304 was not blocked by any bugs. 937304 was not blocking any bugs. Added blocking bug(s) of 937304: 938618, 937411, 938666, 936655, 937539, 936515, 939481, 936552, 936516, 939479, 937883, 938629, and 937154 > End of message, stopping processing here. Please contact me if you need assistance. -- 936293: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=936293 937304: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937304 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From gitlab at salsa.debian.org Thu Sep 5 17:29:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 16:29:05 +0000 Subject: [Git][debian-gis-team/pgrouting][pristine-tar] pristine-tar data for pgrouting_2.6.3.orig.tar.gz Message-ID: <5d7137d131219_577b3f91ce507b6c963729@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / pgrouting Commits: 8de87466 by Bas Couwenberg at 2019-09-05T16:08:56Z pristine-tar data for pgrouting_2.6.3.orig.tar.gz - - - - - 2 changed files: - + pgrouting_2.6.3.orig.tar.gz.delta - + pgrouting_2.6.3.orig.tar.gz.id Changes: ===================================== pgrouting_2.6.3.orig.tar.gz.delta ===================================== Binary files /dev/null and b/pgrouting_2.6.3.orig.tar.gz.delta differ ===================================== pgrouting_2.6.3.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +74fd24e0593b67e5fb6c61d55d2f73de6884527a View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/commit/8de8746656bcf65a98e52990e5d8b150ed38ecbf -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/commit/8de8746656bcf65a98e52990e5d8b150ed38ecbf You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 17:29:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 16:29:09 +0000 Subject: [Git][debian-gis-team/pgrouting][master] 4 commits: New upstream version 2.6.3 Message-ID: <5d7137d59799e_577b2ade6111031896389f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pgrouting Commits: 2e7fc262 by Bas Couwenberg at 2019-09-05T16:08:36Z New upstream version 2.6.3 - - - - - 7eeb4a2b by Bas Couwenberg at 2019-09-05T16:08:56Z Update upstream source from tag 'upstream/2.6.3' Update to upstream version '2.6.3' with Debian dir 28ff7c9dd436b92f3b9f27b0723ea39aefee2ce1 - - - - - 4af42a5d by Bas Couwenberg at 2019-09-05T16:09:25Z New upstream release. - - - - - 0c575398 by Bas Couwenberg at 2019-09-05T16:11:14Z Set distribution to unstable. - - - - - 13 changed files: - CMakeLists.txt - NEWS - README.md - cmake/FindPostgreSQL.cmake - debian/changelog - doc/queries/doc-pgr_version.queries - doc/src/pgRouting-concepts.rst - doc/src/release_notes.rst - pgtap/common/hasnt_v2_signatures.sql - sql/alpha_shape/alpha_shape.sql - + sql/sigs/pgrouting--2.6.3.sig - sql/trsp/trsp_V2.2.sql - test/common/doc-pgr_version.result Changes: ===================================== CMakeLists.txt ===================================== @@ -182,7 +182,7 @@ endif() set(PGROUTING_VERSION_MAJOR "2") set(PGROUTING_VERSION_MINOR "6") -set(PGROUTING_VERSION_PATCH "2") +set(PGROUTING_VERSION_PATCH "3") set(PGROUTING_VERSION_DEV "") set(PGROUTING_SHORT_VERSION "${PGROUTING_VERSION_MAJOR}.${PGROUTING_VERSION_MINOR}") ===================================== NEWS ===================================== @@ -1,4 +1,17 @@ +pgRouting 2.6.2 Release Notes +------------------------------------------------------------------------------- + +To see the issues closed by this release see the [Git closed milestone for 2.6.2 ](https://github.com/pgRouting/pgrouting/issues?utf8=%E2%9C%93&q=milestone%3A%22Release%202.6.3%22%20) on Github. + +*Bug fixes* + +* [#1219 ](https://github.com/pgRouting/pgrouting/pull/1219)_ Implicit cast for via_path integer to text +* [#1193 ](https://github.com/pgRouting/pgrouting/pull/1193)_ Fixed pgr_pointsAsPolygon breaking when comparing strings in WHERE clause +* [#1185 ](https://github.com/pgRouting/pgrouting/pull/1185)_ Improve FindPostgreSQL.cmake + + + pgRouting 2.6.2 Release Notes ------------------------------------------------------------------------------- ===================================== README.md ===================================== @@ -99,7 +99,7 @@ Family of functions include: ## REQUIREMENTS -Building reqirements +Building requirements -------------------- * Perl * C and C++ compilers @@ -112,7 +112,7 @@ Building reqirements * Sphinx >= 1.2 -User's reqirements +User's requirements -------------------- * PostGIS >= 2.0 ===================================== cmake/FindPostgreSQL.cmake ===================================== @@ -38,7 +38,6 @@ else(POSTGRESQL_INCLUDE_DIR AND POSTGRESQL_LIBRARIES AND POSTGRESQL_EXECUTABLE) endif(NOT "${POSTGRESQL_BIN}" STREQUAL "") message(STATUS "POSTGRESQL_PG_CONFIG is " ${POSTGRESQL_PG_CONFIG}) - if(POSTGRESQL_PG_CONFIG) execute_process( COMMAND ${POSTGRESQL_PG_CONFIG} --bindir @@ -46,26 +45,25 @@ else(POSTGRESQL_INCLUDE_DIR AND POSTGRESQL_LIBRARIES AND POSTGRESQL_EXECUTABLE) OUTPUT_VARIABLE T_POSTGRESQL_BIN) endif(POSTGRESQL_PG_CONFIG) - - # Checking POSTGRESQL_EXECUTABLE in all the dir (*) - implies that + # search for POSTGRESQL_EXECUTABLE _only_ in the dir specified by pg_config find_program(POSTGRESQL_EXECUTABLE NAMES postgres PATHS ${T_POSTGRESQL_BIN} + NO_DEFAULT_PATH ) - message(STATUS "POSTGRESQL_EXECUTABLE is " ${POSTGRESQL_EXECUTABLE}) - - + # if not found continue search in the path and all the dirs listed here (questionable) + find_program(POSTGRESQL_EXECUTABLE NAMES postgres + PATHS + /usr/lib/postgresql/*/bin/ + ) +# # more elegant, equivalent way if we want to keep both of above: # find_program(POSTGRESQL_EXECUTABLE NAMES postgres +# HINTS +# ${T_POSTGRESQL_BIN} # PATHS # /usr/lib/postgresql/*/bin/ # ) -# message(STATUS "POSTGRESQL_EXECUTABLE is " ${POSTGRESQL_EXECUTABLE}) - -# find_program(POSTGRESQL_PG_CONFIG NAMES pg_config -# PATHS -# /usr/lib/postgresql/*/bin/ -# ) -# message(STATUS "POSTGRESQL_PG_CONFIG is " ${POSTGRESQL_PG_CONFIG}) + message(STATUS "POSTGRESQL_EXECUTABLE is " ${POSTGRESQL_EXECUTABLE}) if(POSTGRESQL_PG_CONFIG) execute_process( @@ -83,9 +81,12 @@ else(POSTGRESQL_INCLUDE_DIR AND POSTGRESQL_LIBRARIES AND POSTGRESQL_EXECUTABLE) OUTPUT_VARIABLE T_POSTGRESQL_INCLUDE_DIR) endif(POSTGRESQL_PG_CONFIG) + #as with POSTGRESQL_EXECUTABLE we should/could use the path specified by pg_config only + #instead of path and our own guesses find_path(POSTGRESQL_INCLUDE_DIR postgres.h + HINTS ${T_POSTGRESQL_INCLUDE_DIR} - + PATHS /usr/include/server /usr/include/pgsql/server /usr/local/include/pgsql/server ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +pgrouting (2.6.3-1) unstable; urgency=medium + + * Team upload. + * New upstream release. + + -- Bas Couwenberg Thu, 05 Sep 2019 18:10:48 +0200 + pgrouting (2.6.2-2) unstable; urgency=medium * Team upload. ===================================== doc/queries/doc-pgr_version.queries ===================================== @@ -6,7 +6,7 @@ SET SELECT version FROM pgr_version(); version --------- - 2.6.2 + 2.6.3 (1 row) -- q2 ===================================== doc/src/pgRouting-concepts.rst ===================================== @@ -122,7 +122,7 @@ network. The general form of a route query is: .. code-block:: none - select pgr_dijkstra(`SELECT * FROM myroads', 1, 2) + select pgr_dijkstra('SELECT * FROM myroads', 1, 2) As you can see this is fairly straight forward and you can look and the specific algorithms for the details of the signatures and how to use them. ===================================== doc/src/release_notes.rst ===================================== @@ -18,6 +18,7 @@ To see the full list of changes check the list of `Git commits `_ on Github. + +.. rubric:: Bug fixes + +* `#1219 `__ Implicit cast for via_path integer to text +* `#1193 `__ Fixed pgr_pointsAsPolygon breaking when comparing strings in WHERE clause +* `#1185 `__ Improve FindPostgreSQL.cmake + + .. _changelog_2_6_2: pgRouting 2.6.2 Release Notes ===================================== pgtap/common/hasnt_v2_signatures.sql ===================================== @@ -30,7 +30,7 @@ SELECT hasnt_function('pgr_ksp',ARRAY['text', 'integer', 'integer', 'integer', ' SELECT hasnt_function('pgr_drivingdistance',ARRAY['text', 'integer', 'double precision', 'boolean', 'boolean']); SELECT hasnt_function('pgr_bdastar',ARRAY['text', 'integer', 'integer', 'boolean', 'boolean']); SELECT hasnt_function('pgr_bddijkstra',ARRAY['text', 'integer', 'integer', 'boolean', 'boolean']); -SELECT hasnt_function('pgr_tsp',ARRAY['(double precision[]', 'integer', 'integer']); +SELECT hasnt_function('pgr_tsp',ARRAY['double precision[]', 'integer', 'integer']); -- deprecated functions SELECT hasnt_function('pgr_kdijkstracost'); ===================================== sql/alpha_shape/alpha_shape.sql ===================================== @@ -59,7 +59,7 @@ CREATE OR REPLACE FUNCTION pgr_pointsAsPolygon(query varchar, alpha float8 DEFAU geoms := array[]::geometry[]; i := 1; - FOR vertex_result IN EXECUTE 'SELECT x, y FROM pgr_alphashape('''|| query || ''', ' || alpha || ')' + FOR vertex_result IN EXECUTE 'SELECT x, y FROM pgr_alphashape('''|| REPLACE(query, E'\'', E'\'\'') || ''', ' || alpha || ')' LOOP x[i] = vertex_result.x; y[i] = vertex_result.y; ===================================== sql/sigs/pgrouting--2.6.3.sig ===================================== @@ -0,0 +1,191 @@ +#VERSION pgrouting 2.6.3 +#TYPES +pgr_costresult +pgr_costresult3 +pgr_geomresult +#FUNCTIONS +pgr_alphashape(text,double precision) +pgr_analyzegraph(text,double precision,text,text,text,text,text) +pgr_analyzeoneway(text,text[],text[],text[],text[],boolean,text,text,text) +pgr_apspjohnson(text) +pgr_apspwarshall(text,boolean,boolean) +_pgr_array_reverse(anyarray) +pgr_articulationpoints(text) +pgr_astarcostmatrix(text,anyarray,boolean,integer,double precision,double precision) +pgr_astarcost(text,anyarray,anyarray,boolean,integer,double precision,double precision) +pgr_astarcost(text,anyarray,bigint,boolean,integer,double precision,double precision) +pgr_astarcost(text,bigint,anyarray,boolean,integer,double precision,double precision) +pgr_astarcost(text,bigint,bigint,boolean,integer,double precision,double precision) +pgr_astar(text,anyarray,anyarray,boolean,integer,double precision,double precision) +_pgr_astar(text,anyarray,anyarray,boolean,integer,double precision,double precision,boolean,boolean) +pgr_astar(text,anyarray,bigint,boolean,integer,double precision,double precision) +pgr_astar(text,bigint,anyarray,boolean,integer,double precision,double precision) +pgr_astar(text,bigint,bigint,boolean,integer,double precision,double precision) +pgr_astar(text,integer,integer,boolean,boolean) +pgr_bdastarcostmatrix(text,anyarray,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,anyarray,anyarray,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,anyarray,bigint,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,bigint,anyarray,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,bigint,bigint,boolean,integer,numeric,numeric) +_pgr_bdastar(text,anyarray,anyarray,boolean,integer,double precision,double precision,boolean) +pgr_bdastar(text,anyarray,anyarray,boolean,integer,numeric,numeric) +pgr_bdastar(text,anyarray,bigint,boolean,integer,numeric,numeric) +pgr_bdastar(text,bigint,anyarray,boolean,integer,numeric,numeric) +pgr_bdastar(text,bigint,bigint) +pgr_bdastar(text,bigint,bigint,boolean,integer,numeric,numeric) +pgr_bdastar(text,integer,integer,boolean,boolean) +pgr_bddijkstracostmatrix(text,anyarray,boolean) +pgr_bddijkstracost(text,anyarray,anyarray,boolean) +pgr_bddijkstracost(text,anyarray,bigint,boolean) +pgr_bddijkstracost(text,bigint,anyarray,boolean) +pgr_bddijkstracost(text,bigint,bigint,boolean) +pgr_bddijkstra(text,anyarray,anyarray,boolean) +_pgr_bddijkstra(text,anyarray,anyarray,boolean,boolean) +pgr_bddijkstra(text,anyarray,bigint,boolean) +pgr_bddijkstra(text,bigint,anyarray,boolean) +pgr_bddijkstra(text,bigint,bigint) +pgr_bddijkstra(text,bigint,bigint,boolean) +pgr_bddijkstra(text,integer,integer,boolean,boolean) +pgr_biconnectedcomponents(text) +pgr_boykovkolmogorov(text,anyarray,anyarray) +pgr_boykovkolmogorov(text,anyarray,bigint) +pgr_boykovkolmogorov(text,bigint,anyarray) +pgr_boykovkolmogorov(text,bigint,bigint) +pgr_bridges(text) +_pgr_checkverttab(text,text[],integer,text) +pgr_connectedcomponents(text) +pgr_contractgraph(text,bigint[],integer,bigint[],boolean) +_pgr_createindex(text,text,text,integer,text) +_pgr_createindex(text,text,text,text,integer,text) +pgr_createtopology(text,double precision,text,text,text,text,text,boolean) +pgr_createverticestable(text,text,text,text,text) +pgr_dijkstracostmatrix(text,anyarray,boolean) +pgr_dijkstracost(text,anyarray,anyarray,boolean) +pgr_dijkstracost(text,anyarray,bigint,boolean) +pgr_dijkstracost(text,bigint,anyarray,boolean) +pgr_dijkstracost(text,bigint,bigint,boolean) +pgr_dijkstra(text,anyarray,anyarray,boolean) +_pgr_dijkstra(text,anyarray,anyarray,boolean,boolean,boolean) +pgr_dijkstra(text,anyarray,bigint,boolean) +pgr_dijkstra(text,bigint,anyarray,boolean) +pgr_dijkstra(text,bigint,bigint) +pgr_dijkstra(text,bigint,bigint,boolean) +pgr_dijkstra(text,integer,integer,boolean,boolean) +pgr_dijkstravia(text,anyarray,boolean,boolean,boolean) +pgr_drivingdistance(text,anyarray,double precision,boolean,boolean) +pgr_drivingdistance(text,bigint,double precision,boolean) +pgr_drivingdistance(text,bigint,double precision,boolean,boolean) +pgr_edgedisjointpaths(text,anyarray,anyarray,boolean) +pgr_edgedisjointpaths(text,anyarray,bigint,boolean) +pgr_edgedisjointpaths(text,bigint,anyarray,boolean) +pgr_edgedisjointpaths(text,bigint,bigint,boolean) +pgr_edmondskarp(text,anyarray,anyarray) +pgr_edmondskarp(text,anyarray,bigint) +pgr_edmondskarp(text,bigint,anyarray) +pgr_edmondskarp(text,bigint,bigint) +_pgr_endpoint(geometry) +pgr_endpoint(geometry) +pgr_euclediantsp(text,bigint,bigint,double precision,integer,integer,integer,double precision,double precision,double precision,boolean) +pgr_flipedges(geometry[]) +pgr_floydwarshall(text,boolean) +pgr_getcolumnname(text,text) +_pgr_getcolumnname(text,text,integer,text) +_pgr_getcolumnname(text,text,text,integer,text) +_pgr_getcolumntype(text,text,integer,text) +_pgr_getcolumntype(text,text,text,integer,text) +_pgr_get_statement(text) +pgr_gettablename(text) +_pgr_gettablename(text,integer,text) +_pgr_gsoc_vrppdtw(text,integer,double precision,double precision,integer) +pgr_gsoc_vrppdtw(text,integer,integer) +pgr_iscolumnindexed(text,text) +_pgr_iscolumnindexed(text,text,integer,text) +_pgr_iscolumnindexed(text,text,text,integer,text) +_pgr_iscolumnintable(text,text) +pgr_iscolumnintable(text,text) +pgr_johnson(text,boolean) +pgr_kdijkstracost(text,integer,integer[],boolean,boolean) +pgr_kdijkstrapath(text,integer,integer[],boolean,boolean) +_pgr_ksp(text,bigint,bigint,integer,boolean,boolean) +pgr_ksp(text,bigint,bigint,integer,boolean,boolean) +pgr_ksp(text,integer,integer,integer,boolean) +pgr_labelgraph(text,text,text,text,text,text) +pgr_linegraphfull(text) +pgr_linegraph(text,boolean) +_pgr_makedistancematrix(text) +pgr_maxcardinalitymatch(text,boolean) +pgr_maxflowboykovkolmogorov(text,anyarray,anyarray) +pgr_maxflowboykovkolmogorov(text,anyarray,bigint) +pgr_maxflowboykovkolmogorov(text,bigint,anyarray) +pgr_maxflowboykovkolmogorov(text,bigint,bigint) +pgr_maxflowedmondskarp(text,anyarray,anyarray) +pgr_maxflowedmondskarp(text,anyarray,bigint) +pgr_maxflowedmondskarp(text,bigint,anyarray) +pgr_maxflowedmondskarp(text,bigint,bigint) +pgr_maxflowpushrelabel(text,anyarray,anyarray) +pgr_maxflowpushrelabel(text,anyarray,bigint) +pgr_maxflowpushrelabel(text,bigint,anyarray) +pgr_maxflowpushrelabel(text,bigint,bigint) +pgr_maxflow(text,anyarray,anyarray) +_pgr_maxflow(text,anyarray,anyarray,integer,boolean) +pgr_maxflow(text,anyarray,bigint) +pgr_maxflow(text,bigint,anyarray) +pgr_maxflow(text,bigint,bigint) +pgr_maximumcardinalitymatching(text,boolean) +_pgr_msg(integer,text,text) +pgr_nodenetwork(text,double precision,text,text,text,text,boolean) +_pgr_onerror(boolean,integer,text,text,text,text) +_pgr_parameter_check(text,text,boolean) +_pgr_pickdelivereuclidean(text,text,double precision,integer,integer) +_pgr_pickdeliver(text,text,text,double precision,integer,integer) +pgr_pointsaspolygon(character varying,double precision) +pgr_pointstodmatrix(geometry[],integer) +pgr_pointstovids(geometry[],text,double precision) +pgr_pointtoedgenode(text,geometry,double precision) +_pgr_pointtoid(geometry,double precision,text,integer) +pgr_pushrelabel(text,anyarray,anyarray) +pgr_pushrelabel(text,anyarray,bigint) +pgr_pushrelabel(text,bigint,anyarray) +pgr_pushrelabel(text,bigint,bigint) +_pgr_quote_ident(text) +pgr_quote_ident(text) +_pgr_startpoint(geometry) +pgr_startpoint(geometry) +pgr_strongcomponents(text) +pgr_texttopoints(text,integer) +_pgr_trsp(text,integer,double precision,integer,double precision,boolean,boolean,text) +pgr_trsp(text,integer,double precision,integer,double precision,boolean,boolean,text) +pgr_trsp(text,integer,integer,boolean,boolean,text) +_pgr_trsp(text,text,anyarray,anyarray,boolean) +_pgr_trsp(text,text,anyarray,bigint,boolean) +_pgr_trsp(text,text,bigint,anyarray,boolean) +_pgr_trsp(text,text,bigint,bigint,boolean) +pgr_trspviaedges(text,integer[],double precision[],boolean,boolean,text) +pgr_trspviavertices(text,anyarray,boolean,boolean,text) +_pgr_trspviavertices(text,integer[],boolean,boolean,text) +pgr_tsp(double precision[],integer,integer) +pgr_tsp(text,bigint,bigint,double precision,integer,integer,integer,double precision,double precision,double precision,boolean) +pgr_tsp(text,integer,integer) +_pgr_unnest_matrix(double precision[]) +pgr_version() +_pgr_versionless(text,text) +pgr_versionless(text,text) +pgr_vidstodmatrix(integer[],geometry[],text,double precision) +pgr_vidstodmatrix(text,integer[],boolean,boolean,boolean) +_pgr_vrponedepot(text,text,text,integer) +pgr_vrponedepot(text,text,text,integer) +pgr_withpointscostmatrix(text,text,anyarray,boolean,character) +pgr_withpointscost(text,text,anyarray,anyarray,boolean,character) +pgr_withpointscost(text,text,anyarray,bigint,boolean,character) +pgr_withpointscost(text,text,bigint,anyarray,boolean,character) +pgr_withpointscost(text,text,bigint,bigint,boolean,character) +pgr_withpointsdd(text,text,anyarray,double precision,boolean,character,boolean,boolean) +pgr_withpointsdd(text,text,bigint,double precision,boolean,character,boolean) +pgr_withpointsksp(text,text,bigint,bigint,integer,boolean,boolean,character,boolean) +pgr_withpoints(text,text,anyarray,anyarray,boolean,character,boolean) +_pgr_withpoints(text,text,anyarray,anyarray,boolean,character,boolean,boolean,boolean) +pgr_withpoints(text,text,anyarray,bigint,boolean,character,boolean) +pgr_withpoints(text,text,bigint,anyarray,boolean,character,boolean) +pgr_withpoints(text,text,bigint,bigint,boolean,character,boolean) +_pgr_withpointsvia(text,bigint[],double precision[],boolean) +_trsp(text,text,anyarray,anyarray,boolean) ===================================== sql/trsp/trsp_V2.2.sql ===================================== @@ -105,7 +105,7 @@ BEGIN $6 || $$ ) SELECT ROW_NUMBER() OVER() AS id, - _pgr_array_reverse(array_prepend(target_id, string_to_array(via_path, ',')::INTEGER[])) AS path, + _pgr_array_reverse(array_prepend(target_id, string_to_array(via_path::text, ',')::INTEGER[])) AS path, to_cost AS cost FROM old_restrictions; $$; ===================================== test/common/doc-pgr_version.result ===================================== @@ -6,7 +6,7 @@ SET SELECT version FROM pgr_version(); version --------- - 2.6.2 + 2.6.3 (1 row) -- q2 View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/compare/4aeabf7ead4308f668cbfa0c58945c4f334c1bb8...0c5753989be544f3e877d0a1aa788723674b3235 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/compare/4aeabf7ead4308f668cbfa0c58945c4f334c1bb8...0c5753989be544f3e877d0a1aa788723674b3235 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 17:29:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 16:29:15 +0000 Subject: [Git][debian-gis-team/pgrouting][upstream] New upstream version 2.6.3 Message-ID: <5d7137dbe2cca_577b2ade61bd0334964020@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / pgrouting Commits: 2e7fc262 by Bas Couwenberg at 2019-09-05T16:08:36Z New upstream version 2.6.3 - - - - - 12 changed files: - CMakeLists.txt - NEWS - README.md - cmake/FindPostgreSQL.cmake - doc/queries/doc-pgr_version.queries - doc/src/pgRouting-concepts.rst - doc/src/release_notes.rst - pgtap/common/hasnt_v2_signatures.sql - sql/alpha_shape/alpha_shape.sql - + sql/sigs/pgrouting--2.6.3.sig - sql/trsp/trsp_V2.2.sql - test/common/doc-pgr_version.result Changes: ===================================== CMakeLists.txt ===================================== @@ -182,7 +182,7 @@ endif() set(PGROUTING_VERSION_MAJOR "2") set(PGROUTING_VERSION_MINOR "6") -set(PGROUTING_VERSION_PATCH "2") +set(PGROUTING_VERSION_PATCH "3") set(PGROUTING_VERSION_DEV "") set(PGROUTING_SHORT_VERSION "${PGROUTING_VERSION_MAJOR}.${PGROUTING_VERSION_MINOR}") ===================================== NEWS ===================================== @@ -1,4 +1,17 @@ +pgRouting 2.6.2 Release Notes +------------------------------------------------------------------------------- + +To see the issues closed by this release see the [Git closed milestone for 2.6.2 ](https://github.com/pgRouting/pgrouting/issues?utf8=%E2%9C%93&q=milestone%3A%22Release%202.6.3%22%20) on Github. + +*Bug fixes* + +* [#1219 ](https://github.com/pgRouting/pgrouting/pull/1219)_ Implicit cast for via_path integer to text +* [#1193 ](https://github.com/pgRouting/pgrouting/pull/1193)_ Fixed pgr_pointsAsPolygon breaking when comparing strings in WHERE clause +* [#1185 ](https://github.com/pgRouting/pgrouting/pull/1185)_ Improve FindPostgreSQL.cmake + + + pgRouting 2.6.2 Release Notes ------------------------------------------------------------------------------- ===================================== README.md ===================================== @@ -99,7 +99,7 @@ Family of functions include: ## REQUIREMENTS -Building reqirements +Building requirements -------------------- * Perl * C and C++ compilers @@ -112,7 +112,7 @@ Building reqirements * Sphinx >= 1.2 -User's reqirements +User's requirements -------------------- * PostGIS >= 2.0 ===================================== cmake/FindPostgreSQL.cmake ===================================== @@ -38,7 +38,6 @@ else(POSTGRESQL_INCLUDE_DIR AND POSTGRESQL_LIBRARIES AND POSTGRESQL_EXECUTABLE) endif(NOT "${POSTGRESQL_BIN}" STREQUAL "") message(STATUS "POSTGRESQL_PG_CONFIG is " ${POSTGRESQL_PG_CONFIG}) - if(POSTGRESQL_PG_CONFIG) execute_process( COMMAND ${POSTGRESQL_PG_CONFIG} --bindir @@ -46,26 +45,25 @@ else(POSTGRESQL_INCLUDE_DIR AND POSTGRESQL_LIBRARIES AND POSTGRESQL_EXECUTABLE) OUTPUT_VARIABLE T_POSTGRESQL_BIN) endif(POSTGRESQL_PG_CONFIG) - - # Checking POSTGRESQL_EXECUTABLE in all the dir (*) - implies that + # search for POSTGRESQL_EXECUTABLE _only_ in the dir specified by pg_config find_program(POSTGRESQL_EXECUTABLE NAMES postgres PATHS ${T_POSTGRESQL_BIN} + NO_DEFAULT_PATH ) - message(STATUS "POSTGRESQL_EXECUTABLE is " ${POSTGRESQL_EXECUTABLE}) - - + # if not found continue search in the path and all the dirs listed here (questionable) + find_program(POSTGRESQL_EXECUTABLE NAMES postgres + PATHS + /usr/lib/postgresql/*/bin/ + ) +# # more elegant, equivalent way if we want to keep both of above: # find_program(POSTGRESQL_EXECUTABLE NAMES postgres +# HINTS +# ${T_POSTGRESQL_BIN} # PATHS # /usr/lib/postgresql/*/bin/ # ) -# message(STATUS "POSTGRESQL_EXECUTABLE is " ${POSTGRESQL_EXECUTABLE}) - -# find_program(POSTGRESQL_PG_CONFIG NAMES pg_config -# PATHS -# /usr/lib/postgresql/*/bin/ -# ) -# message(STATUS "POSTGRESQL_PG_CONFIG is " ${POSTGRESQL_PG_CONFIG}) + message(STATUS "POSTGRESQL_EXECUTABLE is " ${POSTGRESQL_EXECUTABLE}) if(POSTGRESQL_PG_CONFIG) execute_process( @@ -83,9 +81,12 @@ else(POSTGRESQL_INCLUDE_DIR AND POSTGRESQL_LIBRARIES AND POSTGRESQL_EXECUTABLE) OUTPUT_VARIABLE T_POSTGRESQL_INCLUDE_DIR) endif(POSTGRESQL_PG_CONFIG) + #as with POSTGRESQL_EXECUTABLE we should/could use the path specified by pg_config only + #instead of path and our own guesses find_path(POSTGRESQL_INCLUDE_DIR postgres.h + HINTS ${T_POSTGRESQL_INCLUDE_DIR} - + PATHS /usr/include/server /usr/include/pgsql/server /usr/local/include/pgsql/server ===================================== doc/queries/doc-pgr_version.queries ===================================== @@ -6,7 +6,7 @@ SET SELECT version FROM pgr_version(); version --------- - 2.6.2 + 2.6.3 (1 row) -- q2 ===================================== doc/src/pgRouting-concepts.rst ===================================== @@ -122,7 +122,7 @@ network. The general form of a route query is: .. code-block:: none - select pgr_dijkstra(`SELECT * FROM myroads', 1, 2) + select pgr_dijkstra('SELECT * FROM myroads', 1, 2) As you can see this is fairly straight forward and you can look and the specific algorithms for the details of the signatures and how to use them. ===================================== doc/src/release_notes.rst ===================================== @@ -18,6 +18,7 @@ To see the full list of changes check the list of `Git commits `_ on Github. + +.. rubric:: Bug fixes + +* `#1219 `__ Implicit cast for via_path integer to text +* `#1193 `__ Fixed pgr_pointsAsPolygon breaking when comparing strings in WHERE clause +* `#1185 `__ Improve FindPostgreSQL.cmake + + .. _changelog_2_6_2: pgRouting 2.6.2 Release Notes ===================================== pgtap/common/hasnt_v2_signatures.sql ===================================== @@ -30,7 +30,7 @@ SELECT hasnt_function('pgr_ksp',ARRAY['text', 'integer', 'integer', 'integer', ' SELECT hasnt_function('pgr_drivingdistance',ARRAY['text', 'integer', 'double precision', 'boolean', 'boolean']); SELECT hasnt_function('pgr_bdastar',ARRAY['text', 'integer', 'integer', 'boolean', 'boolean']); SELECT hasnt_function('pgr_bddijkstra',ARRAY['text', 'integer', 'integer', 'boolean', 'boolean']); -SELECT hasnt_function('pgr_tsp',ARRAY['(double precision[]', 'integer', 'integer']); +SELECT hasnt_function('pgr_tsp',ARRAY['double precision[]', 'integer', 'integer']); -- deprecated functions SELECT hasnt_function('pgr_kdijkstracost'); ===================================== sql/alpha_shape/alpha_shape.sql ===================================== @@ -59,7 +59,7 @@ CREATE OR REPLACE FUNCTION pgr_pointsAsPolygon(query varchar, alpha float8 DEFAU geoms := array[]::geometry[]; i := 1; - FOR vertex_result IN EXECUTE 'SELECT x, y FROM pgr_alphashape('''|| query || ''', ' || alpha || ')' + FOR vertex_result IN EXECUTE 'SELECT x, y FROM pgr_alphashape('''|| REPLACE(query, E'\'', E'\'\'') || ''', ' || alpha || ')' LOOP x[i] = vertex_result.x; y[i] = vertex_result.y; ===================================== sql/sigs/pgrouting--2.6.3.sig ===================================== @@ -0,0 +1,191 @@ +#VERSION pgrouting 2.6.3 +#TYPES +pgr_costresult +pgr_costresult3 +pgr_geomresult +#FUNCTIONS +pgr_alphashape(text,double precision) +pgr_analyzegraph(text,double precision,text,text,text,text,text) +pgr_analyzeoneway(text,text[],text[],text[],text[],boolean,text,text,text) +pgr_apspjohnson(text) +pgr_apspwarshall(text,boolean,boolean) +_pgr_array_reverse(anyarray) +pgr_articulationpoints(text) +pgr_astarcostmatrix(text,anyarray,boolean,integer,double precision,double precision) +pgr_astarcost(text,anyarray,anyarray,boolean,integer,double precision,double precision) +pgr_astarcost(text,anyarray,bigint,boolean,integer,double precision,double precision) +pgr_astarcost(text,bigint,anyarray,boolean,integer,double precision,double precision) +pgr_astarcost(text,bigint,bigint,boolean,integer,double precision,double precision) +pgr_astar(text,anyarray,anyarray,boolean,integer,double precision,double precision) +_pgr_astar(text,anyarray,anyarray,boolean,integer,double precision,double precision,boolean,boolean) +pgr_astar(text,anyarray,bigint,boolean,integer,double precision,double precision) +pgr_astar(text,bigint,anyarray,boolean,integer,double precision,double precision) +pgr_astar(text,bigint,bigint,boolean,integer,double precision,double precision) +pgr_astar(text,integer,integer,boolean,boolean) +pgr_bdastarcostmatrix(text,anyarray,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,anyarray,anyarray,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,anyarray,bigint,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,bigint,anyarray,boolean,integer,numeric,numeric) +pgr_bdastarcost(text,bigint,bigint,boolean,integer,numeric,numeric) +_pgr_bdastar(text,anyarray,anyarray,boolean,integer,double precision,double precision,boolean) +pgr_bdastar(text,anyarray,anyarray,boolean,integer,numeric,numeric) +pgr_bdastar(text,anyarray,bigint,boolean,integer,numeric,numeric) +pgr_bdastar(text,bigint,anyarray,boolean,integer,numeric,numeric) +pgr_bdastar(text,bigint,bigint) +pgr_bdastar(text,bigint,bigint,boolean,integer,numeric,numeric) +pgr_bdastar(text,integer,integer,boolean,boolean) +pgr_bddijkstracostmatrix(text,anyarray,boolean) +pgr_bddijkstracost(text,anyarray,anyarray,boolean) +pgr_bddijkstracost(text,anyarray,bigint,boolean) +pgr_bddijkstracost(text,bigint,anyarray,boolean) +pgr_bddijkstracost(text,bigint,bigint,boolean) +pgr_bddijkstra(text,anyarray,anyarray,boolean) +_pgr_bddijkstra(text,anyarray,anyarray,boolean,boolean) +pgr_bddijkstra(text,anyarray,bigint,boolean) +pgr_bddijkstra(text,bigint,anyarray,boolean) +pgr_bddijkstra(text,bigint,bigint) +pgr_bddijkstra(text,bigint,bigint,boolean) +pgr_bddijkstra(text,integer,integer,boolean,boolean) +pgr_biconnectedcomponents(text) +pgr_boykovkolmogorov(text,anyarray,anyarray) +pgr_boykovkolmogorov(text,anyarray,bigint) +pgr_boykovkolmogorov(text,bigint,anyarray) +pgr_boykovkolmogorov(text,bigint,bigint) +pgr_bridges(text) +_pgr_checkverttab(text,text[],integer,text) +pgr_connectedcomponents(text) +pgr_contractgraph(text,bigint[],integer,bigint[],boolean) +_pgr_createindex(text,text,text,integer,text) +_pgr_createindex(text,text,text,text,integer,text) +pgr_createtopology(text,double precision,text,text,text,text,text,boolean) +pgr_createverticestable(text,text,text,text,text) +pgr_dijkstracostmatrix(text,anyarray,boolean) +pgr_dijkstracost(text,anyarray,anyarray,boolean) +pgr_dijkstracost(text,anyarray,bigint,boolean) +pgr_dijkstracost(text,bigint,anyarray,boolean) +pgr_dijkstracost(text,bigint,bigint,boolean) +pgr_dijkstra(text,anyarray,anyarray,boolean) +_pgr_dijkstra(text,anyarray,anyarray,boolean,boolean,boolean) +pgr_dijkstra(text,anyarray,bigint,boolean) +pgr_dijkstra(text,bigint,anyarray,boolean) +pgr_dijkstra(text,bigint,bigint) +pgr_dijkstra(text,bigint,bigint,boolean) +pgr_dijkstra(text,integer,integer,boolean,boolean) +pgr_dijkstravia(text,anyarray,boolean,boolean,boolean) +pgr_drivingdistance(text,anyarray,double precision,boolean,boolean) +pgr_drivingdistance(text,bigint,double precision,boolean) +pgr_drivingdistance(text,bigint,double precision,boolean,boolean) +pgr_edgedisjointpaths(text,anyarray,anyarray,boolean) +pgr_edgedisjointpaths(text,anyarray,bigint,boolean) +pgr_edgedisjointpaths(text,bigint,anyarray,boolean) +pgr_edgedisjointpaths(text,bigint,bigint,boolean) +pgr_edmondskarp(text,anyarray,anyarray) +pgr_edmondskarp(text,anyarray,bigint) +pgr_edmondskarp(text,bigint,anyarray) +pgr_edmondskarp(text,bigint,bigint) +_pgr_endpoint(geometry) +pgr_endpoint(geometry) +pgr_euclediantsp(text,bigint,bigint,double precision,integer,integer,integer,double precision,double precision,double precision,boolean) +pgr_flipedges(geometry[]) +pgr_floydwarshall(text,boolean) +pgr_getcolumnname(text,text) +_pgr_getcolumnname(text,text,integer,text) +_pgr_getcolumnname(text,text,text,integer,text) +_pgr_getcolumntype(text,text,integer,text) +_pgr_getcolumntype(text,text,text,integer,text) +_pgr_get_statement(text) +pgr_gettablename(text) +_pgr_gettablename(text,integer,text) +_pgr_gsoc_vrppdtw(text,integer,double precision,double precision,integer) +pgr_gsoc_vrppdtw(text,integer,integer) +pgr_iscolumnindexed(text,text) +_pgr_iscolumnindexed(text,text,integer,text) +_pgr_iscolumnindexed(text,text,text,integer,text) +_pgr_iscolumnintable(text,text) +pgr_iscolumnintable(text,text) +pgr_johnson(text,boolean) +pgr_kdijkstracost(text,integer,integer[],boolean,boolean) +pgr_kdijkstrapath(text,integer,integer[],boolean,boolean) +_pgr_ksp(text,bigint,bigint,integer,boolean,boolean) +pgr_ksp(text,bigint,bigint,integer,boolean,boolean) +pgr_ksp(text,integer,integer,integer,boolean) +pgr_labelgraph(text,text,text,text,text,text) +pgr_linegraphfull(text) +pgr_linegraph(text,boolean) +_pgr_makedistancematrix(text) +pgr_maxcardinalitymatch(text,boolean) +pgr_maxflowboykovkolmogorov(text,anyarray,anyarray) +pgr_maxflowboykovkolmogorov(text,anyarray,bigint) +pgr_maxflowboykovkolmogorov(text,bigint,anyarray) +pgr_maxflowboykovkolmogorov(text,bigint,bigint) +pgr_maxflowedmondskarp(text,anyarray,anyarray) +pgr_maxflowedmondskarp(text,anyarray,bigint) +pgr_maxflowedmondskarp(text,bigint,anyarray) +pgr_maxflowedmondskarp(text,bigint,bigint) +pgr_maxflowpushrelabel(text,anyarray,anyarray) +pgr_maxflowpushrelabel(text,anyarray,bigint) +pgr_maxflowpushrelabel(text,bigint,anyarray) +pgr_maxflowpushrelabel(text,bigint,bigint) +pgr_maxflow(text,anyarray,anyarray) +_pgr_maxflow(text,anyarray,anyarray,integer,boolean) +pgr_maxflow(text,anyarray,bigint) +pgr_maxflow(text,bigint,anyarray) +pgr_maxflow(text,bigint,bigint) +pgr_maximumcardinalitymatching(text,boolean) +_pgr_msg(integer,text,text) +pgr_nodenetwork(text,double precision,text,text,text,text,boolean) +_pgr_onerror(boolean,integer,text,text,text,text) +_pgr_parameter_check(text,text,boolean) +_pgr_pickdelivereuclidean(text,text,double precision,integer,integer) +_pgr_pickdeliver(text,text,text,double precision,integer,integer) +pgr_pointsaspolygon(character varying,double precision) +pgr_pointstodmatrix(geometry[],integer) +pgr_pointstovids(geometry[],text,double precision) +pgr_pointtoedgenode(text,geometry,double precision) +_pgr_pointtoid(geometry,double precision,text,integer) +pgr_pushrelabel(text,anyarray,anyarray) +pgr_pushrelabel(text,anyarray,bigint) +pgr_pushrelabel(text,bigint,anyarray) +pgr_pushrelabel(text,bigint,bigint) +_pgr_quote_ident(text) +pgr_quote_ident(text) +_pgr_startpoint(geometry) +pgr_startpoint(geometry) +pgr_strongcomponents(text) +pgr_texttopoints(text,integer) +_pgr_trsp(text,integer,double precision,integer,double precision,boolean,boolean,text) +pgr_trsp(text,integer,double precision,integer,double precision,boolean,boolean,text) +pgr_trsp(text,integer,integer,boolean,boolean,text) +_pgr_trsp(text,text,anyarray,anyarray,boolean) +_pgr_trsp(text,text,anyarray,bigint,boolean) +_pgr_trsp(text,text,bigint,anyarray,boolean) +_pgr_trsp(text,text,bigint,bigint,boolean) +pgr_trspviaedges(text,integer[],double precision[],boolean,boolean,text) +pgr_trspviavertices(text,anyarray,boolean,boolean,text) +_pgr_trspviavertices(text,integer[],boolean,boolean,text) +pgr_tsp(double precision[],integer,integer) +pgr_tsp(text,bigint,bigint,double precision,integer,integer,integer,double precision,double precision,double precision,boolean) +pgr_tsp(text,integer,integer) +_pgr_unnest_matrix(double precision[]) +pgr_version() +_pgr_versionless(text,text) +pgr_versionless(text,text) +pgr_vidstodmatrix(integer[],geometry[],text,double precision) +pgr_vidstodmatrix(text,integer[],boolean,boolean,boolean) +_pgr_vrponedepot(text,text,text,integer) +pgr_vrponedepot(text,text,text,integer) +pgr_withpointscostmatrix(text,text,anyarray,boolean,character) +pgr_withpointscost(text,text,anyarray,anyarray,boolean,character) +pgr_withpointscost(text,text,anyarray,bigint,boolean,character) +pgr_withpointscost(text,text,bigint,anyarray,boolean,character) +pgr_withpointscost(text,text,bigint,bigint,boolean,character) +pgr_withpointsdd(text,text,anyarray,double precision,boolean,character,boolean,boolean) +pgr_withpointsdd(text,text,bigint,double precision,boolean,character,boolean) +pgr_withpointsksp(text,text,bigint,bigint,integer,boolean,boolean,character,boolean) +pgr_withpoints(text,text,anyarray,anyarray,boolean,character,boolean) +_pgr_withpoints(text,text,anyarray,anyarray,boolean,character,boolean,boolean,boolean) +pgr_withpoints(text,text,anyarray,bigint,boolean,character,boolean) +pgr_withpoints(text,text,bigint,anyarray,boolean,character,boolean) +pgr_withpoints(text,text,bigint,bigint,boolean,character,boolean) +_pgr_withpointsvia(text,bigint[],double precision[],boolean) +_trsp(text,text,anyarray,anyarray,boolean) ===================================== sql/trsp/trsp_V2.2.sql ===================================== @@ -105,7 +105,7 @@ BEGIN $6 || $$ ) SELECT ROW_NUMBER() OVER() AS id, - _pgr_array_reverse(array_prepend(target_id, string_to_array(via_path, ',')::INTEGER[])) AS path, + _pgr_array_reverse(array_prepend(target_id, string_to_array(via_path::text, ',')::INTEGER[])) AS path, to_cost AS cost FROM old_restrictions; $$; ===================================== test/common/doc-pgr_version.result ===================================== @@ -6,7 +6,7 @@ SET SELECT version FROM pgr_version(); version --------- - 2.6.2 + 2.6.3 (1 row) -- q2 View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/commit/2e7fc2622263acb485c06ff55be78580e4d68dea -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/commit/2e7fc2622263acb485c06ff55be78580e4d68dea You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 17:29:18 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 16:29:18 +0000 Subject: [Git][debian-gis-team/pgrouting] Pushed new tag debian/2.6.3-1 Message-ID: <5d7137de17a4a_577b2ade61bd0334964220@godard.mail> Bas Couwenberg pushed new tag debian/2.6.3-1 at Debian GIS Project / pgrouting -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/tree/debian/2.6.3-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 5 17:29:19 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 05 Sep 2019 16:29:19 +0000 Subject: [Git][debian-gis-team/pgrouting] Pushed new tag upstream/2.6.3 Message-ID: <5d7137dfd0dc_577b3f91ce1f2f949644a9@godard.mail> Bas Couwenberg pushed new tag upstream/2.6.3 at Debian GIS Project / pgrouting -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/tree/upstream/2.6.3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 5 17:39:32 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 16:39:32 +0000 Subject: Processing of pgrouting_2.6.3-1_source.changes Message-ID: pgrouting_2.6.3-1_source.changes uploaded successfully to localhost along with the files: pgrouting_2.6.3-1.dsc pgrouting_2.6.3.orig.tar.gz pgrouting_2.6.3-1.debian.tar.xz pgrouting_2.6.3-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Thu Sep 5 17:58:12 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 05 Sep 2019 16:58:12 +0000 Subject: pgrouting_2.6.3-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 05 Sep 2019 18:10:48 +0200 Source: pgrouting Architecture: source Version: 2.6.3-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: pgrouting (2.6.3-1) unstable; urgency=medium . * Team upload. * New upstream release. Checksums-Sha1: cc67bee23a99f9e5e2c84f7be8e84dd68f078bed 2344 pgrouting_2.6.3-1.dsc 6adb11cc49a47baaa4599909c46d0c73393aaa91 6066698 pgrouting_2.6.3.orig.tar.gz 872f25a28825477a8c53b640b6ee21dba0cf596f 16264 pgrouting_2.6.3-1.debian.tar.xz c42ec4baca55137a3653c4ed331ec266cee0e534 13529 pgrouting_2.6.3-1_amd64.buildinfo Checksums-Sha256: 2e943b7a155d754bc57e6fb65b28d6a793254df902716dd09133a4abc4a05b2b 2344 pgrouting_2.6.3-1.dsc 7ebef19dc698d4e85b85274f6949e77b26fe5a2b79335589bc3fbdfca977eb0f 6066698 pgrouting_2.6.3.orig.tar.gz 40aaef7cd1bf7d1bcf7452ee1ac22e105ae5fd2306e65de9272f951652d8b8e6 16264 pgrouting_2.6.3-1.debian.tar.xz 5229a305d3078fb4f249264dba1193f15e08ca1e88be2dae45cb223d0fd441a8 13529 pgrouting_2.6.3-1_amd64.buildinfo Files: ad33e090ff5734043703f393c0870a15 2344 misc optional pgrouting_2.6.3-1.dsc 16f8537a553d2953ef50726764dd7d66 6066698 misc optional pgrouting_2.6.3.orig.tar.gz d918f87db4133764c7384d306d159a3b 16264 misc optional pgrouting_2.6.3-1.debian.tar.xz 5195581238a6729b68e07527e92959e3 13529 misc optional pgrouting_2.6.3-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1xN4MACgkQZ1DxCuiN SvEj1BAAn1MLa0iUo2zpMn8+FqQf10Jq6UA2R8HU/Bk5bOENPrAx30DpZcVJEi1u gNxy3/Z+Ee8EO9KdD51QTYoyqAIRfaUYRs42HLgDpiuJYtXaQH88fM5PSbE6RNYg c2XXFeLSxjWDNZzFDyP9hElDXqH0mxkB5UXxdU2xF06CsvQSWXd5j7sE0s0aQQhH bKYxwBEdY6JI4OjFh4DQ1xYUWLRhEpMttOKOoQs+AsWfo7gxd3ztwfMV290yXiPW tnE6UUg3uzuzO+EImAIyEC/uxCuFjOrW+Fp6VPAns0M2T1MEkdane68EmUtfR/7O QO60hZXROTQ+7GAc9QZLNkLVgnyH8qTJlzTKcIktqP5nOGODvqic710RY5KYoS9c /y7Riy35HaIz9ESzp83xsIp+J3t5I/T4kF/IjlH4PRRWshTWbF8o9+inRJqeWh4P O7ddoHDtT51FmUJTjjG1H2dN5wF3mjSx9YkdGN/fcNuYULDyBGEq3yQ7FCMBGrz8 O/OKmlfHvpS0uSnqYT43hppKh+sQUpzxpWyd1usQM86QKipCVGOhHaGw3dHHwPXX imdRCNPXJignhhGmdqjNXkjRZr+WrcugyGEDhfeegLDWcrfEbgtIkCGXvMGck12d Chqka7KLdDTjyB0xmG+PedLIZTstNfkGgZS+eKRfmyzPlgbbyvM= =TXbd -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From debian-bts-link at lists.debian.org Thu Sep 5 20:27:24 2019 From: debian-bts-link at lists.debian.org (debian-bts-link at lists.debian.org) Date: Thu, 05 Sep 2019 19:27:24 +0000 Subject: [bts-link] source package src:totalopenstation Message-ID: <156771164427.9023.14031625188256993248.btslink@sonntag.debian.org> # # bts-link upstream status pull for source package src:totalopenstation # see http://lists.debian.org/debian-devel-announce/2006/05/msg00001.html # https://bts-link-team.pages.debian.net/bts-link/ # user debian-bts-link at lists.debian.org # remote status report for #938680 (http://bugs.debian.org/938680) # Bug title: totalopenstation: Python2 removal in sid/bullseye # * https://github.com/steko/totalopenstation/issues/32 # * remote status changed: (?) -> open usertags 938680 + status-open thanks From noreply at release.debian.org Fri Sep 6 05:39:24 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 06 Sep 2019 04:39:24 +0000 Subject: pycsw 2.4.1+dfsg-1 MIGRATED to testing Message-ID: FYI: The status of the pycsw source package in Debian's testing distribution has changed. Previous version: 2.4.0+dfsg-2 Current version: 2.4.1+dfsg-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Fri Sep 6 05:39:22 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 06 Sep 2019 04:39:22 +0000 Subject: mkgmap 0.0.0+svn4289-1 MIGRATED to testing Message-ID: FYI: The status of the mkgmap source package in Debian's testing distribution has changed. Previous version: 0.0.0+svn4287-1 Current version: 0.0.0+svn4289-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Fri Sep 6 06:15:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 06 Sep 2019 05:15:44 +0000 Subject: [Git][debian-gis-team/rasterio][pristine-tar] pristine-tar data for rasterio_1.0.27.orig.tar.gz Message-ID: <5d71eb803dcdc_577b3f91ce507b6c10770ec@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / rasterio Commits: 47d5ef83 by Bas Couwenberg at 2019-09-06T05:01:53Z pristine-tar data for rasterio_1.0.27.orig.tar.gz - - - - - 2 changed files: - + rasterio_1.0.27.orig.tar.gz.delta - + rasterio_1.0.27.orig.tar.gz.id Changes: ===================================== rasterio_1.0.27.orig.tar.gz.delta ===================================== Binary files /dev/null and b/rasterio_1.0.27.orig.tar.gz.delta differ ===================================== rasterio_1.0.27.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +f6154a4087bb0c6f975e31a90f4ea68d7c16562e View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/47d5ef830798c4aa6a30216e45771eb89edb42a6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/47d5ef830798c4aa6a30216e45771eb89edb42a6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 6 06:15:55 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 06 Sep 2019 05:15:55 +0000 Subject: [Git][debian-gis-team/rasterio][master] 4 commits: New upstream version 1.0.27 Message-ID: <5d71eb8b32bd_577b2ade61bd033410772be@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / rasterio Commits: d6d54f4f by Bas Couwenberg at 2019-09-06T05:01:06Z New upstream version 1.0.27 - - - - - d2485d13 by Bas Couwenberg at 2019-09-06T05:01:54Z Update upstream source from tag 'upstream/1.0.27' Update to upstream version '1.0.27' with Debian dir 43433957a3b2867f99b0023d181964d3be55b00c - - - - - d732dd85 by Bas Couwenberg at 2019-09-06T05:02:28Z New upstream release. - - - - - bf42bcb4 by Bas Couwenberg at 2019-09-06T05:03:38Z Set distribution to unstable. - - - - - 25 changed files: - CHANGES.txt - debian/changelog - rasterio/__init__.py - rasterio/_base.pyx - rasterio/_features.pyx - rasterio/_io.pyx - rasterio/_warp.pyx - rasterio/compat.py - rasterio/crs.py - rasterio/features.py - rasterio/rio/stack.py - rasterio/transform.py - rasterio/vrt.py - rasterio/windows.py - tests/conftest.py - + tests/data/float32.tif - tests/test_blocks.py - tests/test_colormap.py - tests/test_dataset.py - tests/test_plot.py - tests/test_profile.py - tests/test_rio_info.py - tests/test_rio_merge.py - tests/test_warp.py - tests/test_warpedvrt.py Changes: ===================================== CHANGES.txt ===================================== @@ -1,6 +1,29 @@ Changes ======= +1.0.27 (2019-09-05) +------------------- + +- Resolve #1744 by adding a `dtype` keyword argument to the WarpedVRT + constructor. It allows a user to specify the working data type for the warp + operation and output. +- All cases of deprecated affine right multiplication have been changed to be + forward compatible with affine 3.0. The rasterio tests now pass without + warnings. +- The coordinate transformer used in _base._transform() is now properly + deleted, fixing the memory leak reported in #1713. +- An unavoidable warning about 4-channel colormap entries in + DatasetWriterBase.write_colormap() has been removed. +- All deprecated imports of abstract base classes for collections have been + corrected, eliminating the warnings reported in #1742 and #1764. +- DatasetWriterBase no longer requires that GeoTIFF block sizes be smaller than + the raster size (#1760). Block sizes are however checked to ensure that they + are multiples of 16. +- DatasetBase.is_tiled has been made more reliable, fixing #1376. +- Tests have been added to demonstrate that image corruption when writing + block-wise to an image with extra large block sizes (#520) is no longer an + issue. + 1.0.26 (2019-08-26) ------------------- ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +rasterio (1.0.27-1) unstable; urgency=medium + + * Team upload. + * New upstream release. + + -- Bas Couwenberg Fri, 06 Sep 2019 07:03:26 +0200 + rasterio (1.0.26-1) unstable; urgency=medium * Team upload. ===================================== rasterio/__init__.py ===================================== @@ -42,7 +42,7 @@ import rasterio.path __all__ = ['band', 'open', 'pad', 'Env'] -__version__ = "1.0.26" +__version__ = "1.0.27" __gdal_version__ = gdal_version() # Rasterio attaches NullHandler to the 'rasterio' logger and its ===================================== rasterio/_base.pyx ===================================== @@ -879,7 +879,15 @@ cdef class DatasetBase(object): def is_tiled(self): if len(self.block_shapes) == 0: return False - return self.block_shapes[0][1] < self.width and self.block_shapes[0][1] <= 1024 + else: + blockysize, blockxsize = self.block_shapes[0] + if blockxsize % 16 or blockysize % 16: + return False + # Perfectly square is a special case/ + if blockxsize == blockysize == self.height == self.width: + return True + else: + return blockxsize < self.width or blockxsize > self.width property profile: """Basic metadata and creation options of this dataset. @@ -1269,17 +1277,6 @@ def _transform(src_crs, dst_crs, xs, ys, zs): transform = exc_wrap_pointer(transform) exc_wrap_int(OCTTransform(transform, n, x, y, z)) - except CPLE_BaseError as exc: - log.debug("{}".format(exc)) - - except: - CPLFree(x) - CPLFree(y) - CPLFree(z) - _safe_osr_release(src) - _safe_osr_release(dst) - - try: res_xs = [0]*n res_ys = [0]*n for i in range(n): @@ -1297,6 +1294,7 @@ def _transform(src_crs, dst_crs, xs, ys, zs): CPLFree(x) CPLFree(y) CPLFree(z) + OCTDestroyCoordinateTransformation(transform) _safe_osr_release(src) _safe_osr_release(dst) ===================================== rasterio/_features.pyx ===================================== @@ -367,7 +367,7 @@ def _bounds(geometry, north_up=True, transform=None): else: if transform is not None: xyz = list(_explode(geometry['coordinates'])) - xyz_px = [point * transform for point in xyz] + xyz_px = [transform * point for point in xyz] xyz = tuple(zip(*xyz_px)) return min(xyz[0]), max(xyz[1]), max(xyz[0]), min(xyz[1]) else: ===================================== rasterio/_io.pyx ===================================== @@ -25,7 +25,7 @@ from rasterio.enums import ColorInterp, MaskFlags, Resampling from rasterio.errors import ( CRSError, DriverRegistrationError, RasterioIOError, NotGeoreferencedWarning, NodataShadowWarning, WindowError, - UnsupportedOperation, OverviewCreationError + UnsupportedOperation, OverviewCreationError, RasterBlockError ) from rasterio.sample import sample_gen from rasterio.transform import Affine @@ -1077,7 +1077,14 @@ cdef class DatasetWriterBase(DatasetReaderBase): # Process dataset opening options. # "tiled" affects the meaning of blocksize, so we need it # before iterating. - tiled = bool(kwargs.get('tiled', False)) + tiled = kwargs.pop("tiled", False) or kwargs.pop("TILED", False) + + if tiled: + blockxsize = kwargs.get("blockxsize", None) + blockysize = kwargs.get("blockysize", None) + if (blockxsize and blockxsize % 16) or (blockysize and blockysize % 16): + raise RasterBlockError("The height and width of dataset blocks must be multiples of 16") + kwargs["tiled"] = "TRUE" for k, v in kwargs.items(): # Skip items that are definitely *not* valid driver @@ -1087,11 +1094,8 @@ cdef class DatasetWriterBase(DatasetReaderBase): k, v = k.upper(), str(v).upper() - # Guard against block size that exceed image size. - if k == 'BLOCKXSIZE' and tiled and int(v) > width: - raise ValueError("blockxsize exceeds raster width.") - if k == 'BLOCKYSIZE' and tiled and int(v) > height: - raise ValueError("blockysize exceeds raster height.") + if k in ['BLOCKXSIZE', 'BLOCKYSIZE'] and not tiled: + continue key_b = k.encode('utf-8') val_b = v.encode('utf-8') @@ -1505,12 +1509,8 @@ cdef class DatasetWriterBase(DatasetReaderBase): vals = range(256) for i, rgba in colormap.items(): - if len(rgba) == 4 and self.driver in ('GTiff'): - warnings.warn( - "This format doesn't support alpha in colormap entries. " - "The value will be ignored.") - elif len(rgba) == 3: + if len(rgba) == 3: rgba = tuple(rgba) + (255,) if i not in vals: ===================================== rasterio/_warp.pyx ===================================== @@ -112,7 +112,7 @@ def _transform_geom( cdef GDALWarpOptions * create_warp_options( GDALResampleAlg resampling, object src_nodata, object dst_nodata, int src_count, - object dst_alpha, object src_alpha, int warp_mem_limit, const char **options) except NULL: + object dst_alpha, object src_alpha, int warp_mem_limit, GDALDataType working_data_type, const char **options) except NULL: """Return a pointer to a GDALWarpOptions composed from input params This is used in _reproject() and the WarpedVRT constructor. It sets @@ -145,6 +145,7 @@ cdef GDALWarpOptions * create_warp_options( warp_extras = CSLMerge(warp_extras, options) + psWOptions.eWorkingDataType = working_data_type psWOptions.eResampleAlg = resampling if warp_mem_limit > 0: @@ -212,6 +213,7 @@ def _reproject( init_dest_nodata=True, num_threads=1, warp_mem_limit=0, + working_data_type=0, **kwargs): """ Reproject a source raster to a destination raster. @@ -315,7 +317,7 @@ def _reproject( if not in_dtype_range(dst_nodata, destination.dtype): raise ValueError("dst_nodata must be in valid range for " "destination dtype") - + def format_transform(in_transform): if not in_transform: return in_transform @@ -461,7 +463,7 @@ def _reproject( psWOptions = create_warp_options( resampling, src_nodata, - dst_nodata, src_count, dst_alpha, src_alpha, warp_mem_limit, + dst_nodata, src_count, dst_alpha, src_alpha, warp_mem_limit, working_data_type, warp_extras) psWOptions.pfnTransformer = pfnTransformer @@ -607,7 +609,7 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): dst_width=None, width=None, dst_height=None, height=None, src_transform=None, dst_transform=None, transform=None, init_dest_nodata=True, src_alpha=0, add_alpha=False, - warp_mem_limit=0, **warp_extras): + warp_mem_limit=0, dtype=None, **warp_extras): """Make a virtual warped dataset Parameters @@ -655,6 +657,8 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): warp_mem_limit : int, optional The warp operation's memory limit in MB. The default (0) means 64 MB with GDAL 2.2. + dtype : str, optional + The working data type for warp operation and output. warp_extras : dict GDAL extra warp options. See http://www.gdal.org/structGDALWarpOptions.html. @@ -735,6 +739,7 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): self.name = "WarpedVRT({})".format(src_dataset.name) self.resampling = resampling self.tolerance = tolerance + self.working_dtype = dtype self.src_nodata = self.src_dataset.nodata if src_nodata is DEFAULT_NODATA_FLAG else src_nodata self.dst_nodata = self.src_nodata if nodata is DEFAULT_NODATA_FLAG else nodata @@ -839,7 +844,8 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): psWOptions = create_warp_options( c_resampling, self.src_nodata, self.dst_nodata, src_dataset.count, dst_alpha, - src_alpha_band, warp_mem_limit, c_warp_extras) + src_alpha_band, warp_mem_limit, dtypes.dtype_rev[self.working_dtype], + c_warp_extras) if psWOptions == NULL: raise RuntimeError("Warp options are NULL") ===================================== rasterio/compat.py ===================================== @@ -12,7 +12,7 @@ if sys.version_info[0] >= 3: # pragma: no cover import configparser from urllib.parse import urlparse from collections import UserDict - from collections.abc import Mapping + from collections.abc import Iterable, Mapping from inspect import getfullargspec as getargspec else: # pragma: no cover string_types = basestring, @@ -23,4 +23,4 @@ else: # pragma: no cover from urlparse import urlparse from UserDict import UserDict from inspect import getargspec - from collections import Mapping + from collections import Iterable, Mapping ===================================== rasterio/crs.py ===================================== @@ -10,7 +10,6 @@ used. """ -import collections import json import pickle ===================================== rasterio/features.py ===================================== @@ -419,8 +419,8 @@ def geometry_window(dataset, shapes, pad_x=0, pad_y=0, north_up=True, right = min(dataset.shape[1], right) bottom = min(dataset.shape[0], bottom) # convert the bounds back to the CRS domain - left, top = (left, top) * dataset.transform - right, bottom = (right, bottom) * dataset.transform + left, top = dataset.transform * (left, top) + right, bottom = dataset.transform * (right, bottom) window = dataset.window(left, bottom, right, top) window_floored = window.round_offsets(op='floor', pixel_precision=pixel_precision) ===================================== rasterio/rio/stack.py ===================================== @@ -1,14 +1,13 @@ """$ rio stack""" -import collections import logging import click from cligj import format_opt import rasterio -from rasterio.compat import zip_longest +from rasterio.compat import Iterable, zip_longest from rasterio.rio import options from rasterio.rio.helpers import resolve_inout @@ -104,7 +103,7 @@ def stack(ctx, files, output, driver, bidx, photometric, overwrite, data = src.read(index) dst.write(data, dst_idx) dst_idx += 1 - elif isinstance(index, collections.Iterable): + elif isinstance(index, Iterable): data = src.read(index) dst.write(data, range(dst_idx, dst_idx + len(index))) dst_idx += len(index) ===================================== rasterio/transform.py ===================================== @@ -2,11 +2,12 @@ from __future__ import division -import collections import math from affine import Affine +from rasterio.compat import Iterable + IDENTITY = Affine.identity() GDAL_IDENTITY = IDENTITY.to_gdal() @@ -153,10 +154,10 @@ def xy(transform, rows, cols, offset='center'): single_col = False single_row = False - if not isinstance(cols, collections.Iterable): + if not isinstance(cols, Iterable): cols = [cols] single_col = True - if not isinstance(rows, collections.Iterable): + if not isinstance(rows, Iterable): rows = [rows] single_row = True @@ -221,10 +222,10 @@ def rowcol(transform, xs, ys, op=math.floor, precision=None): single_x = False single_y = False - if not isinstance(xs, collections.Iterable): + if not isinstance(xs, Iterable): xs = [xs] single_x = True - if not isinstance(ys, collections.Iterable): + if not isinstance(ys, Iterable): ys = [ys] single_y = True ===================================== rasterio/vrt.py ===================================== @@ -21,6 +21,57 @@ class WarpedVRT(WarpedVRTReaderBase, WindowMethodsMixin, This class is backed by an in-memory GDAL VRTWarpedDataset VRT file. + Parameters + ---------- + src_dataset : dataset object + The warp source. + src_crs : CRS or str, optional + Overrides the coordinate reference system of `src_dataset`. + src_transfrom : Affine, optional + Overrides the transform of `src_dataset`. + src_nodata : float, optional + Overrides the nodata value of `src_dataset`, which is the + default. + crs : CRS or str, optional + The coordinate reference system at the end of the warp + operation. Default: the crs of `src_dataset`. dst_crs is + a deprecated alias for this parameter. + transform : Affine, optional + The transform for the virtual dataset. Default: will be + computed from the attributes of `src_dataset`. dst_transform + is a deprecated alias for this parameter. + height, width: int, optional + The dimensions of the virtual dataset. Defaults: will be + computed from the attributes of `src_dataset`. dst_height + and dst_width are deprecated alias for these parameters. + nodata : float, optional + Nodata value for the virtual dataset. Default: the nodata + value of `src_dataset` or 0.0. dst_nodata is a deprecated + alias for this parameter. + resampling : Resampling, optional + Warp resampling algorithm. Default: `Resampling.nearest`. + tolerance : float, optional + The maximum error tolerance in input pixels when + approximating the warp transformation. Default: 0.125, + or one-eigth of a pixel. + src_alpha : int, optional + Index of a source band to use as an alpha band for warping. + add_alpha : bool, optional + Whether to add an alpha masking band to the virtual dataset. + Default: False. This option will cause deletion of the VRT + nodata value. + init_dest_nodata : bool, optional + Whether or not to initialize output to `nodata`. Default: + True. + warp_mem_limit : int, optional + The warp operation's memory limit in MB. The default (0) + means 64 MB with GDAL 2.2. + dtype : str, optional + The working data type for warp operation and output. + warp_extras : dict + GDAL extra warp options. See + http://www.gdal.org/structGDALWarpOptions.html. + Attributes ---------- src_dataset : dataset @@ -39,6 +90,8 @@ class WarpedVRT(WarpedVRTReaderBase, WindowMethodsMixin, The nodata value used to initialize the destination; it will remain in all areas not covered by the reprojected source. Defaults to the value of src_nodata, or 0 (gdal default). + working_dtype : str, optional + The working data type for warp operation and output. warp_extras : dict GDAL extra warp options. See http://www.gdal.org/structGDALWarpOptions.html. ===================================== rasterio/windows.py ===================================== @@ -27,6 +27,7 @@ import attr from affine import Affine import numpy as np +from rasterio.compat import Iterable from rasterio.errors import WindowError from rasterio.transform import rowcol, guard_transform @@ -114,7 +115,7 @@ def iter_args(function): """ @functools.wraps(function) def wrapper(*args, **kwargs): - if len(args) == 1 and isinstance(args[0], collections.Iterable): + if len(args) == 1 and isinstance(args[0], Iterable): return function(*args[0]) else: return function(*args) ===================================== tests/conftest.py ===================================== @@ -27,8 +27,8 @@ if sys.version_info > (3,): reduce = functools.reduce test_files = [os.path.join(os.path.dirname(__file__), p) for p in [ - 'data/RGB.byte.tif', 'data/float.tif', 'data/float_nan.tif', - 'data/shade.tif', 'data/RGBA.byte.tif']] + 'data/RGB.byte.tif', 'data/float.tif', 'data/float32.tif', + 'data/float_nan.tif', 'data/shade.tif', 'data/RGBA.byte.tif']] def pytest_cmdline_main(config): @@ -601,6 +601,10 @@ requires_gdal22 = pytest.mark.skipif( not gdal_version.at_least('2.2'), reason="Requires GDAL 2.2.x") +requires_gdal23 = pytest.mark.skipif( + not gdal_version.at_least('2.3'), + reason="Requires GDAL ~= 2.3") + requires_gdal_lt_3 = pytest.mark.skipif( gdal_version.__lt__('3.0'), reason="Requires GDAL 1.x/2.x") ===================================== tests/data/float32.tif ===================================== Binary files /dev/null and b/tests/data/float32.tif differ ===================================== tests/test_blocks.py ===================================== @@ -11,6 +11,7 @@ import pytest import rasterio from rasterio import windows from rasterio.errors import RasterBlockError +from rasterio.profiles import default_gtiff_profile from .conftest import requires_gdal2 @@ -51,6 +52,7 @@ class WindowTest(unittest.TestCase): rasterio.windows.evaluate(((None, -10), (None, -10)), 100, 90), windows.Window.from_slices((0, 90), (0, 80))) + def test_window_index(): idx = rasterio.windows.window_index(((0, 4), (1, 12))) assert len(idx) == 2 @@ -191,3 +193,29 @@ def test_block_window_tiff(path_rgb_byte_tif): with rasterio.open(path_rgb_byte_tif) as src: for (i, j), w in src.block_windows(): assert src.block_window(1, i, j) == w + + + at pytest.mark.parametrize("blocksize", [16, 32, 256, 1024]) +def test_block_windows_bigger_blocksize(tmpdir, blocksize): + """Ensure that block sizes greater than raster size are ok""" + tempfile = str(tmpdir.join("test.tif")) + profile = default_gtiff_profile.copy() + profile.update(height=16, width=16, count=1, blockxsize=blocksize, blockysize=blocksize) + with rasterio.open(tempfile, "w", **profile) as dst: + assert dst.is_tiled + for ij, window in dst.block_windows(): + dst.write(np.ones((1, 1), dtype="uint8"), 1, window=window) + + with rasterio.open(tempfile) as dst: + assert list(dst.block_windows()) == [((0, 0), windows.Window(0, 0, 16, 16))] + assert (dst.read(1) == 1).all() + + + at pytest.mark.parametrize("blocksizes", [{"blockxsize": 33, "blockysize": 32}, {"blockxsize": 32, "blockysize": 33}]) +def test_odd_blocksize_error(tmpdir, blocksizes): + """For a tiled TIFF block sizes must be multiples of 16""" + tempfile = str(tmpdir.join("test.tif")) + profile = default_gtiff_profile.copy() + profile.update(height=64, width=64, count=1, **blocksizes) + with pytest.raises(RasterBlockError): + rasterio.open(tempfile, "w", **profile) ===================================== tests/test_colormap.py ===================================== @@ -5,17 +5,12 @@ import sys import rasterio -logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) - - def test_write_colormap_warn(tmpdir, recwarn): with rasterio.open('tests/data/shade.tif') as src: profile = src.meta tiffname = str(tmpdir.join('foo.tif')) with rasterio.open(tiffname, 'w', **profile) as dst: dst.write_colormap(1, {0: (255, 0, 0, 255), 255: (0, 0, 0, 0)}) - w = recwarn.pop(UserWarning) - assert "The value will be ignored" in str(w.message) def test_write_colormap(tmpdir): ===================================== tests/test_dataset.py ===================================== @@ -47,23 +47,16 @@ def test_untiled_dataset_blocksize(tmpdir): """Blocksize is not relevant to untiled datasets (see #1689)""" tmpfile = str(tmpdir.join("test.tif")) with rasterio.open( - tmpfile, "w", driver="GTiff", count=1, height=13, width=13, dtype="uint8", crs="epsg:3857", - transform=Affine.identity(), blockxsize=256, blockysize=256) as dataset: + tmpfile, "w", driver="GTiff", count=1, height=13, width=23, dtype="uint8", crs="epsg:3857", + transform=Affine.identity(), blockxsize=64, blockysize=64) as dataset: pass with rasterio.open(tmpfile) as dataset: assert not dataset.profile["tiled"] - assert dataset.shape == (13, 13) + assert dataset.shape == (13, 23) + assert dataset.block_shapes == [(13, 23)] -def test_tiled_dataset_blocksize_guard(tmpdir): - """Tiled datasets with dimensions less than blocksize are not permitted""" - tmpfile = str(tmpdir.join("test.tif")) - with pytest.raises(ValueError): - rasterio.open( - tmpfile, "w", driver="GTiff", count=1, height=13, width=13, dtype="uint8", crs="epsg:3857", - transform=Affine.identity(), tiled=True, blockxsize=256, blockysize=256) - def test_dataset_readonly_attributes(path_rgb_byte_tif): """Attempts to set read-only attributes fail with DatasetAttributeError""" with pytest.raises(DatasetAttributeError): ===================================== tests/test_plot.py ===================================== @@ -58,6 +58,7 @@ def test_show_cmyk_interp(tmpdir): meta = src.meta meta['photometric'] = 'cmyk' meta['count'] = 4 + del meta["nodata"] tiffname = str(tmpdir.join('foo.tif')) with rasterio.open(tiffname, 'w', **meta) as dst: assert dst.colorinterp == ( ===================================== tests/test_profile.py ===================================== @@ -62,24 +62,6 @@ def test_open_with_profile(tmpdir): assert not dst.closed -def test_blockxsize_guard(tmpdir): - """blockxsize can't be greater than image width.""" - tiffname = str(tmpdir.join('foo.tif')) - with pytest.raises(ValueError): - profile = default_gtiff_profile.copy() - profile.update(count=1, height=256, width=128) - rasterio.open(tiffname, 'w', **profile) - - -def test_blockysize_guard(tmpdir): - """blockysize can't be greater than image height.""" - tiffname = str(tmpdir.join('foo.tif')) - with pytest.raises(ValueError): - profile = default_gtiff_profile.copy() - profile.update(count=1, width=256, height=128) - rasterio.open(tiffname, 'w', **profile) - - def test_profile_overlay(path_rgb_byte_tif): with rasterio.open(path_rgb_byte_tif) as src: kwds = src.profile ===================================== tests/test_rio_info.py ===================================== @@ -8,7 +8,7 @@ import rasterio from rasterio.rio.main import main_group from rasterio.env import GDALVersion -from .conftest import requires_gdal21 +from .conftest import requires_gdal21, requires_gdal23 with rasterio.Env() as env: @@ -455,7 +455,7 @@ def test_info_no_credentials(tmpdir, monkeypatch): assert result.exit_code == 0 - at requires_gdal21(reason="S3 raster access requires GDAL 2.1+") + at requires_gdal23(reason="Unsigned S3 requests require GDAL ~= 2.3") @pytest.mark.network def test_info_aws_unsigned(): """Unsigned access to public dataset works (see #1637)""" ===================================== tests/test_rio_merge.py ===================================== @@ -105,9 +105,9 @@ def test_merge_with_colormap(test_data_dir_1): inputs = [str(x) for x in test_data_dir_1.listdir()] inputs.sort() - # Add a colormap to the first input prior merge - with rasterio.open(inputs[0], 'r+') as src: - src.write_colormap(1, {0: (255, 0, 0, 255), 255: (0, 0, 0, 0)}) + for inputname in inputs: + with rasterio.open(inputname, 'r+') as src: + src.write_colormap(1, {0: (255, 0, 0, 255), 255: (0, 0, 0, 255)}) runner = CliRunner() result = runner.invoke(main_group, ['merge'] + inputs + [outputname]) @@ -139,6 +139,7 @@ def test_merge_with_nodata(test_data_dir_1): assert np.all(data == expected) + at pytest.mark.filterwarnings("ignore:Input file's nodata value") def test_merge_error(test_data_dir_1): """A nodata value outside the valid range results in an error""" outputname = str(test_data_dir_1.join('merged.tif')) ===================================== tests/test_warp.py ===================================== @@ -1368,7 +1368,7 @@ def test_reproject_dst_nodata(): resampling=Resampling.nearest, ) - assert (out > 0).sum() == 438113 + assert (out[~np.isnan(out)] > 0.0).sum() == 438113 assert out[0, 0] != 0 assert np.isnan(out[0, 0]) ===================================== tests/test_warpedvrt.py ===================================== @@ -392,3 +392,44 @@ def test_invalid_add_alpha(): with rasterio.open('tests/data/RGBA.byte.tif') as src: with pytest.raises(WarpOptionsError): WarpedVRT(src, add_alpha=True) + + +def test_warpedvrt_float32_preserve(data): + """WarpedVRT preserves float32 dtype of source""" + with rasterio.open("tests/data/float32.tif") as src: + with WarpedVRT(src, src_crs="EPSG:4326") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) + + +def test_warpedvrt_float32_override(data): + """Override GDAL defaults for working data type""" + float32file = str(data.join("float32.tif")) + with rasterio.open(float32file, "r+") as dst: + dst.nodata = -3.4028230607370965e+38 + + with rasterio.open(float32file) as src: + with WarpedVRT(src, src_crs="EPSG:4326", dtype="float32") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) + + +def test_warpedvrt_float32_overridei_nodata(data): + """Override GDAL defaults for working data type""" + float32file = str(data.join("float32.tif")) + with rasterio.open(float32file, "r+") as dst: + dst.nodata = -3.4028230607370965e+38 + + with rasterio.open(float32file) as src: + with WarpedVRT(src, src_crs="EPSG:4326", nodata=0.0001, dtype="float32") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) + + + at pytest.mark.xfail(reason="GDAL's output defaults to float64") +def test_warpedvrt_issue1744(data): + """Reproduce the bug reported in 1744""" + float32file = str(data.join("float32.tif")) + with rasterio.open(float32file, "r+") as dst: + dst.nodata = -3.4028230607370965e+38 + + with rasterio.open(float32file) as src: + with WarpedVRT(src, src_crs="EPSG:4326") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/compare/34e47f757905ec19a24e8a834a57a3486b5c9c0d...bf42bcb4d332d58d7ae92c3d9336ac0e26d37ced -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/compare/34e47f757905ec19a24e8a834a57a3486b5c9c0d...bf42bcb4d332d58d7ae92c3d9336ac0e26d37ced You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 6 06:16:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 06 Sep 2019 05:16:06 +0000 Subject: [Git][debian-gis-team/rasterio] Pushed new tag debian/1.0.27-1 Message-ID: <5d71eb96a7957_577b3f91ce507b6c1077644@godard.mail> Bas Couwenberg pushed new tag debian/1.0.27-1 at Debian GIS Project / rasterio -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/tree/debian/1.0.27-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 6 06:16:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 06 Sep 2019 05:16:06 +0000 Subject: [Git][debian-gis-team/rasterio][upstream] New upstream version 1.0.27 Message-ID: <5d71eb962ba10_577b2ade611103181077462@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / rasterio Commits: d6d54f4f by Bas Couwenberg at 2019-09-06T05:01:06Z New upstream version 1.0.27 - - - - - 24 changed files: - CHANGES.txt - rasterio/__init__.py - rasterio/_base.pyx - rasterio/_features.pyx - rasterio/_io.pyx - rasterio/_warp.pyx - rasterio/compat.py - rasterio/crs.py - rasterio/features.py - rasterio/rio/stack.py - rasterio/transform.py - rasterio/vrt.py - rasterio/windows.py - tests/conftest.py - + tests/data/float32.tif - tests/test_blocks.py - tests/test_colormap.py - tests/test_dataset.py - tests/test_plot.py - tests/test_profile.py - tests/test_rio_info.py - tests/test_rio_merge.py - tests/test_warp.py - tests/test_warpedvrt.py Changes: ===================================== CHANGES.txt ===================================== @@ -1,6 +1,29 @@ Changes ======= +1.0.27 (2019-09-05) +------------------- + +- Resolve #1744 by adding a `dtype` keyword argument to the WarpedVRT + constructor. It allows a user to specify the working data type for the warp + operation and output. +- All cases of deprecated affine right multiplication have been changed to be + forward compatible with affine 3.0. The rasterio tests now pass without + warnings. +- The coordinate transformer used in _base._transform() is now properly + deleted, fixing the memory leak reported in #1713. +- An unavoidable warning about 4-channel colormap entries in + DatasetWriterBase.write_colormap() has been removed. +- All deprecated imports of abstract base classes for collections have been + corrected, eliminating the warnings reported in #1742 and #1764. +- DatasetWriterBase no longer requires that GeoTIFF block sizes be smaller than + the raster size (#1760). Block sizes are however checked to ensure that they + are multiples of 16. +- DatasetBase.is_tiled has been made more reliable, fixing #1376. +- Tests have been added to demonstrate that image corruption when writing + block-wise to an image with extra large block sizes (#520) is no longer an + issue. + 1.0.26 (2019-08-26) ------------------- ===================================== rasterio/__init__.py ===================================== @@ -42,7 +42,7 @@ import rasterio.path __all__ = ['band', 'open', 'pad', 'Env'] -__version__ = "1.0.26" +__version__ = "1.0.27" __gdal_version__ = gdal_version() # Rasterio attaches NullHandler to the 'rasterio' logger and its ===================================== rasterio/_base.pyx ===================================== @@ -879,7 +879,15 @@ cdef class DatasetBase(object): def is_tiled(self): if len(self.block_shapes) == 0: return False - return self.block_shapes[0][1] < self.width and self.block_shapes[0][1] <= 1024 + else: + blockysize, blockxsize = self.block_shapes[0] + if blockxsize % 16 or blockysize % 16: + return False + # Perfectly square is a special case/ + if blockxsize == blockysize == self.height == self.width: + return True + else: + return blockxsize < self.width or blockxsize > self.width property profile: """Basic metadata and creation options of this dataset. @@ -1269,17 +1277,6 @@ def _transform(src_crs, dst_crs, xs, ys, zs): transform = exc_wrap_pointer(transform) exc_wrap_int(OCTTransform(transform, n, x, y, z)) - except CPLE_BaseError as exc: - log.debug("{}".format(exc)) - - except: - CPLFree(x) - CPLFree(y) - CPLFree(z) - _safe_osr_release(src) - _safe_osr_release(dst) - - try: res_xs = [0]*n res_ys = [0]*n for i in range(n): @@ -1297,6 +1294,7 @@ def _transform(src_crs, dst_crs, xs, ys, zs): CPLFree(x) CPLFree(y) CPLFree(z) + OCTDestroyCoordinateTransformation(transform) _safe_osr_release(src) _safe_osr_release(dst) ===================================== rasterio/_features.pyx ===================================== @@ -367,7 +367,7 @@ def _bounds(geometry, north_up=True, transform=None): else: if transform is not None: xyz = list(_explode(geometry['coordinates'])) - xyz_px = [point * transform for point in xyz] + xyz_px = [transform * point for point in xyz] xyz = tuple(zip(*xyz_px)) return min(xyz[0]), max(xyz[1]), max(xyz[0]), min(xyz[1]) else: ===================================== rasterio/_io.pyx ===================================== @@ -25,7 +25,7 @@ from rasterio.enums import ColorInterp, MaskFlags, Resampling from rasterio.errors import ( CRSError, DriverRegistrationError, RasterioIOError, NotGeoreferencedWarning, NodataShadowWarning, WindowError, - UnsupportedOperation, OverviewCreationError + UnsupportedOperation, OverviewCreationError, RasterBlockError ) from rasterio.sample import sample_gen from rasterio.transform import Affine @@ -1077,7 +1077,14 @@ cdef class DatasetWriterBase(DatasetReaderBase): # Process dataset opening options. # "tiled" affects the meaning of blocksize, so we need it # before iterating. - tiled = bool(kwargs.get('tiled', False)) + tiled = kwargs.pop("tiled", False) or kwargs.pop("TILED", False) + + if tiled: + blockxsize = kwargs.get("blockxsize", None) + blockysize = kwargs.get("blockysize", None) + if (blockxsize and blockxsize % 16) or (blockysize and blockysize % 16): + raise RasterBlockError("The height and width of dataset blocks must be multiples of 16") + kwargs["tiled"] = "TRUE" for k, v in kwargs.items(): # Skip items that are definitely *not* valid driver @@ -1087,11 +1094,8 @@ cdef class DatasetWriterBase(DatasetReaderBase): k, v = k.upper(), str(v).upper() - # Guard against block size that exceed image size. - if k == 'BLOCKXSIZE' and tiled and int(v) > width: - raise ValueError("blockxsize exceeds raster width.") - if k == 'BLOCKYSIZE' and tiled and int(v) > height: - raise ValueError("blockysize exceeds raster height.") + if k in ['BLOCKXSIZE', 'BLOCKYSIZE'] and not tiled: + continue key_b = k.encode('utf-8') val_b = v.encode('utf-8') @@ -1505,12 +1509,8 @@ cdef class DatasetWriterBase(DatasetReaderBase): vals = range(256) for i, rgba in colormap.items(): - if len(rgba) == 4 and self.driver in ('GTiff'): - warnings.warn( - "This format doesn't support alpha in colormap entries. " - "The value will be ignored.") - elif len(rgba) == 3: + if len(rgba) == 3: rgba = tuple(rgba) + (255,) if i not in vals: ===================================== rasterio/_warp.pyx ===================================== @@ -112,7 +112,7 @@ def _transform_geom( cdef GDALWarpOptions * create_warp_options( GDALResampleAlg resampling, object src_nodata, object dst_nodata, int src_count, - object dst_alpha, object src_alpha, int warp_mem_limit, const char **options) except NULL: + object dst_alpha, object src_alpha, int warp_mem_limit, GDALDataType working_data_type, const char **options) except NULL: """Return a pointer to a GDALWarpOptions composed from input params This is used in _reproject() and the WarpedVRT constructor. It sets @@ -145,6 +145,7 @@ cdef GDALWarpOptions * create_warp_options( warp_extras = CSLMerge(warp_extras, options) + psWOptions.eWorkingDataType = working_data_type psWOptions.eResampleAlg = resampling if warp_mem_limit > 0: @@ -212,6 +213,7 @@ def _reproject( init_dest_nodata=True, num_threads=1, warp_mem_limit=0, + working_data_type=0, **kwargs): """ Reproject a source raster to a destination raster. @@ -315,7 +317,7 @@ def _reproject( if not in_dtype_range(dst_nodata, destination.dtype): raise ValueError("dst_nodata must be in valid range for " "destination dtype") - + def format_transform(in_transform): if not in_transform: return in_transform @@ -461,7 +463,7 @@ def _reproject( psWOptions = create_warp_options( resampling, src_nodata, - dst_nodata, src_count, dst_alpha, src_alpha, warp_mem_limit, + dst_nodata, src_count, dst_alpha, src_alpha, warp_mem_limit, working_data_type, warp_extras) psWOptions.pfnTransformer = pfnTransformer @@ -607,7 +609,7 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): dst_width=None, width=None, dst_height=None, height=None, src_transform=None, dst_transform=None, transform=None, init_dest_nodata=True, src_alpha=0, add_alpha=False, - warp_mem_limit=0, **warp_extras): + warp_mem_limit=0, dtype=None, **warp_extras): """Make a virtual warped dataset Parameters @@ -655,6 +657,8 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): warp_mem_limit : int, optional The warp operation's memory limit in MB. The default (0) means 64 MB with GDAL 2.2. + dtype : str, optional + The working data type for warp operation and output. warp_extras : dict GDAL extra warp options. See http://www.gdal.org/structGDALWarpOptions.html. @@ -735,6 +739,7 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): self.name = "WarpedVRT({})".format(src_dataset.name) self.resampling = resampling self.tolerance = tolerance + self.working_dtype = dtype self.src_nodata = self.src_dataset.nodata if src_nodata is DEFAULT_NODATA_FLAG else src_nodata self.dst_nodata = self.src_nodata if nodata is DEFAULT_NODATA_FLAG else nodata @@ -839,7 +844,8 @@ cdef class WarpedVRTReaderBase(DatasetReaderBase): psWOptions = create_warp_options( c_resampling, self.src_nodata, self.dst_nodata, src_dataset.count, dst_alpha, - src_alpha_band, warp_mem_limit, c_warp_extras) + src_alpha_band, warp_mem_limit, dtypes.dtype_rev[self.working_dtype], + c_warp_extras) if psWOptions == NULL: raise RuntimeError("Warp options are NULL") ===================================== rasterio/compat.py ===================================== @@ -12,7 +12,7 @@ if sys.version_info[0] >= 3: # pragma: no cover import configparser from urllib.parse import urlparse from collections import UserDict - from collections.abc import Mapping + from collections.abc import Iterable, Mapping from inspect import getfullargspec as getargspec else: # pragma: no cover string_types = basestring, @@ -23,4 +23,4 @@ else: # pragma: no cover from urlparse import urlparse from UserDict import UserDict from inspect import getargspec - from collections import Mapping + from collections import Iterable, Mapping ===================================== rasterio/crs.py ===================================== @@ -10,7 +10,6 @@ used. """ -import collections import json import pickle ===================================== rasterio/features.py ===================================== @@ -419,8 +419,8 @@ def geometry_window(dataset, shapes, pad_x=0, pad_y=0, north_up=True, right = min(dataset.shape[1], right) bottom = min(dataset.shape[0], bottom) # convert the bounds back to the CRS domain - left, top = (left, top) * dataset.transform - right, bottom = (right, bottom) * dataset.transform + left, top = dataset.transform * (left, top) + right, bottom = dataset.transform * (right, bottom) window = dataset.window(left, bottom, right, top) window_floored = window.round_offsets(op='floor', pixel_precision=pixel_precision) ===================================== rasterio/rio/stack.py ===================================== @@ -1,14 +1,13 @@ """$ rio stack""" -import collections import logging import click from cligj import format_opt import rasterio -from rasterio.compat import zip_longest +from rasterio.compat import Iterable, zip_longest from rasterio.rio import options from rasterio.rio.helpers import resolve_inout @@ -104,7 +103,7 @@ def stack(ctx, files, output, driver, bidx, photometric, overwrite, data = src.read(index) dst.write(data, dst_idx) dst_idx += 1 - elif isinstance(index, collections.Iterable): + elif isinstance(index, Iterable): data = src.read(index) dst.write(data, range(dst_idx, dst_idx + len(index))) dst_idx += len(index) ===================================== rasterio/transform.py ===================================== @@ -2,11 +2,12 @@ from __future__ import division -import collections import math from affine import Affine +from rasterio.compat import Iterable + IDENTITY = Affine.identity() GDAL_IDENTITY = IDENTITY.to_gdal() @@ -153,10 +154,10 @@ def xy(transform, rows, cols, offset='center'): single_col = False single_row = False - if not isinstance(cols, collections.Iterable): + if not isinstance(cols, Iterable): cols = [cols] single_col = True - if not isinstance(rows, collections.Iterable): + if not isinstance(rows, Iterable): rows = [rows] single_row = True @@ -221,10 +222,10 @@ def rowcol(transform, xs, ys, op=math.floor, precision=None): single_x = False single_y = False - if not isinstance(xs, collections.Iterable): + if not isinstance(xs, Iterable): xs = [xs] single_x = True - if not isinstance(ys, collections.Iterable): + if not isinstance(ys, Iterable): ys = [ys] single_y = True ===================================== rasterio/vrt.py ===================================== @@ -21,6 +21,57 @@ class WarpedVRT(WarpedVRTReaderBase, WindowMethodsMixin, This class is backed by an in-memory GDAL VRTWarpedDataset VRT file. + Parameters + ---------- + src_dataset : dataset object + The warp source. + src_crs : CRS or str, optional + Overrides the coordinate reference system of `src_dataset`. + src_transfrom : Affine, optional + Overrides the transform of `src_dataset`. + src_nodata : float, optional + Overrides the nodata value of `src_dataset`, which is the + default. + crs : CRS or str, optional + The coordinate reference system at the end of the warp + operation. Default: the crs of `src_dataset`. dst_crs is + a deprecated alias for this parameter. + transform : Affine, optional + The transform for the virtual dataset. Default: will be + computed from the attributes of `src_dataset`. dst_transform + is a deprecated alias for this parameter. + height, width: int, optional + The dimensions of the virtual dataset. Defaults: will be + computed from the attributes of `src_dataset`. dst_height + and dst_width are deprecated alias for these parameters. + nodata : float, optional + Nodata value for the virtual dataset. Default: the nodata + value of `src_dataset` or 0.0. dst_nodata is a deprecated + alias for this parameter. + resampling : Resampling, optional + Warp resampling algorithm. Default: `Resampling.nearest`. + tolerance : float, optional + The maximum error tolerance in input pixels when + approximating the warp transformation. Default: 0.125, + or one-eigth of a pixel. + src_alpha : int, optional + Index of a source band to use as an alpha band for warping. + add_alpha : bool, optional + Whether to add an alpha masking band to the virtual dataset. + Default: False. This option will cause deletion of the VRT + nodata value. + init_dest_nodata : bool, optional + Whether or not to initialize output to `nodata`. Default: + True. + warp_mem_limit : int, optional + The warp operation's memory limit in MB. The default (0) + means 64 MB with GDAL 2.2. + dtype : str, optional + The working data type for warp operation and output. + warp_extras : dict + GDAL extra warp options. See + http://www.gdal.org/structGDALWarpOptions.html. + Attributes ---------- src_dataset : dataset @@ -39,6 +90,8 @@ class WarpedVRT(WarpedVRTReaderBase, WindowMethodsMixin, The nodata value used to initialize the destination; it will remain in all areas not covered by the reprojected source. Defaults to the value of src_nodata, or 0 (gdal default). + working_dtype : str, optional + The working data type for warp operation and output. warp_extras : dict GDAL extra warp options. See http://www.gdal.org/structGDALWarpOptions.html. ===================================== rasterio/windows.py ===================================== @@ -27,6 +27,7 @@ import attr from affine import Affine import numpy as np +from rasterio.compat import Iterable from rasterio.errors import WindowError from rasterio.transform import rowcol, guard_transform @@ -114,7 +115,7 @@ def iter_args(function): """ @functools.wraps(function) def wrapper(*args, **kwargs): - if len(args) == 1 and isinstance(args[0], collections.Iterable): + if len(args) == 1 and isinstance(args[0], Iterable): return function(*args[0]) else: return function(*args) ===================================== tests/conftest.py ===================================== @@ -27,8 +27,8 @@ if sys.version_info > (3,): reduce = functools.reduce test_files = [os.path.join(os.path.dirname(__file__), p) for p in [ - 'data/RGB.byte.tif', 'data/float.tif', 'data/float_nan.tif', - 'data/shade.tif', 'data/RGBA.byte.tif']] + 'data/RGB.byte.tif', 'data/float.tif', 'data/float32.tif', + 'data/float_nan.tif', 'data/shade.tif', 'data/RGBA.byte.tif']] def pytest_cmdline_main(config): @@ -601,6 +601,10 @@ requires_gdal22 = pytest.mark.skipif( not gdal_version.at_least('2.2'), reason="Requires GDAL 2.2.x") +requires_gdal23 = pytest.mark.skipif( + not gdal_version.at_least('2.3'), + reason="Requires GDAL ~= 2.3") + requires_gdal_lt_3 = pytest.mark.skipif( gdal_version.__lt__('3.0'), reason="Requires GDAL 1.x/2.x") ===================================== tests/data/float32.tif ===================================== Binary files /dev/null and b/tests/data/float32.tif differ ===================================== tests/test_blocks.py ===================================== @@ -11,6 +11,7 @@ import pytest import rasterio from rasterio import windows from rasterio.errors import RasterBlockError +from rasterio.profiles import default_gtiff_profile from .conftest import requires_gdal2 @@ -51,6 +52,7 @@ class WindowTest(unittest.TestCase): rasterio.windows.evaluate(((None, -10), (None, -10)), 100, 90), windows.Window.from_slices((0, 90), (0, 80))) + def test_window_index(): idx = rasterio.windows.window_index(((0, 4), (1, 12))) assert len(idx) == 2 @@ -191,3 +193,29 @@ def test_block_window_tiff(path_rgb_byte_tif): with rasterio.open(path_rgb_byte_tif) as src: for (i, j), w in src.block_windows(): assert src.block_window(1, i, j) == w + + + at pytest.mark.parametrize("blocksize", [16, 32, 256, 1024]) +def test_block_windows_bigger_blocksize(tmpdir, blocksize): + """Ensure that block sizes greater than raster size are ok""" + tempfile = str(tmpdir.join("test.tif")) + profile = default_gtiff_profile.copy() + profile.update(height=16, width=16, count=1, blockxsize=blocksize, blockysize=blocksize) + with rasterio.open(tempfile, "w", **profile) as dst: + assert dst.is_tiled + for ij, window in dst.block_windows(): + dst.write(np.ones((1, 1), dtype="uint8"), 1, window=window) + + with rasterio.open(tempfile) as dst: + assert list(dst.block_windows()) == [((0, 0), windows.Window(0, 0, 16, 16))] + assert (dst.read(1) == 1).all() + + + at pytest.mark.parametrize("blocksizes", [{"blockxsize": 33, "blockysize": 32}, {"blockxsize": 32, "blockysize": 33}]) +def test_odd_blocksize_error(tmpdir, blocksizes): + """For a tiled TIFF block sizes must be multiples of 16""" + tempfile = str(tmpdir.join("test.tif")) + profile = default_gtiff_profile.copy() + profile.update(height=64, width=64, count=1, **blocksizes) + with pytest.raises(RasterBlockError): + rasterio.open(tempfile, "w", **profile) ===================================== tests/test_colormap.py ===================================== @@ -5,17 +5,12 @@ import sys import rasterio -logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) - - def test_write_colormap_warn(tmpdir, recwarn): with rasterio.open('tests/data/shade.tif') as src: profile = src.meta tiffname = str(tmpdir.join('foo.tif')) with rasterio.open(tiffname, 'w', **profile) as dst: dst.write_colormap(1, {0: (255, 0, 0, 255), 255: (0, 0, 0, 0)}) - w = recwarn.pop(UserWarning) - assert "The value will be ignored" in str(w.message) def test_write_colormap(tmpdir): ===================================== tests/test_dataset.py ===================================== @@ -47,23 +47,16 @@ def test_untiled_dataset_blocksize(tmpdir): """Blocksize is not relevant to untiled datasets (see #1689)""" tmpfile = str(tmpdir.join("test.tif")) with rasterio.open( - tmpfile, "w", driver="GTiff", count=1, height=13, width=13, dtype="uint8", crs="epsg:3857", - transform=Affine.identity(), blockxsize=256, blockysize=256) as dataset: + tmpfile, "w", driver="GTiff", count=1, height=13, width=23, dtype="uint8", crs="epsg:3857", + transform=Affine.identity(), blockxsize=64, blockysize=64) as dataset: pass with rasterio.open(tmpfile) as dataset: assert not dataset.profile["tiled"] - assert dataset.shape == (13, 13) + assert dataset.shape == (13, 23) + assert dataset.block_shapes == [(13, 23)] -def test_tiled_dataset_blocksize_guard(tmpdir): - """Tiled datasets with dimensions less than blocksize are not permitted""" - tmpfile = str(tmpdir.join("test.tif")) - with pytest.raises(ValueError): - rasterio.open( - tmpfile, "w", driver="GTiff", count=1, height=13, width=13, dtype="uint8", crs="epsg:3857", - transform=Affine.identity(), tiled=True, blockxsize=256, blockysize=256) - def test_dataset_readonly_attributes(path_rgb_byte_tif): """Attempts to set read-only attributes fail with DatasetAttributeError""" with pytest.raises(DatasetAttributeError): ===================================== tests/test_plot.py ===================================== @@ -58,6 +58,7 @@ def test_show_cmyk_interp(tmpdir): meta = src.meta meta['photometric'] = 'cmyk' meta['count'] = 4 + del meta["nodata"] tiffname = str(tmpdir.join('foo.tif')) with rasterio.open(tiffname, 'w', **meta) as dst: assert dst.colorinterp == ( ===================================== tests/test_profile.py ===================================== @@ -62,24 +62,6 @@ def test_open_with_profile(tmpdir): assert not dst.closed -def test_blockxsize_guard(tmpdir): - """blockxsize can't be greater than image width.""" - tiffname = str(tmpdir.join('foo.tif')) - with pytest.raises(ValueError): - profile = default_gtiff_profile.copy() - profile.update(count=1, height=256, width=128) - rasterio.open(tiffname, 'w', **profile) - - -def test_blockysize_guard(tmpdir): - """blockysize can't be greater than image height.""" - tiffname = str(tmpdir.join('foo.tif')) - with pytest.raises(ValueError): - profile = default_gtiff_profile.copy() - profile.update(count=1, width=256, height=128) - rasterio.open(tiffname, 'w', **profile) - - def test_profile_overlay(path_rgb_byte_tif): with rasterio.open(path_rgb_byte_tif) as src: kwds = src.profile ===================================== tests/test_rio_info.py ===================================== @@ -8,7 +8,7 @@ import rasterio from rasterio.rio.main import main_group from rasterio.env import GDALVersion -from .conftest import requires_gdal21 +from .conftest import requires_gdal21, requires_gdal23 with rasterio.Env() as env: @@ -455,7 +455,7 @@ def test_info_no_credentials(tmpdir, monkeypatch): assert result.exit_code == 0 - at requires_gdal21(reason="S3 raster access requires GDAL 2.1+") + at requires_gdal23(reason="Unsigned S3 requests require GDAL ~= 2.3") @pytest.mark.network def test_info_aws_unsigned(): """Unsigned access to public dataset works (see #1637)""" ===================================== tests/test_rio_merge.py ===================================== @@ -105,9 +105,9 @@ def test_merge_with_colormap(test_data_dir_1): inputs = [str(x) for x in test_data_dir_1.listdir()] inputs.sort() - # Add a colormap to the first input prior merge - with rasterio.open(inputs[0], 'r+') as src: - src.write_colormap(1, {0: (255, 0, 0, 255), 255: (0, 0, 0, 0)}) + for inputname in inputs: + with rasterio.open(inputname, 'r+') as src: + src.write_colormap(1, {0: (255, 0, 0, 255), 255: (0, 0, 0, 255)}) runner = CliRunner() result = runner.invoke(main_group, ['merge'] + inputs + [outputname]) @@ -139,6 +139,7 @@ def test_merge_with_nodata(test_data_dir_1): assert np.all(data == expected) + at pytest.mark.filterwarnings("ignore:Input file's nodata value") def test_merge_error(test_data_dir_1): """A nodata value outside the valid range results in an error""" outputname = str(test_data_dir_1.join('merged.tif')) ===================================== tests/test_warp.py ===================================== @@ -1368,7 +1368,7 @@ def test_reproject_dst_nodata(): resampling=Resampling.nearest, ) - assert (out > 0).sum() == 438113 + assert (out[~np.isnan(out)] > 0.0).sum() == 438113 assert out[0, 0] != 0 assert np.isnan(out[0, 0]) ===================================== tests/test_warpedvrt.py ===================================== @@ -392,3 +392,44 @@ def test_invalid_add_alpha(): with rasterio.open('tests/data/RGBA.byte.tif') as src: with pytest.raises(WarpOptionsError): WarpedVRT(src, add_alpha=True) + + +def test_warpedvrt_float32_preserve(data): + """WarpedVRT preserves float32 dtype of source""" + with rasterio.open("tests/data/float32.tif") as src: + with WarpedVRT(src, src_crs="EPSG:4326") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) + + +def test_warpedvrt_float32_override(data): + """Override GDAL defaults for working data type""" + float32file = str(data.join("float32.tif")) + with rasterio.open(float32file, "r+") as dst: + dst.nodata = -3.4028230607370965e+38 + + with rasterio.open(float32file) as src: + with WarpedVRT(src, src_crs="EPSG:4326", dtype="float32") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) + + +def test_warpedvrt_float32_overridei_nodata(data): + """Override GDAL defaults for working data type""" + float32file = str(data.join("float32.tif")) + with rasterio.open(float32file, "r+") as dst: + dst.nodata = -3.4028230607370965e+38 + + with rasterio.open(float32file) as src: + with WarpedVRT(src, src_crs="EPSG:4326", nodata=0.0001, dtype="float32") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) + + + at pytest.mark.xfail(reason="GDAL's output defaults to float64") +def test_warpedvrt_issue1744(data): + """Reproduce the bug reported in 1744""" + float32file = str(data.join("float32.tif")) + with rasterio.open(float32file, "r+") as dst: + dst.nodata = -3.4028230607370965e+38 + + with rasterio.open(float32file) as src: + with WarpedVRT(src, src_crs="EPSG:4326") as vrt: + assert src.dtypes == vrt.dtypes == ("float32",) View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/d6d54f4f34edde67757051b5d9c643264b186824 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/d6d54f4f34edde67757051b5d9c643264b186824 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 6 06:16:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 06 Sep 2019 05:16:08 +0000 Subject: [Git][debian-gis-team/rasterio] Pushed new tag upstream/1.0.27 Message-ID: <5d71eb98ee8d5_577b2ade6111031810778b2@godard.mail> Bas Couwenberg pushed new tag upstream/1.0.27 at Debian GIS Project / rasterio -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/tree/upstream/1.0.27 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 6 06:28:08 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 06 Sep 2019 05:28:08 +0000 Subject: Processing of rasterio_1.0.27-1_source.changes Message-ID: rasterio_1.0.27-1_source.changes uploaded successfully to localhost along with the files: rasterio_1.0.27-1.dsc rasterio_1.0.27.orig.tar.gz rasterio_1.0.27-1.debian.tar.xz rasterio_1.0.27-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 6 06:34:56 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 06 Sep 2019 05:34:56 +0000 Subject: rasterio_1.0.27-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 06 Sep 2019 07:03:26 +0200 Source: rasterio Architecture: source Version: 1.0.27-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: rasterio (1.0.27-1) unstable; urgency=medium . * Team upload. * New upstream release. Checksums-Sha1: b3d303e7a9bb1b5f335d1b0dc6278277258ea83e 2309 rasterio_1.0.27-1.dsc a6a77095096aa355adf0a877fd0d85298eb3e86b 15903979 rasterio_1.0.27.orig.tar.gz d3425c5a8f0d2daa0433296e26ae4f308f04063c 8296 rasterio_1.0.27-1.debian.tar.xz 6b01bb87bf6b97c8c3318ebade41a4a2152a8a49 13789 rasterio_1.0.27-1_amd64.buildinfo Checksums-Sha256: 02a68589764506a969fb6a469cafc59e175125f26b64eba6a4e8ee4f261ee506 2309 rasterio_1.0.27-1.dsc 0d954dee2afbcf72ff038466d8b9e814b1e1bf05886b62a9d2378dbf504ec246 15903979 rasterio_1.0.27.orig.tar.gz c3f02f02d212a75d841338e073baeb83d7b05fc49b3a7666957e01dd44980f3e 8296 rasterio_1.0.27-1.debian.tar.xz 09ba5b77b1d1f3aa1ca733b1558c28999873bcdc224cddb362f3d66d770f279a 13789 rasterio_1.0.27-1_amd64.buildinfo Files: 6fd1b1d3c5d73e00f16d4b5b1682a9c9 2309 python optional rasterio_1.0.27-1.dsc 57ae1ae19e9a7ecc7dddac3806477f38 15903979 python optional rasterio_1.0.27.orig.tar.gz a725f7fc3500d8632598943c06dd760a 8296 python optional rasterio_1.0.27-1.debian.tar.xz d82d39e58c9b8b5ded67523b1ccc9be3 13789 python optional rasterio_1.0.27-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1x62EACgkQZ1DxCuiN SvH8Ow/9E+7QTvbYRZrxyPJikq5Ymt6MRRgYxDlX+G7ZIJEI4edd0dSP5zp8w8Gz Z3AmhCZ7/iAyNmtBR4qlWdmLpt/YSL7eTWKmf4cYXPGgojcebiWNFsXZ/jacmvGr KRHoT5tdWQRUm4sAO4TIUALPS2QFxyCT1+9Bc6sS4E4bstPi+8wyUt2VN9qtUwr2 /zmSk8+iVOvkEIC2lblKawOTWyKdBtgd1bMLu3B+LnM7zDVToS1VnuLZpO++1ymv UjCqUdaaf3AHfFSgE0wsPrBNs5CINiO7smOWyQPoOMdzWV3j4b44XAJRnrhyHw43 h7cZTaMCLCq5NrfGnwAoLAYrsXVr1/bNjHVmtWiYRU9gLrz+CgDK6DmS2+HkWKcM ZRZJwVw6wGdBr988waUidiq9zMK7BXhZfw2GC/zsc0VkeRnNP9e+DlZYFGdZ2wFG sURWSegZlpH5gx0s/ESVXsn5ZF7Yqqra+2Jn84rnJZmn2QLlMTj3QRoLagBQHVun gyxzDocWO5WhQYwGEPMh9piqpjjCfRpj+6QIAVsrDnhGX/84xFV++RsuG5MtS6pB q59ica03KWPyzAQwoXqWCVzneqSqbAIWQqW0a9nPFsLbt0i+SGBwv56VkfQCGCIz /cku2rUkaOLQzKxn5YH9kFZ0/w4taOOAxvnrzIjCcCVOO+6wEkE= =pNGw -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From landa.martin at gmail.com Fri Sep 6 11:07:55 2019 From: landa.martin at gmail.com (Martin Landa) Date: Fri, 6 Sep 2019 12:07:55 +0200 Subject: [Git][debian-gis-team/grass][ubuntugis/bionic] 27 commits: Define ACCEPT_USE_OF_DEPRECATED_PROJ_API_H for PROJ 6.0.0 compatibility. In-Reply-To: References: <5d6914731125c_5e012abec02571885120af@godard.mail> <0f117043-f938-0c8d-d5f8-5c281f7778fb@xs4all.nl> Message-ID: Hi, so 31. 8. 2019 v 13:21 odesílatel Sebastiaan Couwenberg napsal: > That seems like a bug in GRASS, using svn to download code from GitHub > makes no sense. Use git clone with shallow options, e.g.: > > git clone --depth 1 https://github.com/OSGeo/grass-addons yes, it will still clone all directories, the reason for `svn export` is to fetch specific directory only. du -s -c -h grass-addons/* 4.0K grass-addons/contributors.csv 26M grass-addons/grass6 73M grass-addons/grass7 4.0K grass-addons/README 4.0K grass-addons/README.md 15M grass-addons/roadmap 4.0K grass-addons/SUBMITTING 8.0K grass-addons/SVN_HOWTO.txt 448K grass-addons/tools 113M total > There is also a spelling error: reporsitory -> repository: > > """Download source code from a official GitHub reporsitory > > Since download_source_code_svn() requires subversion, both git & > subversion should be recommended by the package, and included in the > documentation (REQUIREMENTS.html). Thanks, fixed in https://github.com/OSGeo/grass/commit/b206fcd70ead34b9bef38cd50e44232f0b24bc84 Martin -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From landa.martin at gmail.com Fri Sep 6 11:09:19 2019 From: landa.martin at gmail.com (Martin Landa) Date: Fri, 6 Sep 2019 12:09:19 +0200 Subject: [Git][debian-gis-team/grass][ubuntugis/bionic] 27 commits: Define ACCEPT_USE_OF_DEPRECATED_PROJ_API_H for PROJ 6.0.0 compatibility. In-Reply-To: References: <5d6914731125c_5e012abec02571885120af@godard.mail> <0f117043-f938-0c8d-d5f8-5c281f7778fb@xs4all.nl> Message-ID: Hi, so 31. 8. 2019 v 10:07 odesílatel Martin Landa napsal: > there is possible issue with g.extension dependency. In [1] svn has > been replace by git. Currently g.extension is using svn export [2] to > avoid cloning the whole grass-addons repository. I hope better solution will be found soon, but at this point it's not blocker for 7.8.0 release from my POV. Ma -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From owner at bugs.debian.org Fri Sep 6 13:21:16 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Fri, 06 Sep 2019 12:21:16 +0000 Subject: Processed: tagging 935550, found 939502 in 0.3.4-8, tagging 935563, tagging 935353, tagging 939489 ... References: <1567772206-3941-bts-anbe@debian.org> Message-ID: Processing commands for control at bugs.debian.org: > tags 935550 + sid bullseye Bug #935550 [src:yubioath-desktop] yubioath-desktop: Qt4 removal from Bullseye Added tag(s) sid and bullseye. > found 939502 0.3.4-8 Bug #939502 [src:haskell-attoparsec-enumerator] haskell-attoparsec-enumerator: Removal notice: unused Haskell library Marked as found in versions haskell-attoparsec-enumerator/0.3.4-8. > tags 935563 + sid bullseye Bug #935563 {Done: Ole Streicher } [python3-sunpy] astropy 3.2.1 breaks sunpy Added tag(s) sid and bullseye. > tags 935353 + sid bullseye Bug #935353 [src:vistrails] vistrails: Qt4 removal from Bullseye Added tag(s) sid and bullseye. > tags 939489 + sid bullseye Bug #939489 [bumblebee-nvidia] bumblebee fails to disable discrete graphics card after upgrade nvidia driver to 430.40-2 Added tag(s) sid and bullseye. > tags 937448 + experimental Bug #937448 [src:pygobject] pygobject: Python2 removal in sid/bullseye Added tag(s) experimental. > tags 931943 + sid bullseye Bug #931943 {Done: Gert Wollny } [src:vtk6] vtk6: FTBFS with PROJ 6 Added tag(s) sid and bullseye. > tags 939303 + sid bullseye Bug #939303 [src:matplotlib2] matplotlib2: FTBFS unsatisfied b-d on python-pyshp Added tag(s) sid and bullseye. > tags 925794 + experimental Bug #925794 {Done: Bernd Zeimetz } [src:open-vm-tools] open-vm-tools: ftbfs with GCC-9 Added tag(s) experimental. > tags 935341 + sid bullseye Bug #935341 [src:backintime] backintime-qt4: Qt4 removal from Bullseye Added tag(s) sid and bullseye. > tags 935346 + sid bullseye Bug #935346 [src:puddletag] puddletag: Qt4 removal from Bullseye Added tag(s) sid and bullseye. > tags 935349 + sid bullseye Bug #935349 [src:python-pyface] python-pyface: Qt4 removal from Bullseye Added tag(s) bullseye and sid. > tags 935352 + sid bullseye Bug #935352 [src:trimage] trimage: Qt4 removal from Bullseye Added tag(s) bullseye and sid. > tags 935354 + sid bullseye Bug #935354 [src:vitables] vitables: Qt4 removal from Bullseye Added tag(s) sid and bullseye. > tags 939399 + sid Bug #939399 {Done: Bas Couwenberg } [src:libgeotiff] libgeotiff FTBFS in bullseye (possiblly armhf specific), test discrepancies. Added tag(s) sid. > tags 939492 + sid bullseye Bug #939492 [src:nbsphinx] nbsphinx: testsuite failures with new pandoc Added tag(s) bullseye and sid. > found 935211 0.36.0-1 Bug #935211 [python-acme] python-acme: Please port to Python 3 and/or drop Python 2 package Marked as found in versions python-acme/0.36.0-1. > tags 875050 + sid Bug #875050 {Done: Moritz Muehlenhoff } [src:mustang-plug] [mustang-plug] Future Qt4 removal from Buster Added tag(s) sid. > found 939529 0.3.1-4 Bug #939529 [src:gtg] gtg: should this package be removed? Marked as found in versions gtg/0.3.1-4. > tags 939529 + sid bullseye Bug #939529 [src:gtg] gtg: should this package be removed? Added tag(s) sid and bullseye. > found 939531 1.2.0-3 Bug #939531 [src:medit] medit: should this package be removed? Marked as found in versions medit/1.2.0-3. > tags 939531 + sid bullseye Bug #939531 [src:medit] medit: should this package be removed? Added tag(s) bullseye and sid. > tags 934079 + sid Bug #934079 {Done: Bastien Roucariès } [acorn] FTBFS: build depends on older node-unicode-* Added tag(s) sid. > tags 935345 + sid bullseye Bug #935345 [src:microbegps] microbegps: Qt4 removal from Bullseye Added tag(s) sid and bullseye. > tags 936016 - buster Bug #936016 {Done: "Alf Gaida" } [meteo-qt] meteo-qt crashes immediately Removed tag(s) buster. > thanks Stopping processing here. Please contact me if you need assistance. -- 875050: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=875050 925794: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=925794 931943: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=931943 934079: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=934079 935211: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935211 935341: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935341 935345: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935345 935346: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935346 935349: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935349 935352: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935352 935353: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935353 935354: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935354 935550: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935550 935563: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=935563 936016: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=936016 937448: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937448 939303: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939303 939399: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939399 939489: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939489 939492: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939492 939502: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939502 939529: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939529 939531: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939531 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From noreply at release.debian.org Sat Sep 7 05:39:06 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Sat, 07 Sep 2019 04:39:06 +0000 Subject: otb is marked for autoremoval from testing Message-ID: otb 6.6.1+dfsg-2 is marked for autoremoval from testing on 2019-09-18 It (build-)depends on packages with these RC bugs: 875075: openscenegraph: [openscenegraph] Future Qt4 removal from Buster 935086: insighttoolkit4: FTBFS with GCC-9: use of undeclared identifier '__builtin_is_constant_evaluated' From noreply at release.debian.org Sat Sep 7 05:39:06 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Sat, 07 Sep 2019 04:39:06 +0000 Subject: osgearth is marked for autoremoval from testing Message-ID: osgearth 2.10.2+dfsg-1 is marked for autoremoval from testing on 2019-09-18 It (build-)depends on packages with these RC bugs: 875075: openscenegraph: [openscenegraph] Future Qt4 removal from Buster From noreply at release.debian.org Sat Sep 7 05:39:20 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 07 Sep 2019 04:39:20 +0000 Subject: mapproxy 1.12.0-1 MIGRATED to testing Message-ID: FYI: The status of the mapproxy source package in Debian's testing distribution has changed. Previous version: 1.11.1-2 Current version: 1.12.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 7 05:39:23 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 07 Sep 2019 04:39:23 +0000 Subject: pyspectral 0.9.0+ds-1 MIGRATED to testing Message-ID: FYI: The status of the pyspectral source package in Debian's testing distribution has changed. Previous version: 0.8.9+ds-2 Current version: 0.9.0+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 7 05:39:23 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 07 Sep 2019 04:39:23 +0000 Subject: python-hdf4 0.10.1-2 MIGRATED to testing Message-ID: FYI: The status of the python-hdf4 source package in Debian's testing distribution has changed. Previous version: 0.10.1-1 Current version: 0.10.1-2 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 7 05:39:26 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 07 Sep 2019 04:39:26 +0000 Subject: trollimage 1.9.0-2 MIGRATED to testing Message-ID: FYI: The status of the trollimage source package in Debian's testing distribution has changed. Previous version: 1.9.0-1 Current version: 1.9.0-2 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Sat Sep 7 07:23:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 06:23:17 +0000 Subject: [Git][debian-gis-team/python-pdal][pristine-tar] pristine-tar data for python-pdal_2.2.0+ds.orig.tar.xz Message-ID: <5d734cd52d681_577b3f91ce34cb10124477a@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-pdal Commits: fdb616bb by Bas Couwenberg at 2019-09-07T05:56:39Z pristine-tar data for python-pdal_2.2.0+ds.orig.tar.xz - - - - - 2 changed files: - + python-pdal_2.2.0+ds.orig.tar.xz.delta - + python-pdal_2.2.0+ds.orig.tar.xz.id Changes: ===================================== python-pdal_2.2.0+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/python-pdal_2.2.0+ds.orig.tar.xz.delta differ ===================================== python-pdal_2.2.0+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +503ed3a0d5025e64a2d9ed7f577d1c3994f17749 View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/fdb616bbe454a5d593dac056ca3483a04a2d813f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/fdb616bbe454a5d593dac056ca3483a04a2d813f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 07:23:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 06:23:29 +0000 Subject: [Git][debian-gis-team/python-pdal][master] 6 commits: New upstream version 2.2.0+ds Message-ID: <5d734ce132ee7_577b2ade5d6a104c12449c7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pdal Commits: 5a313af5 by Bas Couwenberg at 2019-09-07T05:56:39Z New upstream version 2.2.0+ds - - - - - 46ef7a14 by Bas Couwenberg at 2019-09-07T05:56:40Z Update upstream source from tag 'upstream/2.2.0+ds' Update to upstream version '2.2.0+ds' with Debian dir ec93c3c38b898180cce6bc59e00c280b5e4a9b65 - - - - - 0de8c9a2 by Bas Couwenberg at 2019-09-07T05:56:53Z New upstream release. - - - - - 6b978bfb by Bas Couwenberg at 2019-09-07T05:59:13Z Update copyright years for copyright holders. - - - - - a2687e19 by Bas Couwenberg at 2019-09-07T06:13:10Z Refresh patches. - - - - - a8f729e0 by Bas Couwenberg at 2019-09-07T06:13:22Z Set distribution to unstable. - - - - - 15 changed files: - PKG-INFO - README.rst - VERSION.txt - debian/changelog - debian/copyright - debian/patches/clean-target.patch - + pdal/PyArray.cpp - pdal/PyArray.hpp - pdal/PyPipeline.cpp - pdal/PyPipeline.hpp - pdal/__init__.py - pdal/libpdalpython.cpp - pdal/libpdalpython.pyx - setup.py - test/test_pipeline.py Changes: ===================================== PKG-INFO ===================================== @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: PDAL -Version: 2.1.8 +Version: 2.2.0 Summary: Point cloud data processing Home-page: http://pdal.io Author: Howard Butler @@ -60,6 +60,9 @@ Description: =================================================================== .. image:: https://travis-ci.org/PDAL/python.svg?branch=master :target: https://travis-ci.org/PDAL/python + .. image:: https://ci.appveyor.com/api/projects/status/of4kecyahpo8892d + :target: https://ci.appveyor.com/project/hobu/python/ + Requirements ================================================================================ @@ -93,5 +96,5 @@ Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Scientific/Engineering :: GIS -Requires: Python (>=2.7) +Requires: Python (>=3.0) Requires: Numpy ===================================== README.rst ===================================== @@ -50,6 +50,9 @@ sorts it by the ``X`` dimension: .. image:: https://travis-ci.org/PDAL/python.svg?branch=master :target: https://travis-ci.org/PDAL/python +.. image:: https://ci.appveyor.com/api/projects/status/of4kecyahpo8892d + :target: https://ci.appveyor.com/project/hobu/python/ + Requirements ================================================================================ ===================================== VERSION.txt ===================================== @@ -1 +1 @@ -2.1.8 \ No newline at end of file +2.2.0 \ No newline at end of file ===================================== debian/changelog ===================================== @@ -1,3 +1,11 @@ +python-pdal (2.2.0+ds-1) unstable; urgency=medium + + * New upstream release. + * Update copyright years for copyright holders. + * Refresh patches. + + -- Bas Couwenberg Sat, 07 Sep 2019 08:13:12 +0200 + python-pdal (2.1.8+ds-3) unstable; urgency=medium * Add filenamemangle to distinguish it from pdal releases. ===================================== debian/copyright ===================================== @@ -7,9 +7,8 @@ Comment: The upstream sources are repacked to excluded the .egg-info Files-Excluded: PDAL.egg-info/* Files: * -Copyright: 2015, Hobu, Inc. - 2016, 2018, Howard Butler - 2011, Michael P. Gerlek +Copyright: 2015, 2019, Hobu, Inc. + 2016, 2018, Howard Butler License: BSD-3-Clause Files: setup.py ===================================== debian/patches/clean-target.patch ===================================== @@ -1,16 +1,15 @@ -Description: Don't append library in clean target. +Description: Fix clean target. Author: Bas Couwenberg -Forwarded: https://github.com/PDAL/python/pull/24 -Applied-Upstream: https://github.com/PDAL/python/commit/b8b192925814828cabdb4e527697b77c30edd943 +Forwarded: https://github.com/PDAL/python/pull/32 --- a/setup.py +++ b/setup.py -@@ -157,7 +157,7 @@ if DEBUG: +@@ -156,7 +156,7 @@ if DEBUG: + if os.name != 'nt': extra_compile_args += ['-g','-O0'] - # readers.numpy doesn't exist until PDAL 1.8 --if PDALVERSION >= Version('1.8'): -+if PDALVERSION is not None and PDALVERSION >= Version('1.8'): - libraries.append('pdal_plugin_reader_numpy') +-if PDALVERSION < Version('2.0.0'): ++if PDALVERSION is not None and PDALVERSION < Version('2.0.0'): + raise Exception("PDAL version '%s' is not compatible with PDAL Python library version '%s'"%(PDALVERSION, module_version)) + - if os.name in ['nt']: ===================================== pdal/PyArray.cpp ===================================== @@ -0,0 +1,339 @@ +/****************************************************************************** +* Copyright (c) 2019, Hobu Inc. (info at hobu.co) +* +* All rights reserved. +* +* Redistribution and use in source and binary forms, with or without +* modification, are permitted provided that the following +* conditions are met: +* +* * Redistributions of source code must retain the above copyright +* notice, this list of conditions and the following disclaimer. +* * Redistributions in binary form must reproduce the above copyright +* notice, this list of conditions and the following disclaimer in +* the documentation and/or other materials provided +* with the distribution. +* * Neither the name of Hobu, Inc. or Flaxen Geo Consulting nor the +* names of its contributors may be used to endorse or promote +* products derived from this software without specific prior +* written permission. +* +* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS +* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT +* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS +* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE +* COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, +* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, +* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS +* OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED +* AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT +* OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY +* OF SUCH DAMAGE. +****************************************************************************/ + +#include "PyArray.hpp" +#include + +#include + +namespace pdal +{ +namespace python +{ + +namespace +{ + +Dimension::Type pdalType(int t) +{ + using namespace Dimension; + + switch (t) + { + case NPY_FLOAT32: + return Type::Float; + case NPY_FLOAT64: + return Type::Double; + case NPY_INT8: + return Type::Signed8; + case NPY_INT16: + return Type::Signed16; + case NPY_INT32: + return Type::Signed32; + case NPY_INT64: + return Type::Signed64; + case NPY_UINT8: + return Type::Unsigned8; + case NPY_UINT16: + return Type::Unsigned16; + case NPY_UINT32: + return Type::Unsigned32; + case NPY_UINT64: + return Type::Unsigned64; + default: + return Type::None; + } + assert(0); + + return Type::None; +} + +std::string toString(PyObject *pname) +{ + PyObject* r = PyObject_Str(pname); + if (!r) + throw pdal_error("couldn't make string representation value"); + Py_ssize_t size; + return std::string(PyUnicode_AsUTF8AndSize(r, &size)); +} + +} // unnamed namespace + +Array::Array() : m_array(nullptr) +{ + if (_import_array() < 0) + throw pdal_error("Could not import numpy.core.multiarray."); +} + +Array::Array(PyArrayObject* array) : m_array(array), m_rowMajor(true) +{ + if (_import_array() < 0) + throw pdal_error("Could not import numpy.core.multiarray."); + + Py_XINCREF(array); + + PyArray_Descr *dtype = PyArray_DTYPE(m_array); + npy_intp ndims = PyArray_NDIM(m_array); + npy_intp *shape = PyArray_SHAPE(m_array); + int numFields = (dtype->fields == Py_None) ? + 0 : + static_cast(PyDict_Size(dtype->fields)); + + int xyz = 0; + if (numFields == 0) + { + if (ndims != 3) + throw pdal_error("Array without fields must have 3 dimensions."); + m_fields.push_back({"Intensity", pdalType(dtype->type_num), 0}); + } + else + { + PyObject *names_dict = dtype->fields; + PyObject *names = PyDict_Keys(names_dict); + PyObject *values = PyDict_Values(names_dict); + if (!names || !values) + throw pdal_error("Bad field specification in numpy array."); + + for (int i = 0; i < numFields; ++i) + { + std::string name = toString(PyList_GetItem(names, i)); + if (name == "X") + xyz |= 1; + else if (name == "Y") + xyz |= 2; + else if (name == "Z") + xyz |= 4; + PyObject *tup = PyList_GetItem(values, i); + + // Get offset. + size_t offset = PyLong_AsLong(PySequence_Fast_GET_ITEM(tup, 1)); + + // Get type. + PyArray_Descr *descriptor = + (PyArray_Descr *)PySequence_Fast_GET_ITEM(tup, 0); + Dimension::Type type = pdalType(descriptor->type_num); + if (type == Dimension::Type::None) + throw pdal_error("Incompatible type for field '" + name + "'."); + + m_fields.push_back({name, type, offset}); + } + + if (xyz != 0 && xyz != 7) + throw pdal_error("Array fields must contain all or none " + "of X, Y and Z"); + if (xyz == 0 && ndims != 3) + throw pdal_error("Array without named X/Y/Z fields " + "must have three dimensions."); + } + if (xyz == 0) + m_shape = { (size_t)shape[0], (size_t)shape[1], (size_t)shape[2] }; + m_rowMajor = !(PyArray_FLAGS(m_array) & NPY_ARRAY_F_CONTIGUOUS); +} + +Array::~Array() +{ + if (m_array) + Py_XDECREF((PyObject *)m_array); +} + + +void Array::update(PointViewPtr view) +{ + if (m_array) + Py_XDECREF((PyObject *)m_array); + m_array = nullptr; // Just in case of an exception. + + Dimension::IdList dims = view->dims(); + npy_intp size = view->size(); + + PyObject *dtype_dict = (PyObject*)buildNumpyDescription(view); + if (!dtype_dict) + throw pdal_error("Unable to build numpy dtype " + "description dictionary"); + + PyArray_Descr *dtype = nullptr; + if (PyArray_DescrConverter(dtype_dict, &dtype) == NPY_FAIL) + throw pdal_error("Unable to build numpy dtype"); + Py_XDECREF(dtype_dict); + + // This is a 1 x size array. + m_array = (PyArrayObject *)PyArray_NewFromDescr(&PyArray_Type, dtype, + 1, &size, 0, nullptr, NPY_ARRAY_CARRAY, nullptr); + + // copy the data + DimTypeList types = view->dimTypes(); + for (PointId idx = 0; idx < view->size(); idx++) + { + char *p = (char *)PyArray_GETPTR1(m_array, idx); + view->getPackedPoint(types, idx, p); + } +} + + +//ABELL - Who's responsible for incrementing the ref count? +PyArrayObject *Array::getPythonArray() const +{ + return m_array; +} + +PyObject* Array::buildNumpyDescription(PointViewPtr view) const +{ + // Build up a numpy dtype dictionary + // + // {'formats': ['f8', 'f8', 'f8', 'u2', 'u1', 'u1', 'u1', 'u1', 'u1', + // 'f4', 'u1', 'u2', 'f8', 'u2', 'u2', 'u2'], + // 'names': ['X', 'Y', 'Z', 'Intensity', 'ReturnNumber', + // 'NumberOfReturns', 'ScanDirectionFlag', 'EdgeOfFlightLine', + // 'Classification', 'ScanAngleRank', 'UserData', + // 'PointSourceId', 'GpsTime', 'Red', 'Green', 'Blue']} + // + + Dimension::IdList dims = view->dims(); + + PyObject* dict = PyDict_New(); + PyObject* sizes = PyList_New(dims.size()); + PyObject* formats = PyList_New(dims.size()); + PyObject* titles = PyList_New(dims.size()); + + for (size_t i = 0; i < dims.size(); ++i) + { + Dimension::Id id = dims[i]; + Dimension::Type t = view->dimType(id); + npy_intp stride = view->dimSize(id); + + std::string name = view->dimName(id); + + std::string kind("i"); + Dimension::BaseType b = Dimension::base(t); + if (b == Dimension::BaseType::Unsigned) + kind = "u"; + else if (b == Dimension::BaseType::Signed) + kind = "i"; + else if (b == Dimension::BaseType::Floating) + kind = "f"; + else + throw pdal_error("Unable to map kind '" + kind + + "' to PDAL dimension type"); + + std::stringstream oss; + oss << kind << stride; + PyObject* pySize = PyLong_FromLong(stride); + PyObject* pyTitle = PyUnicode_FromString(name.c_str()); + PyObject* pyFormat = PyUnicode_FromString(oss.str().c_str()); + + PyList_SetItem(sizes, i, pySize); + PyList_SetItem(titles, i, pyTitle); + PyList_SetItem(formats, i, pyFormat); + } + + PyDict_SetItemString(dict, "names", titles); + PyDict_SetItemString(dict, "formats", formats); + + return dict; +} + +bool Array::rowMajor() const +{ + return m_rowMajor; +} + +Array::Shape Array::shape() const +{ + return m_shape; +} + +const Array::Fields& Array::fields() const +{ + return m_fields; +} + +ArrayIter& Array::iterator() +{ + ArrayIter *it = new ArrayIter(*this); + m_iterators.push_back(std::unique_ptr(it)); + return *it; +} + +ArrayIter::ArrayIter(Array& array) +{ + m_iter = NpyIter_New(array.getPythonArray(), + NPY_ITER_EXTERNAL_LOOP | NPY_ITER_READONLY | NPY_ITER_REFS_OK, + NPY_KEEPORDER, NPY_NO_CASTING, NULL); + if (!m_iter) + throw pdal_error("Unable to create numpy iterator."); + + char *itererr; + m_iterNext = NpyIter_GetIterNext(m_iter, &itererr); + if (!m_iterNext) + { + NpyIter_Deallocate(m_iter); + throw pdal_error(std::string("Unable to create numpy iterator: ") + + itererr); + } + m_data = NpyIter_GetDataPtrArray(m_iter); + m_stride = NpyIter_GetInnerStrideArray(m_iter); + m_size = NpyIter_GetInnerLoopSizePtr(m_iter); + m_done = false; +} + +ArrayIter::~ArrayIter() +{ + NpyIter_Deallocate(m_iter); +} + +ArrayIter& ArrayIter::operator++() +{ + if (m_done) + return *this; + + if (--(*m_size)) + *m_data += *m_stride; + else if (!m_iterNext(m_iter)) + m_done = true; + return *this; +} + +ArrayIter::operator bool () const +{ + return !m_done; +} + +char * ArrayIter::operator * () const +{ + return *m_data; +} + +} // namespace python +} // namespace pdal + ===================================== pdal/PyArray.hpp ===================================== @@ -1,5 +1,5 @@ /****************************************************************************** -* Copyright (c) 2011, Michael P. Gerlek (mpg at flaxen.com) +* Copyright (c) 2019, Hobu Inc. (info at hobu.co) * * All rights reserved. * @@ -13,7 +13,7 @@ * notice, this list of conditions and the following disclaimer in * the documentation and/or other materials provided * with the distribution. -* * Neither the name of Hobu, Inc. or Flaxen Geo Consulting nor the +* * Neither the name of Hobu, Inc. nor the * names of its contributors may be used to endorse or promote * products derived from this software without specific prior * written permission. @@ -34,204 +34,69 @@ #pragma once -#include - -#include - -#pragma warning(disable: 4127) // conditional expression is constant - - -#include -#undef toupper -#undef tolower -#undef isspace +#include -#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION -#include - -// forward declare PyObject so we don't need the python headers everywhere -// see: http://mail.python.org/pipermail/python-dev/2003-August/037601.html -#ifndef PyObject_HEAD -struct _object; -typedef _object PyObject; -#endif +#include +#include namespace pdal { namespace python { +class ArrayIter; class PDAL_DLL Array { public: + using Shape = std::array; + using Fields = std::vector; + + // Create an array for reading data from PDAL. + Array(); + + // Create an array for writing data to PDAL. + Array(PyArrayObject* array); - Array() : m_py_array(0), m_own_array(true) - { -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - } - - Array(PyObject* array) : m_py_array(array), m_own_array(false) - { -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - if (!PyArray_Check(array)) - throw pdal::pdal_error("pdal::python::Array constructor object is not a numpy array"); - Py_XINCREF(array); - - } - - ~Array() - { - cleanup(); - } - - - inline void update(PointViewPtr view) - { - typedef std::unique_ptr> DataPtr; - cleanup(); - int nd = 1; - Dimension::IdList dims = view->dims(); - npy_intp mydims = view->size(); - npy_intp* ndims = &mydims; - std::vector strides(dims.size()); - - DataPtr pdata( new std::vector(view->pointSize()* view->size(), 0)); - - PyArray_Descr *dtype = nullptr; - PyObject * dtype_dict = (PyObject*)buildNumpyDescription(view); - if (!dtype_dict) - throw pdal_error("Unable to build numpy dtype description dictionary"); - - int did_convert = PyArray_DescrConverter(dtype_dict, &dtype); - if (did_convert == NPY_FAIL) - throw pdal_error("Unable to build numpy dtype"); - Py_XDECREF(dtype_dict); - -#ifdef NPY_ARRAY_CARRAY - int flags = NPY_ARRAY_CARRAY; -#else - int flags = NPY_CARRAY; -#endif - uint8_t* sp = pdata.get()->data(); - PyObject * pyArray = PyArray_NewFromDescr(&PyArray_Type, - dtype, - nd, - ndims, - 0, - sp, - flags, - NULL); - - // copy the data - uint8_t* p(sp); - DimTypeList types = view->dimTypes(); - for (PointId idx = 0; idx < view->size(); idx++) - { - p = sp + (view->pointSize() * idx); - view->getPackedPoint(types, idx, (char*)p); - } - - m_py_array = pyArray; - m_data_array = std::move(pdata); - } - - - inline PyObject* getPythonArray() const - { - return m_py_array; - } + ~Array(); + void update(PointViewPtr view); + PyArrayObject *getPythonArray() const; + bool rowMajor() const; + Shape shape() const; + const Fields& fields() const; + ArrayIter& iterator(); private: + inline PyObject* buildNumpyDescription(PointViewPtr view) const; - inline void cleanup() - { - PyObject* p = (PyObject*)(m_py_array); - if (m_own_array) - { - m_data_array.reset(); - } - - Py_XDECREF(p); - } - - inline PyObject* buildNumpyDescription(PointViewPtr view) const - { - - // Build up a numpy dtype dictionary - // - // {'formats': ['f8', 'f8', 'f8', 'u2', 'u1', 'u1', 'u1', 'u1', 'u1', 'f4', 'u1', 'u2', 'f8', 'u2', 'u2', 'u2'], - // 'names': ['X', 'Y', 'Z', 'Intensity', 'ReturnNumber', 'NumberOfReturns', - // 'ScanDirectionFlag', 'EdgeOfFlightLine', 'Classification', - // 'ScanAngleRank', 'UserData', 'PointSourceId', 'GpsTime', 'Red', 'Green', - // 'Blue']} - // - - std::stringstream oss; - Dimension::IdList dims = view->dims(); - - PyObject* dict = PyDict_New(); - PyObject* sizes = PyList_New(dims.size()); - PyObject* formats = PyList_New(dims.size()); - PyObject* titles = PyList_New(dims.size()); - - for (Dimension::IdList::size_type i=0; i < dims.size(); ++i) - { - Dimension::Id id = (dims[i]); - Dimension::Type t = view->dimType(id); - npy_intp stride = view->dimSize(id); - - std::string name = view->dimName(id); - - std::string kind("i"); - Dimension::BaseType b = Dimension::base(t); - if (b == Dimension::BaseType::Unsigned) - kind = "u"; - else if (b == Dimension::BaseType::Signed) - kind = "i"; - else if (b == Dimension::BaseType::Floating) - kind = "f"; - else - { - std::stringstream o; - oss << "unable to map kind '" << kind <<"' to PDAL dimension type"; - throw pdal::pdal_error(o.str()); - } - - oss << kind << stride; - PyObject* pySize = PyLong_FromLong(stride); - PyObject* pyTitle = PyUnicode_FromString(name.c_str()); - PyObject* pyFormat = PyUnicode_FromString(oss.str().c_str()); - - PyList_SetItem(sizes, i, pySize); - PyList_SetItem(titles, i, pyTitle); - PyList_SetItem(formats, i, pyFormat); - - oss.str(""); - } - - PyDict_SetItemString(dict, "names", titles); - PyDict_SetItemString(dict, "formats", formats); - - // PyObject* obj = PyUnicode_AsASCIIString(PyObject_Str(dict)); - // const char* s = PyBytes_AsString(obj); - // std::string output(s); - // std::cout << "array: " << output << std::endl; - return dict; - } - - - - - PyObject* m_py_array; - std::unique_ptr > m_data_array; - bool m_own_array; + PyArrayObject* m_array; Array& operator=(Array const& rhs); + Fields m_fields; + bool m_rowMajor; + Shape m_shape; + std::vector> m_iterators; +}; + +class ArrayIter +{ +public: + ArrayIter(const ArrayIter&) = delete; + + ArrayIter(Array& array); + ~ArrayIter(); + + ArrayIter& operator++(); + operator bool () const; + char *operator * () const; + +private: + NpyIter *m_iter; + NpyIter_IterNextFunc *m_iterNext; + char **m_data; + npy_intp *m_size; + npy_intp *m_stride; + bool m_done; }; } // namespace python ===================================== pdal/PyPipeline.cpp ===================================== @@ -33,124 +33,125 @@ ****************************************************************************/ #include "PyPipeline.hpp" -#ifdef PDAL_HAVE_LIBXML2 -#include -#endif #ifndef _WIN32 #include #endif #include -#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION #include -#include "PyArray.hpp" #include #include -#include -#include - -namespace libpdalpython -{ -using namespace pdal::python; +#include "PyArray.hpp" -Pipeline::Pipeline(std::string const& json, std::vector arrays) +namespace pdal +{ +namespace python { +// Create a pipeline for writing data to PDAL +Pipeline::Pipeline(std::string const& json, std::vector arrays) : + m_executor(new PipelineExecutor(json)) +{ #ifndef _WIN32 + // See comment in alternate constructor below. ::dlopen("libpdal_base.so", RTLD_NOLOAD | RTLD_GLOBAL); - ::dlopen("libpdal_plugin_reader_numpy.so", RTLD_NOLOAD | RTLD_GLOBAL); #endif -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - - m_executor = std::shared_ptr(new pdal::PipelineExecutor(json)); + if (_import_array() < 0) + throw pdal_error("Could not impory numpy.core.multiarray."); - pdal::PipelineManager& manager = m_executor->getManager(); + PipelineManager& manager = m_executor->getManager(); std::stringstream strm(json); manager.readPipeline(strm); + std::vector roots = manager.roots(); + if (roots.size() != 1) + throw pdal_error("Filter pipeline must contain a single root stage."); - pdal::Stage *r = manager.getStage(); - if (!r) - throw pdal::pdal_error("pipeline had no stages!"); - -#if PDAL_VERSION_MAJOR > 1 || PDAL_VERSION_MINOR >=8 - int counter = 1; - for (auto array: arrays) + for (auto array : arrays) { // Create numpy reader for each array - pdal::Options options; - std::stringstream tag; - tag << "readers_numpy" << counter; - pdal::StageCreationOptions opts { "", "readers.numpy", nullptr, options, tag.str()}; - pdal::Stage& reader = manager.makeReader(opts); - - pdal::NumpyReader* np_reader = dynamic_cast(&reader); - if (!np_reader) - throw pdal::pdal_error("couldn't cast reader!"); - + // Options + + Options options; + options.add("order", array->rowMajor() ? + MemoryViewReader::Order::RowMajor : + MemoryViewReader::Order::ColumnMajor); + options.add("shape", MemoryViewReader::Shape(array->shape())); + + Stage& s = manager.makeReader("", "readers.memoryview", options); + MemoryViewReader& r = dynamic_cast(s); + for (auto f : array->fields()) + r.pushField(f); + + ArrayIter& iter = array->iterator(); + auto incrementer = [&iter](PointId id) -> char * + { + if (! iter) + return nullptr; + + char *c = *iter; + ++iter; + return c; + }; + + r.setIncrementer(incrementer); PyObject* parray = (PyObject*)array->getPythonArray(); if (!parray) - throw pdal::pdal_error("array was none!"); - - np_reader->setArray(parray); - - r->setInput(reader); - counter++; + throw pdal_error("array was none!"); + roots[0]->setInput(r); } -#endif manager.validateStageOptions(); } -Pipeline::Pipeline(std::string const& json) +// Create a pipeline for reading data from PDAL +Pipeline::Pipeline(std::string const& json) : + m_executor(new PipelineExecutor(json)) { // Make the symbols in pdal_base global so that they're accessible // to PDAL plugins. Python dlopen's this extension with RTLD_LOCAL, // which means that without this, symbols in libpdal_base aren't available // for resolution of symbols on future runtime linking. This is an issue - // on Apline and other Linux variants that doesn't use UNIQUE symbols - // for C++ template statics. only + // on Alpine and other Linux variants that don't use UNIQUE symbols + // for C++ template statics only. Without this, you end up with multiple + // copies of template statics. #ifndef _WIN32 ::dlopen("libpdal_base.so", RTLD_NOLOAD | RTLD_GLOBAL); #endif -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - - m_executor = std::shared_ptr(new pdal::PipelineExecutor(json)); + if (_import_array() < 0) + throw pdal_error("Could not impory numpy.core.multiarray."); } Pipeline::~Pipeline() -{ -} +{} + void Pipeline::setLogLevel(int level) { m_executor->setLogLevel(level); } + int Pipeline::getLogLevel() const { return static_cast(m_executor->getLogLevel()); } + int64_t Pipeline::execute() { - - int64_t count = m_executor->execute(); - return count; + return m_executor->execute(); } bool Pipeline::validate() { - return m_executor->validate(); + auto res = m_executor->validate(); + return res; } std::vector Pipeline::getArrays() const @@ -160,16 +161,18 @@ std::vector Pipeline::getArrays() const if (!m_executor->executed()) throw python_error("call execute() before fetching arrays"); - const pdal::PointViewSet& pvset = m_executor->getManagerConst().views(); + const PointViewSet& pvset = m_executor->getManagerConst().views(); for (auto i: pvset) { //ABELL - Leak? - Array *array = new pdal::python::Array; + Array *array = new python::Array; array->update(i); output.push_back(array); } return output; } -} //namespace libpdalpython + +} // namespace python +} // namespace pdal ===================================== pdal/PyPipeline.hpp ===================================== @@ -43,20 +43,12 @@ #include #include -#undef toupper -#undef tolower -#undef isspace - namespace pdal { namespace python { - class Array; -} -} -namespace libpdalpython -{ +class Array; class python_error : public std::runtime_error { @@ -65,10 +57,12 @@ public: {} }; -class Pipeline { +class Pipeline +{ public: Pipeline(std::string const& json); - Pipeline(std::string const& json, std::vector arrays); + Pipeline(std::string const& json, + std::vector arrays); ~Pipeline(); int64_t execute(); @@ -98,4 +92,5 @@ private: std::shared_ptr m_executor; }; -} +} // namespace python +} // namespace pdal ===================================== pdal/__init__.py ===================================== @@ -1,4 +1,4 @@ -__version__='2.1.8' +__version__='2.2.0' from .pipeline import Pipeline from .array import Array ===================================== pdal/libpdalpython.cpp ===================================== The diff for this file was not included because it is too large. ===================================== pdal/libpdalpython.pyx ===================================== @@ -23,7 +23,6 @@ cdef extern from "pdal/pdal_config.hpp" namespace "pdal::Config": def getVersionString(): return versionString() - def getVersionMajor(): return versionMajor() def getVersionMinor(): @@ -39,10 +38,10 @@ def getPluginInstallPath(): cdef extern from "PyArray.hpp" namespace "pdal::python": cdef cppclass Array: - Array(object) except + - void* getPythonArray() except+ + Array(np.ndarray) except + + void *getPythonArray() except+ -cdef extern from "PyPipeline.hpp" namespace "libpdalpython": +cdef extern from "PyPipeline.hpp" namespace "pdal::python": cdef cppclass Pipeline: Pipeline(const char* ) except + Pipeline(const char*, vector[Array*]& ) except + @@ -56,11 +55,9 @@ cdef extern from "PyPipeline.hpp" namespace "libpdalpython": int getLogLevel() void setLogLevel(int) - - cdef class PyArray: cdef Array *thisptr - def __cinit__(self, object array): + def __cinit__(self, np.ndarray array): self.thisptr = new Array(array) def __dealloc__(self): del self.thisptr @@ -109,24 +106,14 @@ cdef class PyPipeline: cdef Array* a if arrays is not None: + print("Looping arrays\n") for array in arrays: a = new Array(array) c_arrays.push_back(a) - if PY_MAJOR_VERSION >= 3: - if arrays: - self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) - else: - self.thisptr = new Pipeline(json.encode('UTF-8')) + self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) else: - if arrays: - self.thisptr = new Pipeline(json, c_arrays) - else: - self.thisptr = new Pipeline(json) -# if arrays: -# self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) -# else: -# self.thisptr = new Pipeline(json.encode('UTF-8')) + self.thisptr = new Pipeline(json.encode('UTF-8')) def __dealloc__(self): del self.thisptr @@ -158,6 +145,7 @@ cdef class PyPipeline: return json.loads(j) property arrays: + def __get__(self): v = self.thisptr.getArrays() output = [] @@ -171,6 +159,7 @@ cdef class PyPipeline: inc(it) return output + def execute(self): if not self.thisptr: raise Exception("C++ Pipeline object not constructed!") ===================================== setup.py ===================================== @@ -156,9 +156,9 @@ if DEBUG: if os.name != 'nt': extra_compile_args += ['-g','-O0'] -# readers.numpy doesn't exist until PDAL 1.8 -if PDALVERSION >= Version('1.8'): - libraries.append('pdal_plugin_reader_numpy') +if PDALVERSION < Version('2.0.0'): + raise Exception("PDAL version '%s' is not compatible with PDAL Python library version '%s'"%(PDALVERSION, module_version)) + if os.name in ['nt']: if os.environ.get('OSGEO4W_ROOT'): @@ -168,8 +168,6 @@ if os.name in ['nt']: library_dirs = ['%s\Library\lib' % prefix] libraries = ['pdalcpp','pdal_util','ws2_32'] - if PDALVERSION >= Version('1.8'): - libraries.append('libpdal_plugin_reader_numpy') extra_compile_args = ['/DNOMINMAX',] @@ -182,7 +180,7 @@ if 'linux' in sys.platform or 'linux2' in sys.platform or 'darwin' in sys.platfo -sources=['pdal/libpdalpython'+ext, "pdal/PyPipeline.cpp" ] +sources=['pdal/libpdalpython'+ext, "pdal/PyPipeline.cpp", "pdal/PyArray.cpp" ] extensions = [DistutilsExtension("*", sources, include_dirs=include_dirs, @@ -192,12 +190,12 @@ extensions = [DistutilsExtension("*", extra_link_args=extra_link_args,)] if USE_CYTHON and "clean" not in sys.argv: from Cython.Build import cythonize - extensions= cythonize(extensions, language="c++") + extensions= cythonize(extensions, compiler_directives={'language_level':3}) setup_args = dict( name = 'PDAL', version = str(module_version), - requires = ['Python (>=2.7)', 'Numpy'], + requires = ['Python (>=3.0)', 'Numpy'], description = 'Point cloud data processing', license = 'BSD', keywords = 'point cloud spatial', ===================================== test/test_pipeline.py ===================================== @@ -31,14 +31,14 @@ class PDALTest(unittest.TestCase): return output class TestPipeline(PDALTest): -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_construction(self): """Can we construct a PDAL pipeline""" json = self.fetch_json('sort.json') r = pdal.Pipeline(json) -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_execution(self): @@ -48,13 +48,13 @@ class TestPipeline(PDALTest): r.validate() r.execute() self.assertGreater(len(r.pipeline), 200) -# + def test_validate(self): """Do we complain with bad pipelines""" r = pdal.Pipeline(bad_json) with self.assertRaises(RuntimeError): r.validate() -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_array(self): @@ -65,11 +65,11 @@ class TestPipeline(PDALTest): r.execute() arrays = r.arrays self.assertEqual(len(arrays), 1) -# + a = arrays[0] self.assertAlmostEqual(a[0][0], 635619.85, 7) self.assertAlmostEqual(a[1064][2], 456.92, 7) -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_metadata(self): @@ -82,8 +82,8 @@ class TestPipeline(PDALTest): import json j = json.loads(metadata) self.assertEqual(j["metadata"]["readers.las"][0]["count"], 1065) -# -# + + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_no_execute(self): @@ -93,17 +93,17 @@ class TestPipeline(PDALTest): with self.assertRaises(RuntimeError): r.arrays # - @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'reproject.json')), - "missing test data") - def test_logging(self): - """Can we fetch log output""" - json = self.fetch_json('reproject.json') - r = pdal.Pipeline(json) - r.loglevel = 8 - r.validate() - count = r.execute() - self.assertEqual(count, 789) - self.assertEqual(r.log.split()[0], '(pypipeline') +# @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'reproject.json')), +# "missing test data") +# def test_logging(self): +# """Can we fetch log output""" +# json = self.fetch_json('reproject.json') +# r = pdal.Pipeline(json) +# r.loglevel = 8 +# r.validate() +# count = r.execute() +# self.assertEqual(count, 789) +# self.assertEqual(r.log.split()[0], '(pypipeline') # @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") @@ -114,7 +114,7 @@ class TestPipeline(PDALTest): r.validate() r.execute() self.assertEqual(r.schema['schema']['dimensions'][0]['name'], 'X') -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'chip.json')), "missing test data") def test_merged_arrays(self): @@ -125,16 +125,17 @@ class TestPipeline(PDALTest): r.execute() arrays = r.arrays self.assertEqual(len(arrays), 43) -# + + class TestArrayLoad(PDALTest): @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'perlin.npy')), "missing test data") def test_merged_arrays(self): - """Can we load data from a a list of arrays to PDAL""" + """Can we load data from a list of arrays to PDAL""" if Version(pdal.info.version) < Version('1.8'): return True - data = np.load(os.path.join(DATADIRECTORY, 'perlin.npy')) + data = np.load(os.path.join(DATADIRECTORY, 'test3d.npy')) arrays = [data, data, data] @@ -143,7 +144,7 @@ class TestArrayLoad(PDALTest): "pipeline":[ { "type":"filters.range", - "limits":"Intensity[0:0.10]" + "limits":"Intensity[100:300)" } ] }""" @@ -154,9 +155,9 @@ class TestArrayLoad(PDALTest): arrays = p.arrays self.assertEqual(len(arrays), 3) - data = arrays[0] - self.assertEqual(len(data), 1836) - self.assertEqual(sum([len(i) for i in arrays]), 3*1836) + for data in arrays: + self.assertEqual(len(data), 12) + self.assertEqual(data['Intensity'].sum(), 1926) class TestDimensions(PDALTest): def test_fetch_dimensions(self): View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/compare/ab8e6273e083cb527782fda0eecc2dae1ea37df4...a8f729e0e08edef82226ede11044a9bb95daf8c3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/compare/ab8e6273e083cb527782fda0eecc2dae1ea37df4...a8f729e0e08edef82226ede11044a9bb95daf8c3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 07:23:37 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 06:23:37 +0000 Subject: [Git][debian-gis-team/python-pdal] Pushed new tag debian/2.2.0+ds-1 Message-ID: <5d734ce973f9_577b3f91ce34cb10124513e@godard.mail> Bas Couwenberg pushed new tag debian/2.2.0+ds-1 at Debian GIS Project / python-pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/tree/debian/2.2.0+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 07:23:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 06:23:38 +0000 Subject: [Git][debian-gis-team/python-pdal] Pushed new tag upstream/2.2.0+ds Message-ID: <5d734ceae6726_577b2ade5d6a104c124535@godard.mail> Bas Couwenberg pushed new tag upstream/2.2.0+ds at Debian GIS Project / python-pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/tree/upstream/2.2.0+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 07:23:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 06:23:38 +0000 Subject: [Git][debian-gis-team/python-pdal][upstream] New upstream version 2.2.0+ds Message-ID: <5d734ceae7c87_577b3f91ce324a981245413@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pdal Commits: 5a313af5 by Bas Couwenberg at 2019-09-07T05:56:39Z New upstream version 2.2.0+ds - - - - - 12 changed files: - PKG-INFO - README.rst - VERSION.txt - + pdal/PyArray.cpp - pdal/PyArray.hpp - pdal/PyPipeline.cpp - pdal/PyPipeline.hpp - pdal/__init__.py - pdal/libpdalpython.cpp - pdal/libpdalpython.pyx - setup.py - test/test_pipeline.py Changes: ===================================== PKG-INFO ===================================== @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: PDAL -Version: 2.1.8 +Version: 2.2.0 Summary: Point cloud data processing Home-page: http://pdal.io Author: Howard Butler @@ -60,6 +60,9 @@ Description: =================================================================== .. image:: https://travis-ci.org/PDAL/python.svg?branch=master :target: https://travis-ci.org/PDAL/python + .. image:: https://ci.appveyor.com/api/projects/status/of4kecyahpo8892d + :target: https://ci.appveyor.com/project/hobu/python/ + Requirements ================================================================================ @@ -93,5 +96,5 @@ Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Topic :: Scientific/Engineering :: GIS -Requires: Python (>=2.7) +Requires: Python (>=3.0) Requires: Numpy ===================================== README.rst ===================================== @@ -50,6 +50,9 @@ sorts it by the ``X`` dimension: .. image:: https://travis-ci.org/PDAL/python.svg?branch=master :target: https://travis-ci.org/PDAL/python +.. image:: https://ci.appveyor.com/api/projects/status/of4kecyahpo8892d + :target: https://ci.appveyor.com/project/hobu/python/ + Requirements ================================================================================ ===================================== VERSION.txt ===================================== @@ -1 +1 @@ -2.1.8 \ No newline at end of file +2.2.0 \ No newline at end of file ===================================== pdal/PyArray.cpp ===================================== @@ -0,0 +1,339 @@ +/****************************************************************************** +* Copyright (c) 2019, Hobu Inc. (info at hobu.co) +* +* All rights reserved. +* +* Redistribution and use in source and binary forms, with or without +* modification, are permitted provided that the following +* conditions are met: +* +* * Redistributions of source code must retain the above copyright +* notice, this list of conditions and the following disclaimer. +* * Redistributions in binary form must reproduce the above copyright +* notice, this list of conditions and the following disclaimer in +* the documentation and/or other materials provided +* with the distribution. +* * Neither the name of Hobu, Inc. or Flaxen Geo Consulting nor the +* names of its contributors may be used to endorse or promote +* products derived from this software without specific prior +* written permission. +* +* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS +* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT +* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS +* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE +* COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, +* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, +* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS +* OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED +* AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT +* OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY +* OF SUCH DAMAGE. +****************************************************************************/ + +#include "PyArray.hpp" +#include + +#include + +namespace pdal +{ +namespace python +{ + +namespace +{ + +Dimension::Type pdalType(int t) +{ + using namespace Dimension; + + switch (t) + { + case NPY_FLOAT32: + return Type::Float; + case NPY_FLOAT64: + return Type::Double; + case NPY_INT8: + return Type::Signed8; + case NPY_INT16: + return Type::Signed16; + case NPY_INT32: + return Type::Signed32; + case NPY_INT64: + return Type::Signed64; + case NPY_UINT8: + return Type::Unsigned8; + case NPY_UINT16: + return Type::Unsigned16; + case NPY_UINT32: + return Type::Unsigned32; + case NPY_UINT64: + return Type::Unsigned64; + default: + return Type::None; + } + assert(0); + + return Type::None; +} + +std::string toString(PyObject *pname) +{ + PyObject* r = PyObject_Str(pname); + if (!r) + throw pdal_error("couldn't make string representation value"); + Py_ssize_t size; + return std::string(PyUnicode_AsUTF8AndSize(r, &size)); +} + +} // unnamed namespace + +Array::Array() : m_array(nullptr) +{ + if (_import_array() < 0) + throw pdal_error("Could not import numpy.core.multiarray."); +} + +Array::Array(PyArrayObject* array) : m_array(array), m_rowMajor(true) +{ + if (_import_array() < 0) + throw pdal_error("Could not import numpy.core.multiarray."); + + Py_XINCREF(array); + + PyArray_Descr *dtype = PyArray_DTYPE(m_array); + npy_intp ndims = PyArray_NDIM(m_array); + npy_intp *shape = PyArray_SHAPE(m_array); + int numFields = (dtype->fields == Py_None) ? + 0 : + static_cast(PyDict_Size(dtype->fields)); + + int xyz = 0; + if (numFields == 0) + { + if (ndims != 3) + throw pdal_error("Array without fields must have 3 dimensions."); + m_fields.push_back({"Intensity", pdalType(dtype->type_num), 0}); + } + else + { + PyObject *names_dict = dtype->fields; + PyObject *names = PyDict_Keys(names_dict); + PyObject *values = PyDict_Values(names_dict); + if (!names || !values) + throw pdal_error("Bad field specification in numpy array."); + + for (int i = 0; i < numFields; ++i) + { + std::string name = toString(PyList_GetItem(names, i)); + if (name == "X") + xyz |= 1; + else if (name == "Y") + xyz |= 2; + else if (name == "Z") + xyz |= 4; + PyObject *tup = PyList_GetItem(values, i); + + // Get offset. + size_t offset = PyLong_AsLong(PySequence_Fast_GET_ITEM(tup, 1)); + + // Get type. + PyArray_Descr *descriptor = + (PyArray_Descr *)PySequence_Fast_GET_ITEM(tup, 0); + Dimension::Type type = pdalType(descriptor->type_num); + if (type == Dimension::Type::None) + throw pdal_error("Incompatible type for field '" + name + "'."); + + m_fields.push_back({name, type, offset}); + } + + if (xyz != 0 && xyz != 7) + throw pdal_error("Array fields must contain all or none " + "of X, Y and Z"); + if (xyz == 0 && ndims != 3) + throw pdal_error("Array without named X/Y/Z fields " + "must have three dimensions."); + } + if (xyz == 0) + m_shape = { (size_t)shape[0], (size_t)shape[1], (size_t)shape[2] }; + m_rowMajor = !(PyArray_FLAGS(m_array) & NPY_ARRAY_F_CONTIGUOUS); +} + +Array::~Array() +{ + if (m_array) + Py_XDECREF((PyObject *)m_array); +} + + +void Array::update(PointViewPtr view) +{ + if (m_array) + Py_XDECREF((PyObject *)m_array); + m_array = nullptr; // Just in case of an exception. + + Dimension::IdList dims = view->dims(); + npy_intp size = view->size(); + + PyObject *dtype_dict = (PyObject*)buildNumpyDescription(view); + if (!dtype_dict) + throw pdal_error("Unable to build numpy dtype " + "description dictionary"); + + PyArray_Descr *dtype = nullptr; + if (PyArray_DescrConverter(dtype_dict, &dtype) == NPY_FAIL) + throw pdal_error("Unable to build numpy dtype"); + Py_XDECREF(dtype_dict); + + // This is a 1 x size array. + m_array = (PyArrayObject *)PyArray_NewFromDescr(&PyArray_Type, dtype, + 1, &size, 0, nullptr, NPY_ARRAY_CARRAY, nullptr); + + // copy the data + DimTypeList types = view->dimTypes(); + for (PointId idx = 0; idx < view->size(); idx++) + { + char *p = (char *)PyArray_GETPTR1(m_array, idx); + view->getPackedPoint(types, idx, p); + } +} + + +//ABELL - Who's responsible for incrementing the ref count? +PyArrayObject *Array::getPythonArray() const +{ + return m_array; +} + +PyObject* Array::buildNumpyDescription(PointViewPtr view) const +{ + // Build up a numpy dtype dictionary + // + // {'formats': ['f8', 'f8', 'f8', 'u2', 'u1', 'u1', 'u1', 'u1', 'u1', + // 'f4', 'u1', 'u2', 'f8', 'u2', 'u2', 'u2'], + // 'names': ['X', 'Y', 'Z', 'Intensity', 'ReturnNumber', + // 'NumberOfReturns', 'ScanDirectionFlag', 'EdgeOfFlightLine', + // 'Classification', 'ScanAngleRank', 'UserData', + // 'PointSourceId', 'GpsTime', 'Red', 'Green', 'Blue']} + // + + Dimension::IdList dims = view->dims(); + + PyObject* dict = PyDict_New(); + PyObject* sizes = PyList_New(dims.size()); + PyObject* formats = PyList_New(dims.size()); + PyObject* titles = PyList_New(dims.size()); + + for (size_t i = 0; i < dims.size(); ++i) + { + Dimension::Id id = dims[i]; + Dimension::Type t = view->dimType(id); + npy_intp stride = view->dimSize(id); + + std::string name = view->dimName(id); + + std::string kind("i"); + Dimension::BaseType b = Dimension::base(t); + if (b == Dimension::BaseType::Unsigned) + kind = "u"; + else if (b == Dimension::BaseType::Signed) + kind = "i"; + else if (b == Dimension::BaseType::Floating) + kind = "f"; + else + throw pdal_error("Unable to map kind '" + kind + + "' to PDAL dimension type"); + + std::stringstream oss; + oss << kind << stride; + PyObject* pySize = PyLong_FromLong(stride); + PyObject* pyTitle = PyUnicode_FromString(name.c_str()); + PyObject* pyFormat = PyUnicode_FromString(oss.str().c_str()); + + PyList_SetItem(sizes, i, pySize); + PyList_SetItem(titles, i, pyTitle); + PyList_SetItem(formats, i, pyFormat); + } + + PyDict_SetItemString(dict, "names", titles); + PyDict_SetItemString(dict, "formats", formats); + + return dict; +} + +bool Array::rowMajor() const +{ + return m_rowMajor; +} + +Array::Shape Array::shape() const +{ + return m_shape; +} + +const Array::Fields& Array::fields() const +{ + return m_fields; +} + +ArrayIter& Array::iterator() +{ + ArrayIter *it = new ArrayIter(*this); + m_iterators.push_back(std::unique_ptr(it)); + return *it; +} + +ArrayIter::ArrayIter(Array& array) +{ + m_iter = NpyIter_New(array.getPythonArray(), + NPY_ITER_EXTERNAL_LOOP | NPY_ITER_READONLY | NPY_ITER_REFS_OK, + NPY_KEEPORDER, NPY_NO_CASTING, NULL); + if (!m_iter) + throw pdal_error("Unable to create numpy iterator."); + + char *itererr; + m_iterNext = NpyIter_GetIterNext(m_iter, &itererr); + if (!m_iterNext) + { + NpyIter_Deallocate(m_iter); + throw pdal_error(std::string("Unable to create numpy iterator: ") + + itererr); + } + m_data = NpyIter_GetDataPtrArray(m_iter); + m_stride = NpyIter_GetInnerStrideArray(m_iter); + m_size = NpyIter_GetInnerLoopSizePtr(m_iter); + m_done = false; +} + +ArrayIter::~ArrayIter() +{ + NpyIter_Deallocate(m_iter); +} + +ArrayIter& ArrayIter::operator++() +{ + if (m_done) + return *this; + + if (--(*m_size)) + *m_data += *m_stride; + else if (!m_iterNext(m_iter)) + m_done = true; + return *this; +} + +ArrayIter::operator bool () const +{ + return !m_done; +} + +char * ArrayIter::operator * () const +{ + return *m_data; +} + +} // namespace python +} // namespace pdal + ===================================== pdal/PyArray.hpp ===================================== @@ -1,5 +1,5 @@ /****************************************************************************** -* Copyright (c) 2011, Michael P. Gerlek (mpg at flaxen.com) +* Copyright (c) 2019, Hobu Inc. (info at hobu.co) * * All rights reserved. * @@ -13,7 +13,7 @@ * notice, this list of conditions and the following disclaimer in * the documentation and/or other materials provided * with the distribution. -* * Neither the name of Hobu, Inc. or Flaxen Geo Consulting nor the +* * Neither the name of Hobu, Inc. nor the * names of its contributors may be used to endorse or promote * products derived from this software without specific prior * written permission. @@ -34,204 +34,69 @@ #pragma once -#include - -#include - -#pragma warning(disable: 4127) // conditional expression is constant - - -#include -#undef toupper -#undef tolower -#undef isspace +#include -#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION -#include - -// forward declare PyObject so we don't need the python headers everywhere -// see: http://mail.python.org/pipermail/python-dev/2003-August/037601.html -#ifndef PyObject_HEAD -struct _object; -typedef _object PyObject; -#endif +#include +#include namespace pdal { namespace python { +class ArrayIter; class PDAL_DLL Array { public: + using Shape = std::array; + using Fields = std::vector; + + // Create an array for reading data from PDAL. + Array(); + + // Create an array for writing data to PDAL. + Array(PyArrayObject* array); - Array() : m_py_array(0), m_own_array(true) - { -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - } - - Array(PyObject* array) : m_py_array(array), m_own_array(false) - { -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - if (!PyArray_Check(array)) - throw pdal::pdal_error("pdal::python::Array constructor object is not a numpy array"); - Py_XINCREF(array); - - } - - ~Array() - { - cleanup(); - } - - - inline void update(PointViewPtr view) - { - typedef std::unique_ptr> DataPtr; - cleanup(); - int nd = 1; - Dimension::IdList dims = view->dims(); - npy_intp mydims = view->size(); - npy_intp* ndims = &mydims; - std::vector strides(dims.size()); - - DataPtr pdata( new std::vector(view->pointSize()* view->size(), 0)); - - PyArray_Descr *dtype = nullptr; - PyObject * dtype_dict = (PyObject*)buildNumpyDescription(view); - if (!dtype_dict) - throw pdal_error("Unable to build numpy dtype description dictionary"); - - int did_convert = PyArray_DescrConverter(dtype_dict, &dtype); - if (did_convert == NPY_FAIL) - throw pdal_error("Unable to build numpy dtype"); - Py_XDECREF(dtype_dict); - -#ifdef NPY_ARRAY_CARRAY - int flags = NPY_ARRAY_CARRAY; -#else - int flags = NPY_CARRAY; -#endif - uint8_t* sp = pdata.get()->data(); - PyObject * pyArray = PyArray_NewFromDescr(&PyArray_Type, - dtype, - nd, - ndims, - 0, - sp, - flags, - NULL); - - // copy the data - uint8_t* p(sp); - DimTypeList types = view->dimTypes(); - for (PointId idx = 0; idx < view->size(); idx++) - { - p = sp + (view->pointSize() * idx); - view->getPackedPoint(types, idx, (char*)p); - } - - m_py_array = pyArray; - m_data_array = std::move(pdata); - } - - - inline PyObject* getPythonArray() const - { - return m_py_array; - } + ~Array(); + void update(PointViewPtr view); + PyArrayObject *getPythonArray() const; + bool rowMajor() const; + Shape shape() const; + const Fields& fields() const; + ArrayIter& iterator(); private: + inline PyObject* buildNumpyDescription(PointViewPtr view) const; - inline void cleanup() - { - PyObject* p = (PyObject*)(m_py_array); - if (m_own_array) - { - m_data_array.reset(); - } - - Py_XDECREF(p); - } - - inline PyObject* buildNumpyDescription(PointViewPtr view) const - { - - // Build up a numpy dtype dictionary - // - // {'formats': ['f8', 'f8', 'f8', 'u2', 'u1', 'u1', 'u1', 'u1', 'u1', 'f4', 'u1', 'u2', 'f8', 'u2', 'u2', 'u2'], - // 'names': ['X', 'Y', 'Z', 'Intensity', 'ReturnNumber', 'NumberOfReturns', - // 'ScanDirectionFlag', 'EdgeOfFlightLine', 'Classification', - // 'ScanAngleRank', 'UserData', 'PointSourceId', 'GpsTime', 'Red', 'Green', - // 'Blue']} - // - - std::stringstream oss; - Dimension::IdList dims = view->dims(); - - PyObject* dict = PyDict_New(); - PyObject* sizes = PyList_New(dims.size()); - PyObject* formats = PyList_New(dims.size()); - PyObject* titles = PyList_New(dims.size()); - - for (Dimension::IdList::size_type i=0; i < dims.size(); ++i) - { - Dimension::Id id = (dims[i]); - Dimension::Type t = view->dimType(id); - npy_intp stride = view->dimSize(id); - - std::string name = view->dimName(id); - - std::string kind("i"); - Dimension::BaseType b = Dimension::base(t); - if (b == Dimension::BaseType::Unsigned) - kind = "u"; - else if (b == Dimension::BaseType::Signed) - kind = "i"; - else if (b == Dimension::BaseType::Floating) - kind = "f"; - else - { - std::stringstream o; - oss << "unable to map kind '" << kind <<"' to PDAL dimension type"; - throw pdal::pdal_error(o.str()); - } - - oss << kind << stride; - PyObject* pySize = PyLong_FromLong(stride); - PyObject* pyTitle = PyUnicode_FromString(name.c_str()); - PyObject* pyFormat = PyUnicode_FromString(oss.str().c_str()); - - PyList_SetItem(sizes, i, pySize); - PyList_SetItem(titles, i, pyTitle); - PyList_SetItem(formats, i, pyFormat); - - oss.str(""); - } - - PyDict_SetItemString(dict, "names", titles); - PyDict_SetItemString(dict, "formats", formats); - - // PyObject* obj = PyUnicode_AsASCIIString(PyObject_Str(dict)); - // const char* s = PyBytes_AsString(obj); - // std::string output(s); - // std::cout << "array: " << output << std::endl; - return dict; - } - - - - - PyObject* m_py_array; - std::unique_ptr > m_data_array; - bool m_own_array; + PyArrayObject* m_array; Array& operator=(Array const& rhs); + Fields m_fields; + bool m_rowMajor; + Shape m_shape; + std::vector> m_iterators; +}; + +class ArrayIter +{ +public: + ArrayIter(const ArrayIter&) = delete; + + ArrayIter(Array& array); + ~ArrayIter(); + + ArrayIter& operator++(); + operator bool () const; + char *operator * () const; + +private: + NpyIter *m_iter; + NpyIter_IterNextFunc *m_iterNext; + char **m_data; + npy_intp *m_size; + npy_intp *m_stride; + bool m_done; }; } // namespace python ===================================== pdal/PyPipeline.cpp ===================================== @@ -33,124 +33,125 @@ ****************************************************************************/ #include "PyPipeline.hpp" -#ifdef PDAL_HAVE_LIBXML2 -#include -#endif #ifndef _WIN32 #include #endif #include -#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION #include -#include "PyArray.hpp" #include #include -#include -#include - -namespace libpdalpython -{ -using namespace pdal::python; +#include "PyArray.hpp" -Pipeline::Pipeline(std::string const& json, std::vector arrays) +namespace pdal +{ +namespace python { +// Create a pipeline for writing data to PDAL +Pipeline::Pipeline(std::string const& json, std::vector arrays) : + m_executor(new PipelineExecutor(json)) +{ #ifndef _WIN32 + // See comment in alternate constructor below. ::dlopen("libpdal_base.so", RTLD_NOLOAD | RTLD_GLOBAL); - ::dlopen("libpdal_plugin_reader_numpy.so", RTLD_NOLOAD | RTLD_GLOBAL); #endif -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - - m_executor = std::shared_ptr(new pdal::PipelineExecutor(json)); + if (_import_array() < 0) + throw pdal_error("Could not impory numpy.core.multiarray."); - pdal::PipelineManager& manager = m_executor->getManager(); + PipelineManager& manager = m_executor->getManager(); std::stringstream strm(json); manager.readPipeline(strm); + std::vector roots = manager.roots(); + if (roots.size() != 1) + throw pdal_error("Filter pipeline must contain a single root stage."); - pdal::Stage *r = manager.getStage(); - if (!r) - throw pdal::pdal_error("pipeline had no stages!"); - -#if PDAL_VERSION_MAJOR > 1 || PDAL_VERSION_MINOR >=8 - int counter = 1; - for (auto array: arrays) + for (auto array : arrays) { // Create numpy reader for each array - pdal::Options options; - std::stringstream tag; - tag << "readers_numpy" << counter; - pdal::StageCreationOptions opts { "", "readers.numpy", nullptr, options, tag.str()}; - pdal::Stage& reader = manager.makeReader(opts); - - pdal::NumpyReader* np_reader = dynamic_cast(&reader); - if (!np_reader) - throw pdal::pdal_error("couldn't cast reader!"); - + // Options + + Options options; + options.add("order", array->rowMajor() ? + MemoryViewReader::Order::RowMajor : + MemoryViewReader::Order::ColumnMajor); + options.add("shape", MemoryViewReader::Shape(array->shape())); + + Stage& s = manager.makeReader("", "readers.memoryview", options); + MemoryViewReader& r = dynamic_cast(s); + for (auto f : array->fields()) + r.pushField(f); + + ArrayIter& iter = array->iterator(); + auto incrementer = [&iter](PointId id) -> char * + { + if (! iter) + return nullptr; + + char *c = *iter; + ++iter; + return c; + }; + + r.setIncrementer(incrementer); PyObject* parray = (PyObject*)array->getPythonArray(); if (!parray) - throw pdal::pdal_error("array was none!"); - - np_reader->setArray(parray); - - r->setInput(reader); - counter++; + throw pdal_error("array was none!"); + roots[0]->setInput(r); } -#endif manager.validateStageOptions(); } -Pipeline::Pipeline(std::string const& json) +// Create a pipeline for reading data from PDAL +Pipeline::Pipeline(std::string const& json) : + m_executor(new PipelineExecutor(json)) { // Make the symbols in pdal_base global so that they're accessible // to PDAL plugins. Python dlopen's this extension with RTLD_LOCAL, // which means that without this, symbols in libpdal_base aren't available // for resolution of symbols on future runtime linking. This is an issue - // on Apline and other Linux variants that doesn't use UNIQUE symbols - // for C++ template statics. only + // on Alpine and other Linux variants that don't use UNIQUE symbols + // for C++ template statics only. Without this, you end up with multiple + // copies of template statics. #ifndef _WIN32 ::dlopen("libpdal_base.so", RTLD_NOLOAD | RTLD_GLOBAL); #endif -#undef NUMPY_IMPORT_ARRAY_RETVAL -#define NUMPY_IMPORT_ARRAY_RETVAL - import_array(); - - m_executor = std::shared_ptr(new pdal::PipelineExecutor(json)); + if (_import_array() < 0) + throw pdal_error("Could not impory numpy.core.multiarray."); } Pipeline::~Pipeline() -{ -} +{} + void Pipeline::setLogLevel(int level) { m_executor->setLogLevel(level); } + int Pipeline::getLogLevel() const { return static_cast(m_executor->getLogLevel()); } + int64_t Pipeline::execute() { - - int64_t count = m_executor->execute(); - return count; + return m_executor->execute(); } bool Pipeline::validate() { - return m_executor->validate(); + auto res = m_executor->validate(); + return res; } std::vector Pipeline::getArrays() const @@ -160,16 +161,18 @@ std::vector Pipeline::getArrays() const if (!m_executor->executed()) throw python_error("call execute() before fetching arrays"); - const pdal::PointViewSet& pvset = m_executor->getManagerConst().views(); + const PointViewSet& pvset = m_executor->getManagerConst().views(); for (auto i: pvset) { //ABELL - Leak? - Array *array = new pdal::python::Array; + Array *array = new python::Array; array->update(i); output.push_back(array); } return output; } -} //namespace libpdalpython + +} // namespace python +} // namespace pdal ===================================== pdal/PyPipeline.hpp ===================================== @@ -43,20 +43,12 @@ #include #include -#undef toupper -#undef tolower -#undef isspace - namespace pdal { namespace python { - class Array; -} -} -namespace libpdalpython -{ +class Array; class python_error : public std::runtime_error { @@ -65,10 +57,12 @@ public: {} }; -class Pipeline { +class Pipeline +{ public: Pipeline(std::string const& json); - Pipeline(std::string const& json, std::vector arrays); + Pipeline(std::string const& json, + std::vector arrays); ~Pipeline(); int64_t execute(); @@ -98,4 +92,5 @@ private: std::shared_ptr m_executor; }; -} +} // namespace python +} // namespace pdal ===================================== pdal/__init__.py ===================================== @@ -1,4 +1,4 @@ -__version__='2.1.8' +__version__='2.2.0' from .pipeline import Pipeline from .array import Array ===================================== pdal/libpdalpython.cpp ===================================== The diff for this file was not included because it is too large. ===================================== pdal/libpdalpython.pyx ===================================== @@ -23,7 +23,6 @@ cdef extern from "pdal/pdal_config.hpp" namespace "pdal::Config": def getVersionString(): return versionString() - def getVersionMajor(): return versionMajor() def getVersionMinor(): @@ -39,10 +38,10 @@ def getPluginInstallPath(): cdef extern from "PyArray.hpp" namespace "pdal::python": cdef cppclass Array: - Array(object) except + - void* getPythonArray() except+ + Array(np.ndarray) except + + void *getPythonArray() except+ -cdef extern from "PyPipeline.hpp" namespace "libpdalpython": +cdef extern from "PyPipeline.hpp" namespace "pdal::python": cdef cppclass Pipeline: Pipeline(const char* ) except + Pipeline(const char*, vector[Array*]& ) except + @@ -56,11 +55,9 @@ cdef extern from "PyPipeline.hpp" namespace "libpdalpython": int getLogLevel() void setLogLevel(int) - - cdef class PyArray: cdef Array *thisptr - def __cinit__(self, object array): + def __cinit__(self, np.ndarray array): self.thisptr = new Array(array) def __dealloc__(self): del self.thisptr @@ -109,24 +106,14 @@ cdef class PyPipeline: cdef Array* a if arrays is not None: + print("Looping arrays\n") for array in arrays: a = new Array(array) c_arrays.push_back(a) - if PY_MAJOR_VERSION >= 3: - if arrays: - self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) - else: - self.thisptr = new Pipeline(json.encode('UTF-8')) + self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) else: - if arrays: - self.thisptr = new Pipeline(json, c_arrays) - else: - self.thisptr = new Pipeline(json) -# if arrays: -# self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) -# else: -# self.thisptr = new Pipeline(json.encode('UTF-8')) + self.thisptr = new Pipeline(json.encode('UTF-8')) def __dealloc__(self): del self.thisptr @@ -158,6 +145,7 @@ cdef class PyPipeline: return json.loads(j) property arrays: + def __get__(self): v = self.thisptr.getArrays() output = [] @@ -171,6 +159,7 @@ cdef class PyPipeline: inc(it) return output + def execute(self): if not self.thisptr: raise Exception("C++ Pipeline object not constructed!") ===================================== setup.py ===================================== @@ -156,9 +156,9 @@ if DEBUG: if os.name != 'nt': extra_compile_args += ['-g','-O0'] -# readers.numpy doesn't exist until PDAL 1.8 -if PDALVERSION >= Version('1.8'): - libraries.append('pdal_plugin_reader_numpy') +if PDALVERSION < Version('2.0.0'): + raise Exception("PDAL version '%s' is not compatible with PDAL Python library version '%s'"%(PDALVERSION, module_version)) + if os.name in ['nt']: if os.environ.get('OSGEO4W_ROOT'): @@ -168,8 +168,6 @@ if os.name in ['nt']: library_dirs = ['%s\Library\lib' % prefix] libraries = ['pdalcpp','pdal_util','ws2_32'] - if PDALVERSION >= Version('1.8'): - libraries.append('libpdal_plugin_reader_numpy') extra_compile_args = ['/DNOMINMAX',] @@ -182,7 +180,7 @@ if 'linux' in sys.platform or 'linux2' in sys.platform or 'darwin' in sys.platfo -sources=['pdal/libpdalpython'+ext, "pdal/PyPipeline.cpp" ] +sources=['pdal/libpdalpython'+ext, "pdal/PyPipeline.cpp", "pdal/PyArray.cpp" ] extensions = [DistutilsExtension("*", sources, include_dirs=include_dirs, @@ -192,12 +190,12 @@ extensions = [DistutilsExtension("*", extra_link_args=extra_link_args,)] if USE_CYTHON and "clean" not in sys.argv: from Cython.Build import cythonize - extensions= cythonize(extensions, language="c++") + extensions= cythonize(extensions, compiler_directives={'language_level':3}) setup_args = dict( name = 'PDAL', version = str(module_version), - requires = ['Python (>=2.7)', 'Numpy'], + requires = ['Python (>=3.0)', 'Numpy'], description = 'Point cloud data processing', license = 'BSD', keywords = 'point cloud spatial', ===================================== test/test_pipeline.py ===================================== @@ -31,14 +31,14 @@ class PDALTest(unittest.TestCase): return output class TestPipeline(PDALTest): -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_construction(self): """Can we construct a PDAL pipeline""" json = self.fetch_json('sort.json') r = pdal.Pipeline(json) -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_execution(self): @@ -48,13 +48,13 @@ class TestPipeline(PDALTest): r.validate() r.execute() self.assertGreater(len(r.pipeline), 200) -# + def test_validate(self): """Do we complain with bad pipelines""" r = pdal.Pipeline(bad_json) with self.assertRaises(RuntimeError): r.validate() -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_array(self): @@ -65,11 +65,11 @@ class TestPipeline(PDALTest): r.execute() arrays = r.arrays self.assertEqual(len(arrays), 1) -# + a = arrays[0] self.assertAlmostEqual(a[0][0], 635619.85, 7) self.assertAlmostEqual(a[1064][2], 456.92, 7) -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_metadata(self): @@ -82,8 +82,8 @@ class TestPipeline(PDALTest): import json j = json.loads(metadata) self.assertEqual(j["metadata"]["readers.las"][0]["count"], 1065) -# -# + + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") def test_no_execute(self): @@ -93,17 +93,17 @@ class TestPipeline(PDALTest): with self.assertRaises(RuntimeError): r.arrays # - @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'reproject.json')), - "missing test data") - def test_logging(self): - """Can we fetch log output""" - json = self.fetch_json('reproject.json') - r = pdal.Pipeline(json) - r.loglevel = 8 - r.validate() - count = r.execute() - self.assertEqual(count, 789) - self.assertEqual(r.log.split()[0], '(pypipeline') +# @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'reproject.json')), +# "missing test data") +# def test_logging(self): +# """Can we fetch log output""" +# json = self.fetch_json('reproject.json') +# r = pdal.Pipeline(json) +# r.loglevel = 8 +# r.validate() +# count = r.execute() +# self.assertEqual(count, 789) +# self.assertEqual(r.log.split()[0], '(pypipeline') # @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')), "missing test data") @@ -114,7 +114,7 @@ class TestPipeline(PDALTest): r.validate() r.execute() self.assertEqual(r.schema['schema']['dimensions'][0]['name'], 'X') -# + @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'chip.json')), "missing test data") def test_merged_arrays(self): @@ -125,16 +125,17 @@ class TestPipeline(PDALTest): r.execute() arrays = r.arrays self.assertEqual(len(arrays), 43) -# + + class TestArrayLoad(PDALTest): @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'perlin.npy')), "missing test data") def test_merged_arrays(self): - """Can we load data from a a list of arrays to PDAL""" + """Can we load data from a list of arrays to PDAL""" if Version(pdal.info.version) < Version('1.8'): return True - data = np.load(os.path.join(DATADIRECTORY, 'perlin.npy')) + data = np.load(os.path.join(DATADIRECTORY, 'test3d.npy')) arrays = [data, data, data] @@ -143,7 +144,7 @@ class TestArrayLoad(PDALTest): "pipeline":[ { "type":"filters.range", - "limits":"Intensity[0:0.10]" + "limits":"Intensity[100:300)" } ] }""" @@ -154,9 +155,9 @@ class TestArrayLoad(PDALTest): arrays = p.arrays self.assertEqual(len(arrays), 3) - data = arrays[0] - self.assertEqual(len(data), 1836) - self.assertEqual(sum([len(i) for i in arrays]), 3*1836) + for data in arrays: + self.assertEqual(len(data), 12) + self.assertEqual(data['Intensity'].sum(), 1926) class TestDimensions(PDALTest): def test_fetch_dimensions(self): View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/5a313af5bc7eccbdc49f0cf3b1143c91229940fb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/5a313af5bc7eccbdc49f0cf3b1143c91229940fb You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 7 07:30:25 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 06:30:25 +0000 Subject: Processing of python-pdal_2.2.0+ds-1_source.changes Message-ID: python-pdal_2.2.0+ds-1_source.changes uploaded successfully to localhost along with the files: python-pdal_2.2.0+ds-1.dsc python-pdal_2.2.0+ds.orig.tar.xz python-pdal_2.2.0+ds-1.debian.tar.xz python-pdal_2.2.0+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 7 07:34:33 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 06:34:33 +0000 Subject: python-pdal_2.2.0+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 07 Sep 2019 08:13:12 +0200 Source: python-pdal Architecture: source Version: 2.2.0+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pdal (2.2.0+ds-1) unstable; urgency=medium . * New upstream release. * Update copyright years for copyright holders. * Refresh patches. Checksums-Sha1: 03796dc03c570317947084292c200a83dede46ee 2103 python-pdal_2.2.0+ds-1.dsc 439a9e53f566e8789115dadc4042f5589c00fb9d 53668 python-pdal_2.2.0+ds.orig.tar.xz e19353a1c4d382c77b9e36a0484af6d1e067d6f7 4464 python-pdal_2.2.0+ds-1.debian.tar.xz 6ec6b2cd7cc25cfbed75777abcdfab105d208aa6 13690 python-pdal_2.2.0+ds-1_amd64.buildinfo Checksums-Sha256: 1d5b36ddbf0eb1dcd7fdd301c0e67e327c1f36a74b70b4705c08c0776b081002 2103 python-pdal_2.2.0+ds-1.dsc 9429822f802c83d4998d1698de921cfa1790e0e4765d3c1cd942c3113e0d3697 53668 python-pdal_2.2.0+ds.orig.tar.xz 3eb833cd8f9dca707e9547cc6c2e7435fc4790f0c688418de97e17de8cfe5dcb 4464 python-pdal_2.2.0+ds-1.debian.tar.xz e92bd75096f11836db9bb23b60b70d36852f39f63ac7dcce59967729298e389d 13690 python-pdal_2.2.0+ds-1_amd64.buildinfo Files: b6aba0994bfa7d7f530333eae30965da 2103 science optional python-pdal_2.2.0+ds-1.dsc 667f7fd122525e6c701dc30fb88c10fd 53668 science optional python-pdal_2.2.0+ds.orig.tar.xz ca93ab34030e7f8ad8f5da4ff550273c 4464 science optional python-pdal_2.2.0+ds-1.debian.tar.xz 18f9d6f81a6fd9f0677659bde2988c35 13690 science optional python-pdal_2.2.0+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1zTGEACgkQZ1DxCuiN SvFIxg//SBbV+B/lpjShDG/uO6235qRafRgpY1QEPf7k/gImaoIl+8aBtRCiEDNs wgekORjr5RIhwNftT051l4C5tvmtX0rNTvLY1LzWNjdJoUiH+r8gAEWCrxMEVa6N y2vjxjmyJ/QkorwUA3Pk5ABrBvcM0cvFpgH3IEmBHp5HOnp4OX/Hp/piVvuBnMB5 7biX7RzmJOK67B4zFUuB1JF6XleDJjRuq+s9UsaKM6HZEGJTxFaKm3urSKUPrr6W fmYbkXApUt6zsTAR8srUgUe7ZTlcEfKv/Yh01kZDV7su3veuLWLdNOUL0gz3YSZi y5axNXI4RHihCWWp3JYIvOCQrdpRsavLi/h/Tgj8rrGyYJkhLUV72TVsM4giysSZ +EBHOVSBmUwlUhqdlvVYPD7RLQeg2bEpHFozuqtKzZQGrtWylPHLBnjwJheZC2iL rHzxdl8ouxt2w+rzJSVsegb2M6TgnB11nLzJgf6OZs43UPNp50qQAjE2m9WPWLUV tTewWpNulGUwfJcAdd4SpBtdCBPuzb6BXwuw7M3U96VfAc+CSlSucnXJqGIQ1AMG nvHxiRrPY4hYBQJZ1LnRZ2KJVynU8BTso+Da7o8YLN4TI1ZVw7oq327CzTWhLeID MX48olLOjjkDt7Hs7d2WHjHxAiFQwNTtHoaSFcHPOhCX1H3Qtg0= =anux -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 7 08:12:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 07:12:53 +0000 Subject: [Git][debian-gis-team/gmt][pristine-tar] pristine-tar data for gmt_6.0.0~rc4+dfsg.orig.tar.xz Message-ID: <5d7358752fcc8_577b3f91b5240c4412472b@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / gmt Commits: 29b83e0b by Bas Couwenberg at 2019-09-07T06:32:00Z pristine-tar data for gmt_6.0.0~rc4+dfsg.orig.tar.xz - - - - - 2 changed files: - + gmt_6.0.0~rc4+dfsg.orig.tar.xz.delta - + gmt_6.0.0~rc4+dfsg.orig.tar.xz.id Changes: ===================================== gmt_6.0.0~rc4+dfsg.orig.tar.xz.delta ===================================== Binary files /dev/null and b/gmt_6.0.0~rc4+dfsg.orig.tar.xz.delta differ ===================================== gmt_6.0.0~rc4+dfsg.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +08cf30089ae5f2ba388527ef13d0c739b5b99d15 View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/commit/29b83e0b0497a8722768a01ae0c72959dd50487e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/commit/29b83e0b0497a8722768a01ae0c72959dd50487e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 08:12:55 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 07:12:55 +0000 Subject: [Git][debian-gis-team/gmt] Pushed new tag debian/6.0.0_rc4+dfsg-1_exp1 Message-ID: <5d7358775dea7_577b2ade5d6a104c1247414@godard.mail> Bas Couwenberg pushed new tag debian/6.0.0_rc4+dfsg-1_exp1 at Debian GIS Project / gmt -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/tree/debian/6.0.0_rc4+dfsg-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 08:12:56 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 07:12:56 +0000 Subject: [Git][debian-gis-team/gmt] Pushed new tag upstream/6.0.0_rc4+dfsg Message-ID: <5d735878aaa7c_577b3f91ce34cb10124767b@godard.mail> Bas Couwenberg pushed new tag upstream/6.0.0_rc4+dfsg at Debian GIS Project / gmt -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/tree/upstream/6.0.0_rc4+dfsg You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 08:13:02 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 07:13:02 +0000 Subject: [Git][debian-gis-team/gmt][experimental] 7 commits: New upstream version 6.0.0~rc4+dfsg Message-ID: <5d73587e8163_577b3f91d44f1208124784c@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / gmt Commits: 8f891627 by Bas Couwenberg at 2019-09-07T06:30:48Z New upstream version 6.0.0~rc4+dfsg - - - - - fc63cce4 by Bas Couwenberg at 2019-09-07T06:32:00Z Update upstream source from tag 'upstream/6.0.0_rc4+dfsg' Update to upstream version '6.0.0~rc4+dfsg' with Debian dir 00d97c51c6ea544aacf1331c37fdf877ed726fba - - - - - dd37e584 by Bas Couwenberg at 2019-09-07T06:32:21Z New upstream release candidate. - - - - - ec7edcb1 by Bas Couwenberg at 2019-09-07T06:35:29Z Refresh patches. - - - - - 1138b90e by Bas Couwenberg at 2019-09-07T06:45:38Z Drop python-sphinx from build dependencies. - - - - - c588de67 by Bas Couwenberg at 2019-09-07T06:53:12Z Update symbols for 6.0.0~rc4. - - - - - 667323e5 by Bas Couwenberg at 2019-09-07T06:53:12Z Set distribution to experimental. - - - - - 30 changed files: - BUILDING.md - CMakeLists.txt - CONTRIBUTING.md - INSTALL.md - README.md - cmake/ConfigDefault.cmake - cmake/ConfigUserTemplate.cmake - cmake/dist/CMakeLists.txt - cmake/dist/add_exes_cpack.txt - cmake/dist/startup_macosx.sh.in - cmake/modules/ConfigCMake.cmake - debian/changelog - debian/control - debian/libgmt6.symbols - debian/patches/manpage-section.patch - doc/CMakeLists.txt - doc/examples/CMakeLists.txt - doc/examples/anim04/anim_04.ps - doc/examples/anim08/anim_08.ps - doc/examples/do_examples.bat - doc/examples/do_examples.sh - doc/examples/do_view.sh - doc/examples/ex01/example_01.bat → doc/examples/ex01/ex01.bat - doc/examples/ex01/example_01.ps → doc/examples/ex01/ex01.ps - + doc/examples/ex01/ex01.sh - − doc/examples/ex01/example_01.sh - doc/examples/ex02/example_02.bat → doc/examples/ex02/ex02.bat - + doc/examples/ex02/ex02.ps - + doc/examples/ex02/ex02.sh - − doc/examples/ex02/example_02.ps The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/compare/88ba6038457f3c3d8541398a345b21b552173ca2...667323e595b74b4bd978017820246344aad45018 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/compare/88ba6038457f3c3d8541398a345b21b552173ca2...667323e595b74b4bd978017820246344aad45018 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 08:13:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 07:13:04 +0000 Subject: [Git][debian-gis-team/gmt][upstream] New upstream version 6.0.0~rc4+dfsg Message-ID: <5d73588019145_577b3f91b5240c441248076@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / gmt Commits: 8f891627 by Bas Couwenberg at 2019-09-07T06:30:48Z New upstream version 6.0.0~rc4+dfsg - - - - - 30 changed files: - BUILDING.md - CMakeLists.txt - CONTRIBUTING.md - INSTALL.md - README.md - cmake/ConfigDefault.cmake - cmake/ConfigUserTemplate.cmake - cmake/dist/CMakeLists.txt - cmake/dist/add_exes_cpack.txt - cmake/dist/startup_macosx.sh.in - cmake/modules/ConfigCMake.cmake - doc/CMakeLists.txt - doc/examples/CMakeLists.txt - doc/examples/anim04/anim_04.ps - doc/examples/anim08/anim_08.ps - doc/examples/do_examples.bat - doc/examples/do_examples.sh - doc/examples/do_view.sh - doc/examples/ex01/example_01.bat → doc/examples/ex01/ex01.bat - doc/examples/ex01/example_01.ps → doc/examples/ex01/ex01.ps - + doc/examples/ex01/ex01.sh - − doc/examples/ex01/example_01.sh - doc/examples/ex02/example_02.bat → doc/examples/ex02/ex02.bat - + doc/examples/ex02/ex02.ps - + doc/examples/ex02/ex02.sh - − doc/examples/ex02/example_02.ps - − doc/examples/ex02/example_02.sh - doc/examples/ex03/example_03.bat → doc/examples/ex03/ex03.bat - doc/examples/ex03/example_03.ps → doc/examples/ex03/ex03.ps - + doc/examples/ex03/ex03.sh The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/commit/8f891627c28c4beaab247b37ad6d5f3efa4924ed -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/commit/8f891627c28c4beaab247b37ad6d5f3efa4924ed You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 7 08:20:30 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 07:20:30 +0000 Subject: Processing of gmt_6.0.0~rc4+dfsg-1~exp1_source.changes Message-ID: gmt_6.0.0~rc4+dfsg-1~exp1_source.changes uploaded successfully to localhost along with the files: gmt_6.0.0~rc4+dfsg-1~exp1.dsc gmt_6.0.0~rc4+dfsg.orig.tar.xz gmt_6.0.0~rc4+dfsg-1~exp1.debian.tar.xz gmt_6.0.0~rc4+dfsg-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Sat Sep 7 09:03:57 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:03:57 +0000 Subject: [Git][debian-gis-team/grass][master] 23 commits: Update branch in gbp.conf & Vcs-Git URL. Message-ID: <5d73646d81bfe_577b2ade634ff080126537c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / grass Commits: 9326c313 by Bas Couwenberg at 2019-08-04T18:36:48Z Update branch in gbp.conf & Vcs-Git URL. - - - - - f0063b0d by Bas Couwenberg at 2019-08-04T18:37:23Z Update watch file for GRASS 7.8 releases. - - - - - 8412bce5 by Bas Couwenberg at 2019-08-14T17:10:47Z Update uversion mangle for _ in version. - - - - - bb4ffb14 by Bas Couwenberg at 2019-08-14T17:13:50Z New upstream version 7.8.0~rc1 - - - - - 40eeb45a by Bas Couwenberg at 2019-08-14T17:17:42Z Update upstream source from tag 'upstream/7.8.0_rc1' Update to upstream version '7.8.0~rc1' with Debian dir d1149572be90c967f5a2ee19c5aa1de1d2cdf9a6 - - - - - d429d50f by Bas Couwenberg at 2019-08-14T17:20:42Z New upstream release candidate. - - - - - 1f5ffdcd by Bas Couwenberg at 2019-08-14T17:49:23Z Switch to Python 3. - - - - - 0fe3b2e3 by Bas Couwenberg at 2019-08-14T17:49:23Z Update upstream metadata for move to GitHub. - - - - - acfa5256 by Bas Couwenberg at 2019-08-14T17:49:23Z Update README filename in docs file. - - - - - 3a8fa795 by Bas Couwenberg at 2019-08-14T17:49:23Z Add python3-six to (build) dependencies. - - - - - 3b77aa48 by Bas Couwenberg at 2019-08-14T17:49:23Z Replace subversion Recommends with git for g.extension. - - - - - 08dc025d by Bas Couwenberg at 2019-08-14T17:49:23Z Update copyright file. Changes: - Update copyright years for Supreet Singh - Add Shubham Sharma & Marcus D. Hanwell to copyright holders - - - - - 08b50b34 by Bas Couwenberg at 2019-08-14T17:51:55Z Drop patches applied upstream, refresh remaining patches. - - - - - 8e9a8712 by Bas Couwenberg at 2019-08-14T17:51:55Z Set distribution to experimental. - - - - - 24b2c735 by Bas Couwenberg at 2019-08-31T08:28:25Z Recommend both git & subversion for g.extension. - - - - - c3cfceda by Bas Couwenberg at 2019-09-04T18:52:57Z Drop unused override for spelling-error-in-binary. - - - - - 6f91099d by Bas Couwenberg at 2019-09-07T07:16:26Z Merge branch 'experimental' - - - - - 1056d313 by Bas Couwenberg at 2019-09-07T07:18:00Z Revert "Update branch in gbp.conf & Vcs-Git URL." This reverts commit 9326c3139fdd481f409d31d7d6507d35ac3cc9bf. - - - - - 13c26cbe by Bas Couwenberg at 2019-09-07T07:18:39Z New upstream version 7.8.0 - - - - - be66e91f by Bas Couwenberg at 2019-09-07T07:22:22Z Update upstream source from tag 'upstream/7.8.0' Update to upstream version '7.8.0' with Debian dir c72a35fc9d0791107d1fbbecf28b0ec79262beb5 - - - - - 35a1b150 by Bas Couwenberg at 2019-09-07T07:28:33Z New upstream release. - - - - - fe05f027 by Bas Couwenberg at 2019-09-07T07:31:12Z Don't define ACCEPT_USE_OF_DEPRECATED_PROJ_API_H, proj.h is used. - - - - - 75a299f4 by Bas Couwenberg at 2019-09-07T07:36:38Z Set distribution to unstable. - - - - - 30 changed files: - AUTHORS - + ChangeLog_7.8.0.gz - Dockerfile - INSTALL - Makefile - README → README.md - REQUIREMENTS.html - Vagrantfile - config.guess - config.sub - configure - configure.in - contributors.csv - db/db.columns/db.columns.html - db/db.connect/db.connect.html - db/db.copy/db.copy.html - db/db.createdb/db.createdb.html - db/db.databases/db.databases.html - db/db.describe/db.describe.html - db/db.describe/testsuite/test_dbdescribe.py - db/db.drivers/db.drivers.html - db/db.dropdb/db.dropdb.html - db/db.execute/db.execute.html - db/db.login/db.login.html - db/db.select/db.select.html - db/db.tables/db.tables.html - db/drivers/dbf/grass-dbf.html - db/drivers/mysql/grass-mesql.html - db/drivers/mysql/grass-mysql.html - db/drivers/odbc/grass-odbc.html The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/compare/28498784fc4acb75e241ff5e2bc2d4260afd18d2...75a299f404c6ce005a2389c7e1a553fdaa3095b6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/compare/28498784fc4acb75e241ff5e2bc2d4260afd18d2...75a299f404c6ce005a2389c7e1a553fdaa3095b6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 09:03:58 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:03:58 +0000 Subject: [Git][debian-gis-team/grass][pristine-tar] pristine-tar data for grass_7.8.0.orig.tar.gz Message-ID: <5d73646e52d05_577b2ade634ff08012655f7@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / grass Commits: 411b4ed7 by Bas Couwenberg at 2019-09-07T07:22:21Z pristine-tar data for grass_7.8.0.orig.tar.gz - - - - - 2 changed files: - + grass_7.8.0.orig.tar.gz.delta - + grass_7.8.0.orig.tar.gz.id Changes: ===================================== grass_7.8.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/grass_7.8.0.orig.tar.gz.delta differ ===================================== grass_7.8.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +6127d584499bb0c6293cbb5f0efb55c36fafec08 View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/411b4ed79017081f78837e47034e0474cdc80fa1 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/411b4ed79017081f78837e47034e0474cdc80fa1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 09:03:59 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:03:59 +0000 Subject: [Git][debian-gis-team/grass][upstream] New upstream version 7.8.0 Message-ID: <5d73646f50509_577b2ade634ff080126579b@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / grass Commits: 13c26cbe by Bas Couwenberg at 2019-09-07T07:18:39Z New upstream version 7.8.0 - - - - - 30 changed files: - + ChangeLog_7.8.0.gz - − ChangeLog_7_8_0RC1.gz - REQUIREMENTS.html - Vagrantfile - − doc/.howto_release.md.swp - doc/howto_release.md - docker/Dockerfile_alpine - docker/Dockerfile_alpine_wxgui - gui/wxpython/animation/dialogs.py - gui/wxpython/animation/provider.py - gui/wxpython/animation/utils.py - gui/wxpython/core/workspace.py - gui/wxpython/gcp/manager.py - gui/wxpython/gmodeler/dialogs.py - gui/wxpython/gmodeler/model.py - gui/wxpython/gui_core/preferences.py - gui/wxpython/gui_core/vselect.py - gui/wxpython/gui_core/widgets.py - gui/wxpython/gui_core/wxlibplot.py - gui/wxpython/image2target/ii2t_manager.py - gui/wxpython/mapdisp/main.py - gui/wxpython/photo2image/ip2i_manager.py - gui/wxpython/psmap/dialogs.py - gui/wxpython/psmap/instructions.py - gui/wxpython/vdigit/dialogs.py - include/Make/Docs.make - include/VERSION - lib/gis/gisinit.c - lib/gis/parser_help.c - lib/gis/parser_json.c The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/13c26cbedeb1caa3886051b0cc574bb3de3a0a05 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/13c26cbedeb1caa3886051b0cc574bb3de3a0a05 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 09:04:18 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:04:18 +0000 Subject: [Git][debian-gis-team/grass] Pushed new tag debian/7.8.0-1 Message-ID: <5d7364829d8a5_577b2ade634ff08012659b@godard.mail> Bas Couwenberg pushed new tag debian/7.8.0-1 at Debian GIS Project / grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/tree/debian/7.8.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 09:04:18 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:04:18 +0000 Subject: [Git][debian-gis-team/grass] Pushed new tag upstream/7.8.0 Message-ID: <5d736482bac42_577b2ade63474868126611f@godard.mail> Bas Couwenberg pushed new tag upstream/7.8.0 at Debian GIS Project / grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/tree/upstream/7.8.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 7 09:10:56 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 08:10:56 +0000 Subject: Processing of grass_7.8.0-1_source.changes Message-ID: grass_7.8.0-1_source.changes uploaded successfully to localhost along with the files: grass_7.8.0-1.dsc grass_7.8.0.orig.tar.gz grass_7.8.0-1.debian.tar.xz grass_7.8.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Sat Sep 7 09:24:01 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:24:01 +0000 Subject: [Git][debian-gis-team/gdal-grass][master] 5 commits: Update packaging for GRASS 7.8.0. Message-ID: <5d7369211c668_577b2ade6340325812698fe@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / gdal-grass Commits: 41615f22 by Bas Couwenberg at 2019-09-07T07:34:51Z Update packaging for GRASS 7.8.0. - - - - - d7bc319f by Bas Couwenberg at 2019-09-07T08:09:46Z Update PIE hardening conditional, trusty is EOL. - - - - - 155552bd by Bas Couwenberg at 2019-09-07T08:10:19Z Bump Standards-Version to 4.4.0, no changes. - - - - - 4716c438 by Bas Couwenberg at 2019-09-07T08:11:36Z Update watch file to check version directories. - - - - - c200aa69 by Bas Couwenberg at 2019-09-07T08:12:05Z Set distribution to unstable. - - - - - 5 changed files: - debian/changelog - debian/control - debian/patches/rpath - debian/rules - debian/watch Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,12 @@ +libgdal-grass (2.4.2-2) unstable; urgency=medium + + * Update packaging for GRASS 7.8.0. + * Update PIE hardening conditional, trusty is EOL. + * Bump Standards-Version to 4.4.0, no changes. + * Update watch file to check version directories. + + -- Bas Couwenberg Sat, 07 Sep 2019 10:11:51 +0200 + libgdal-grass (2.4.2-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -7,13 +7,13 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-autoreconf, d-shlibs, - grass (>= 7.6.1), - grass-dev (>= 7.6.1), + grass (>= 7.8.0), + grass-dev (>= 7.8.0), libgdal-dev (>= 2.4.2), libpq-dev, lsb-release, pkg-config -Standards-Version: 4.3.0 +Standards-Version: 4.4.0 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gdal-grass Vcs-Git: https://salsa.debian.org/debian-gis-team/gdal-grass.git Homepage: http://www.gdal.org/ ===================================== debian/patches/rpath ===================================== @@ -9,11 +9,11 @@ Forwarded: not-needed $(GLIBNAME): grass57dataset.o - $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -+ $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -Wl,-rpath,/usr/lib/grass76/lib ++ $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -Wl,-rpath,/usr/lib/grass78/lib $(OLIBNAME): ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o - $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -+ $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -Wl,-rpath,/usr/lib/grass76/lib ++ $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -Wl,-rpath,/usr/lib/grass78/lib %.o: %.cpp $(CXX) $(CXXFLAGS) $(CPPFLAGS) $(CFLAGS) -c -o $@ $< ===================================== debian/rules ===================================== @@ -12,7 +12,7 @@ VENDOR_DERIVES_FROM_UBUNTU ?= $(shell dpkg-vendor --derives-from Ubuntu && echo DISTRIBUTION_RELEASE := $(shell lsb_release -cs) ifeq ($(VENDOR_DERIVES_FROM_UBUNTU),yes) - ifneq (,$(filter $(DISTRIBUTION_RELEASE),trusty xenial bionic)) + ifneq (,$(filter $(DISTRIBUTION_RELEASE),xenial bionic)) export DEB_BUILD_MAINT_OPTIONS=hardening=+all,-pie endif export DEB_LDFLAGS_MAINT_APPEND=-Wl,--no-as-needed ===================================== debian/watch ===================================== @@ -2,5 +2,5 @@ version=3 opts=\ dversionmangle=s/\+(debian|dfsg|ds|deb)\d*$//,\ uversionmangle=s/(\d)[_\.\-\+]?((RC|rc|pre|dev|beta|alpha)\d*)$/$1~$2/;s/RC/rc/ \ -https://trac.osgeo.org/gdal/wiki/DownloadSource \ +https://download.osgeo.org/gdal/(\d+\.\d+\.\d+)/ \ (?:|.*/)gdal(?:[_\-]v?|)(\d\S*)\.(?:tar\.xz|txz|tar\.bz2|tbz2|tar\.gz|tgz) View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/f6d6709db13a818ab1f4a2349fab17899a38b419...c200aa699c82a1534ef2689f899949d7ea7efa39 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/f6d6709db13a818ab1f4a2349fab17899a38b419...c200aa699c82a1534ef2689f899949d7ea7efa39 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 09:24:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:24:05 +0000 Subject: [Git][debian-gis-team/gdal-grass] Pushed new tag debian/2.4.2-2 Message-ID: <5d7369256c4c4_577b2ade63462f3c12700e6@godard.mail> Bas Couwenberg pushed new tag debian/2.4.2-2 at Debian GIS Project / gdal-grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/tree/debian/2.4.2-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 7 09:30:57 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 08:30:57 +0000 Subject: Processing of libgdal-grass_2.4.2-2_source.changes Message-ID: libgdal-grass_2.4.2-2_source.changes uploaded successfully to localhost along with the files: libgdal-grass_2.4.2-2.dsc libgdal-grass_2.4.2-2.debian.tar.xz libgdal-grass_2.4.2-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Sat Sep 7 09:38:55 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:38:55 +0000 Subject: [Git][debian-gis-team/gdal-grass][experimental] 2 commits: Update packaging for GRASS 7.8.0. Message-ID: <5d736c9f185d8_577b2ade634821481271066@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / gdal-grass Commits: 2a4b1724 by Bas Couwenberg at 2019-09-07T07:33:52Z Update packaging for GRASS 7.8.0. - - - - - 8d42e1a0 by Bas Couwenberg at 2019-09-07T08:26:26Z Set distribution to experimental. - - - - - 3 changed files: - debian/changelog - debian/control - debian/patches/rpath Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libgdal-grass (3.0.1-1~exp2) experimental; urgency=medium + + * Update packaging for GRASS 7.8.0. + + -- Bas Couwenberg Sat, 07 Sep 2019 10:26:07 +0200 + libgdal-grass (3.0.1-1~exp1) experimental; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -7,8 +7,8 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-autoreconf, d-shlibs, - grass (>= 7.6.1), - grass-dev (>= 7.6.1), + grass (>= 7.8.0), + grass-dev (>= 7.8.0), libgdal-dev (>= 3.0.1), libpq-dev, lsb-release, ===================================== debian/patches/rpath ===================================== @@ -9,11 +9,11 @@ Forwarded: not-needed $(GLIBNAME): grass57dataset.o - $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -+ $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -Wl,-rpath,/usr/lib/grass76/lib ++ $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -Wl,-rpath,/usr/lib/grass78/lib $(OLIBNAME): ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o - $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -+ $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -Wl,-rpath,/usr/lib/grass76/lib ++ $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -Wl,-rpath,/usr/lib/grass78/lib %.o: %.cpp $(CXX) $(CXXFLAGS) $(CPPFLAGS) $(CFLAGS) -c -o $@ $< View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/c80e71b8ee81842d9da58cf47740483efb324039...8d42e1a07ec9413c0046821a25d0b225e92b30fa -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/c80e71b8ee81842d9da58cf47740483efb324039...8d42e1a07ec9413c0046821a25d0b225e92b30fa You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 09:39:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 08:39:00 +0000 Subject: [Git][debian-gis-team/gdal-grass] Pushed new tag debian/3.0.1-1_exp2 Message-ID: <5d736ca4dd35_577b2ade6344cd901271278@godard.mail> Bas Couwenberg pushed new tag debian/3.0.1-1_exp2 at Debian GIS Project / gdal-grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/tree/debian/3.0.1-1_exp2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 7 09:45:58 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 08:45:58 +0000 Subject: Processing of libgdal-grass_3.0.1-1~exp2_source.changes Message-ID: libgdal-grass_3.0.1-1~exp2_source.changes uploaded successfully to localhost along with the files: libgdal-grass_3.0.1-1~exp2.dsc libgdal-grass_3.0.1-1~exp2.debian.tar.xz libgdal-grass_3.0.1-1~exp2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Sat Sep 7 11:11:40 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Sat, 07 Sep 2019 10:11:40 +0000 Subject: [Git][debian-gis-team/grass][ubuntugis/bionic] 12 commits: Recommend both git & subversion for g.extension. Message-ID: <5d73825c60bf5_577b2ade63462f3c128289@godard.mail> Martin Landa pushed to branch ubuntugis/bionic at Debian GIS Project / grass Commits: 24b2c735 by Bas Couwenberg at 2019-08-31T08:28:25Z Recommend both git & subversion for g.extension. - - - - - aed1bbb2 by Bas Couwenberg at 2019-09-04T17:45:29Z Drop unused override for spelling-error-in-binary. - - - - - 28498784 by Bas Couwenberg at 2019-09-04T17:45:40Z Set distribution to unstable. - - - - - c3cfceda by Bas Couwenberg at 2019-09-04T18:52:57Z Drop unused override for spelling-error-in-binary. - - - - - 6f91099d by Bas Couwenberg at 2019-09-07T07:16:26Z Merge branch 'experimental' - - - - - 1056d313 by Bas Couwenberg at 2019-09-07T07:18:00Z Revert "Update branch in gbp.conf & Vcs-Git URL." This reverts commit 9326c3139fdd481f409d31d7d6507d35ac3cc9bf. - - - - - 13c26cbe by Bas Couwenberg at 2019-09-07T07:18:39Z New upstream version 7.8.0 - - - - - be66e91f by Bas Couwenberg at 2019-09-07T07:22:22Z Update upstream source from tag 'upstream/7.8.0' Update to upstream version '7.8.0' with Debian dir c72a35fc9d0791107d1fbbecf28b0ec79262beb5 - - - - - 35a1b150 by Bas Couwenberg at 2019-09-07T07:28:33Z New upstream release. - - - - - fe05f027 by Bas Couwenberg at 2019-09-07T07:31:12Z Don't define ACCEPT_USE_OF_DEPRECATED_PROJ_API_H, proj.h is used. - - - - - 75a299f4 by Bas Couwenberg at 2019-09-07T07:36:38Z Set distribution to unstable. - - - - - 75d1e3ec by Martin Landa at 2019-09-07T09:44:19Z Rebuild 7.8.0 for bionic - - - - - 30 changed files: - + ChangeLog_7.8.0.gz - − ChangeLog_7_8_0RC1.gz - REQUIREMENTS.html - Vagrantfile - debian/changelog - debian/control - debian/grass-core.lintian-overrides - debian/rules - − doc/.howto_release.md.swp - doc/howto_release.md - docker/Dockerfile_alpine - docker/Dockerfile_alpine_wxgui - gui/wxpython/animation/dialogs.py - gui/wxpython/animation/provider.py - gui/wxpython/animation/utils.py - gui/wxpython/core/workspace.py - gui/wxpython/gcp/manager.py - gui/wxpython/gmodeler/dialogs.py - gui/wxpython/gmodeler/model.py - gui/wxpython/gui_core/preferences.py - gui/wxpython/gui_core/vselect.py - gui/wxpython/gui_core/widgets.py - gui/wxpython/gui_core/wxlibplot.py - gui/wxpython/image2target/ii2t_manager.py - gui/wxpython/mapdisp/main.py - gui/wxpython/photo2image/ip2i_manager.py - gui/wxpython/psmap/dialogs.py - gui/wxpython/psmap/instructions.py - gui/wxpython/vdigit/dialogs.py - include/Make/Docs.make The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/compare/20c5d12119031a2d206c20d2ecdbf41898a879a4...75d1e3ece58cf9bef7bad90e980132ed327cfd5b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/compare/20c5d12119031a2d206c20d2ecdbf41898a879a4...75d1e3ece58cf9bef7bad90e980132ed327cfd5b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 11:11:44 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Sat, 07 Sep 2019 10:11:44 +0000 Subject: [Git][debian-gis-team/grass] Pushed new tag ubuntugis/7.8.0-1.bionic1 Message-ID: <5d73826034a63_577b2ade6344cd9012830e7@godard.mail> Martin Landa pushed new tag ubuntugis/7.8.0-1.bionic1 at Debian GIS Project / grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/tree/ubuntugis/7.8.0-1.bionic1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 11:54:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 10:54:41 +0000 Subject: [Git][debian-gis-team/qgis][master] 3 commits: Update packaging for GRASS 7.8.0. Message-ID: <5d738c716e4bd_577b2ade63462f3c1286237@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / qgis Commits: ad75e62c by Bas Couwenberg at 2019-09-07T07:28:18Z Update packaging for GRASS 7.8.0. - - - - - f00f865d by Bas Couwenberg at 2019-09-07T08:08:12Z Update symbols for other architectures. - - - - - 7da12bde by Bas Couwenberg at 2019-09-07T08:08:27Z Set distribution to unstable. - - - - - 10 changed files: - debian/changelog - debian/control - debian/libqgis-analysis3.4.11.symbols - debian/libqgis-app3.4.11.symbols - debian/libqgis-core3.4.11.symbols - debian/libqgis-gui3.4.11.symbols - debian/libqgis-server3.4.11.symbols - + debian/patches/grass78.patch - debian/patches/series - debian/qgis.sh Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +qgis (3.4.11+dfsg-2) unstable; urgency=medium + + * Update packaging for GRASS 7.8.0. + * Update symbols for other architectures. + + -- Bas Couwenberg Sat, 07 Sep 2019 10:08:14 +0200 + qgis (3.4.11+dfsg-1) unstable; urgency=medium * Add Breaks/Replaces to fix upgrade from 2.18.18. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: bison, dh-python, flex, gdal-bin, - grass-dev (>= 7.6.1), + grass-dev (>= 7.8.0), libexpat1-dev, libfcgi-dev, libgdal-dev (>= 1.11), @@ -244,7 +244,7 @@ Description: QGIS custom widgets for Qt Designer Package: libqgis-dev Architecture: any Section: libdevel -Depends: grass-dev (>= 7.6.0), +Depends: grass-dev (>= 7.8.0), libexpat1-dev, libgdal-dev (>= 1.11), libgeos-dev (>= 3.0.0), @@ -301,7 +301,7 @@ Architecture: any Depends: qgis (= ${binary:Version}), qgis-plugin-grass-common (= ${source:Version}), qgis-provider-grass (= ${binary:Version}), - grass-core (>= 7.6.0), + grass-core (>= 7.8.0), ${grass:Depends}, ${shlibs:Depends}, ${misc:Depends} ===================================== debian/libqgis-analysis3.4.11.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.11 i386 powerpc ppc64 +# SymbolsHelper-Confirmed: 3.4.11 armel armhf i386 m68k powerpc ppc64 libqgis_analysis.so.3.4.11 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev _ZN10QByteArrayD1Ev at Base 3.4.5 @@ -16,7 +16,7 @@ libqgis_analysis.so.3.4.11 #PACKAGE# #MINVER# _ZN12QgsExceptionD0Ev at Base 3.4.5 _ZN12QgsExceptionD1Ev at Base 3.4.5 _ZN12QgsExceptionD2Ev at Base 3.4.5 - (arch=i386 mips mipsel powerpc)_ZN12QgsRectangle17combineExtentWithERKS_ at Base 3.4.11 + (arch=armel armhf i386 m68k mips mipsel powerpc)_ZN12QgsRectangle17combineExtentWithERKS_ at Base 3.4.11 _ZN12QtConcurrent16ThreadEngineBase14threadFunctionEv at Base 3.4.5 _ZN12QtConcurrent16ThreadEngineBase17shouldStartThreadEv at Base 3.4.5 _ZN12QtConcurrent16ThreadEngineBase20shouldThrottleThreadEv at Base 3.4.5 ===================================== debian/libqgis-app3.4.11.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.11 amd64 arm64 i386 powerpc ppc64 ppc64el +# SymbolsHelper-Confirmed: 3.4.11 amd64 arm64 armel i386 powerpc ppc64 ppc64el libqgis_app.so.3.4.11 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev _ZN10QgsOptions10addSVGPathEv at Base 3.4.5 @@ -4140,10 +4140,6 @@ libqgis_app.so.3.4.11 #PACKAGE# #MINVER# (optional=templinst)_ZNKSt5ctypeIcE8do_widenEc at Base 2.18.17 (optional=templinst)_ZNSt10unique_ptrI20QgsExpressionContextSt14default_deleteIS0_EED1Ev at Base 3.4.5 (optional=templinst)_ZNSt10unique_ptrI20QgsExpressionContextSt14default_deleteIS0_EED2Ev at Base 3.4.5 - (optional=templinst|arch=armel)_ZNSt12__shared_ptrI21QgsGeometryCheckErrorLN9__gnu_cxx12_Lock_policyE1EEC1ERKS3_ at Base 3.4.5 - (optional=templinst|arch=armel)_ZNSt12__shared_ptrI21QgsGeometryCheckErrorLN9__gnu_cxx12_Lock_policyE1EEC2ERKS3_ at Base 3.4.5 - (optional=templinst|arch=armel)_ZNSt12__shared_ptrI27QgsSingleGeometryCheckErrorLN9__gnu_cxx12_Lock_policyE1EEC1ERKS3_ at Base 3.4.5 - (optional=templinst|arch=armel)_ZNSt12__shared_ptrI27QgsSingleGeometryCheckErrorLN9__gnu_cxx12_Lock_policyE1EEC2ERKS3_ at Base 3.4.5 (optional=templinst)_ZNSt14_Function_base13_Base_managerIPFP17QgsFormAnnotationvEE10_M_managerERSt9_Any_dataRKS6_St18_Manager_operation at Base 3.4.5 (optional=templinst)_ZNSt14_Function_base13_Base_managerIPFP18QgsLayoutItem3DMapP9QgsLayoutEE10_M_managerERSt9_Any_dataRKS8_St18_Manager_operation at Base 3.4.5 (optional=templinst|arch=armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE1EEC1I23QgsGeometryCheckContextSt14default_deleteIS4_EEEOSt10unique_ptrIT_T0_E at Base 3.4.5 ===================================== debian/libqgis-core3.4.11.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.11 amd64 i386 powerpc ppc64 ppc64el +# SymbolsHelper-Confirmed: 3.4.11 amd64 arm64 armel armhf i386 m68k mips64el mipsel powerpc ppc64 ppc64el libqgis_core.so.3.4.11 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev GEOPROJ4 at Base 2.0.1 @@ -2593,9 +2593,9 @@ libqgis_core.so.3.4.11 #PACKAGE# #MINVER# _ZN15QgsPointLocator23onAttributeValueChangedExiRK8QVariant at Base 3.4.5 _ZN15QgsPointLocator4initEi at Base 2.8.0 (arch=!amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x)_ZN15QgsPointLocator5MatchC1ENS_4TypeEP14QgsVectorLayerxdRK10QgsPointXYiPS4_ at Base 3.4.5 - (arch=!arm64 !mips64el)_ZN15QgsPointLocator5MatchC1EOS0_ at Base 3.4.6 + (arch=!arm64 !m68k !mips64el !mipsel)_ZN15QgsPointLocator5MatchC1EOS0_ at Base 3.4.6 (arch=!amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x)_ZN15QgsPointLocator5MatchC2ENS_4TypeEP14QgsVectorLayerxdRK10QgsPointXYiPS4_ at Base 3.4.5 - (arch=!arm64 !mips64el)_ZN15QgsPointLocator5MatchC2EOS0_ at Base 3.4.6 + (arch=!arm64 !m68k !mips64el !mipsel)_ZN15QgsPointLocator5MatchC2EOS0_ at Base 3.4.6 _ZN15QgsPointLocator9setExtentEPK12QgsRectangle at Base 2.14.0 _ZN15QgsPointLocatorC1EP14QgsVectorLayerRK28QgsCoordinateReferenceSystemRK29QgsCoordinateTransformContextPK12QgsRectangle at Base 3.4.5 _ZN15QgsPointLocatorC2EP14QgsVectorLayerRK28QgsCoordinateReferenceSystemRK29QgsCoordinateTransformContextPK12QgsRectangle at Base 3.4.5 @@ -3672,7 +3672,7 @@ libqgis_core.so.3.4.11 #PACKAGE# #MINVER# _ZN17QgsImageOperation23ShadeFromArrayOperationclERjii at Base 2.8.0 _ZN17QgsImageOperation23nonTransparentImageRectERK6QImage5QSizeb at Base 2.14.0 _ZN17QgsImageOperation24adjustBrightnessContrastER6QImageid at Base 2.8.0 - (optional=templinst|arch=!alpha !arm64 !armel !armhf !hppa !mips64el !powerpc !powerpcspe !ppc64 !ppc64el !s390x)_ZN17QgsImageOperation26runBlockOperationInThreadsINS_21GaussianBlurOperationEEEvR6QImageRT_NS_22LineOperationDirectionE at Base 2.8.1 + (optional=templinst|arch=!alpha !arm64 !armel !armhf !hppa !powerpc !powerpcspe !ppc64 !ppc64el !s390x)_ZN17QgsImageOperation26runBlockOperationInThreadsINS_21GaussianBlurOperationEEEvR6QImageRT_NS_22LineOperationDirectionE at Base 2.8.1 (optional=templinst)_ZN17QgsImageOperation26runBlockOperationInThreadsINS_30ProcessBlockUsingLineOperationINS_17FlipLineOperationEEEEEvR6QImageRT_NS_22LineOperationDirectionE at Base 2.8.0 (optional=templinst)_ZN17QgsImageOperation26runBlockOperationInThreadsINS_30ProcessBlockUsingLineOperationINS_22StackBlurLineOperationEEEEEvR6QImageRT_NS_22LineOperationDirectionE at Base 2.8.0 _ZN17QgsImageOperation27HueSaturationPixelOperationclERjii at Base 2.8.0 @@ -9547,8 +9547,8 @@ libqgis_core.so.3.4.11 #PACKAGE# #MINVER# _ZN6QgsGmlD1Ev at Base 2.0.1 _ZN6QgsGmlD2Ev at Base 2.0.1 (optional=templinst)_ZN7QObject7connectIM14QgsVectorLayerFvvEM17QgsLayerTreeModelFvvEEEN11QMetaObject10ConnectionEPKN9QtPrivate15FunctionPointerIT_E6ObjectESB_PKNSA_IT0_E6ObjectESG_N2Qt14ConnectionTypeE at Base 3.4.5 - (optional=templinst|arch=amd64)_ZN7QObject7connectIM16QgsLayerTreeNodeFvPS1_iiEM17QgsLayerTreeModelFvS2_iiEEEN11QMetaObject10ConnectionEPKN9QtPrivate15FunctionPointerIT_E6ObjectESC_PKNSB_IT0_E6ObjectESH_N2Qt14ConnectionTypeE at Base 3.4.5 - (optional=templinst|arch=!i386 !powerpc)_ZN7QObject7connectIM17QgsLayerTreeLayerFvvEM17QgsLayerTreeModelFvvEEEN11QMetaObject10ConnectionEPKN9QtPrivate15FunctionPointerIT_E6ObjectESB_PKNSA_IT0_E6ObjectESG_N2Qt14ConnectionTypeE at Base 3.4.5 + (optional=templinst|arch=amd64 arm64)_ZN7QObject7connectIM16QgsLayerTreeNodeFvPS1_iiEM17QgsLayerTreeModelFvS2_iiEEEN11QMetaObject10ConnectionEPKN9QtPrivate15FunctionPointerIT_E6ObjectESC_PKNSB_IT0_E6ObjectESH_N2Qt14ConnectionTypeE at Base 3.4.5 + (optional=templinst|arch=!armel !armhf !i386 !m68k !mipsel !powerpc)_ZN7QObject7connectIM17QgsLayerTreeLayerFvvEM17QgsLayerTreeModelFvvEEEN11QMetaObject10ConnectionEPKN9QtPrivate15FunctionPointerIT_E6ObjectESB_PKNSA_IT0_E6ObjectESG_N2Qt14ConnectionTypeE at Base 3.4.5 _ZN7QString10fromLatin1ERK10QByteArray at Base 3.4.5 _ZN7QString7prependEPKc at Base 3.4.5 _ZN7QString8fromUtf8EPKci at Base 3.4.5 ===================================== debian/libqgis-gui3.4.11.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.11 amd64 arm64 i386 powerpc ppc64 ppc64el +# SymbolsHelper-Confirmed: 3.4.11 amd64 arm64 armel armhf i386 m68k mipsel powerpc ppc64 ppc64el libqgis_gui.so.3.4.11 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev _Z17createDatabaseURIRK7QStringS1_S1_S_S1_S_S_b at Base 3.4.5 @@ -174,7 +174,7 @@ libqgis_gui.so.3.4.11 #PACKAGE# #MINVER# _ZN11QgsPropertyD1Ev at Base 3.4.5 _ZN11QgsPropertyD2Ev at Base 3.4.5 (optional=templinst)_ZN11QgsSettings9enumValueIN12QgsUnitTypes10LayoutUnitEEET_RK7QStringRKS3_NS_7SectionE at Base 3.4.11 - (optional=templinst|arch=powerpc)_ZN11QgsSettings9enumValueIN18QgsMapToolIdentify12IdentifyModeEEET_RK7QStringRKS3_NS_7SectionE at Base 3.4.11 + (optional=templinst|arch=mipsel powerpc)_ZN11QgsSettings9enumValueIN18QgsMapToolIdentify12IdentifyModeEEET_RK7QStringRKS3_NS_7SectionE at Base 3.4.11 (optional=templinst)_ZN11QgsSettings9enumValueIN6QgsGui18ProjectCrsBehaviorEEET_RK7QStringRKS3_NS_7SectionE at Base 3.4.11 _ZN11QgsWkbTypes11isMultiTypeENS_4TypeE at Base 3.4.5 _ZN11QgsWkbTypes12geometryTypeENS_4TypeE at Base 3.4.5 @@ -1206,7 +1206,7 @@ libqgis_gui.so.3.4.11 #PACKAGE# #MINVER# _ZN15QgsVertexMarkerD0Ev at Base 2.18.17 _ZN15QgsVertexMarkerD1Ev at Base 2.18.17 _ZN15QgsVertexMarkerD2Ev at Base 2.18.17 - _ZN16QTableWidgetItem10setToolTipERK7QString at Base 3.4.5 + (arch=!mipsel)_ZN16QTableWidgetItem10setToolTipERK7QString at Base 3.4.11 _ZN16QTableWidgetItem7setTextERK7QString at Base 3.4.5 _ZN16QgsAttributeForm10WidgetInfoD1Ev at Base 3.4.5 _ZN16QgsAttributeForm10WidgetInfoD2Ev at Base 3.4.5 @@ -8789,7 +8789,7 @@ libqgis_gui.so.3.4.11 #PACKAGE# #MINVER# _ZNK14QgsVScrollArea10metaObjectEv at Base 3.4.5 _ZNK15CharacterWidget10metaObjectEv at Base 2.0.1 _ZNK15CharacterWidget8sizeHintEv at Base 2.0.1 - (arch=i386 powerpc)_ZNK15QTreeWidgetItem10foregroundEi at Base 3.4.11 + (arch=armel armhf i386 m68k mipsel powerpc)_ZNK15QTreeWidgetItem10foregroundEi at Base 3.4.11 _ZNK15QgsAuthCertInfo10metaObjectEv at Base 2.14.0 _ZNK15QgsDateTimeEdit10metaObjectEv at Base 2.6.0 _ZNK15QgsDateTimeEdit8dateTimeEv at Base 2.6.0 @@ -8807,7 +8807,7 @@ libqgis_gui.so.3.4.11 #PACKAGE# #MINVER# _ZNK15QgsSymbolButton8sizeHintEv at Base 3.4.5 _ZNK15QgsSymbolButton9mapCanvasEv at Base 3.4.5 _ZNK15QgsVertexMarker12boundingRectEv at Base 2.0.1 - (arch=i386 powerpc)_ZNK16QTableWidgetItem4textEv at Base 3.4.11 + (arch=armel armhf i386 m68k mipsel powerpc)_ZNK16QTableWidgetItem4textEv at Base 3.4.11 _ZNK16QgsAttributeForm10metaObjectEv at Base 2.4.0 _ZNK16QgsAttributeForm15aggregateFilterEv at Base 3.4.5 _ZNK16QgsAttributeForm15fieldIsEditableERK14QgsVectorLayerix at Base 3.4.5 @@ -10046,7 +10046,7 @@ libqgis_gui.so.3.4.11 #PACKAGE# #MINVER# _ZNK44QgsProcessingParameterWidgetFactoryInterface29modelerExpressionFormatStringEv at Base 3.4.5 (optional=templinst|arch=powerpcspe)_ZNK5QListIP7QObjectE7indexOfERKS1_i at Base 2.14.15 _ZNK6QgsGui10metaObjectEv at Base 3.4.9 - (arch=!i386 !powerpc !ppc64 !s390x)_ZNK7QPointF7toPointEv at Base 3.4.6 + (arch=!armel !armhf !i386 !mipsel !powerpc !ppc64 !s390x)_ZNK7QPointF7toPointEv at Base 3.4.6 _ZNK7QgsDial10metaObjectEv at Base 2.4.0 _ZNK7QgsDial12variantValueEv at Base 2.8.3 _ZNK9QgsDialog10metaObjectEv at Base 2.0.1 ===================================== debian/libqgis-server3.4.11.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.11 amd64 i386 powerpc ppc64 +# SymbolsHelper-Confirmed: 3.4.11 amd64 arm64 i386 m68k mips64el powerpc ppc64 ppc64el libqgis_server.so.3.4.11 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev _ZN10QByteArrayD1Ev at Base 3.4.5 @@ -455,11 +455,11 @@ libqgis_server.so.3.4.11 #PACKAGE# #MINVER# (optional=templinst|arch=armel)_ZNSt16_Sp_counted_baseILN9__gnu_cxx12_Lock_policyE1EE10_M_releaseEv at Base 3.4.5 (optional=templinst|arch=!armel)_ZNSt16_Sp_counted_baseILN9__gnu_cxx12_Lock_policyE2EE10_M_releaseEv at Base 3.4.5 (optional=templinst)_ZNSt8_Rb_treeI7QStringSt4pairIKS0_S0_ESt10_Select1stIS3_ESt4lessIS0_ESaIS3_EE24_M_get_insert_unique_posERS2_ at Base 3.4.5 - (optional=templinst|arch=amd64)_ZNSt8_Rb_treeI7QStringSt4pairIKS0_S0_ESt10_Select1stIS3_ESt4lessIS0_ESaIS3_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS3_ERS2_ at Base 3.4.6 + (optional=templinst|arch=amd64 arm64 m68k mips64el ppc64el)_ZNSt8_Rb_treeI7QStringSt4pairIKS0_S0_ESt10_Select1stIS3_ESt4lessIS0_ESaIS3_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS3_ERS2_ at Base 3.4.6 (optional=templinst)_ZNSt8_Rb_treeI7QStringSt4pairIKS0_S0_ESt10_Select1stIS3_ESt4lessIS0_ESaIS3_EE8_M_eraseEPSt13_Rb_tree_nodeIS3_E at Base 3.4.5 (optional=templinst)_ZNSt8_Rb_treeIN18QgsServerParameter4NameESt4pairIKS1_S0_ESt10_Select1stIS4_ESt4lessIS1_ESaIS4_EE22_M_emplace_hint_uniqueIJS2_IS1_S0_EEEESt17_Rb_tree_iteratorIS4_ESt23_Rb_tree_const_iteratorIS4_EDpOT_ at Base 3.4.5 (optional=templinst)_ZNSt8_Rb_treeIN18QgsServerParameter4NameESt4pairIKS1_S0_ESt10_Select1stIS4_ESt4lessIS1_ESaIS4_EE24_M_get_insert_unique_posERS3_ at Base 3.4.5 - (optional=templinst|arch=amd64)_ZNSt8_Rb_treeIN18QgsServerParameter4NameESt4pairIKS1_S0_ESt10_Select1stIS4_ESt4lessIS1_ESaIS4_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS4_ERS3_ at Base 3.4.6 + (optional=templinst|arch=amd64 arm64 m68k mips64el ppc64el)_ZNSt8_Rb_treeIN18QgsServerParameter4NameESt4pairIKS1_S0_ESt10_Select1stIS4_ESt4lessIS1_ESaIS4_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS4_ERS3_ at Base 3.4.6 (optional=templinst)_ZNSt8_Rb_treeIN18QgsServerParameter4NameESt4pairIKS1_S0_ESt10_Select1stIS4_ESt4lessIS1_ESaIS4_EE8_M_eraseEPSt13_Rb_tree_nodeIS4_E at Base 3.4.5 _ZTI10QgsService at Base 3.4.5 _ZTI12QgsException at Base 3.4.5 ===================================== debian/patches/grass78.patch ===================================== @@ -0,0 +1,62 @@ +Description: Add support for GRASS 7.8. +Author: Bas Couwenberg + +--- a/cmake/FindGRASS.cmake ++++ b/cmake/FindGRASS.cmake +@@ -160,14 +160,14 @@ ENDIF (WIN32) + IF (UNIX) + IF (GRASS_FIND_VERSION EQUAL 7) + IF (CMAKE_SYSTEM_NAME STREQUAL "FreeBSD") +- FOREACH (VERSION_MINOR 0 1 2 3 4 5 6) +- FOREACH (VERSION_BUILD 0 1 2 3 4 5 6) ++ FOREACH (VERSION_MINOR 0 1 2 3 4 5 6 8) ++ FOREACH (VERSION_BUILD 0 1 2 3 4 5 6 8) + LIST (APPEND GRASS_PATHS /usr/local/grass-${GRASS_FIND_VERSION}.${VERSION_MINOR}.${VERSION_BUILD}) + ENDFOREACH (VERSION_BUILD) + ENDFOREACH(VERSION_MINOR) + ELSE (CMAKE_SYSTEM_NAME STREQUAL "FreeBSD") + FOREACH (PATH /usr/lib64 /usr/lib) +- FOREACH (VERSION grass76, grass74, grass72, grass70) ++ FOREACH (VERSION grass78, grass76, grass74, grass72, grass70) + LIST(APPEND GRASS_PATHS "${PATH}/${VERSION}") + ENDFOREACH (VERSION) + ENDFOREACH (PATH) +@@ -178,6 +178,7 @@ ENDIF (UNIX) + IF (APPLE) + IF (GRASS_FIND_VERSION EQUAL 7) + LIST(APPEND GRASS_PATHS ++ /Applications/GRASS-7.8.app/Contents/MacOS + /Applications/GRASS-7.6.app/Contents/MacOS + /Applications/GRASS-7.4.app/Contents/MacOS + /Applications/GRASS-7.2.app/Contents/MacOS +--- a/python/plugins/processing/algs/grass7/Grass7Utils.py ++++ b/python/plugins/processing/algs/grass7/Grass7Utils.py +@@ -165,8 +165,8 @@ class Grass7Utils: + ] + else: + cmdList = [ +- "grass76", "grass74", "grass72", "grass70", "grass", +- "grass76.sh", "grass74.sh", "grass72.sh", "grass70.sh", "grass.sh" ++ "grass78", "grass76", "grass74", "grass72", "grass70", "grass", ++ "grass78.sh", "grass76.sh", "grass74.sh", "grass72.sh", "grass70.sh", "grass.sh" + ] + + # For MS-Windows there is a difference between GRASS Path and GRASS binary +@@ -230,7 +230,7 @@ class Grass7Utils: + elif isMac(): + # For MacOSX, we scan some well-known directories + # Start with QGIS bundle +- for version in ['', '7', '76', '74', '72', '70']: ++ for version in ['', '7', '78', '76', '74', '72', '70']: + testfolder = os.path.join(str(QgsApplication.prefixPath()), + 'grass{}'.format(version)) + if os.path.isdir(testfolder): +@@ -565,7 +565,7 @@ class Grass7Utils: + return 'https://grass.osgeo.org/grass{}/manuals/'.format(version) + else: + # GRASS not available! +- return 'https://grass.osgeo.org/grass76/manuals/' ++ return 'https://grass.osgeo.org/grass78/manuals/' + + @staticmethod + def getSupportedOutputRasterExtensions(): ===================================== debian/patches/series ===================================== @@ -1,3 +1,4 @@ python-env.patch developersmap-use-debian-package.patch exclude-elvensword-resources.patch +grass78.patch ===================================== debian/qgis.sh ===================================== @@ -2,9 +2,9 @@ if dpkg -s qgis-plugin-grass >/dev/null 2>&1; then if [ "$LD_LIBRARY_PATH" = "" ]; then - LD_LIBRARY_PATH=/usr/lib/grass76/lib + LD_LIBRARY_PATH=/usr/lib/grass78/lib else - LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/grass76/lib + LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/grass78/lib fi fi View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/cd8c6be397c1bbd6fd61804c8f75e5ade460439f...7da12bde8ec22bf8a8ffaba54b5d72ec5267313b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/cd8c6be397c1bbd6fd61804c8f75e5ade460439f...7da12bde8ec22bf8a8ffaba54b5d72ec5267313b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 11:54:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 10:54:44 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag debian/3.4.11+dfsg-2 Message-ID: <5d738c7415bdc_577b2ade6175ad68128647f@godard.mail> Bas Couwenberg pushed new tag debian/3.4.11+dfsg-2 at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/debian/3.4.11+dfsg-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 7 12:01:08 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 11:01:08 +0000 Subject: Processing of qgis_3.4.11+dfsg-2_source.changes Message-ID: qgis_3.4.11+dfsg-2_source.changes uploaded successfully to localhost along with the files: qgis_3.4.11+dfsg-2.dsc qgis_3.4.11+dfsg-2.debian.tar.xz qgis_3.4.11+dfsg-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 7 13:36:05 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 12:36:05 +0000 Subject: gmt_6.0.0~rc4+dfsg-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 07 Sep 2019 08:35:33 +0200 Source: gmt Architecture: source Version: 6.0.0~rc4+dfsg-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: gmt (6.0.0~rc4+dfsg-1~exp1) experimental; urgency=medium . * New upstream release candidate. * Refresh patches. * Drop python-sphinx from build dependencies. * Update symbols for 6.0.0~rc4. Checksums-Sha1: 7daecd81d1825367ad9312056fbc77602cf49aab 2610 gmt_6.0.0~rc4+dfsg-1~exp1.dsc 5805c74af2417ace420e2cbcd644177f66e50e78 33125236 gmt_6.0.0~rc4+dfsg.orig.tar.xz 7f13685985a011d9f5983f49b8f529e8a3561565 33784 gmt_6.0.0~rc4+dfsg-1~exp1.debian.tar.xz da6286fb946b616edb740f7a50d8ddae2999d9a0 23289 gmt_6.0.0~rc4+dfsg-1~exp1_amd64.buildinfo Checksums-Sha256: 4fa604e4a7d898f437bb4f46277d8a33542e8ecf3605ae14646b8dc989561dfd 2610 gmt_6.0.0~rc4+dfsg-1~exp1.dsc 929a2a506e0d0cba12700246c3b3aaf0d70f6d14fb62f042627ae199ffc2f130 33125236 gmt_6.0.0~rc4+dfsg.orig.tar.xz f116b8bd9bf654b8c4ab87c0464ef099729fd9ecb14f4d5a061efadf5f898e56 33784 gmt_6.0.0~rc4+dfsg-1~exp1.debian.tar.xz 2e4f4be8065219d785275bc9db6c8e17f818a4ee52fa8d1834f0cd355df6fde8 23289 gmt_6.0.0~rc4+dfsg-1~exp1_amd64.buildinfo Files: 7cbadfc28b934bbbf29113859234e21f 2610 science optional gmt_6.0.0~rc4+dfsg-1~exp1.dsc b39047255afb945745de44c7ad63af35 33125236 science optional gmt_6.0.0~rc4+dfsg.orig.tar.xz cada2f75564d25365aee708b2e878c28 33784 science optional gmt_6.0.0~rc4+dfsg-1~exp1.debian.tar.xz b09fae48bc274fdc3d6a14f11266750e 23289 science optional gmt_6.0.0~rc4+dfsg-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1zWA4ACgkQZ1DxCuiN SvEoihAAsWnmP1rcU6btfMcO5MCKsOCn6K1EBw4/HdaMEbdZHrNpTPc5xUEbv3tk uXL/but5fAFGKfInAZDP4AyiB83ipFGYK5fdd3yRizmq6dpGoJvAyGTY7m3J9MCv aBetfEZ2eaLmPnURHxF+eVrIUPIHduFTKuTNWPcoKEmXsAjq5goS0Hn9GGT4FeuK YWKXD0hB8AlojonWWpVUxjU2UD5Eu9Wf7Oc46VCD40tkiylr25NK7haGFI9ATrPt IoV/4hx1lIeL9I/0tkVVyMqnF+gkboBPE/GEMiHoxq8hBOvfEt17qTzDRl/KrSz+ rD3eiIexSxfczxQD37Kvv9tM5hnvBW3jaXQVA3KsUQ6MUn66KQ0+bQGZX6rudDOt /t81rwbiX5tMby54nnVGzf1jMzvsWjOHDV/a6fvfqJNJpaMwa4ztpB4tsPWRP7H/ xTusl0I8qsxgoAZnnSUCShob9ukJrLtNnrq3CU2X8pOf9DZAL0o/DHgl8eCE7xwR i7Gn8rI7c8QUkGWF+OzDk+Tn2I5K/zLWVAi++P+7Gr9lOuK37S5aLb4PWOHvrHTe hRqgpVxc0+jSfbinIZjSd3YdLwqWuQmW3ltpSOwsgSVPGxZ4x7BJt29GUDDY+Pk4 BoWIjjuLiubg8BboX2fm+rIm5usjX648lkBP1er/d1/UxaV+9R0= =jNkX -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sat Sep 7 13:37:03 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 12:37:03 +0000 Subject: grass_7.8.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 07 Sep 2019 09:31:16 +0200 Source: grass Architecture: source Version: 7.8.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: grass (7.8.0-1) unstable; urgency=medium . * New upstream release. * Recommend both git & subversion for g.extension. * Drop unused override for spelling-error-in-binary. * Don't define ACCEPT_USE_OF_DEPRECATED_PROJ_API_H, proj.h is used. * Move from experimental to unstable. Checksums-Sha1: fd13c342671f97deeca20b56e4ea7727105cd36c 2809 grass_7.8.0-1.dsc c6046668718f5c2515c8cba6252896a5b9b01703 61549835 grass_7.8.0.orig.tar.gz 44eb314dca6feb2b16e38a9ca7d6fb2b5461fdf1 33448 grass_7.8.0-1.debian.tar.xz 2f5a6072a972630050641b3a957a32ede5d59415 21833 grass_7.8.0-1_amd64.buildinfo Checksums-Sha256: fa24141f219dd8d2462f22dfe5838ab7c8afc0ecd15bff2e3c177a301cb5658c 2809 grass_7.8.0-1.dsc 4b1192294e959ffd962282344e4ff325c4472f73abe605e246a1da3beda7ccfa 61549835 grass_7.8.0.orig.tar.gz 0974e84dfe11c2336fc0bd62cce1f305a67b6cda9fbb73eeac83213551d56306 33448 grass_7.8.0-1.debian.tar.xz fae7f10700f2ea6d31639382eca20c47b9f670bebedeb344c87a52269541dea4 21833 grass_7.8.0-1_amd64.buildinfo Files: 0b3d42edaa2aa7374dc387610651e4a9 2809 science optional grass_7.8.0-1.dsc e9b9d3bbbfe3ef9055ea1398b2222de4 61549835 science optional grass_7.8.0.orig.tar.gz 1f3aaefec5fea8464ea2e07ee0784fa4 33448 science optional grass_7.8.0-1.debian.tar.xz 6a81b5717b78c680ea32f2654feaecbc 21833 science optional grass_7.8.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1zZDMACgkQZ1DxCuiN SvGbXQ//WNSmYzel7n07ISRgWIA7ddUyobUp9D5EmCZIFIl8uSubtF+X2AIbf7Pe eqni1l/A7Kxoc29h3B8M6owxfiaMZG2K8B3yC/UucMxZRTNeYEOBfA95v4TfS+4i xsxVt6bi4AmLzFHyQBRHtj5PSp59fzQl6YnyCnDXQLgA8p7FBSOcRYXjbflfuOfu fE6u6gN65SRjHKVw/WYQs9EFXi9brWLzaPkV3cIcnz3MN2sQ0Fh7AO2gggxNNocs cLB0RUuqNNn3QALAaTSXvarDNpcJqx9pDK4R8b9UFkwT+AOt8KHFLjDJXrI1exIK 4hS0l0TYKsSeJOaW983WrT/stEQT2C1Oav/xTquHJMG1ty2pK8ci30zoGMiBiCkO 3VKuBUrrGVtryxi1qRzWASHU9ZQrK2l59GJ5p+19sneDVpQp6NFMg3Ezo8mx2FO/ aorHIaLeBHq13ymUNCf1eO0aRWlJ0euPFrU1l5wi0vLNTwzOmF1c6OVd/HOd6VpC 0G3AJTG16MgzAOzyNbhUI+SUPcPBiX95acH+9y5S97l21vWpGz/hiuM5aS49KVK4 d3/ZLd6e3P+mgD/l3G+Gp7UuznFV/pW7TOFfuk4PzOclvCcbg88M+DGx6/QGZYN+ oAdTNBDKvhpeA5FwP8XiSQeELhtORqihN5G6hYTHwDQGnB31PnY= =5ph0 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sat Sep 7 13:37:56 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 12:37:56 +0000 Subject: libgdal-grass_2.4.2-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 07 Sep 2019 10:11:51 +0200 Source: libgdal-grass Architecture: source Version: 2.4.2-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: libgdal-grass (2.4.2-2) unstable; urgency=medium . * Update packaging for GRASS 7.8.0. * Update PIE hardening conditional, trusty is EOL. * Bump Standards-Version to 4.4.0, no changes. * Update watch file to check version directories. Checksums-Sha1: aa8ba3b1d00cf164d4136c9061ee6e94916d4ed0 2112 libgdal-grass_2.4.2-2.dsc c29324d1aa8582209428d98549ec64e5747f08f4 8232 libgdal-grass_2.4.2-2.debian.tar.xz c6ef433548a0de3d8ab42cb8d623cfeb30940c05 15718 libgdal-grass_2.4.2-2_amd64.buildinfo Checksums-Sha256: 08435df08b667abc9834e22f5472da13390db231b6efa1ac49fcb0d1c9dbbb35 2112 libgdal-grass_2.4.2-2.dsc 71b54fa39c5d547a3d21fd4ea30bd80cccc6a8f86ab39b9f1f5882d17f48919e 8232 libgdal-grass_2.4.2-2.debian.tar.xz 2786d5634f1f45d4907b532ac5ad60768b1d11dd217177d47e628121b93da419 15718 libgdal-grass_2.4.2-2_amd64.buildinfo Files: a05952dba785a2f546f94dec39a58a03 2112 science optional libgdal-grass_2.4.2-2.dsc 652d563409cf52c62fd05be16e647801 8232 science optional libgdal-grass_2.4.2-2.debian.tar.xz 4bff1c9d05c651feab3d863384394751 15718 science optional libgdal-grass_2.4.2-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1zaQIACgkQZ1DxCuiN SvGfhQ//abJQamnQnmkZYrKgC/xdy+Jx8ph3KzJVpAETsEbye5YOYDB4rgB5JKSG VKQBQY3DfJZntQv/lyGSBtJhJT3pAe0oVwO2wpFj1npeCgy7wsaqJzLqb8PfQELx +GWfHBfcMJ4XprDbDXe2gkG1ckabf26iZOeNbyfIqUG/VY2+Zycia6dFJ+HeXeIA uVp0NE6PGe5Y7V3jlODlS/Ky7JiMI+Dn98fd7n1i7nMAjjl0LmFpuWiMrwoIYjSS UvhnnjqCMSex9HxNnNCOSAXDii2uPoQLE92+PuKMARei4Stkh4fzTFpFE08rgITb t1smdFZGXvaBejdo6JHEJn51y2RqwXG58zs9WahGXmhXQRK1pEY4gSf/BSjv+f1O T3WCi5uyyyn2gvi9JH5lIdMsQ9I6lxiX36lhk43Me19BV555ofmhOjToiLuncS0V J0YscPYF6aH+sFr1mpI+S1XGAyTGVzmv1le2ZLKxTHZe6teaKTpczPticPtPsVLn +50Q2J5Jv0bOdn1uYII244JClFRbGNpLbHxPqqDLhN0VDb0LnJbOqxVUY1/o9MOY tbiJy4ImJWpKIp364PxGnxxXawXyDiX+dpy7kc/FzN2p65cFQz2L4Ivy7PNn8DXV zy+u0F2QK52Ysnd904r1FOhqGDgYdKs+QYcQAbCAjnK0FYFKLek= =aDGs -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sat Sep 7 13:38:03 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 12:38:03 +0000 Subject: libgdal-grass_3.0.1-1~exp2_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 07 Sep 2019 10:26:07 +0200 Source: libgdal-grass Architecture: source Version: 3.0.1-1~exp2 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: libgdal-grass (3.0.1-1~exp2) experimental; urgency=medium . * Update packaging for GRASS 7.8.0. Checksums-Sha1: 0921458a36744a24f5863c68d1a9c5ab16a77fda 2148 libgdal-grass_3.0.1-1~exp2.dsc 1e5dc94bb75811bcd67cdac7e09cb757870a1154 8236 libgdal-grass_3.0.1-1~exp2.debian.tar.xz d26df7ff28041bf21d70ce7eba26083326004f4e 15865 libgdal-grass_3.0.1-1~exp2_amd64.buildinfo Checksums-Sha256: 721d32aa4ef623f6a278c3628ea8452bf584fb06d48f10a954c278156d1f6dc1 2148 libgdal-grass_3.0.1-1~exp2.dsc d50e5a4be28ac57bdcc8a72415f4d700c4bc4924fa7e48474b59715eef87c99d 8236 libgdal-grass_3.0.1-1~exp2.debian.tar.xz 75cbd84decfdab40a93c2fbc162bbc67a6cac43bf263eacd019c74f5e1877a41 15865 libgdal-grass_3.0.1-1~exp2_amd64.buildinfo Files: 82d2539cbe7c33abad4e09161792e93c 2148 science optional libgdal-grass_3.0.1-1~exp2.dsc d74d85096ef43a2b02d6fc1f2bccdcdc 8236 science optional libgdal-grass_3.0.1-1~exp2.debian.tar.xz 826c17f7b4dc3ea3d9f08ff557885be9 15865 science optional libgdal-grass_3.0.1-1~exp2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1zbIwACgkQZ1DxCuiN SvG/Sw//eUh+mM/0lb0X0L/yCIrKfB2rctNjQdQck2zIY34ndRcptGVatjDQJNqx tJTxfUCpK+L4i1SadQKDU94wZTzodx0neM+HBce+Hx1QkSAXgPvA7DPyYMBoMLLz zcMtTVpXU0x7Hsvmymmp+I1Tk3ebv0z99Jy0uPFP+AXXN4JXAN9H3QaklHAyBwxa 0voqN1tbyMSTAP4ocXKFs6bGhn8elL3xK7sU0I8xklINTHAZjANR+al42eBq3mWg 67UeEovuOw8+t9p68H4dcvKZtSKpCjaMO3T6pnVYHsfcChHEDD5jgSJp2GG1Cv7g 5R06iHBBXHuL5d+zAZyAOduu35o/tRwxaoTx8fZLf1SAl+sga69L3HjyX2H96Rd/ zICTYJN1mT8B3CsiHu5Ly8havET8Cd/tI4jgOYHicHSg/Ug93TsPjHzFPvgpgJjL HQ/K5ycAngiSdNTHkxmfRFJ0JM1rKPMvHHXUuXC6d3WAkdVGHNB6ysdHAMGSgbBD 6VVASs2z64Gg61hbiZyuq1EBohJ3D+JyUeYhhk7ZFy+dfviaoq8XGTGrPqtpmHLQ al8GpAv4CfnWOiErvkmM3X1BnZjQyJZTFS6bskXXHiz8ZvHcfEhKrGiWDec2tH+W gb9L+Y0n2r3InMcLt3DRIMysxKvm0tJa8fWiAa9Z12sWIczRx4c= =SEK8 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sat Sep 7 13:45:13 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 12:45:13 +0000 Subject: qgis_3.4.11+dfsg-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 07 Sep 2019 10:08:14 +0200 Source: qgis Architecture: source Version: 3.4.11+dfsg-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: qgis (3.4.11+dfsg-2) unstable; urgency=medium . * Update packaging for GRASS 7.8.0. * Update symbols for other architectures. Checksums-Sha1: 696385554710fc96a7510a871cd237595a4b2268 4655 qgis_3.4.11+dfsg-2.dsc 4c1914b4ecfb96025878a4434fc64b1ff23142dd 267052 qgis_3.4.11+dfsg-2.debian.tar.xz 1ee73130b15cf1fd9dd2768a535cd86d8965658f 34848 qgis_3.4.11+dfsg-2_amd64.buildinfo Checksums-Sha256: 358d3bb25786c10e9b56ea6b92f6ac208bbde80fee7fc767d45b4914832f2f9e 4655 qgis_3.4.11+dfsg-2.dsc 7a2eaccfcccc98aa4600ffdedfc0de0a814aeab1aad92551e8336256d907788c 267052 qgis_3.4.11+dfsg-2.debian.tar.xz 22bc37b772ff6c37691adb921a02ab9800bb613534497229b98f35872120de6c 34848 qgis_3.4.11+dfsg-2_amd64.buildinfo Files: 4d1d1e7ad2d3058c60fb2b7efc8b7f18 4655 science optional qgis_3.4.11+dfsg-2.dsc c3884e796dffdb6b8a3b1f10aff5d32f 267052 science optional qgis_3.4.11+dfsg-2.debian.tar.xz 1d51caf0aea9da2fe4dc37cce6170aad 34848 science optional qgis_3.4.11+dfsg-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1zjDEACgkQZ1DxCuiN SvF0bhAAnlfaSHoZD+WZpJNJVvArpjTtJgx1ujkqpunS02nAhX8xkPMxxj9eMRhF IztesDDUBR17YTSf3mYfsre6u8a70jXwIgH+qms0+8uMfHI9uiJ3f30wriTwDT/q qcr03xrSCyOLyc/C+SXGpdCoB+tH4GtFL+zKwagPIh+MPhxdZLVTDgBMTXbh5G2W TRxMOBn3Okex7jAWR4Q+gPXzj1FRVTh5ZhGFpmpecZUu/MynCRclPjbeLlrmYIfF 7oJS2qUgzX6qOjuWKETCnE8LUh/rMRpTMvfZVYSi36BqhSowTCft3hUUz8FPfgAz eNaVavroNcxVbVkEi80AfbeTCjzQQEwIYzRelngZMSdX++0PEU+Lzmelx0Fu85HP Zuk2NZ5RUvdq4nd3yX3R0E1xeDazjkvbeQ6eKB70iyqyWIi6H3LEN9zxT5kyO6ou vELqrlICuvCiNRJ2a7R6yB2iGdKSuxYLZz4LVymJ770UekXg0KJRx7Xkc6xkwaH6 ijAuWhcseq3xlhxs0SrJwZe/1No8SLAxe84yDa4qWD5NRngGFZ1jOGV954PyRzrF dQFSNMgXxC3W/Qx4e4J5jr4jI21+8qGXLYqiZ6QKVfyCF+jHXJ4ftLf+W8f5URae D1tZc9eveltFiMvhdtTwtvsuM2tiUblKwi4m/j5HkNFopCmaxcA= =2Qoh -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 7 14:38:40 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 13:38:40 +0000 Subject: [Git][debian-gis-team/python-pdal][master] Mark clean-target.patch as Applied-Upstream. Message-ID: <5d73b2e0ae26e_577b3f91d8552b58130194a@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pdal Commits: a93eaff0 by Bas Couwenberg at 2019-09-07T13:38:24Z Mark clean-target.patch as Applied-Upstream. - - - - - 1 changed file: - debian/patches/clean-target.patch Changes: ===================================== debian/patches/clean-target.patch ===================================== @@ -1,6 +1,7 @@ Description: Fix clean target. Author: Bas Couwenberg Forwarded: https://github.com/PDAL/python/pull/32 +Applied-Upstream: https://github.com/PDAL/python/commit/66870b6ff39ca268ea6c2410cba1c8e188f58bfe --- a/setup.py +++ b/setup.py View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/a93eaff0181e8f5f3642f7545c0d649c788bbe5c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/a93eaff0181e8f5f3642f7545c0d649c788bbe5c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 15:10:02 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 14:10:02 +0000 Subject: [Git][debian-gis-team/python-pdal][master] 5 commits: New upstream version 2.2.1+ds Message-ID: <5d73ba3a26c98_577b2ade63462f3c1307451@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pdal Commits: 4ba8cc6f by Bas Couwenberg at 2019-09-07T14:01:03Z New upstream version 2.2.1+ds - - - - - e07f8df5 by Bas Couwenberg at 2019-09-07T14:01:04Z Update upstream source from tag 'upstream/2.2.1+ds' Update to upstream version '2.2.1+ds' with Debian dir 73aac861e25b009f9aa70fdcf9afcdef29336618 - - - - - b8161147 by Bas Couwenberg at 2019-09-07T14:01:23Z New upstream release. - - - - - 3894365d by Bas Couwenberg at 2019-09-07T14:02:15Z Drop clean-target.patch, applied upstream. - - - - - 7f39bf60 by Bas Couwenberg at 2019-09-07T14:02:49Z Set distribution to unstable. - - - - - 7 changed files: - PKG-INFO - VERSION.txt - debian/changelog - − debian/patches/clean-target.patch - − debian/patches/series - pdal/__init__.py - setup.py Changes: ===================================== PKG-INFO ===================================== @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: PDAL -Version: 2.2.0 +Version: 2.2.1 Summary: Point cloud data processing Home-page: http://pdal.io Author: Howard Butler ===================================== VERSION.txt ===================================== @@ -1 +1 @@ -2.2.0 \ No newline at end of file +2.2.1 \ No newline at end of file ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +python-pdal (2.2.1+ds-1) unstable; urgency=medium + + * New upstream release. + * Drop clean-target.patch, applied upstream. + + -- Bas Couwenberg Sat, 07 Sep 2019 16:02:27 +0200 + python-pdal (2.2.0+ds-1) unstable; urgency=medium * New upstream release. ===================================== debian/patches/clean-target.patch deleted ===================================== @@ -1,16 +0,0 @@ -Description: Fix clean target. -Author: Bas Couwenberg -Forwarded: https://github.com/PDAL/python/pull/32 -Applied-Upstream: https://github.com/PDAL/python/commit/66870b6ff39ca268ea6c2410cba1c8e188f58bfe - ---- a/setup.py -+++ b/setup.py -@@ -156,7 +156,7 @@ if DEBUG: - if os.name != 'nt': - extra_compile_args += ['-g','-O0'] - --if PDALVERSION < Version('2.0.0'): -+if PDALVERSION is not None and PDALVERSION < Version('2.0.0'): - raise Exception("PDAL version '%s' is not compatible with PDAL Python library version '%s'"%(PDALVERSION, module_version)) - - ===================================== debian/patches/series deleted ===================================== @@ -1 +0,0 @@ -clean-target.patch ===================================== pdal/__init__.py ===================================== @@ -1,4 +1,4 @@ -__version__='2.2.0' +__version__='2.2.1' from .pipeline import Pipeline from .array import Array ===================================== setup.py ===================================== @@ -156,7 +156,7 @@ if DEBUG: if os.name != 'nt': extra_compile_args += ['-g','-O0'] -if PDALVERSION < Version('2.0.0'): +if PDALVERSION is not None and PDALVERSION < Version('2.0.0'): raise Exception("PDAL version '%s' is not compatible with PDAL Python library version '%s'"%(PDALVERSION, module_version)) View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/compare/a93eaff0181e8f5f3642f7545c0d649c788bbe5c...7f39bf60d1a62491f62b104706eb7e08131ea801 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/compare/a93eaff0181e8f5f3642f7545c0d649c788bbe5c...7f39bf60d1a62491f62b104706eb7e08131ea801 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 15:10:03 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 14:10:03 +0000 Subject: [Git][debian-gis-team/python-pdal][pristine-tar] pristine-tar data for python-pdal_2.2.1+ds.orig.tar.xz Message-ID: <5d73ba3bc476d_577b2ade63462f3c13076bc@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-pdal Commits: 13c60977 by Bas Couwenberg at 2019-09-07T14:01:04Z pristine-tar data for python-pdal_2.2.1+ds.orig.tar.xz - - - - - 2 changed files: - + python-pdal_2.2.1+ds.orig.tar.xz.delta - + python-pdal_2.2.1+ds.orig.tar.xz.id Changes: ===================================== python-pdal_2.2.1+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/python-pdal_2.2.1+ds.orig.tar.xz.delta differ ===================================== python-pdal_2.2.1+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +c8f776237038971d29f3be340f76ab1941bcf6da View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/13c60977afdea1ba22eee04d5a9580bc71e5b6fb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/13c60977afdea1ba22eee04d5a9580bc71e5b6fb You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 15:10:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 14:10:05 +0000 Subject: [Git][debian-gis-team/python-pdal][upstream] New upstream version 2.2.1+ds Message-ID: <5d73ba3d53947_577b2ade6112fa601307849@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pdal Commits: 4ba8cc6f by Bas Couwenberg at 2019-09-07T14:01:03Z New upstream version 2.2.1+ds - - - - - 4 changed files: - PKG-INFO - VERSION.txt - pdal/__init__.py - setup.py Changes: ===================================== PKG-INFO ===================================== @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: PDAL -Version: 2.2.0 +Version: 2.2.1 Summary: Point cloud data processing Home-page: http://pdal.io Author: Howard Butler ===================================== VERSION.txt ===================================== @@ -1 +1 @@ -2.2.0 \ No newline at end of file +2.2.1 \ No newline at end of file ===================================== pdal/__init__.py ===================================== @@ -1,4 +1,4 @@ -__version__='2.2.0' +__version__='2.2.1' from .pipeline import Pipeline from .array import Array ===================================== setup.py ===================================== @@ -156,7 +156,7 @@ if DEBUG: if os.name != 'nt': extra_compile_args += ['-g','-O0'] -if PDALVERSION < Version('2.0.0'): +if PDALVERSION is not None and PDALVERSION < Version('2.0.0'): raise Exception("PDAL version '%s' is not compatible with PDAL Python library version '%s'"%(PDALVERSION, module_version)) View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/4ba8cc6f7f46eee6541a76dc6dc19e0a913bced2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/4ba8cc6f7f46eee6541a76dc6dc19e0a913bced2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 15:10:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 14:10:09 +0000 Subject: [Git][debian-gis-team/python-pdal] Pushed new tag debian/2.2.1+ds-1 Message-ID: <5d73ba41cb51f_577b3f91d43b7c0c1308089@godard.mail> Bas Couwenberg pushed new tag debian/2.2.1+ds-1 at Debian GIS Project / python-pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/tree/debian/2.2.1+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 15:10:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 07 Sep 2019 14:10:10 +0000 Subject: [Git][debian-gis-team/python-pdal] Pushed new tag upstream/2.2.1+ds Message-ID: <5d73ba42dd5be_577b3f91d8552b581308247@godard.mail> Bas Couwenberg pushed new tag upstream/2.2.1+ds at Debian GIS Project / python-pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/tree/upstream/2.2.1+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 7 15:15:46 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 14:15:46 +0000 Subject: Processing of python-pdal_2.2.1+ds-1_source.changes Message-ID: python-pdal_2.2.1+ds-1_source.changes uploaded successfully to localhost along with the files: python-pdal_2.2.1+ds-1.dsc python-pdal_2.2.1+ds.orig.tar.xz python-pdal_2.2.1+ds-1.debian.tar.xz python-pdal_2.2.1+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 7 15:52:26 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 07 Sep 2019 14:52:26 +0000 Subject: python-pdal_2.2.1+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 07 Sep 2019 16:02:27 +0200 Source: python-pdal Architecture: source Version: 2.2.1+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pdal (2.2.1+ds-1) unstable; urgency=medium . * New upstream release. * Drop clean-target.patch, applied upstream. Checksums-Sha1: 232ce7ffdb8b1d94babbfe397cbeda63b01af370 2103 python-pdal_2.2.1+ds-1.dsc 13041dcef39fd14333d3633298805e8ed755f0a1 53676 python-pdal_2.2.1+ds.orig.tar.xz 75e4130fa5b99761efb64122a47273c3a469a493 4212 python-pdal_2.2.1+ds-1.debian.tar.xz 1bc6fdac7c88e067fe34336da7acf8a81c9fbe5c 13690 python-pdal_2.2.1+ds-1_amd64.buildinfo Checksums-Sha256: a285ed9c6c804c4caf05c10fd7e7f1a6d3049a9250ffd4c66eb892ed65ce3d82 2103 python-pdal_2.2.1+ds-1.dsc 92945b3526446c920fa185991ca16797b684d5dc440fa526885d3bdddb9653a3 53676 python-pdal_2.2.1+ds.orig.tar.xz fe344a5e6019d5775111506dcdba7f44c4747ca2c71d0c233aed24e5f3f75c7c 4212 python-pdal_2.2.1+ds-1.debian.tar.xz 4d3d97ffe5a3cf33e057e1a3fad323b253a85f125e5c1a317e244a611c0144e2 13690 python-pdal_2.2.1+ds-1_amd64.buildinfo Files: 26a78924d39c73f21a45af8bcd47e48b 2103 science optional python-pdal_2.2.1+ds-1.dsc 8cd508f9023f09e99d4ee0106c1be502 53676 science optional python-pdal_2.2.1+ds.orig.tar.xz c35384314012aa4b469f21eb76b10645 4212 science optional python-pdal_2.2.1+ds-1.debian.tar.xz 8f2b65a3b1b0fcdbca385363b252b966 13690 science optional python-pdal_2.2.1+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1zuiYACgkQZ1DxCuiN SvHLwBAAq3FK0XE2EVMuzdkBr62BJR81fpxgL4v4WfdahXGEqfBHfubrwYoOJycs AhZDYU+U8V07k78vConWWgo29Q6QAQ/XyPIjuoMkxINKFtnUd0qL3Ai/KLYrRcnh gyiieKD7O2FUmmZAVACPpnsQUIM0bv5oN3ACxTO9aTVMoMKRHJbX4+W2kBuumq/L nGQIXW6wn6atPpx8zlZJ9jcWqOe1hhE9F01NjGKgujzWD+/kJ7PP1RoyetcPex1T Kx7TqzwDEMq7NIALhyYoNj8C9NLy0J/8/jOe/L6ZhGG8r6JAqXo7u9cqRmWUKXif xlPPQEL1IUYTcO4o/xdIEOjTLKvi7lNEM6eSpd3ZCcDLXMIsgeZNhPzyEkCaDC/Q ctP2UMj2CyBDQcrZuC9JjJmCo68r16Mc5ejrtcafIEacXLXZPcnRkJ0dkj8o8MOr A2UdsGqaFk2XPfcYN0LXotsVX6Z1if7Uh28x0VX7O4zsADW3wtYqSGHmLi2xQ4rV aT/gbbHlj5iq6I/vUJ1maVk/DOqK1jbeyk8gUU3s0QXmuJ9lbxz6X9rx3StwKu5g tYOCS+4bgljVvHp/ZtsSD6JoEXsuPlLpg+CYse807Ja/+H70DjfKoYQviMXSn1O3 B3NgeSS9RmkoLSCGdal0kfBqlemNi2r0/WNZ/jrCrUFCd9vz5qI= =g/9K -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 7 20:29:43 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Sat, 07 Sep 2019 19:29:43 +0000 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] 20 commits: Remove package name from lintian overrides. Message-ID: <5d74052730a5d_577b2ade6175ad68132954d@godard.mail> Martin Landa pushed to branch ubuntu/bionic at Debian GIS Project / gdal-grass Commits: cd760af2 by Bas Couwenberg at 2019-03-15T14:20:33Z Remove package name from lintian overrides. - - - - - c41f0348 by Bas Couwenberg at 2019-03-22T15:42:16Z Update branch in gbp.conf & Vcs-Git URL. - - - - - a742d768 by Bas Couwenberg at 2019-03-22T15:42:35Z New upstream version 2.4.1 - - - - - 4eceb1e3 by Bas Couwenberg at 2019-03-22T15:42:38Z Merge tag 'upstream/2.4.1' into experimental Upstream version 2.4.1 - - - - - 45c7a1c1 by Bas Couwenberg at 2019-03-22T15:43:20Z New upstream release. - - - - - 993df1fc by Bas Couwenberg at 2019-03-22T15:46:32Z Update packaging for GRASS 7.6.1. - - - - - 899b855d by Bas Couwenberg at 2019-03-22T15:46:47Z Set distribution to experimental. - - - - - e1a9e7fa by Bas Couwenberg at 2019-07-07T11:34:21Z Revert "Update branch in gbp.conf & Vcs-Git URL." This reverts commit c41f034886e362e286ca1c5e7c929b26ec834513. - - - - - 19b1d3b4 by Bas Couwenberg at 2019-07-07T11:38:37Z Update gbp.conf to use --source-only-changes by default. - - - - - e487aadd by Bas Couwenberg at 2019-07-07T12:24:14Z Update upstream branch in gbp.conf. - - - - - c87f88ca by Bas Couwenberg at 2019-07-07T12:24:29Z New upstream version 2.4.2 - - - - - e7df145f by Bas Couwenberg at 2019-07-07T12:24:30Z Update upstream source from tag 'upstream/2.4.2' Update to upstream version '2.4.2' with Debian dir 0abf92f8cf04b47fae703f4459676b5fa6a3b03c - - - - - 0769904c by Bas Couwenberg at 2019-07-07T12:24:52Z New upstream release. - - - - - f6d6709d by Bas Couwenberg at 2019-07-07T12:25:40Z Set distribution to unstable. - - - - - 41615f22 by Bas Couwenberg at 2019-09-07T07:34:51Z Update packaging for GRASS 7.8.0. - - - - - d7bc319f by Bas Couwenberg at 2019-09-07T08:09:46Z Update PIE hardening conditional, trusty is EOL. - - - - - 155552bd by Bas Couwenberg at 2019-09-07T08:10:19Z Bump Standards-Version to 4.4.0, no changes. - - - - - 4716c438 by Bas Couwenberg at 2019-09-07T08:11:36Z Update watch file to check version directories. - - - - - c200aa69 by Bas Couwenberg at 2019-09-07T08:12:05Z Set distribution to unstable. - - - - - eade4a7b by Martin Landa at 2019-09-07T19:26:57Z Rebuild 2.4.2 for bionic - - - - - 8 changed files: - VERSION - debian/changelog - debian/control - debian/gbp.conf - debian/lintian-overrides - debian/patches/rpath - debian/rules - debian/watch Changes: ===================================== VERSION ===================================== @@ -1 +1 @@ -2.4.0 +2.4.2 ===================================== debian/changelog ===================================== @@ -1,3 +1,34 @@ +libgdal-grass (2.4.2-2~bionic1) bionic; urgency=medium + + * Rebuild for bionic. + + -- Martin Landa Sat, 07 Sep 2019 21:26:06 +0200 + +libgdal-grass (2.4.2-2) unstable; urgency=medium + + * Update packaging for GRASS 7.8.0. + * Update PIE hardening conditional, trusty is EOL. + * Bump Standards-Version to 4.4.0, no changes. + * Update watch file to check version directories. + + -- Bas Couwenberg Sat, 07 Sep 2019 10:11:51 +0200 + +libgdal-grass (2.4.2-1) unstable; urgency=medium + + * New upstream release. + * Update gbp.conf to use --source-only-changes by default. + * Move from experimental to unstable. + + -- Bas Couwenberg Sun, 07 Jul 2019 14:25:23 +0200 + +libgdal-grass (2.4.1-1~exp1) experimental; urgency=medium + + * New upstream release. + * Remove package name from lintian overrides. + * Update packaging for GRASS 7.6.1. + + -- Bas Couwenberg Fri, 22 Mar 2019 16:46:36 +0100 + libgdal-grass (2.4.0-3~bionic1) bionic; urgency=medium * Rebuild for bionic. ===================================== debian/control ===================================== @@ -7,13 +7,13 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-autoreconf, d-shlibs, - grass (>= 7.6.1), - grass-dev (>= 7.6.1), - libgdal-dev (>= 2.4.0), + grass (>= 7.8.0), + grass-dev (>= 7.8.0), + libgdal-dev (>= 2.4.2), libpq-dev, lsb-release, pkg-config -Standards-Version: 4.3.0 +Standards-Version: 4.4.0 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gdal-grass Vcs-Git: https://salsa.debian.org/debian-gis-team/gdal-grass.git -b ubuntu/bionic Homepage: http://www.gdal.org/ ===================================== debian/gbp.conf ===================================== @@ -2,7 +2,7 @@ # The default name for the upstream branch is "upstream". # Change it if the name is different (for instance, "master"). -upstream-branch = upstream +upstream-branch = upstream-2.4 # The default name for the Debian branch is "master". # Change it if the name is different (for instance, "debian/unstable"). @@ -14,3 +14,6 @@ upstream-tag = upstream/%(version)s # Always use pristine-tar. pristine-tar = True + +[buildpackage] +pbuilder-options = --source-only-changes ===================================== debian/lintian-overrides ===================================== @@ -1,4 +1,4 @@ # The run path has been added to get GRASS internal library -libgdal-grass: binary-or-shlib-defines-rpath usr/lib/gdalplugins/gdal_GRASS.so /usr/lib/grass*/lib -libgdal-grass: binary-or-shlib-defines-rpath usr/lib/gdalplugins/ogr_GRASS.so /usr/lib/grass*/lib +binary-or-shlib-defines-rpath usr/lib/gdalplugins/gdal_GRASS.so /usr/lib/grass*/lib +binary-or-shlib-defines-rpath usr/lib/gdalplugins/ogr_GRASS.so /usr/lib/grass*/lib ===================================== debian/patches/rpath ===================================== @@ -9,11 +9,11 @@ Forwarded: not-needed $(GLIBNAME): grass57dataset.o - $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -+ $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -Wl,-rpath,/usr/lib/grass76/lib ++ $(LD_SHARED) $(LDFLAGS) grass57dataset.o $(LIBS) -o $(GLIBNAME) -Wl,-rpath,/usr/lib/grass78/lib $(OLIBNAME): ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o - $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -+ $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -Wl,-rpath,/usr/lib/grass76/lib ++ $(LD_SHARED) $(LDFLAGS) ogrgrassdriver.o ogrgrassdatasource.o ogrgrasslayer.o $(LIBS) -o $(OLIBNAME) -Wl,-rpath,/usr/lib/grass78/lib %.o: %.cpp $(CXX) $(CXXFLAGS) $(CPPFLAGS) $(CFLAGS) -c -o $@ $< ===================================== debian/rules ===================================== @@ -12,7 +12,7 @@ VENDOR_DERIVES_FROM_UBUNTU ?= $(shell dpkg-vendor --derives-from Ubuntu && echo DISTRIBUTION_RELEASE := $(shell lsb_release -cs) ifeq ($(VENDOR_DERIVES_FROM_UBUNTU),yes) - ifneq (,$(filter $(DISTRIBUTION_RELEASE),trusty xenial bionic)) + ifneq (,$(filter $(DISTRIBUTION_RELEASE),xenial bionic)) export DEB_BUILD_MAINT_OPTIONS=hardening=+all,-pie endif export DEB_LDFLAGS_MAINT_APPEND=-Wl,--no-as-needed ===================================== debian/watch ===================================== @@ -2,5 +2,5 @@ version=3 opts=\ dversionmangle=s/\+(debian|dfsg|ds|deb)\d*$//,\ uversionmangle=s/(\d)[_\.\-\+]?((RC|rc|pre|dev|beta|alpha)\d*)$/$1~$2/;s/RC/rc/ \ -https://trac.osgeo.org/gdal/wiki/DownloadSource \ +https://download.osgeo.org/gdal/(\d+\.\d+\.\d+)/ \ (?:|.*/)gdal(?:[_\-]v?|)(\d\S*)\.(?:tar\.xz|txz|tar\.bz2|tbz2|tar\.gz|tgz) View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/2d7bb9538f754a4f7a7c03fc7dfbcda1467107c1...eade4a7b75cd005b2756219f8edfdc8a6bcfc3ca -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/2d7bb9538f754a4f7a7c03fc7dfbcda1467107c1...eade4a7b75cd005b2756219f8edfdc8a6bcfc3ca You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 20:29:52 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Sat, 07 Sep 2019 19:29:52 +0000 Subject: [Git][debian-gis-team/gdal-grass] Pushed new tag ubuntu/2.4.2-1.bionic1 Message-ID: <5d74053036b7d_577b2ade63462f3c1329723@godard.mail> Martin Landa pushed new tag ubuntu/2.4.2-1.bionic1 at Debian GIS Project / gdal-grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/tree/ubuntu/2.4.2-1.bionic1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 7 20:35:07 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Sat, 07 Sep 2019 19:35:07 +0000 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic Message-ID: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> Martin Landa pushed to branch ubuntu/bionic at Debian GIS Project / gdal-grass Commits: 61453b3e by Martin Landa at 2019-09-07T19:32:57Z Rebuild 2.4.2 for bionic - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,4 +1,4 @@ -libgdal-grass (2.4.2-2~bionic1) bionic; urgency=medium +libgdal-grass (2.4.2-1~bionic1) bionic; urgency=medium * Rebuild for bionic. View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/commit/61453b3e5bfe8f1d40f6d85f835733ea93146c06 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/commit/61453b3e5bfe8f1d40f6d85f835733ea93146c06 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastic at xs4all.nl Sat Sep 7 20:45:02 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sat, 7 Sep 2019 21:45:02 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> Message-ID: <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> On 9/7/19 9:35 PM, Martin Landa wrote: > -libgdal-grass (2.4.2-2~bionic1) bionic; urgency=medium > +libgdal-grass (2.4.2-1~bionic1) bionic; urgency=medium Why this change? The previous version was more correct, your build is a backport of 2.4.2-2 in Debian. Your update contains many more changes than just a rebuild of 2.4.2-1~bionic0 which was the previous version in ubuntugis-unstable. When creating a backport, just append the appropriate suffix to the version you're backporting, like your initial change. See also: https://debian-gis-team.pages.debian.net/policy/packaging.html#git-backports Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From landa.martin at gmail.com Sat Sep 7 20:47:22 2019 From: landa.martin at gmail.com (Martin Landa) Date: Sat, 7 Sep 2019 21:47:22 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> Message-ID: Hi, so 7. 9. 2019 v 21:45 odesílatel Sebastiaan Couwenberg napsal: > Why this change? > > The previous version was more correct, your build is a backport of > 2.4.2-2 in Debian. > > Your update contains many more changes than just a rebuild of > 2.4.2-1~bionic0 which was the previous version in ubuntugis-unstable. your are right, I should backport 2.4.2-1 (which is in ubuntugis-unstable) and not 2.4.2-2. I will fix it. Ma -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From sebastic at xs4all.nl Sat Sep 7 20:52:50 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sat, 7 Sep 2019 21:52:50 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> Message-ID: <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> On 9/7/19 9:47 PM, Martin Landa wrote: > so 7. 9. 2019 v 21:45 odesílatel Sebastiaan Couwenberg napsal: >> Why this change? >> >> The previous version was more correct, your build is a backport of >> 2.4.2-2 in Debian. >> >> Your update contains many more changes than just a rebuild of >> 2.4.2-1~bionic0 which was the previous version in ubuntugis-unstable. > > your are right, I should backport 2.4.2-1 (which is in > ubuntugis-unstable) and not 2.4.2-2. I will fix it. No, you shouldn't. 2.4.2-2 contains the changes for GRASS 7.8, those are the ones you need for the grass update you did earlier. The same goes for qgis 3.4.11+dfsg-2. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From sebastic at xs4all.nl Sat Sep 7 20:53:50 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sat, 7 Sep 2019 21:53:50 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> Message-ID: <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> On 9/7/19 9:52 PM, Sebastiaan Couwenberg wrote: > On 9/7/19 9:47 PM, Martin Landa wrote: >> so 7. 9. 2019 v 21:45 odesílatel Sebastiaan Couwenberg napsal: >>> Why this change? >>> >>> The previous version was more correct, your build is a backport of >>> 2.4.2-2 in Debian. >>> >>> Your update contains many more changes than just a rebuild of >>> 2.4.2-1~bionic0 which was the previous version in ubuntugis-unstable. >> >> your are right, I should backport 2.4.2-1 (which is in >> ubuntugis-unstable) and not 2.4.2-2. I will fix it. > > No, you shouldn't. > > 2.4.2-2 contains the changes for GRASS 7.8, those are the ones you need > for the grass update you did earlier. The same goes for qgis 3.4.11+dfsg-2. What you should do is revert this change (61453b3e5bfe8f1d40f6d85f835733ea93146c06), the prior one was correct. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From landa.martin at gmail.com Sat Sep 7 21:03:06 2019 From: landa.martin at gmail.com (Martin Landa) Date: Sat, 7 Sep 2019 22:03:06 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: Hi, so 7. 9. 2019 v 21:53 odesílatel Sebastiaan Couwenberg napsal: > What you should do is revert this change > (61453b3e5bfe8f1d40f6d85f835733ea93146c06), the prior one was correct. yes, but than libgdal-grass will have different patch version (2) compared to gdal in unstable (1). I would like to avoid building whole gdal for ubuntugis if possible. Martin -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From sebastic at xs4all.nl Sat Sep 7 21:09:23 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sat, 7 Sep 2019 22:09:23 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: On 9/7/19 10:03 PM, Martin Landa wrote: > Hi, > > so 7. 9. 2019 v 21:53 odesílatel Sebastiaan Couwenberg > napsal: >> What you should do is revert this change >> (61453b3e5bfe8f1d40f6d85f835733ea93146c06), the prior one was correct. > > yes, but than libgdal-grass will have different patch version (2) > compared to gdal in unstable (1). I would like to avoid building whole > gdal for ubuntugis if possible. gdal (2.4.2+dfsg-1~bionic0) is in ubuntugis-unstable for bionic, so no problem. https://launchpad.net/~ubuntugis/+archive/ubuntu/ubuntugis-unstable/+packages?field.name_filter=gdal&field.status_filter=published&field.series_filter=bionic Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From landa.martin at gmail.com Sat Sep 7 21:13:47 2019 From: landa.martin at gmail.com (Martin Landa) Date: Sat, 7 Sep 2019 22:13:47 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: Hi, so 7. 9. 2019 v 22:09 odesílatel Sebastiaan Couwenberg napsal: > gdal (2.4.2+dfsg-1~bionic0) is in ubuntugis-unstable for bionic, so no > problem. > > https://launchpad.net/~ubuntugis/+archive/ubuntu/ubuntugis-unstable/+packages?field.name_filter=gdal&field.status_filter=published&field.series_filter=bionic exactly. I wanted to point out that after revering 61453b3e5bfe8f1d40f6d85f835733ea93146c06 I will upload to ubuntugis-unstable libgdal-grass 2.4.2-*2*~bionic0 which cause some inconsistency in patch versions. Ma -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From landa.martin at gmail.com Sat Sep 7 21:15:19 2019 From: landa.martin at gmail.com (Martin Landa) Date: Sat, 7 Sep 2019 22:15:19 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: Hi, so 7. 9. 2019 v 22:13 odesílatel Martin Landa napsal: related question to qgis packaging. debian/3.4.11+dfsg-2 still points to 7.6.1. Should I merge master into ubuntu/bionic or wait for debian/3.4.11+dfsg-3? Thanks. Ma -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From morph at debian.org Sun Sep 8 04:37:52 2019 From: morph at debian.org (Sandro Tosi) Date: Sat, 07 Sep 2019 23:37:52 -0400 Subject: Bug#939716: debian-gis: replace/remove all python2 modules dependencies Message-ID: <156791387219.24928.15343062850338890522.reportbug@zion.matrix.int> Source: debian-gis Severity: important Hello, There's an on-going effort to remove Python 2 from Bullseye, https://wiki.debian.org/Python/2Removal, so it would be useful if you could remove all python2 modules dependencies from your tasks and/or migrate them to their relative python3 modules. This will help by removing reverse dependencies of those packages, easying their removal from the distribution. Priority set to important as it's (possibly) preventing part of the progess on the py2removal effort. Thanks, Sandro -- System Information: Debian Release: 10.0 APT prefers unstable-debug APT policy: (500, 'unstable-debug'), (500, 'unstable'), (1, 'experimental-debug'), (1, 'experimental') Architecture: amd64 (x86_64) Foreign Architectures: i386 Kernel: Linux 4.19.0-5-amd64 (SMP w/8 CPU cores) Kernel taint flags: TAINT_PROPRIETARY_MODULE, TAINT_OOT_MODULE, TAINT_UNSIGNED_MODULE Locale: LANG=en_US.UTF-8, LC_CTYPE=en_US.UTF-8 (charmap=UTF-8), LANGUAGE= (charmap=UTF-8) Shell: /bin/sh linked to /bin/dash Init: systemd (via /run/systemd/system) LSM: AppArmor: enabled From noreply at release.debian.org Sun Sep 8 05:39:13 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sun, 08 Sep 2019 04:39:13 +0000 Subject: pgrouting 2.6.3-1 MIGRATED to testing Message-ID: FYI: The status of the pgrouting source package in Debian's testing distribution has changed. Previous version: 2.6.2-2 Current version: 2.6.3-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sun Sep 8 05:39:11 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sun, 08 Sep 2019 04:39:11 +0000 Subject: glymur 0.8.18+ds-1 MIGRATED to testing Message-ID: FYI: The status of the glymur source package in Debian's testing distribution has changed. Previous version: 0.8.18-1 Current version: 0.8.18+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Sun Sep 8 06:42:12 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 05:42:12 +0000 Subject: [Git][blends-team/gis][master] Close bug in changelog. Message-ID: <5d7494b42359f_3a62aedf69d952c105399@godard.mail> Bas Couwenberg pushed to branch master at Debian Blends Team / gis Commits: 5a590183 by Bas Couwenberg at 2019-09-08T05:42:04Z Close bug in changelog. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -23,6 +23,7 @@ debian-gis (0.0.18) UNRELEASED; urgency=medium * tasks/workstation: - Drop liblas-bin - Drop libgeo-proj4-perl + (closes: #939716) -- Bas Couwenberg Fri, 12 Jul 2019 09:01:43 +0200 View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/5a590183580dca060623103e564f84b0e7744013 -- View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/5a590183580dca060623103e564f84b0e7744013 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastic at xs4all.nl Sun Sep 8 06:46:16 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sun, 8 Sep 2019 07:46:16 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: On 9/7/19 10:13 PM, Martin Landa wrote: > so 7. 9. 2019 v 22:09 odesílatel Sebastiaan Couwenberg napsal: >> gdal (2.4.2+dfsg-1~bionic0) is in ubuntugis-unstable for bionic, so no >> problem. >> >> https://launchpad.net/~ubuntugis/+archive/ubuntu/ubuntugis-unstable/+packages?field.name_filter=gdal&field.status_filter=published&field.series_filter=bionic > > exactly. I wanted to point out that after revering > 61453b3e5bfe8f1d40f6d85f835733ea93146c06 I will upload to > ubuntugis-unstable > > libgdal-grass 2.4.2-*2*~bionic0 That's the debian revision after the minus. > which cause some inconsistency in patch versions. Ma 2.4.1 would be the previous patch version. I still don't see any inconsistency. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From owner at bugs.debian.org Sun Sep 8 06:51:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sun, 08 Sep 2019 05:51:03 +0000 Subject: Processed: Re: Bug#939716: debian-gis: replace/remove all python2 modules dependencies References: <65c7c406-9373-25a6-5657-cd2bac6ffaed@xs4all.nl> <156791387219.24928.15343062850338890522.reportbug@zion.matrix.int> Message-ID: Processing control commands: > tags -1 pending Bug #939716 [src:debian-gis] debian-gis: replace/remove all python2 modules dependencies Added tag(s) pending. > severity -1 normal Bug #939716 [src:debian-gis] debian-gis: replace/remove all python2 modules dependencies Severity set to 'normal' from 'important' -- 939716: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939716 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From sebastic at xs4all.nl Sun Sep 8 06:40:25 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sun, 8 Sep 2019 07:40:25 +0200 Subject: Bug#939716: debian-gis: replace/remove all python2 modules dependencies In-Reply-To: <156791387219.24928.15343062850338890522.reportbug@zion.matrix.int> References: <156791387219.24928.15343062850338890522.reportbug@zion.matrix.int> <156791387219.24928.15343062850338890522.reportbug@zion.matrix.int> Message-ID: <65c7c406-9373-25a6-5657-cd2bac6ffaed@xs4all.nl> Control: tags -1 pending Control: severity -1 normal On 9/8/19 5:37 AM, Sandro Tosi wrote: > There's an on-going effort to remove Python 2 from Bullseye, > https://wiki.debian.org/Python/2Removal, so it would be useful if you could > remove all python2 modules dependencies from your tasks and/or migrate them to > their relative python3 modules. Already fixed in git for all packages that have a python3 variant, only python-cf & python-geohash don't have one yet. > This will help by removing reverse dependencies of those packages, easying their > removal from the distribution. The metapackages don't have hard dependencies, so they don't block removal. > Priority set to important as it's (possibly) preventing part of the progess on > the py2removal effort. Severity decreased because the metapackages don't prevent removal of python2 packages. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From sebastic at xs4all.nl Sun Sep 8 06:53:49 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sun, 8 Sep 2019 07:53:49 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: On 9/7/19 10:15 PM, Martin Landa wrote: > so 7. 9. 2019 v 22:13 odesílatel Martin Landa napsal: > related question to qgis packaging. debian/3.4.11+dfsg-2 still points > to 7.6.1. Where do you see it pointing to 7.6.1? If you look at the diff since the previous revision, you see all references to 7.6.1 being changed: https://salsa.debian.org/debian-gis-team/qgis/compare/debian%2F3.4.11+dfsg-1...debian%2F3.4.11+dfsg-2 Specifically in this commit: https://salsa.debian.org/debian-gis-team/qgis/commit/ad75e62c005b51e4e0d82de2ca9b58e87a13a5c5 Did you resolve the merge conflict incorrectly perhaps? > Should I merge master into ubuntu/bionic or wait for > debian/3.4.11+dfsg-3? Thanks. Ma You should merge debian/3.4.11+dfsg-2 into ubuntu/bionic and fix the merge conflicts correctly. Compare the branch with the tag (with git diff) and you should only see the different branch in gbp.conf & Vcs-Git URL, plus your changelog entries for UbuntuGIS. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From gitlab at salsa.debian.org Sun Sep 8 07:36:42 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 06:36:42 +0000 Subject: [Git][blends-team/gis][master] 3 commits: Drop python-geohash & python-cf from devel task. Message-ID: <5d74a17a7bcec_3a62aedf69a92a01102be@godard.mail> Bas Couwenberg pushed to branch master at Debian Blends Team / gis Commits: 75a7430e by Bas Couwenberg at 2019-09-08T06:25:47Z Drop python-geohash & python-cf from devel task. - - - - - 930ae8f3 by Bas Couwenberg at 2019-09-08T06:31:16Z Update control file after `make dist`. - - - - - fc42da9e by Bas Couwenberg at 2019-09-08T06:31:16Z Set distribution to unstable. - - - - - 3 changed files: - debian/changelog - debian/control - tasks/devel Changes: ===================================== debian/changelog ===================================== @@ -1,10 +1,12 @@ -debian-gis (0.0.18) UNRELEASED; urgency=medium +debian-gis (0.0.18) unstable; urgency=medium * tasks/devel: - Drop liblas-dev, liblas-c-dev & python-liblas - Drop libgeo-proj4-perl - Drop python-modestmaps - Drop libterralib-dev + - Drop python-geohash + - Drop python-cf * tasks/gps: - Drop gpx2shp * tasks/osm: @@ -25,7 +27,7 @@ debian-gis (0.0.18) UNRELEASED; urgency=medium - Drop libgeo-proj4-perl (closes: #939716) - -- Bas Couwenberg Fri, 12 Jul 2019 09:01:43 +0200 + -- Bas Couwenberg Sun, 08 Sep 2019 08:26:06 +0200 debian-gis (0.0.17) unstable; urgency=medium ===================================== debian/control ===================================== @@ -71,7 +71,6 @@ Recommends: grass-dev, libgeo-google-mapobject-perl, libgeo-googleearth-pluggable-perl, libgeo-gpx-perl, - libgeo-proj4-perl, libgeo-shapelib-perl, libgeographic-dev, libgeoip-dev, @@ -92,8 +91,7 @@ Recommends: grass-dev, libhdf5-serial-dev, libhe5-hdfeos-dev, libkml-dev, - liblas-c-dev, - liblas-dev, + liblaszip-dev, libmapbox-geometry-dev, libmapbox-variant-dev, libmapbox-wagyu-dev, @@ -104,6 +102,7 @@ Recommends: grass-dev, libnetcdf-cxx-legacy-dev, libnetcdf-dev, libnetcdff-dev, + libogdi-dev, libopencv-calib3d-dev, libopencv-contrib-dev, libopencv-core-dev, @@ -129,16 +128,11 @@ Recommends: grass-dev, libshp-dev, libspatialindex-dev, libspatialite-dev, - libterralib-dev, libudunits2-dev, paraview-dev, pktools-dev, - python-geohash, - python-liblas, - python-modestmaps, python3-cligj, python3-geojson, - python3-mapbox-vector-tile, python3-pysal, python3-rtree, ruby-hdfeos5, @@ -148,16 +142,14 @@ Suggests: libcv-dev, libgeoapi-java, libhighgui-dev, libkmlframework-java, - liblaszip-dev, libmapcache-dev, - libogdi-dev, libopencv-gpu-dev, libossimgui-dev, libossimplanet-dev, libossimplanetqt-dev, libwms-dev, - python-cf, - python3-cartopy + python3-cartopy, + python3-mapbox-vector-tile Description: Geographic Information Systems (GIS) development This task sets up your system for GIS development. @@ -172,17 +164,16 @@ Recommends: foxtrotgps, gpsd-clients, gpsman, gpstrans, - gpx2shp, marble, mtkbabel, navit, navit-gui-gtk, - obdgpslogger, qmapshack Suggests: gpscorrelate, gpscorrelate-gui, gpxsee, navit-graphics-gtk-drawing-area, + obdgpslogger, openbmap-logger, viking Description: GPS related programs @@ -197,7 +188,6 @@ Depends: ${misc:Depends}, Recommends: gir1.2-osmgpsmap-1.0, gpsprune, gpxviewer, - imposm, jmapviewer, josm, libgeo-coder-osm-perl, @@ -208,7 +198,6 @@ Recommends: gir1.2-osmgpsmap-1.0, libjs-proj4, libosmpbf-java, maptool, - merkaartor, mkgmap, mkgmap-splitter, mkgmapgui, @@ -223,16 +212,12 @@ Recommends: gir1.2-osmgpsmap-1.0, osmium-tool, osmosis, osmpbf-bin, - python-imposm-parser, python3-osmapi, python3-overpass, python3-overpy, python3-pyosmium, - rel2gpx, routino, sosi2osm, - tilecache, - tilestache, viking Suggests: alacarte-map-server, cascadenik, @@ -240,7 +225,6 @@ Suggests: alacarte-map-server, gpscorrelate, gpscorrelate-gui, libjs-openlayers3, - libosm-gary68-perl, libreadosm-dev, node-carto, node-kosmtik, @@ -275,8 +259,6 @@ Recommends: dans-gdal-scripts, ossim-core, otb-bin, otb-bin-qt, - python-mipp, - python-mpop, python3-epr, python3-gdal, python3-geotiepoints, @@ -288,6 +270,7 @@ Recommends: dans-gdal-scripts, python3-pyorbital, python3-pyresample, python3-pyspectral, + python3-satpy, sentinelsat Suggests: adore-doris, doris, @@ -315,7 +298,6 @@ Suggests: adore-doris, polsarpro, pyaps, python3-bufr, - python3-satpy, roipac, snaphu, varres @@ -337,12 +319,12 @@ Recommends: r-cran-deldir, r-cran-maps, r-cran-raster, r-cran-sp, - r-cran-spatstat, - r-cran-spdep + r-cran-spatstat Suggests: hyantesite, r-cran-gpclib, r-cran-gstat, r-cran-pbsmapping, + r-cran-spdep, r-cran-tripack Description: Statistics with geographical data Set of Debian packages which are useful for doing statistics @@ -358,21 +340,19 @@ Recommends: cgi-mapserver, mapcache-tools, mapproxy, mapserver-bin, + php-mapscript, + python3-mapscript, python3-owslib, pywps, qgis-server, routino-www, - tilecache, - tilestache, twms Suggests: mapcache-cgi, musmap, node-kosmtik, php-geos, - php-mapscript, postgresql-11-pgrouting, pycsw, - python3-mapscript, tilemill, tinyows Description: Present geographic information via web map server @@ -403,10 +383,8 @@ Recommends: avce00, libfreexl1, libgdal-dev, libgdal-grass, - libgeo-point-perl, libgeographic-dev, libjts-java, - liblas-bin, libshp-dev, mapcode, mapnik-utils, @@ -428,6 +406,7 @@ Recommends: avce00, python3-geolinks, python3-geopandas, python3-mapnik, + python3-pdal, python3-pyproj, python3-pysal, python3-rasterio, @@ -444,7 +423,7 @@ Recommends: avce00, Suggests: capaware, googleearth-package, grass-doc, - libgeo-proj4-perl, + libgeo-point-perl, libgeo-shapelib-perl, librewms, libsqlite3-mod-rasterlite2, @@ -458,7 +437,6 @@ Suggests: capaware, python3-cftime, python3-epr, python3-netcdf4, - python3-pdal, python3-pyshp, roadmap, roadmap-gtk2, ===================================== tasks/devel ===================================== @@ -96,10 +96,6 @@ Suggests: libgeoapi-java Depends: ruby-hdfeos5, ruby-netcdf -Depends: python-geohash - -Suggests: python-cf - Suggests: libossimgui-dev, libossimplanet-dev, libossimplanetqt-dev, libwms-dev Suggests: python3-cartopy View it on GitLab: https://salsa.debian.org/blends-team/gis/compare/5a590183580dca060623103e564f84b0e7744013...fc42da9e596d66d3a9f021d0c80c73d3de92305d -- View it on GitLab: https://salsa.debian.org/blends-team/gis/compare/5a590183580dca060623103e564f84b0e7744013...fc42da9e596d66d3a9f021d0c80c73d3de92305d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 07:36:45 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 06:36:45 +0000 Subject: [Git][blends-team/gis] Pushed new tag 0.0.18 Message-ID: <5d74a17d5250a_3a63f8b1ac743d81105be@godard.mail> Bas Couwenberg pushed new tag 0.0.18 at Debian Blends Team / gis -- View it on GitLab: https://salsa.debian.org/blends-team/gis/tree/0.0.18 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 07:37:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 06:37:09 +0000 Subject: [Git][debian-gis-team/routino][master] 7 commits: New upstream version 3.3 Message-ID: <5d74a195d380d_3a63f8b1ac743d8110833@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / routino Commits: 01598e6d by Bas Couwenberg at 2019-09-08T05:46:40Z New upstream version 3.3 - - - - - c977c333 by Bas Couwenberg at 2019-09-08T05:46:48Z Update upstream source from tag 'upstream/3.3' Update to upstream version '3.3' with Debian dir 36ee86f4866d885ea58f137cb641f5e97fee9a66 - - - - - ec8b9638 by Bas Couwenberg at 2019-09-08T05:54:06Z New upstream release. - - - - - 18897626 by Bas Couwenberg at 2019-09-08T06:18:44Z Update copyright years for Andrew M. Bishop. - - - - - 63f79562 by Bas Couwenberg at 2019-09-08T06:18:56Z Refresh patches. - - - - - 4c4189ba by Bas Couwenberg at 2019-09-08T06:30:07Z Add patch to fix FTBFS due to lack of python subdirectory. - - - - - c335b24e by Bas Couwenberg at 2019-09-08T06:30:07Z Set distribution to unstable. - - - - - 30 changed files: - ChangeLog - Makefile - Makefile.conf - debian/changelog - debian/copyright - debian/patches/hardening - debian/patches/install_path - debian/patches/maploader - debian/patches/mapprops - + debian/patches/python.patch - debian/patches/rename_router - debian/patches/series - doc/INSTALL.txt - doc/NEWS.txt - doc/README.txt - doc/TAGGING.txt - doc/html/installation.html - doc/html/readme.html - doc/html/tagging.html - extras/README.txt - extras/errorlog/summarise-log.pl - extras/find-fixme/Makefile - extras/find-fixme/fixme-dumper.c - extras/find-fixme/web/www/fixme.leaflet.js - extras/find-fixme/web/www/fixme.openlayers.js - + extras/find-fixme/web/www/fixme.openlayers2.js - extras/statistics/Makefile - extras/statistics/dumper.c - extras/statistics/update.sh - extras/tagmodifier/Makefile The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/3e4ac791b5476a8a2cb4de86dabea318b3ab8495...c335b24e0489627e291822986b47b52706166fcd -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/3e4ac791b5476a8a2cb4de86dabea318b3ab8495...c335b24e0489627e291822986b47b52706166fcd You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 07:37:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 06:37:17 +0000 Subject: [Git][debian-gis-team/routino] Pushed new tag debian/3.3-1 Message-ID: <5d74a19d598ad_3a62aedf4f4222c1110ae@godard.mail> Bas Couwenberg pushed new tag debian/3.3-1 at Debian GIS Project / routino -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/tree/debian/3.3-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 07:37:18 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 06:37:18 +0000 Subject: [Git][debian-gis-team/routino] Pushed new tag upstream/3.3 Message-ID: <5d74a19e3b6d7_3a62aedf6f143ac1112a2@godard.mail> Bas Couwenberg pushed new tag upstream/3.3 at Debian GIS Project / routino -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/tree/upstream/3.3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 07:42:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 06:42:00 +0000 Subject: [Git][blends-team/gis][master] Drop python3-mapbox-vector-tile from devel task. Message-ID: <5d74a2b884bbd_3a62aedf6f143ac11152d@godard.mail> Bas Couwenberg pushed to branch master at Debian Blends Team / gis Commits: 166189d0 by Bas Couwenberg at 2019-09-08T06:41:53Z Drop python3-mapbox-vector-tile from devel task. - - - - - 2 changed files: - debian/changelog - tasks/devel Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +debian-gis (0.0.19) UNRELEASED; urgency=medium + + * tasks/devel: + - Drop python3-mapbox-vector-tile + + -- Bas Couwenberg Sun, 08 Sep 2019 08:41:12 +0200 + debian-gis (0.0.18) unstable; urgency=medium * tasks/devel: ===================================== tasks/devel ===================================== @@ -53,7 +53,6 @@ Depends: grass-dev, \ libgeo-shapelib-perl, \ python3-cligj, \ python3-geojson, \ - python3-mapbox-vector-tile, \ python3-rtree, \ python3-pysal X-Comment: Packages maintained by the Debian GIS Project View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/166189d00c4628392b37bf958cce384ea4807126 -- View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/166189d00c4628392b37bf958cce384ea4807126 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 07:44:33 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 06:44:33 +0000 Subject: [Git][blends-team/gis][master] Drop libgeo-point-perl from workstation task. Message-ID: <5d74a351e034b_3a62aedf69bc24c11184e@godard.mail> Bas Couwenberg pushed to branch master at Debian Blends Team / gis Commits: 2fbaa47a by Bas Couwenberg at 2019-09-08T06:44:27Z Drop libgeo-point-perl from workstation task. - - - - - 2 changed files: - debian/changelog - tasks/workstation Changes: ===================================== debian/changelog ===================================== @@ -2,6 +2,8 @@ debian-gis (0.0.19) UNRELEASED; urgency=medium * tasks/devel: - Drop python3-mapbox-vector-tile + * tasks/workstation: + - Drop libgeo-point-perl -- Bas Couwenberg Sun, 08 Sep 2019 08:41:12 +0200 ===================================== tasks/workstation ===================================== @@ -80,8 +80,6 @@ Depends: h5utils, hdf5-tools, hdf4-tools Depends: libjts-java -Depends: libgeo-point-perl - Suggests: libgeo-shapelib-perl Depends: geotiff-bin View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/2fbaa47a042fb72b02a8eff87ceefd52fd08c990 -- View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/2fbaa47a042fb72b02a8eff87ceefd52fd08c990 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 8 07:46:59 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 06:46:59 +0000 Subject: Processing of routino_3.3-1_source.changes Message-ID: routino_3.3-1_source.changes uploaded successfully to localhost along with the files: routino_3.3-1.dsc routino_3.3.orig.tar.gz routino_3.3-1.debian.tar.xz routino_3.3-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 8 07:46:59 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 06:46:59 +0000 Subject: Processing of debian-gis_0.0.18_source.changes Message-ID: debian-gis_0.0.18_source.changes uploaded successfully to localhost along with the files: debian-gis_0.0.18.dsc debian-gis_0.0.18.tar.xz debian-gis_0.0.18_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 8 07:48:49 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 06:48:49 +0000 Subject: debian-gis_0.0.18_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 08 Sep 2019 08:26:06 +0200 Source: debian-gis Architecture: source Version: 0.0.18 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 939716 Changes: debian-gis (0.0.18) unstable; urgency=medium . * tasks/devel: - Drop liblas-dev, liblas-c-dev & python-liblas - Drop libgeo-proj4-perl - Drop python-modestmaps - Drop libterralib-dev - Drop python-geohash - Drop python-cf * tasks/gps: - Drop gpx2shp * tasks/osm: - Drop rel2gpx - Drop libosm-gary68-perl - Drop tilestache - Drop tilecache - Drop imposm & python-imposm - Drop merkaartor * tasks/remotesensing: - Drop python-mipp - Drop python-mpop * tasks/web: - Drop tilestache - Drop tilecache * tasks/workstation: - Drop liblas-bin - Drop libgeo-proj4-perl (closes: #939716) Checksums-Sha1: b9ba3a77388d58b2cf0ad6b63bf1085283314136 2280 debian-gis_0.0.18.dsc 2ef06fc0486503a19b2fb8d21e24db845735a02d 141240 debian-gis_0.0.18.tar.xz 15178bd7c9cc175ae46e7c3d45abd381c76c8cc4 9262 debian-gis_0.0.18_amd64.buildinfo Checksums-Sha256: e52ec18ff22b97a11a606aa10395d7c59047701468b053522ead4c3d0226dc7a 2280 debian-gis_0.0.18.dsc d4556e64b0c21bcc2371580eb7167ca3d53940a70b5d250cc3d7a0475a88a52a 141240 debian-gis_0.0.18.tar.xz 692b0cdbd3aac4fd4258b774705d0f3cef9573a69258ae530528075492bafcde 9262 debian-gis_0.0.18_amd64.buildinfo Files: 8faa9426089bc36ecdcb6c6799822306 2280 misc optional debian-gis_0.0.18.dsc fbdf7f89c7b7e83296a21f7071e7fc11 141240 misc optional debian-gis_0.0.18.tar.xz d904d6bd8aea577ca91e42ad05526b1c 9262 misc optional debian-gis_0.0.18_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl10oVoACgkQZ1DxCuiN SvHBDw//UHsqbVJ0yt0wMz6/ZqutMLW/6F7Ir6qybEn4B0cYTTz0/+X3B9yBNbos 6sMfSK4s5DorXmm8VJuOLva/S2VCLjIaCdmrA1VLR9JRW/ba4zZTxCHd3ezeCwrt pNwNb2ozS20dE+iC4RwZeyN0dDCPn+aTm6UKICWPHewBRYn4Dw8nh1bYsFuUMJt2 FgypZcbZjGRxjl1bcrd5RKxYJdpBB3RGIEjmBU9eMIAwSamc62t5kVfT50j/cGbE JIC8mnE1flE78H8dXcZ3PNY2PrrJ83RWz34jLh8Gr1c62LioWSqWiUH9R6fcEIXM lGdGiQ2dCFm6mf8WT69AFIijMRqscWZdSSH85aG4CfsOv9tjdDQ8h3WQKX6/BOkc 2iyOw1/yz9LhEWVq/XAvuu5qY4Gg3kQDSeEAQxQqWRVEDkfEKfNT9r9a2ntA3tyD UzIOGX+zG0h17w39KPcAiklygEGx1RLWlR7T9gIlU2lqodd9lnXJ0NfxfVEo1889 5FV1N4pD5TH1bjU5m7UAXj1BmDiycU6oRn/HPdOoV+cCaDd6p5PVbUeWL7saesxd AAsp9OoGqkeCORbfm2bGVCw9CDiJO5tmhtawNUDAXfpQ4wD983h0ZDnqg0KGwvxC t3OaYKNpdErmvZEkVI/rK7r5rO7pO/aCvYPRcCKNC8db9Nupus0= =nc+5 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sun Sep 8 07:49:22 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 06:49:22 +0000 Subject: routino_3.3-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 08 Sep 2019 08:19:23 +0200 Source: routino Architecture: source Version: 3.3-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: routino (3.3-1) unstable; urgency=medium . * New upstream release. * Bump Standards-Version to 4.4.0, no changes. * Add Build-Depends-Package field to symbols files. * Update gbp.conf to use --source-only-changes by default. * Update copyright years for Andrew M. Bishop. * Refresh patches. * Add patch to fix FTBFS due to lack of python subdirectory. Checksums-Sha1: ebc995e94927bbbcc0fcd1d58445241e108ab75e 2335 routino_3.3-1.dsc ae32b7db64723117f1fd5592aa08934565e2d864 2527654 routino_3.3.orig.tar.gz 444c715909895c3e730b29c1526da8c4d90e6be8 29872 routino_3.3-1.debian.tar.xz a15a94571dbbc6995cfde30dc10d16cecd929f29 10599 routino_3.3-1_amd64.buildinfo Checksums-Sha256: 2d3577394e2a2b5546d6a8e80599b5f5d8aa0e878cd11685d1daa12b86acccdd 2335 routino_3.3-1.dsc f1095fa05438e9a85e6fa7d6fb334f681b96e7c6033abf1164d4de2170ea03bb 2527654 routino_3.3.orig.tar.gz 6edccd732b5d42b82b301e507102cf86cb3be7e298083212dce510e48e2705e6 29872 routino_3.3-1.debian.tar.xz 98d281556b1f80ebc807bfbe0103bac944106ff952c8acddceea6fe29c63087a 10599 routino_3.3-1_amd64.buildinfo Files: 9ea45450adcdfe754ba991e6a7f83be2 2335 misc optional routino_3.3-1.dsc 7243b64fa72ea76d32efba2475a712dd 2527654 misc optional routino_3.3.orig.tar.gz 8033f491a1a8a7b305f21d4fb51d5544 29872 misc optional routino_3.3-1.debian.tar.xz b8731ba53446911b160209395796915f 10599 misc optional routino_3.3-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl10oWsACgkQZ1DxCuiN SvFVlhAAo9ooLh4wbgEo+EcADTIY+arOrMPkAhdybhIBXyIm+fJGrK9qLYUKG4d/ xje1NAn8xbLAoWo6g8oZEnEeGZmtqRZJV8W0v8d6OsVdhYf3/wqh738bfl/O3e8I ncrEpKXNXa3AmAAdI99QtAmcC+xtg5BFMhsf1C9xe9s1h8TszweO+fQjykw2nA6J 8ua6m+y6C1qXYv/K/om94O0tdKz3zzqRuEdHBLYNAKX1l/wKp4CIxEo1peexzhxk PPfi5OcTCSHzqkxBcRgpWSfnKJCrrm8hmvacBLakXNQXhD9Cp8d11XNDfEoqztsZ vfli3SkjZlE41AUWZnfTJKkZOTgcXxkcVyL332xPO7yo/9OYRJ8Ek8WjFgmOU/0Z yXKhYWubebEthoFf2QJ05Pq0TGSC/cqCkzB2uii0ZeP35xCHDSg1NmrSKp/g2qms jKkjSYIIcqLIptigt+s5DBYSl81IrksY6uKFXceru3QxymKTpUZ9x0R3k3MZZOWm 2yPh84BJX9HLARnp2Abgs0cWfgyO6RPcKj43Dygc4ZIt/HSTyB1ibeqCGgRAoZhf vBlhLWgXhuzS9hOyYhMQV3GJFeMN3afAOsrGsYRbbnLc0JWddjtzV8sZliZpKuyI DHy+HxUQarUmTWKxJG6GU0TO2ypQw1ZXXeEeYaIK14jpQP3bNqk= =bRwe -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Sun Sep 8 07:51:04 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sun, 08 Sep 2019 06:51:04 +0000 Subject: Bug#939716: marked as done (debian-gis: replace/remove all python2 modules dependencies) References: <156791387219.24928.15343062850338890522.reportbug@zion.matrix.int> Message-ID: Your message dated Sun, 08 Sep 2019 06:48:49 +0000 with message-id and subject line Bug#939716: fixed in debian-gis 0.0.18 has caused the Debian Bug report #939716, regarding debian-gis: replace/remove all python2 modules dependencies to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 939716: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939716 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Sandro Tosi Subject: debian-gis: replace/remove all python2 modules dependencies Date: Sat, 07 Sep 2019 23:37:52 -0400 Size: 2600 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#939716: fixed in debian-gis 0.0.18 Date: Sun, 08 Sep 2019 06:48:49 +0000 Size: 5644 URL: From gitlab at salsa.debian.org Sun Sep 8 12:11:23 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 11:11:23 +0000 Subject: [Git][debian-gis-team/snaphu][pristine-tar] pristine-tar data for snaphu_2.0.1.orig.tar.gz Message-ID: <5d74e1db1d01e_73482ad95ff5bd3484193@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / snaphu Commits: d6830a64 by Antonio Valentino at 2019-09-08T11:00:07Z pristine-tar data for snaphu_2.0.1.orig.tar.gz - - - - - 2 changed files: - + snaphu_2.0.1.orig.tar.gz.delta - + snaphu_2.0.1.orig.tar.gz.id Changes: ===================================== snaphu_2.0.1.orig.tar.gz.delta ===================================== Binary files /dev/null and b/snaphu_2.0.1.orig.tar.gz.delta differ ===================================== snaphu_2.0.1.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +765e9aace38000ec833f8b4afccd22a12d28ef9d View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/commit/d6830a649424b3897a76823dd7cd61636a4a1a1d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/commit/d6830a649424b3897a76823dd7cd61636a4a1a1d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 12:11:32 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 11:11:32 +0000 Subject: [Git][debian-gis-team/snaphu][master] 7 commits: New upstream version 2.0.1 Message-ID: <5d74e1e4cfedb_73482ad95c4c1b9c8427e@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / snaphu Commits: c7fff519 by Antonio Valentino at 2019-09-08T11:00:06Z New upstream version 2.0.1 - - - - - ee12a0dc by Antonio Valentino at 2019-09-08T11:00:07Z Update upstream source from tag 'upstream/2.0.1' Update to upstream version '2.0.1' with Debian dir 8be84f9b2e752c4311fdb9c45cc8d1c0ee941943 - - - - - 16b15f9b by Antonio Valentino at 2019-09-08T11:04:40Z New upstrean release - - - - - ffddcfb6 by Antonio Valentino at 2019-09-08T11:06:12Z Refresh all patches - - - - - 1f1cb37f by Antonio Valentino at 2019-09-08T11:07:02Z Bump debhelper from old 11 to 12. Fixes lintian: package-uses-old-debhelper-compat-version See https://lintian.debian.org/tags/package-uses-old-debhelper-compat-version.html for more details. - - - - - daa9cc01 by Antonio Valentino at 2019-09-08T11:07:02Z Remove obsolete fields Name from debian/upstream/metadata. - - - - - 4b5a5809 by Antonio Valentino at 2019-09-08T11:09:25Z Set distribution to unstable - - - - - 11 changed files: - debian/changelog - − debian/compat - debian/control - debian/patches/0002-Spelling.patch - debian/upstream/metadata - src/snaphu.c - src/snaphu.h - src/snaphu_cost.c - src/snaphu_io.c - src/snaphu_solver.c - src/snaphu_tile.c Changes: ===================================== debian/changelog ===================================== @@ -1,13 +1,18 @@ -snaphu (2.0.0-2) UNRELEASED; urgency=medium +snaphu (2.0.1-1) unstable; urgency=medium [ Bas Couwenberg ] * Don't delete bin directory in clean target, included in upstream source. [ Antonio Valentino ] + * New upstream release. * debian/tests/control: - mark test as superficial using Restrictions + * debian/patches: + - refresh all patches + * Bump debhelper from old 11 to 12. + * Remove obsolete fields Name from debian/upstream/metadata. - -- Bas Couwenberg Wed, 10 Jul 2019 19:47:39 +0200 + -- Antonio Valentino Sun, 08 Sep 2019 11:08:52 +0000 snaphu (2.0.0-1) unstable; urgency=medium ===================================== debian/compat deleted ===================================== @@ -1 +0,0 @@ -11 ===================================== debian/control ===================================== @@ -4,7 +4,7 @@ Uploaders: Antonio Valentino Section: non-free/science XS-Autobuild: no Priority: optional -Build-Depends: debhelper (>= 11) +Build-Depends: debhelper-compat (= 12) Standards-Version: 4.4.0 Vcs-Browser: https://salsa.debian.org/debian-gis-team/snaphu Vcs-Git: https://salsa.debian.org/debian-gis-team/snaphu.git ===================================== debian/patches/0002-Spelling.patch ===================================== @@ -21,10 +21,10 @@ index 161cc25..a4ed3c9 100644 preceding one, although round-off errors in flow-to-phase conversions may cause minor differences diff --git a/src/snaphu.h b/src/snaphu.h -index 3a9ab8d..6ec5974 100644 +index 5d43cb5..82e6d2b 100644 --- a/src/snaphu.h +++ b/src/snaphu.h -@@ -406,7 +406,7 @@ +@@ -407,7 +407,7 @@ "The parts of this software derived from the CS2 minimum cost flow\n"\ "solver written by A. V. Goldberg and B. Cherkassky are governed by the\n"\ "terms of the copyright holder of that software. Permission has been\n"\ ===================================== debian/upstream/metadata ===================================== @@ -1,4 +1,2 @@ ---- -Name: SNAPHU Other-References: https://web.stanford.edu/group/radar/softwareandlinks/sw/snaphu/ ===================================== src/snaphu.c ===================================== @@ -254,6 +254,9 @@ int Unwrap(infileT *infiles, outfileT *outfiles, paramT *params, /* see if next tile needs to be unwrapped */ if(dotilemask[nexttilerow][nexttilecol]){ + /* wait to make sure file i/o, threads, and OS are synched */ + sleep(sleepinterval); + /* fork to create new process */ fflush(NULL); pid=fork(); @@ -317,10 +320,9 @@ int Unwrap(infileT *infiles, outfileT *outfiles, paramT *params, nexttilerow++; } - /* wait a little while for file i/o before beginning next tile */ + /* increment counter of running child processes */ if(pid!=iterparams->parentpid){ nchildren++; - sleep(sleepinterval); } }else{ @@ -414,13 +416,14 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, long nflow, ncycle, mostflow, nflowdone; long candidatelistsize, candidatebagsize; long isource, nsource; + long nincreasedcostiter; long *nconnectedarr; int *nnodesperrow, *narcsperrow; short **flows, **mstcosts; float **wrappedphase, **unwrappedphase, **mag, **unwrappedest; incrcostT **incrcosts; void **costs; - totalcostT totalcost, oldtotalcost; + totalcostT totalcost, oldtotalcost, mintotalcost; nodeT **sourcelist; nodeT *source, ***apexes; nodeT **nodes, ground[1]; @@ -541,6 +544,9 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, &candidatelist,&iscandidate,&apexes,&bkts,&iincrcostfile, &incrcosts,&nodes,ground,&nnoderow,&nnodesperrow,&narcrow, &narcsperrow,nrow,ncol,¬firstloop,&totalcost,params); + oldtotalcost=totalcost; + mintotalcost=totalcost; + nincreasedcostiter=0; /* regrow regions with -G parameter */ if(params->regrowconncomps){ @@ -633,11 +639,18 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, if(notfirstloop){ oldtotalcost=totalcost; totalcost=EvaluateTotalCost(costs,flows,nrow,ncol,NULL,params); + if(totalcostoldtotalcost || (n>0 && totalcost==oldtotalcost)){ fflush(NULL); - fprintf(sp0,"Unexpected increase in total cost. Breaking loop\n"); - break; + fprintf(sp1,"Caution: Unexpected increase in total cost\n"); } + if(totalcost > mintotalcost){ + nincreasedcostiter++; + }else{ + nincreasedcostiter=0; + } } /* consider this flow increment done if not too many neg cycles found */ @@ -650,6 +663,12 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, /* find maximum flow on network, excluding arcs affected by masking */ mostflow=MaxNonMaskFlow(flows,mag,nrow,ncol); + if(nincreasedcostiter>=mostflow){ + fflush(NULL); + fprintf(sp0,"WARNING: Unexpected sustained increase in total cost." + " Breaking loop\n"); + break; + } /* break if we're done with all flow increments or problem is convex */ if(nflowdone>=params->maxflow || nflowdone>=mostflow || params->p>=1.0){ ===================================== src/snaphu.h ===================================== @@ -14,7 +14,7 @@ /**********************/ #define PROGRAMNAME "snaphu" -#define VERSION "2.0.0" +#define VERSION "2.0.1" #define BUGREPORTEMAIL "snaphu at gmail.com" #ifdef PI #undef PI @@ -103,6 +103,7 @@ #define DEF_VERBOSE FALSE #define DEF_AMPLITUDE TRUE #define AUTOCALCSTATMAX 0 +#define MAXNSHORTCYCLE 8192 #define USEMAXCYCLEFRACTION (-123) #define COMPLEX_DATA 1 /* file format */ #define FLOAT_DATA 2 /* file format */ ===================================== src/snaphu_cost.c ===================================== @@ -48,6 +48,13 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, long nrow, long ncol, tileparamT *tileparams, outfileT *outfiles, paramT *params); static +void MaskCost(costT *costptr); +static +void MaskSmoothCost(smoothcostT *smoothcostptr); +static +int MaskPrespecifiedArcCosts(void **costsptr, short **weights, + long nrow, long ncol, paramT *params); +static int GetIntensityAndCorrelation(float **mag, float **wrappedphase, float ***pwrptr, float ***corrptr, infileT *infiles, long linelen, long nlines, @@ -200,12 +207,20 @@ int BuildCostArrays(void ***costsptr, short ***mstcostsptr, /* build or read the statistical cost arrays unless we were told not to */ if(strlen(infiles->costinfile)){ + + /* read cost info from file */ fprintf(sp1,"Reading cost information from file %s\n",infiles->costinfile); costs=NULL; Read2DRowColFile((void ***)&costs,infiles->costinfile, linelen,nlines,tileparams,costtypesize); (*costsptr)=costs; + /* weights of arcs next to masked pixels are set to zero */ + /* make sure corresponding costs are nulled when costs are read from */ + /* file rather than internally generated since read costs are not */ + /* multiplied by weights */ + MaskPrespecifiedArcCosts(costs,weights,nrow,ncol,params); + }else if(params->costmode!=NOSTATCOSTS){ /* get intensity and correlation info */ @@ -348,16 +363,23 @@ int BuildCostArrays(void ***costsptr, short ***mstcostsptr, tempcost=negcost; } - /* clip scalar cost so it is between 0 and params->maxcost */ + /* clip scalar cost so it is between 1 and params->maxcost */ + /* note: weights used for MST algorithm will not be zero along */ + /* masked edges since they are clipped to 1, but MST is run */ + /* once on entire network, not just non-masked regions */ weights[row][col]=LClip(tempcost,MINSCALARCOST,params->maxcost); /* assign Lp costs if in Lp mode */ + /* let scalar cost be zero if costs in both directions are zero */ if(params->p>=0){ if(params->bidirlpn){ bidircosts[row][col].posweight=LClip(poscost,0,params->maxcost); bidircosts[row][col].negweight=LClip(negcost,0,params->maxcost); }else{ scalarcosts[row][col]=weights[row][col]; + if(poscost==0 && negcost==0){ + scalarcosts[row][col]=0; + } } } } @@ -582,10 +604,7 @@ void **BuildStatCostsTopo(float **wrappedphase, float **mag, if(colweight[row][col]==0){ /* masked pixel */ - colcost[row][col].laycost=0; - colcost[row][col].offset=LARGESHORT/2; - colcost[row][col].dzmax=LARGESHORT; - colcost[row][col].sigsq=LARGESHORT; + MaskCost(&colcost[row][col]); }else{ @@ -732,10 +751,7 @@ void **BuildStatCostsTopo(float **wrappedphase, float **mag, if(rowweight[row][col]==0){ /* masked pixel */ - rowcost[row][col].laycost=0; - rowcost[row][col].offset=LARGESHORT/2; - rowcost[row][col].dzmax=LARGESHORT; - rowcost[row][col].sigsq=LARGESHORT; + MaskCost(&rowcost[row][col]); }else{ @@ -898,10 +914,7 @@ void **BuildStatCostsDefo(float **wrappedphase, float **mag, if(colweight[row][col]==0){ /* masked pixel */ - colcost[row][col].laycost=0; - colcost[row][col].offset=0; - colcost[row][col].dzmax=LARGESHORT; - colcost[row][col].sigsq=LARGESHORT; + MaskCost(&colcost[row][col]); }else{ @@ -964,10 +977,7 @@ void **BuildStatCostsDefo(float **wrappedphase, float **mag, if(rowweight[row][col]==0){ /* masked pixel */ - rowcost[row][col].laycost=0; - rowcost[row][col].offset=0; - rowcost[row][col].dzmax=LARGESHORT; - rowcost[row][col].sigsq=LARGESHORT; + MaskCost(&rowcost[row][col]); }else{ @@ -1086,8 +1096,7 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, if(colweight[row][col]==0){ /* masked pixel */ - colcost[row][col].offset=0; - colcost[row][col].sigsq=LARGESHORT; + MaskSmoothCost(&colcost[row][col]); }else{ @@ -1138,8 +1147,7 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, if(rowweight[row][col]==0){ /* masked pixel */ - rowcost[row][col].offset=0; - rowcost[row][col].sigsq=LARGESHORT; + MaskSmoothCost(&rowcost[row][col]); }else{ @@ -1187,6 +1195,89 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, } +/* function: MaskCost() + * -------------------- + * Set values of costT structure pointed to by input pointer to give zero + * cost, as for arcs next to masked pixels. + */ +static +void MaskCost(costT *costptr){ + + /* set to special values */ + costptr->laycost=0; + costptr->offset=LARGESHORT/2; + costptr->dzmax=LARGESHORT; + costptr->sigsq=LARGESHORT; + +} + + +/* function: MaskSmoothCost() + * -------------------------- + * Set values of smoothcostT structure pointed to by input pointer to give zero + * cost, as for arcs next to masked pixels. + */ +static +void MaskSmoothCost(smoothcostT *smoothcostptr){ + + /* set to special values */ + smoothcostptr->offset=LARGESHORT/2; + smoothcostptr->sigsq=LARGESHORT; + +} + + +/* function: MaskPrespecifiedArcCosts() + * ------------------------------------ + * Loop over grid arcs and set costs to null if corresponding weights + * are null. + */ +static +int MaskPrespecifiedArcCosts(void **costsptr, short **weights, + long nrow, long ncol, paramT *params){ + + long row, col, maxcol; + costT **costs; + smoothcostT **smoothcosts; + + + /* set up pointers */ + costs=NULL; + smoothcosts=NULL; + if(params->costmode==TOPO || params->costmode==DEFO){ + costs=(costT **)costsptr; + }else if(params->costmode==SMOOTH){ + smoothcosts=(smoothcostT **)costsptr; + }else{ + fprintf(sp0,"illegal cost mode in MaskPrespecifiedArcCosts()\n"); + exit(ABNORMAL_EXIT); + } + + /* loop over all arcs */ + for(row=0;row<2*nrow-1;row++){ + if(rowdzmax; offset=cost->offset; sigsq=cost->sigsq; + dzmax=cost->dzmax; laycost=cost->laycost; + + /* just return 0 if we have zero cost arc */ + if(sigsq==LARGESHORT){ + (*poscostptr)=0; + (*negcostptr)=0; + return; + } + + /* compute argument to cost function */ nshortcycle=params->nshortcycle; layfalloffconst=params->layfalloffconst; if(arcrowsigsq==LARGESHORT){ + (*poscostptr)=0; + (*negcostptr)=0; + return; + } + + /* compute argument to cost function */ nshortcycle=params->nshortcycle; layfalloffconst=params->layfalloffconst; idz1=labs(flow*nshortcycle+cost->offset); @@ -1944,11 +2053,15 @@ void CalcCostSmooth(void **costs, long flow, long arcrow, long arccol, /* get arc info */ cost=&((smoothcostT **)(costs))[arcrow][arccol]; + + /* just return 0 if we have zero cost arc */ if(cost->sigsq==LARGESHORT){ - *poscostptr=0; - *negcostptr=0; + (*poscostptr)=0; + (*negcostptr)=0; return; } + + /* compute argument to cost function */ nshortcycle=params->nshortcycle; idz1=labs(flow*nshortcycle+cost->offset); idz2pos=labs((flow+nflow)*nshortcycle+cost->offset); @@ -2238,6 +2351,24 @@ void CalcCostLPBiDir(void **costs, long flow, long arcrow, long arccol, /* function: CalcCostNonGrid() * --------------------------- * Calculates the arc cost given an array of long integer cost lookup tables. + * + * The cost array for each arc gives the cost for +/-flowmax units of + * flow around the flow value with minimum cost, which is not + * necessarily flow == 0. The offset between the flow value with + * minimum cost and flow == 0 is given by arroffset = costarr[0]. + * Positive flow values k for k = 1 to flowmax relative to this min + * cost flow value are in costarr[k]. Negative flow values k relative + * to the min cost flow from k = -1 to -flowmax costarr[flowmax-k]. + * costarr[2*flowmax+1] contains a scaling factor for extrapolating + * beyond the ends of the cost table, assuming quadratically (with an offset) + * increasing cost (subject to rounding and scaling). + * + * As of summer 2019, the rationale for how seconeary costs are + * extrapolated beyond the end of the table has been lost to time, but + * the logic at least does give a self-consistent cost function that + * is continuous at +/-flowmax and quadratically increases beyond, + * albeit not necessarily with a starting slope that has an easily + * intuitive basis. */ void CalcCostNonGrid(void **costs, long flow, long arcrow, long arccol, long nflow, long nrow, paramT *params, @@ -2345,6 +2476,13 @@ long EvalCostTopo(void **costs, short **flows, long arcrow, long arccol, /* get arc info */ cost=&((costT **)(costs))[arcrow][arccol]; + + /* just return 0 if we have zero cost arc */ + if(cost->sigsq==LARGESHORT){ + return(0); + } + + /* compute argument to cost function */ if(arcrowsigsq==LARGESHORT){ + return(0); + } + + /* compute argument to cost function */ idz1=labs(flows[arcrow][arccol]*(params->nshortcycle)+cost->offset); /* calculate and return cost */ @@ -2417,9 +2562,13 @@ long EvalCostSmooth(void **costs, short **flows, long arcrow, long arccol, /* get arc info */ cost=&((smoothcostT **)(costs))[arcrow][arccol]; + + /* just return 0 if we have zero cost arc */ if(cost->sigsq==LARGESHORT){ return(0); } + + /* compute argument to cost function */ idz1=labs(flows[arcrow][arccol]*(params->nshortcycle)+cost->offset); /* calculate and return cost */ ===================================== src/snaphu_io.c ===================================== @@ -899,6 +899,11 @@ int CheckParams(infileT *infiles, outfileT *outfiles, fprintf(sp0,"defomax exceeds range of short int for given nshortcycle\n"); exit(ABNORMAL_EXIT); } + if(params->nshortcycle < 1 || params->nshortcycle > MAXNSHORTCYCLE){ + fflush(NULL); + fprintf(sp0,"illegal value for nshortcycle\n"); + exit(ABNORMAL_EXIT); + } if(params->maxnewnodeconst<=0 || params->maxnewnodeconst>1){ fflush(NULL); fprintf(sp0,"maxnewnodeconst must be between 0 and 1\n"); ===================================== src/snaphu_solver.c ===================================== @@ -2108,14 +2108,6 @@ void GetArcGrid(nodeT *from, nodeT *to, long *arcrow, long *arccol, *arccol=tocol; *arcdir=-1; }else{ -#define DIAG_GETARCGRID -#ifdef DIAG_GETARCGRID - if(!(torow>0 && nodes[torow-1][tocol].group==BOUNDARYPTR)){ - fflush(NULL); - fprintf(stderr,"BUG: should not have gotten here in GetArcGrid()\n"); - exit(1); - } -#endif *arcrow=torow+nrow-1; *arccol=tocol; *arcdir=1; @@ -2134,14 +2126,6 @@ void GetArcGrid(nodeT *from, nodeT *to, long *arcrow, long *arccol, *arccol=fromcol; *arcdir=1; }else{ -#define DIAG_GETARCGRID -#ifdef DIAG_GETARCGRID - if(!(fromrow>0 && nodes[fromrow-1][fromcol].group==BOUNDARYPTR)){ - fflush(NULL); - fprintf(stderr,"BUG: should not have gotten here in GetArcGrid()\n"); - exit(1); - } -#endif *arcrow=fromrow+nrow-1; *arccol=fromcol; *arcdir=-1; ===================================== src/snaphu_tile.c ===================================== @@ -122,6 +122,8 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, void **voidrightedgecosts, short **rightedgeflows, paramT *params, short **bulkoffsets); static +short AvgSigSq(short sigsq1, short sigsq2); +static int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, nodesuppT **nodesupp, scndryarcT **scndryarcs, long ***scndrycosts, long *nnewnodesptr, @@ -1205,6 +1207,7 @@ int AssembleTiles(outfileT *outfiles, paramT *params, long nrow, ncol, prevnrow, prevncol, nextnrow, nextncol; long n, ncycle, nflowdone, nflow, candidatelistsize, candidatebagsize; long nnodes, maxnflowcycles, arclen, narcs, sourcetilenum, flowmax; + long nincreasedcostiter; long *totarclens; long ***scndrycosts; double avgarclen; @@ -1216,7 +1219,7 @@ int AssembleTiles(outfileT *outfiles, paramT *params, short **tempregions, *regionsbelow, *regionsabove; int *nscndrynodes, *nscndryarcs; incrcostT **incrcosts; - totalcostT totalcost, oldtotalcost; + totalcostT totalcost, oldtotalcost, mintotalcost; nodeT *source; nodeT **scndrynodes, ***scndryapexes; signed char **iscandidate; @@ -1227,7 +1230,7 @@ int AssembleTiles(outfileT *outfiles, paramT *params, bucketT *bkts; char filename[MAXSTRLEN]; - + /* set up */ fprintf(sp1,"Assembling tiles\n"); ntilerow=params->ntilerow; @@ -1370,8 +1373,8 @@ int AssembleTiles(outfileT *outfiles, paramT *params, } scndrycosts[i][j][2*flowmax+1]=LRound(scndrycosts[i][j][2*flowmax+1] /avgarclen); - if(scndrycosts[i][j][2*flowmax+1]<1){ - scndrycosts[i][j][2*flowmax+1]=1; + if(scndrycosts[i][j][2*flowmax+1]<0){ + scndrycosts[i][j][2*flowmax+1]=0; } } } @@ -1408,13 +1411,15 @@ int AssembleTiles(outfileT *outfiles, paramT *params, incrcosts[i]=(incrcostT *)MAlloc(nscndryarcs[i]*sizeof(incrcostT)); nnodes+=nscndrynodes[i]; } - + /* set up network for secondary solver */ InitNetwork(scndryflows,&dummylong,&ncycle,&nflowdone,&dummylong,&nflow, &candidatebagsize,&candidatebag,&candidatelistsize, &candidatelist,NULL,NULL,&bkts,&dummylong,NULL,NULL,NULL, NULL,NULL,NULL,NULL,ntiles,0,¬firstloop,&totalcost,params); - + oldtotalcost=totalcost; + mintotalcost=totalcost; + nincreasedcostiter=0; /* set pointers to functions for nongrid secondary network */ CalcCost=CalcCostNonGrid; @@ -1456,10 +1461,17 @@ int AssembleTiles(outfileT *outfiles, paramT *params, oldtotalcost=totalcost; totalcost=EvaluateTotalCost((void **)scndrycosts,scndryflows,ntiles,0, nscndryarcs,params); + if(totalcostoldtotalcost || (n>0 && totalcost==oldtotalcost)){ fflush(NULL); - fprintf(sp0,"Unexpected increase in total cost. Breaking loop\n"); - break; + fprintf(sp1,"Caution: Unexpected increase in total cost\n"); + } + if(totalcost>mintotalcost){ + nincreasedcostiter++; + }else{ + nincreasedcostiter=0; } } @@ -1471,6 +1483,15 @@ int AssembleTiles(outfileT *outfiles, paramT *params, nflowdone=1; } + /* break if total cost increase is sustained */ + if(nincreasedcostiter>=params->maxflow){ + fflush(NULL); + fprintf(sp0,"WARNING: Unexpected sustained increase in total cost." + " Breaking loop\n"); + break; + } + + /* break if we're done with all flow increments or problem is convex */ if(nflowdone>=params->maxflow){ break; @@ -2392,9 +2413,9 @@ int SetUpperEdge(long ncol, long tilerow, long tilecol, void **voidcosts, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - upperedgecosts[0][col].offset=nshortcycle*dpsi; - upperedgecosts[0][col].sigsq=ceil((costs[0][col].sigsq - +costsabove[col].sigsq)/2.0); + upperedgecosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + upperedgecosts[0][col].sigsq=AvgSigSq(costs[0][col].sigsq, + costsabove[col].sigsq); if(costs[0][col].dzmax>costsabove[col].dzmax){ upperedgecosts[0][col].dzmax=costs[0][col].dzmax; }else{ @@ -2406,9 +2427,9 @@ int SetUpperEdge(long ncol, long tilerow, long tilecol, void **voidcosts, upperedgecosts[0][col].laycost=costsabove[col].laycost; } }else if(CalcCost==CalcCostSmooth){ - upperedgesmoothcosts[0][col].offset=nshortcycle*dpsi; - upperedgesmoothcosts[0][col].sigsq= - ceil((smoothcosts[0][col].sigsq+smoothcostsabove[col].sigsq)/2.0); + upperedgesmoothcosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + upperedgesmoothcosts[0][col].sigsq + =AvgSigSq(smoothcosts[0][col].sigsq,smoothcostsabove[col].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidupperedgecosts)[0][col]= @@ -2528,9 +2549,9 @@ int SetLowerEdge(long nrow, long ncol, long tilerow, long tilecol, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - loweredgecosts[0][col].offset=nshortcycle*dpsi; - loweredgecosts[0][col].sigsq=ceil((costs[nrow-2][col].sigsq - +costsbelow[col].sigsq)/2.0); + loweredgecosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + loweredgecosts[0][col].sigsq=AvgSigSq(costs[nrow-2][col].sigsq, + costsbelow[col].sigsq); if(costs[nrow-2][col].dzmax>costsbelow[col].dzmax){ loweredgecosts[0][col].dzmax=costs[nrow-2][col].dzmax; }else{ @@ -2542,10 +2563,9 @@ int SetLowerEdge(long nrow, long ncol, long tilerow, long tilecol, loweredgecosts[0][col].laycost=costsbelow[col].laycost; } }else if(CalcCost==CalcCostSmooth){ - loweredgesmoothcosts[0][col].offset=nshortcycle*dpsi; - loweredgesmoothcosts[0][col].sigsq= - ceil((smoothcosts[nrow-2][col].sigsq - +smoothcostsbelow[col].sigsq)/2.0); + loweredgesmoothcosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + loweredgesmoothcosts[0][col].sigsq + =AvgSigSq(smoothcosts[nrow-2][col].sigsq,smoothcostsbelow[col].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidloweredgecosts)[0][col]= @@ -2662,10 +2682,11 @@ int SetLeftEdge(long nrow, long prevncol, long tilerow, long tilecol, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - leftedgecosts[row][0].offset=(TILEDPSICOLFACTOR*nshortcycle*dpsi); - leftedgecosts[row][0].sigsq= - ceil((costs[row+nrow-1][0].sigsq - +lastcosts[row+nrow-1][prevncol-2].sigsq)/2.0); + leftedgecosts[row][0].offset=(short )LRound(TILEDPSICOLFACTOR + *nshortcycle*dpsi); + leftedgecosts[row][0].sigsq + =AvgSigSq(costs[row+nrow-1][0].sigsq, + lastcosts[row+nrow-1][prevncol-2].sigsq); if(costs[row+nrow-1][0].dzmax>lastcosts[row+nrow-1][prevncol-2].dzmax){ leftedgecosts[row][0].dzmax=costs[row+nrow-1][0].dzmax; }else{ @@ -2680,10 +2701,10 @@ int SetLeftEdge(long nrow, long prevncol, long tilerow, long tilecol, } }else if(CalcCost==CalcCostSmooth){ leftedgesmoothcosts[row][0].offset - =(TILEDPSICOLFACTOR*nshortcycle*dpsi); - leftedgesmoothcosts[row][0].sigsq= - ceil((smoothcosts[row+nrow-1][0].sigsq - +lastsmoothcosts[row+nrow-1][prevncol-2].sigsq)/2.0); + =(short )LRound(TILEDPSICOLFACTOR*nshortcycle*dpsi); + leftedgesmoothcosts[row][0].sigsq + =AvgSigSq(smoothcosts[row+nrow-1][0].sigsq, + lastsmoothcosts[row+nrow-1][prevncol-2].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidleftedgecosts)[row][0]= @@ -2807,10 +2828,11 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - rightedgecosts[row][0].offset=(TILEDPSICOLFACTOR*nshortcycle*dpsi); + rightedgecosts[row][0].offset=(short )LRound(TILEDPSICOLFACTOR + *nshortcycle*dpsi); rightedgecosts[row][0].sigsq - =ceil((costs[row+nrow-1][ncol-2].sigsq - +nextcosts[row+nrow-1][0].sigsq)/2.0); + =AvgSigSq(costs[row+nrow-1][ncol-2].sigsq, + nextcosts[row+nrow-1][0].sigsq); if(costs[row+nrow-1][ncol-2].dzmax>nextcosts[row+nrow-1][0].dzmax){ rightedgecosts[row][0].dzmax=costs[row+nrow-1][ncol-2].dzmax; }else{ @@ -2823,10 +2845,10 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, } }else if(CalcCost==CalcCostSmooth){ rightedgesmoothcosts[row][0].offset - =(TILEDPSICOLFACTOR*nshortcycle*dpsi); + =(short )LRound(TILEDPSICOLFACTOR*nshortcycle*dpsi); rightedgesmoothcosts[row][0].sigsq - =ceil((smoothcosts[row+nrow-1][ncol-2].sigsq - +nextsmoothcosts[row+nrow-1][0].sigsq)/2.0); + =AvgSigSq(smoothcosts[row+nrow-1][ncol-2].sigsq, + nextsmoothcosts[row+nrow-1][0].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidrightedgecosts)[row][0]= @@ -2908,6 +2930,34 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, } +/* function: AvgSigSq() + * -------------------- + * Return average of sigsq values after chcking for special value and + * clipping to short. + */ +static +short AvgSigSq(short sigsq1, short sigsq2){ + + int sigsqavg; + + + /* if either value is special LARGESHORT value, use that */ + if(sigsq1==LARGESHORT || sigsq2==LARGESHORT){ + return(LARGESHORT); + } + + /* compute average */ + sigsqavg=(int )ceil(0.5*(((int )sigsq1)+((int )sigsq2))); + + /* clip */ + sigsqavg=LClip(sigsqavg,-LARGESHORT,LARGESHORT); + + /* return */ + return((short )sigsqavg); + +} + + /* function: TraceSecondaryArc() * ----------------------------- */ @@ -2931,7 +2981,7 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, long i, row, col, nnewnodes, arclen, ntilerow, ntilecol, arcnum; long tilenum, nflow, primaryarcrow, primaryarccol, poscost, negcost, nomcost; long nnrow, nncol, calccostnrow, nnewarcs, arroffset, nshortcycle; - long mincost, mincostflow, minweight; + long mincost, mincostflow, minweight, maxcost; long *scndrycostarr; double sigsq, sumsigsqinv, tempdouble, tileedgearcweight; short **flows; @@ -3096,6 +3146,9 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, /* keep absolute cost of arc to the previous node */ if(!zerocost){ + + /* accumulate incremental cost in table for each nflow increment */ + /* offset flow in flow array temporarily by arroffset then undo below */ flows[primaryarcrow][primaryarccol]-=primaryarcdir*arroffset; nomcost=EvalCost(costs,flows,primaryarcrow,primaryarccol,calccostnrow, params); @@ -3125,22 +3178,35 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, } } flows[primaryarcrow][primaryarccol]+=primaryarcdir*arroffset; + + /* accumulate term to be used for cost growth beyond table bounds */ if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ sigsq=((costT **)costs)[primaryarcrow][primaryarccol].sigsq; }else if(CalcCost==CalcCostSmooth){ sigsq=((smoothcostT **)costs)[primaryarcrow][primaryarccol].sigsq; }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ - sigsq=((short **)costs)[primaryarcrow][primaryarccol]; + minweight=((short **)costs)[primaryarcrow][primaryarccol]; + if(minweight<1){ + sigsq=LARGESHORT; + }else{ + sigsq=1.0/(double )minweight; + } }else if(CalcCost==CalcCostL0BiDir || CalcCost==CalcCostL1BiDir || CalcCost==CalcCostL2BiDir || CalcCost==CalcCostLPBiDir){ minweight=LMin(((bidircostT **)costs)[primaryarcrow][primaryarccol] .posweight, ((bidircostT **)costs)[primaryarcrow][primaryarccol] .negweight); - sigsq=1.0/(double )minweight; + if(minweight<1){ + sigsq=LARGESHORT; + }else{ + sigsq=1.0/(double )minweight; + } } - sumsigsqinv+=(1.0/sigsq); + if(sigsqmaxcost){ + maxcost=scndrycostarr[nflow]; + } + if(scndrycostarr[flowmax+nflow]>maxcost){ + maxcost=scndrycostarr[flowmax+nflow]; + } + } + + /* if cost was all zero, treat as zero cost arc */ + if(maxcost==mincost){ + zerocost=TRUE; + sumsigsqinv=0; } /* break if cost array adequately centered on minimum cost flow */ @@ -3198,12 +3277,16 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, return(0); } - /* see if we have a secondary arc on the edge of the full-sized array */ /* these arcs have zero cost since the edge is treated as a single node */ + /* secondary arcs whose primary arcs all have zero cost are also zeroed */ if(zerocost){ /* set sum of standard deviations to indicate zero-cost secondary arc */ + scndrycostarr[0]=0; + for(nflow=1;nflow<=2*flowmax;nflow++){ + scndrycostarr[nflow]=0; + } scndrycostarr[2*flowmax+1]=ZEROCOSTARC; }else{ @@ -4042,12 +4125,6 @@ int AssembleTileConnComps(long linelen, long nlines, conncompsizes[nconncomp].icompfull=0; conncompsizes[nconncomp].npix=tileconncompsizes[k].npix; nconncomp++; -#define DEBUG -#ifdef DEBUG -if(nconncomp>nconncompmem){ - fprintf(sp0,"ERROR--THIS IS A BUG\n"); -} -#endif } } View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/compare/3b973465db3c5b284a97f1706f9f2daf73ebcf57...4b5a58099c5288780e48a09b2685fba31c2435b9 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/compare/3b973465db3c5b284a97f1706f9f2daf73ebcf57...4b5a58099c5288780e48a09b2685fba31c2435b9 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 12:11:37 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 11:11:37 +0000 Subject: [Git][debian-gis-team/snaphu] Pushed new tag upstream/2.0.1 Message-ID: <5d74e1e9cf001_73482ad95db3f3b0845ca@godard.mail> Antonio Valentino pushed new tag upstream/2.0.1 at Debian GIS Project / snaphu -- View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/tree/upstream/2.0.1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 12:11:45 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 11:11:45 +0000 Subject: [Git][debian-gis-team/snaphu][upstream] New upstream version 2.0.1 Message-ID: <5d74e1f187b0a_73482ad95ff622b084615@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / snaphu Commits: c7fff519 by Antonio Valentino at 2019-09-08T11:00:06Z New upstream version 2.0.1 - - - - - 6 changed files: - src/snaphu.c - src/snaphu.h - src/snaphu_cost.c - src/snaphu_io.c - src/snaphu_solver.c - src/snaphu_tile.c Changes: ===================================== src/snaphu.c ===================================== @@ -254,6 +254,9 @@ int Unwrap(infileT *infiles, outfileT *outfiles, paramT *params, /* see if next tile needs to be unwrapped */ if(dotilemask[nexttilerow][nexttilecol]){ + /* wait to make sure file i/o, threads, and OS are synched */ + sleep(sleepinterval); + /* fork to create new process */ fflush(NULL); pid=fork(); @@ -317,10 +320,9 @@ int Unwrap(infileT *infiles, outfileT *outfiles, paramT *params, nexttilerow++; } - /* wait a little while for file i/o before beginning next tile */ + /* increment counter of running child processes */ if(pid!=iterparams->parentpid){ nchildren++; - sleep(sleepinterval); } }else{ @@ -414,13 +416,14 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, long nflow, ncycle, mostflow, nflowdone; long candidatelistsize, candidatebagsize; long isource, nsource; + long nincreasedcostiter; long *nconnectedarr; int *nnodesperrow, *narcsperrow; short **flows, **mstcosts; float **wrappedphase, **unwrappedphase, **mag, **unwrappedest; incrcostT **incrcosts; void **costs; - totalcostT totalcost, oldtotalcost; + totalcostT totalcost, oldtotalcost, mintotalcost; nodeT **sourcelist; nodeT *source, ***apexes; nodeT **nodes, ground[1]; @@ -541,6 +544,9 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, &candidatelist,&iscandidate,&apexes,&bkts,&iincrcostfile, &incrcosts,&nodes,ground,&nnoderow,&nnodesperrow,&narcrow, &narcsperrow,nrow,ncol,¬firstloop,&totalcost,params); + oldtotalcost=totalcost; + mintotalcost=totalcost; + nincreasedcostiter=0; /* regrow regions with -G parameter */ if(params->regrowconncomps){ @@ -633,11 +639,18 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, if(notfirstloop){ oldtotalcost=totalcost; totalcost=EvaluateTotalCost(costs,flows,nrow,ncol,NULL,params); + if(totalcostoldtotalcost || (n>0 && totalcost==oldtotalcost)){ fflush(NULL); - fprintf(sp0,"Unexpected increase in total cost. Breaking loop\n"); - break; + fprintf(sp1,"Caution: Unexpected increase in total cost\n"); } + if(totalcost > mintotalcost){ + nincreasedcostiter++; + }else{ + nincreasedcostiter=0; + } } /* consider this flow increment done if not too many neg cycles found */ @@ -650,6 +663,12 @@ int UnwrapTile(infileT *infiles, outfileT *outfiles, paramT *params, /* find maximum flow on network, excluding arcs affected by masking */ mostflow=MaxNonMaskFlow(flows,mag,nrow,ncol); + if(nincreasedcostiter>=mostflow){ + fflush(NULL); + fprintf(sp0,"WARNING: Unexpected sustained increase in total cost." + " Breaking loop\n"); + break; + } /* break if we're done with all flow increments or problem is convex */ if(nflowdone>=params->maxflow || nflowdone>=mostflow || params->p>=1.0){ ===================================== src/snaphu.h ===================================== @@ -14,7 +14,7 @@ /**********************/ #define PROGRAMNAME "snaphu" -#define VERSION "2.0.0" +#define VERSION "2.0.1" #define BUGREPORTEMAIL "snaphu at gmail.com" #ifdef PI #undef PI @@ -103,6 +103,7 @@ #define DEF_VERBOSE FALSE #define DEF_AMPLITUDE TRUE #define AUTOCALCSTATMAX 0 +#define MAXNSHORTCYCLE 8192 #define USEMAXCYCLEFRACTION (-123) #define COMPLEX_DATA 1 /* file format */ #define FLOAT_DATA 2 /* file format */ ===================================== src/snaphu_cost.c ===================================== @@ -48,6 +48,13 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, long nrow, long ncol, tileparamT *tileparams, outfileT *outfiles, paramT *params); static +void MaskCost(costT *costptr); +static +void MaskSmoothCost(smoothcostT *smoothcostptr); +static +int MaskPrespecifiedArcCosts(void **costsptr, short **weights, + long nrow, long ncol, paramT *params); +static int GetIntensityAndCorrelation(float **mag, float **wrappedphase, float ***pwrptr, float ***corrptr, infileT *infiles, long linelen, long nlines, @@ -200,12 +207,20 @@ int BuildCostArrays(void ***costsptr, short ***mstcostsptr, /* build or read the statistical cost arrays unless we were told not to */ if(strlen(infiles->costinfile)){ + + /* read cost info from file */ fprintf(sp1,"Reading cost information from file %s\n",infiles->costinfile); costs=NULL; Read2DRowColFile((void ***)&costs,infiles->costinfile, linelen,nlines,tileparams,costtypesize); (*costsptr)=costs; + /* weights of arcs next to masked pixels are set to zero */ + /* make sure corresponding costs are nulled when costs are read from */ + /* file rather than internally generated since read costs are not */ + /* multiplied by weights */ + MaskPrespecifiedArcCosts(costs,weights,nrow,ncol,params); + }else if(params->costmode!=NOSTATCOSTS){ /* get intensity and correlation info */ @@ -348,16 +363,23 @@ int BuildCostArrays(void ***costsptr, short ***mstcostsptr, tempcost=negcost; } - /* clip scalar cost so it is between 0 and params->maxcost */ + /* clip scalar cost so it is between 1 and params->maxcost */ + /* note: weights used for MST algorithm will not be zero along */ + /* masked edges since they are clipped to 1, but MST is run */ + /* once on entire network, not just non-masked regions */ weights[row][col]=LClip(tempcost,MINSCALARCOST,params->maxcost); /* assign Lp costs if in Lp mode */ + /* let scalar cost be zero if costs in both directions are zero */ if(params->p>=0){ if(params->bidirlpn){ bidircosts[row][col].posweight=LClip(poscost,0,params->maxcost); bidircosts[row][col].negweight=LClip(negcost,0,params->maxcost); }else{ scalarcosts[row][col]=weights[row][col]; + if(poscost==0 && negcost==0){ + scalarcosts[row][col]=0; + } } } } @@ -582,10 +604,7 @@ void **BuildStatCostsTopo(float **wrappedphase, float **mag, if(colweight[row][col]==0){ /* masked pixel */ - colcost[row][col].laycost=0; - colcost[row][col].offset=LARGESHORT/2; - colcost[row][col].dzmax=LARGESHORT; - colcost[row][col].sigsq=LARGESHORT; + MaskCost(&colcost[row][col]); }else{ @@ -732,10 +751,7 @@ void **BuildStatCostsTopo(float **wrappedphase, float **mag, if(rowweight[row][col]==0){ /* masked pixel */ - rowcost[row][col].laycost=0; - rowcost[row][col].offset=LARGESHORT/2; - rowcost[row][col].dzmax=LARGESHORT; - rowcost[row][col].sigsq=LARGESHORT; + MaskCost(&rowcost[row][col]); }else{ @@ -898,10 +914,7 @@ void **BuildStatCostsDefo(float **wrappedphase, float **mag, if(colweight[row][col]==0){ /* masked pixel */ - colcost[row][col].laycost=0; - colcost[row][col].offset=0; - colcost[row][col].dzmax=LARGESHORT; - colcost[row][col].sigsq=LARGESHORT; + MaskCost(&colcost[row][col]); }else{ @@ -964,10 +977,7 @@ void **BuildStatCostsDefo(float **wrappedphase, float **mag, if(rowweight[row][col]==0){ /* masked pixel */ - rowcost[row][col].laycost=0; - rowcost[row][col].offset=0; - rowcost[row][col].dzmax=LARGESHORT; - rowcost[row][col].sigsq=LARGESHORT; + MaskCost(&rowcost[row][col]); }else{ @@ -1086,8 +1096,7 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, if(colweight[row][col]==0){ /* masked pixel */ - colcost[row][col].offset=0; - colcost[row][col].sigsq=LARGESHORT; + MaskSmoothCost(&colcost[row][col]); }else{ @@ -1138,8 +1147,7 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, if(rowweight[row][col]==0){ /* masked pixel */ - rowcost[row][col].offset=0; - rowcost[row][col].sigsq=LARGESHORT; + MaskSmoothCost(&rowcost[row][col]); }else{ @@ -1187,6 +1195,89 @@ void **BuildStatCostsSmooth(float **wrappedphase, float **mag, } +/* function: MaskCost() + * -------------------- + * Set values of costT structure pointed to by input pointer to give zero + * cost, as for arcs next to masked pixels. + */ +static +void MaskCost(costT *costptr){ + + /* set to special values */ + costptr->laycost=0; + costptr->offset=LARGESHORT/2; + costptr->dzmax=LARGESHORT; + costptr->sigsq=LARGESHORT; + +} + + +/* function: MaskSmoothCost() + * -------------------------- + * Set values of smoothcostT structure pointed to by input pointer to give zero + * cost, as for arcs next to masked pixels. + */ +static +void MaskSmoothCost(smoothcostT *smoothcostptr){ + + /* set to special values */ + smoothcostptr->offset=LARGESHORT/2; + smoothcostptr->sigsq=LARGESHORT; + +} + + +/* function: MaskPrespecifiedArcCosts() + * ------------------------------------ + * Loop over grid arcs and set costs to null if corresponding weights + * are null. + */ +static +int MaskPrespecifiedArcCosts(void **costsptr, short **weights, + long nrow, long ncol, paramT *params){ + + long row, col, maxcol; + costT **costs; + smoothcostT **smoothcosts; + + + /* set up pointers */ + costs=NULL; + smoothcosts=NULL; + if(params->costmode==TOPO || params->costmode==DEFO){ + costs=(costT **)costsptr; + }else if(params->costmode==SMOOTH){ + smoothcosts=(smoothcostT **)costsptr; + }else{ + fprintf(sp0,"illegal cost mode in MaskPrespecifiedArcCosts()\n"); + exit(ABNORMAL_EXIT); + } + + /* loop over all arcs */ + for(row=0;row<2*nrow-1;row++){ + if(rowdzmax; offset=cost->offset; sigsq=cost->sigsq; + dzmax=cost->dzmax; laycost=cost->laycost; + + /* just return 0 if we have zero cost arc */ + if(sigsq==LARGESHORT){ + (*poscostptr)=0; + (*negcostptr)=0; + return; + } + + /* compute argument to cost function */ nshortcycle=params->nshortcycle; layfalloffconst=params->layfalloffconst; if(arcrowsigsq==LARGESHORT){ + (*poscostptr)=0; + (*negcostptr)=0; + return; + } + + /* compute argument to cost function */ nshortcycle=params->nshortcycle; layfalloffconst=params->layfalloffconst; idz1=labs(flow*nshortcycle+cost->offset); @@ -1944,11 +2053,15 @@ void CalcCostSmooth(void **costs, long flow, long arcrow, long arccol, /* get arc info */ cost=&((smoothcostT **)(costs))[arcrow][arccol]; + + /* just return 0 if we have zero cost arc */ if(cost->sigsq==LARGESHORT){ - *poscostptr=0; - *negcostptr=0; + (*poscostptr)=0; + (*negcostptr)=0; return; } + + /* compute argument to cost function */ nshortcycle=params->nshortcycle; idz1=labs(flow*nshortcycle+cost->offset); idz2pos=labs((flow+nflow)*nshortcycle+cost->offset); @@ -2238,6 +2351,24 @@ void CalcCostLPBiDir(void **costs, long flow, long arcrow, long arccol, /* function: CalcCostNonGrid() * --------------------------- * Calculates the arc cost given an array of long integer cost lookup tables. + * + * The cost array for each arc gives the cost for +/-flowmax units of + * flow around the flow value with minimum cost, which is not + * necessarily flow == 0. The offset between the flow value with + * minimum cost and flow == 0 is given by arroffset = costarr[0]. + * Positive flow values k for k = 1 to flowmax relative to this min + * cost flow value are in costarr[k]. Negative flow values k relative + * to the min cost flow from k = -1 to -flowmax costarr[flowmax-k]. + * costarr[2*flowmax+1] contains a scaling factor for extrapolating + * beyond the ends of the cost table, assuming quadratically (with an offset) + * increasing cost (subject to rounding and scaling). + * + * As of summer 2019, the rationale for how seconeary costs are + * extrapolated beyond the end of the table has been lost to time, but + * the logic at least does give a self-consistent cost function that + * is continuous at +/-flowmax and quadratically increases beyond, + * albeit not necessarily with a starting slope that has an easily + * intuitive basis. */ void CalcCostNonGrid(void **costs, long flow, long arcrow, long arccol, long nflow, long nrow, paramT *params, @@ -2345,6 +2476,13 @@ long EvalCostTopo(void **costs, short **flows, long arcrow, long arccol, /* get arc info */ cost=&((costT **)(costs))[arcrow][arccol]; + + /* just return 0 if we have zero cost arc */ + if(cost->sigsq==LARGESHORT){ + return(0); + } + + /* compute argument to cost function */ if(arcrowsigsq==LARGESHORT){ + return(0); + } + + /* compute argument to cost function */ idz1=labs(flows[arcrow][arccol]*(params->nshortcycle)+cost->offset); /* calculate and return cost */ @@ -2417,9 +2562,13 @@ long EvalCostSmooth(void **costs, short **flows, long arcrow, long arccol, /* get arc info */ cost=&((smoothcostT **)(costs))[arcrow][arccol]; + + /* just return 0 if we have zero cost arc */ if(cost->sigsq==LARGESHORT){ return(0); } + + /* compute argument to cost function */ idz1=labs(flows[arcrow][arccol]*(params->nshortcycle)+cost->offset); /* calculate and return cost */ ===================================== src/snaphu_io.c ===================================== @@ -899,6 +899,11 @@ int CheckParams(infileT *infiles, outfileT *outfiles, fprintf(sp0,"defomax exceeds range of short int for given nshortcycle\n"); exit(ABNORMAL_EXIT); } + if(params->nshortcycle < 1 || params->nshortcycle > MAXNSHORTCYCLE){ + fflush(NULL); + fprintf(sp0,"illegal value for nshortcycle\n"); + exit(ABNORMAL_EXIT); + } if(params->maxnewnodeconst<=0 || params->maxnewnodeconst>1){ fflush(NULL); fprintf(sp0,"maxnewnodeconst must be between 0 and 1\n"); ===================================== src/snaphu_solver.c ===================================== @@ -2108,14 +2108,6 @@ void GetArcGrid(nodeT *from, nodeT *to, long *arcrow, long *arccol, *arccol=tocol; *arcdir=-1; }else{ -#define DIAG_GETARCGRID -#ifdef DIAG_GETARCGRID - if(!(torow>0 && nodes[torow-1][tocol].group==BOUNDARYPTR)){ - fflush(NULL); - fprintf(stderr,"BUG: should not have gotten here in GetArcGrid()\n"); - exit(1); - } -#endif *arcrow=torow+nrow-1; *arccol=tocol; *arcdir=1; @@ -2134,14 +2126,6 @@ void GetArcGrid(nodeT *from, nodeT *to, long *arcrow, long *arccol, *arccol=fromcol; *arcdir=1; }else{ -#define DIAG_GETARCGRID -#ifdef DIAG_GETARCGRID - if(!(fromrow>0 && nodes[fromrow-1][fromcol].group==BOUNDARYPTR)){ - fflush(NULL); - fprintf(stderr,"BUG: should not have gotten here in GetArcGrid()\n"); - exit(1); - } -#endif *arcrow=fromrow+nrow-1; *arccol=fromcol; *arcdir=-1; ===================================== src/snaphu_tile.c ===================================== @@ -122,6 +122,8 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, void **voidrightedgecosts, short **rightedgeflows, paramT *params, short **bulkoffsets); static +short AvgSigSq(short sigsq1, short sigsq2); +static int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, nodesuppT **nodesupp, scndryarcT **scndryarcs, long ***scndrycosts, long *nnewnodesptr, @@ -1205,6 +1207,7 @@ int AssembleTiles(outfileT *outfiles, paramT *params, long nrow, ncol, prevnrow, prevncol, nextnrow, nextncol; long n, ncycle, nflowdone, nflow, candidatelistsize, candidatebagsize; long nnodes, maxnflowcycles, arclen, narcs, sourcetilenum, flowmax; + long nincreasedcostiter; long *totarclens; long ***scndrycosts; double avgarclen; @@ -1216,7 +1219,7 @@ int AssembleTiles(outfileT *outfiles, paramT *params, short **tempregions, *regionsbelow, *regionsabove; int *nscndrynodes, *nscndryarcs; incrcostT **incrcosts; - totalcostT totalcost, oldtotalcost; + totalcostT totalcost, oldtotalcost, mintotalcost; nodeT *source; nodeT **scndrynodes, ***scndryapexes; signed char **iscandidate; @@ -1227,7 +1230,7 @@ int AssembleTiles(outfileT *outfiles, paramT *params, bucketT *bkts; char filename[MAXSTRLEN]; - + /* set up */ fprintf(sp1,"Assembling tiles\n"); ntilerow=params->ntilerow; @@ -1370,8 +1373,8 @@ int AssembleTiles(outfileT *outfiles, paramT *params, } scndrycosts[i][j][2*flowmax+1]=LRound(scndrycosts[i][j][2*flowmax+1] /avgarclen); - if(scndrycosts[i][j][2*flowmax+1]<1){ - scndrycosts[i][j][2*flowmax+1]=1; + if(scndrycosts[i][j][2*flowmax+1]<0){ + scndrycosts[i][j][2*flowmax+1]=0; } } } @@ -1408,13 +1411,15 @@ int AssembleTiles(outfileT *outfiles, paramT *params, incrcosts[i]=(incrcostT *)MAlloc(nscndryarcs[i]*sizeof(incrcostT)); nnodes+=nscndrynodes[i]; } - + /* set up network for secondary solver */ InitNetwork(scndryflows,&dummylong,&ncycle,&nflowdone,&dummylong,&nflow, &candidatebagsize,&candidatebag,&candidatelistsize, &candidatelist,NULL,NULL,&bkts,&dummylong,NULL,NULL,NULL, NULL,NULL,NULL,NULL,ntiles,0,¬firstloop,&totalcost,params); - + oldtotalcost=totalcost; + mintotalcost=totalcost; + nincreasedcostiter=0; /* set pointers to functions for nongrid secondary network */ CalcCost=CalcCostNonGrid; @@ -1456,10 +1461,17 @@ int AssembleTiles(outfileT *outfiles, paramT *params, oldtotalcost=totalcost; totalcost=EvaluateTotalCost((void **)scndrycosts,scndryflows,ntiles,0, nscndryarcs,params); + if(totalcostoldtotalcost || (n>0 && totalcost==oldtotalcost)){ fflush(NULL); - fprintf(sp0,"Unexpected increase in total cost. Breaking loop\n"); - break; + fprintf(sp1,"Caution: Unexpected increase in total cost\n"); + } + if(totalcost>mintotalcost){ + nincreasedcostiter++; + }else{ + nincreasedcostiter=0; } } @@ -1471,6 +1483,15 @@ int AssembleTiles(outfileT *outfiles, paramT *params, nflowdone=1; } + /* break if total cost increase is sustained */ + if(nincreasedcostiter>=params->maxflow){ + fflush(NULL); + fprintf(sp0,"WARNING: Unexpected sustained increase in total cost." + " Breaking loop\n"); + break; + } + + /* break if we're done with all flow increments or problem is convex */ if(nflowdone>=params->maxflow){ break; @@ -2392,9 +2413,9 @@ int SetUpperEdge(long ncol, long tilerow, long tilecol, void **voidcosts, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - upperedgecosts[0][col].offset=nshortcycle*dpsi; - upperedgecosts[0][col].sigsq=ceil((costs[0][col].sigsq - +costsabove[col].sigsq)/2.0); + upperedgecosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + upperedgecosts[0][col].sigsq=AvgSigSq(costs[0][col].sigsq, + costsabove[col].sigsq); if(costs[0][col].dzmax>costsabove[col].dzmax){ upperedgecosts[0][col].dzmax=costs[0][col].dzmax; }else{ @@ -2406,9 +2427,9 @@ int SetUpperEdge(long ncol, long tilerow, long tilecol, void **voidcosts, upperedgecosts[0][col].laycost=costsabove[col].laycost; } }else if(CalcCost==CalcCostSmooth){ - upperedgesmoothcosts[0][col].offset=nshortcycle*dpsi; - upperedgesmoothcosts[0][col].sigsq= - ceil((smoothcosts[0][col].sigsq+smoothcostsabove[col].sigsq)/2.0); + upperedgesmoothcosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + upperedgesmoothcosts[0][col].sigsq + =AvgSigSq(smoothcosts[0][col].sigsq,smoothcostsabove[col].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidupperedgecosts)[0][col]= @@ -2528,9 +2549,9 @@ int SetLowerEdge(long nrow, long ncol, long tilerow, long tilecol, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - loweredgecosts[0][col].offset=nshortcycle*dpsi; - loweredgecosts[0][col].sigsq=ceil((costs[nrow-2][col].sigsq - +costsbelow[col].sigsq)/2.0); + loweredgecosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + loweredgecosts[0][col].sigsq=AvgSigSq(costs[nrow-2][col].sigsq, + costsbelow[col].sigsq); if(costs[nrow-2][col].dzmax>costsbelow[col].dzmax){ loweredgecosts[0][col].dzmax=costs[nrow-2][col].dzmax; }else{ @@ -2542,10 +2563,9 @@ int SetLowerEdge(long nrow, long ncol, long tilerow, long tilecol, loweredgecosts[0][col].laycost=costsbelow[col].laycost; } }else if(CalcCost==CalcCostSmooth){ - loweredgesmoothcosts[0][col].offset=nshortcycle*dpsi; - loweredgesmoothcosts[0][col].sigsq= - ceil((smoothcosts[nrow-2][col].sigsq - +smoothcostsbelow[col].sigsq)/2.0); + loweredgesmoothcosts[0][col].offset=(short )LRound(nshortcycle*dpsi); + loweredgesmoothcosts[0][col].sigsq + =AvgSigSq(smoothcosts[nrow-2][col].sigsq,smoothcostsbelow[col].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidloweredgecosts)[0][col]= @@ -2662,10 +2682,11 @@ int SetLeftEdge(long nrow, long prevncol, long tilerow, long tilecol, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - leftedgecosts[row][0].offset=(TILEDPSICOLFACTOR*nshortcycle*dpsi); - leftedgecosts[row][0].sigsq= - ceil((costs[row+nrow-1][0].sigsq - +lastcosts[row+nrow-1][prevncol-2].sigsq)/2.0); + leftedgecosts[row][0].offset=(short )LRound(TILEDPSICOLFACTOR + *nshortcycle*dpsi); + leftedgecosts[row][0].sigsq + =AvgSigSq(costs[row+nrow-1][0].sigsq, + lastcosts[row+nrow-1][prevncol-2].sigsq); if(costs[row+nrow-1][0].dzmax>lastcosts[row+nrow-1][prevncol-2].dzmax){ leftedgecosts[row][0].dzmax=costs[row+nrow-1][0].dzmax; }else{ @@ -2680,10 +2701,10 @@ int SetLeftEdge(long nrow, long prevncol, long tilerow, long tilecol, } }else if(CalcCost==CalcCostSmooth){ leftedgesmoothcosts[row][0].offset - =(TILEDPSICOLFACTOR*nshortcycle*dpsi); - leftedgesmoothcosts[row][0].sigsq= - ceil((smoothcosts[row+nrow-1][0].sigsq - +lastsmoothcosts[row+nrow-1][prevncol-2].sigsq)/2.0); + =(short )LRound(TILEDPSICOLFACTOR*nshortcycle*dpsi); + leftedgesmoothcosts[row][0].sigsq + =AvgSigSq(smoothcosts[row+nrow-1][0].sigsq, + lastsmoothcosts[row+nrow-1][prevncol-2].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidleftedgecosts)[row][0]= @@ -2807,10 +2828,11 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, dpsi-=1.0; } if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ - rightedgecosts[row][0].offset=(TILEDPSICOLFACTOR*nshortcycle*dpsi); + rightedgecosts[row][0].offset=(short )LRound(TILEDPSICOLFACTOR + *nshortcycle*dpsi); rightedgecosts[row][0].sigsq - =ceil((costs[row+nrow-1][ncol-2].sigsq - +nextcosts[row+nrow-1][0].sigsq)/2.0); + =AvgSigSq(costs[row+nrow-1][ncol-2].sigsq, + nextcosts[row+nrow-1][0].sigsq); if(costs[row+nrow-1][ncol-2].dzmax>nextcosts[row+nrow-1][0].dzmax){ rightedgecosts[row][0].dzmax=costs[row+nrow-1][ncol-2].dzmax; }else{ @@ -2823,10 +2845,10 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, } }else if(CalcCost==CalcCostSmooth){ rightedgesmoothcosts[row][0].offset - =(TILEDPSICOLFACTOR*nshortcycle*dpsi); + =(short )LRound(TILEDPSICOLFACTOR*nshortcycle*dpsi); rightedgesmoothcosts[row][0].sigsq - =ceil((smoothcosts[row+nrow-1][ncol-2].sigsq - +nextsmoothcosts[row+nrow-1][0].sigsq)/2.0); + =AvgSigSq(smoothcosts[row+nrow-1][ncol-2].sigsq, + nextsmoothcosts[row+nrow-1][0].sigsq); }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ ((short **)voidrightedgecosts)[row][0]= @@ -2908,6 +2930,34 @@ int SetRightEdge(long nrow, long ncol, long tilerow, long tilecol, } +/* function: AvgSigSq() + * -------------------- + * Return average of sigsq values after chcking for special value and + * clipping to short. + */ +static +short AvgSigSq(short sigsq1, short sigsq2){ + + int sigsqavg; + + + /* if either value is special LARGESHORT value, use that */ + if(sigsq1==LARGESHORT || sigsq2==LARGESHORT){ + return(LARGESHORT); + } + + /* compute average */ + sigsqavg=(int )ceil(0.5*(((int )sigsq1)+((int )sigsq2))); + + /* clip */ + sigsqavg=LClip(sigsqavg,-LARGESHORT,LARGESHORT); + + /* return */ + return((short )sigsqavg); + +} + + /* function: TraceSecondaryArc() * ----------------------------- */ @@ -2931,7 +2981,7 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, long i, row, col, nnewnodes, arclen, ntilerow, ntilecol, arcnum; long tilenum, nflow, primaryarcrow, primaryarccol, poscost, negcost, nomcost; long nnrow, nncol, calccostnrow, nnewarcs, arroffset, nshortcycle; - long mincost, mincostflow, minweight; + long mincost, mincostflow, minweight, maxcost; long *scndrycostarr; double sigsq, sumsigsqinv, tempdouble, tileedgearcweight; short **flows; @@ -3096,6 +3146,9 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, /* keep absolute cost of arc to the previous node */ if(!zerocost){ + + /* accumulate incremental cost in table for each nflow increment */ + /* offset flow in flow array temporarily by arroffset then undo below */ flows[primaryarcrow][primaryarccol]-=primaryarcdir*arroffset; nomcost=EvalCost(costs,flows,primaryarcrow,primaryarccol,calccostnrow, params); @@ -3125,22 +3178,35 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, } } flows[primaryarcrow][primaryarccol]+=primaryarcdir*arroffset; + + /* accumulate term to be used for cost growth beyond table bounds */ if(CalcCost==CalcCostTopo || CalcCost==CalcCostDefo){ sigsq=((costT **)costs)[primaryarcrow][primaryarccol].sigsq; }else if(CalcCost==CalcCostSmooth){ sigsq=((smoothcostT **)costs)[primaryarcrow][primaryarccol].sigsq; }else if(CalcCost==CalcCostL0 || CalcCost==CalcCostL1 || CalcCost==CalcCostL2 || CalcCost==CalcCostLP){ - sigsq=((short **)costs)[primaryarcrow][primaryarccol]; + minweight=((short **)costs)[primaryarcrow][primaryarccol]; + if(minweight<1){ + sigsq=LARGESHORT; + }else{ + sigsq=1.0/(double )minweight; + } }else if(CalcCost==CalcCostL0BiDir || CalcCost==CalcCostL1BiDir || CalcCost==CalcCostL2BiDir || CalcCost==CalcCostLPBiDir){ minweight=LMin(((bidircostT **)costs)[primaryarcrow][primaryarccol] .posweight, ((bidircostT **)costs)[primaryarcrow][primaryarccol] .negweight); - sigsq=1.0/(double )minweight; + if(minweight<1){ + sigsq=LARGESHORT; + }else{ + sigsq=1.0/(double )minweight; + } } - sumsigsqinv+=(1.0/sigsq); + if(sigsqmaxcost){ + maxcost=scndrycostarr[nflow]; + } + if(scndrycostarr[flowmax+nflow]>maxcost){ + maxcost=scndrycostarr[flowmax+nflow]; + } + } + + /* if cost was all zero, treat as zero cost arc */ + if(maxcost==mincost){ + zerocost=TRUE; + sumsigsqinv=0; } /* break if cost array adequately centered on minimum cost flow */ @@ -3198,12 +3277,16 @@ int TraceSecondaryArc(nodeT *primaryhead, nodeT **scndrynodes, return(0); } - /* see if we have a secondary arc on the edge of the full-sized array */ /* these arcs have zero cost since the edge is treated as a single node */ + /* secondary arcs whose primary arcs all have zero cost are also zeroed */ if(zerocost){ /* set sum of standard deviations to indicate zero-cost secondary arc */ + scndrycostarr[0]=0; + for(nflow=1;nflow<=2*flowmax;nflow++){ + scndrycostarr[nflow]=0; + } scndrycostarr[2*flowmax+1]=ZEROCOSTARC; }else{ @@ -4042,12 +4125,6 @@ int AssembleTileConnComps(long linelen, long nlines, conncompsizes[nconncomp].icompfull=0; conncompsizes[nconncomp].npix=tileconncompsizes[k].npix; nconncomp++; -#define DEBUG -#ifdef DEBUG -if(nconncomp>nconncompmem){ - fprintf(sp0,"ERROR--THIS IS A BUG\n"); -} -#endif } } View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/commit/c7fff519947050f80cd102a77b25912796a1099d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/commit/c7fff519947050f80cd102a77b25912796a1099d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 13:03:57 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 12:03:57 +0000 Subject: [Git][debian-gis-team/snaphu] Pushed new tag debian/2.0.1-1 Message-ID: <5d74ee2d84daa_73482ad95ff5bd34862d6@godard.mail> Bas Couwenberg pushed new tag debian/2.0.1-1 at Debian GIS Project / snaphu -- View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/tree/debian/2.0.1-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 8 13:10:22 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 12:10:22 +0000 Subject: Processing of snaphu_2.0.1-1_amd64.changes Message-ID: snaphu_2.0.1-1_amd64.changes uploaded successfully to localhost along with the files: snaphu_2.0.1-1.dsc snaphu_2.0.1.orig.tar.gz snaphu_2.0.1-1.debian.tar.xz snaphu-dbgsym_2.0.1-1_amd64.deb snaphu_2.0.1-1_amd64.buildinfo snaphu_2.0.1-1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 8 13:24:23 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 12:24:23 +0000 Subject: snaphu_2.0.1-1_amd64.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 08 Sep 2019 11:08:52 +0000 Source: snaphu Binary: snaphu snaphu-dbgsym Architecture: source amd64 Version: 2.0.1-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Description: snaphu - Statistical-Cost, Network-Flow Algorithm for 2D Phase Unwrapping Changes: snaphu (2.0.1-1) unstable; urgency=medium . [ Bas Couwenberg ] * Don't delete bin directory in clean target, included in upstream source. . [ Antonio Valentino ] * New upstream release. * debian/tests/control: - mark test as superficial using Restrictions * debian/patches: - refresh all patches * Bump debhelper from old 11 to 12. * Remove obsolete fields Name from debian/upstream/metadata. Checksums-Sha1: 2a646c17ef26235f06878c9175386e9f2b64c168 1983 snaphu_2.0.1-1.dsc 95d3b392a2a87224abd7028a1d57f9a04872b15e 168419 snaphu_2.0.1.orig.tar.gz 699baf6140b709a153d10ccecbdfbb14d1532272 6240 snaphu_2.0.1-1.debian.tar.xz 509cce6009ed675e86c4cd0f59e4888276ad5d80 264968 snaphu-dbgsym_2.0.1-1_amd64.deb 2010ba7affef0403cac1c1279e5cbe3ae48f515f 6499 snaphu_2.0.1-1_amd64.buildinfo 341107a598ed77be922db9c07d7421994d5ff55d 151256 snaphu_2.0.1-1_amd64.deb Checksums-Sha256: afbb95407f567893965b31916c0e861c2cee462456539ac3ca157db953dea517 1983 snaphu_2.0.1-1.dsc ed2a05d97a05795d489ccb1653459fcd4b73a349ec2759410be280bfdde2e4b5 168419 snaphu_2.0.1.orig.tar.gz 2f4c7c630be223f336f1fa704c7985186bf64643479dc7ead96e5a80f3a672d5 6240 snaphu_2.0.1-1.debian.tar.xz 2982a2338e3c50337e30673a854e6516481b6d4e0c242e9b246d737dea98b055 264968 snaphu-dbgsym_2.0.1-1_amd64.deb b4094a81609af41618730a2d9a15d8bbfcc69e1d9ca766d7cbd9bd466656978e 6499 snaphu_2.0.1-1_amd64.buildinfo cf32907987a7aec4dc9ebad6dd245f49bb5f3f7b385f8205585d7b50a26842b4 151256 snaphu_2.0.1-1_amd64.deb Files: a8430103972277d30afd68373b5c2725 1983 non-free/science optional snaphu_2.0.1-1.dsc 8304254a9d44fc32040ac0ed85c41410 168419 non-free/science optional snaphu_2.0.1.orig.tar.gz 2b33aee1c1883d82a6dea8621797cc45 6240 non-free/science optional snaphu_2.0.1-1.debian.tar.xz f289402f6814cd5f574f06ed3dcc5ef1 264968 non-free/debug optional snaphu-dbgsym_2.0.1-1_amd64.deb 9a00d528da152a3a7c12b98ba7c7ad7a 6499 non-free/science optional snaphu_2.0.1-1_amd64.buildinfo fe831fd92966be81d3f468fc890c223f 151256 non-free/science optional snaphu_2.0.1-1_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl107hsACgkQZ1DxCuiN SvEKUA/+Pu4EbndANcwDXp+yOW/ZtKEyUvMMhmpP5oyN/iZg1gTLcWg/dfNwiHj0 i7GBPQ8BaT9Cw+VL3Jq/WDLOsKRkvcruNtkwIYfVnKa7lFLtbLoGR9rsSYpaQy6H E09u0eec3P9kY3FjhN1HTfCoXf6DvYVH1DXO+irLGW4gecn7LpJJd/oQmrK1OLGR KIgqZ1vWf/Gd3uLNTWq+4vZU+653PVbYp4gsZ6hlYykid5bqKYQtN0Y9s7XrYBrs FifhDxGYS1h7ic1a0eSxDT9xQLYrea/O0j9pnesXiqG+A9+j43SWj1jj/fn9WuG1 velA5RaYTsorCj0L597m2soAmM9I5S++f/Oo91b/KQV9w72WCQWejpHnrGIMW7TQ keMQOJiqJxxyu+LOiEs2wvZDAM5tamCvD1bgbi5S9RRvt0+ABwASUTvc6iKiaCjj TXuGUrB3ahVqVeCyBu+kpzWeD1GJT+tTkzQAjpCV90IaOLhKLhuXGpJqz6rlbJP/ 0Z6zmCTM066ulpUXDMSHm64jmQJbT1MMjLW+XJITjc3lC8HL2UZ1zWuL+E04Rug4 97NWP36sivmPn/2zx2ZYSaKW2D4DDY51M1Flx2Br8TdL3pFF4MyXZ1e71fj3qr0B qX51GW8JvOiulCmLsFX4cPydbAB0SNoUp3+hvr+KtqPZ5mpSyao= =waK7 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sun Sep 8 20:45:47 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 19:45:47 +0000 Subject: [Git][debian-gis-team/pyepr][pristine-tar] pristine-tar data for pyepr_1.0.0.orig.tar.gz Message-ID: <5d755a6bab6e9_73482ad95ff622b0123173@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / pyepr Commits: 1a6ef654 by Antonio Valentino at 2019-09-08T18:58:05Z pristine-tar data for pyepr_1.0.0.orig.tar.gz - - - - - 2 changed files: - + pyepr_1.0.0.orig.tar.gz.delta - + pyepr_1.0.0.orig.tar.gz.id Changes: ===================================== pyepr_1.0.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/pyepr_1.0.0.orig.tar.gz.delta differ ===================================== pyepr_1.0.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +b1f6e69d3dc958ca74ec773a65023ebbbf7daaf0 View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/1a6ef6543a7224234dd2fe217308b68ec5766846 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/1a6ef6543a7224234dd2fe217308b68ec5766846 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 20:46:14 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 19:46:14 +0000 Subject: [Git][debian-gis-team/pyepr] Pushed new tag upstream/1.0.0 Message-ID: <5d755a8659134_73482ad95c4c1b9c1235a9@godard.mail> Antonio Valentino pushed new tag upstream/1.0.0 at Debian GIS Project / pyepr -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/tree/upstream/1.0.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 20:46:13 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 19:46:13 +0000 Subject: [Git][debian-gis-team/pyepr][master] 5 commits: New upstream version 1.0.0 Message-ID: <5d755a85a9018_73482ad95ff5bd341233dc@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyepr Commits: f7a06a44 by Antonio Valentino at 2019-09-08T18:58:03Z New upstream version 1.0.0 - - - - - cad2d4dd by Antonio Valentino at 2019-09-08T18:58:05Z Update upstream source from tag 'upstream/1.0.0' Update to upstream version '1.0.0' with Debian dir be1fe2893d82902fca90da724d7ad74722c473ae - - - - - c525533b by Antonio Valentino at 2019-09-08T18:58:55Z New upstream release - - - - - de76193c by Antonio Valentino at 2019-09-08T19:02:48Z Update debian/copyright file - - - - - 610a3278 by Antonio Valentino at 2019-09-08T19:08:48Z Refresh all patches - - - - - 30 changed files: - + .coveragerc - .gitignore - .travis.yml - Makefile - README.rst - appveyor.yml - debian/changelog - debian/copyright - debian/patches/0001-Only-use-local-files-for-generating-sphinx-doc.patch - doc/Makefile - doc/NEWS.rst - doc/_templates/appveyor.html - + doc/_templates/codecov.html - doc/_templates/ohloh.html - doc/_templates/pypi.html - + doc/_templates/readthedocs.html - doc/_templates/travis-ci.html - doc/conf.py - doc/gdal_export_example.rst - doc/index.rst - doc/interactive_use.rst - doc/make.bat - doc/reference.rst - doc/sphinxext/ipython_console_highlighting.py - doc/usermanual.rst - requirements.txt - setup.py - src/epr.pxd - src/epr.pyx - tests/test_all.py Changes: ===================================== .coveragerc ===================================== @@ -0,0 +1,5 @@ +[run] +plugins = Cython.Coverage +source = src +branch = True +# omit = */Cython/Includes/* ===================================== .gitignore ===================================== @@ -1,2 +1,4 @@ SciTEDirectory.properties .idea +.DS_Store + ===================================== .travis.yml ===================================== @@ -1,24 +1,30 @@ language: python python: - - "2.6" - "2.7" - - "3.3" - "3.4" - "3.5" - "3.6" - # - "3.7" - # - "3.8-dev" - # - "pypy2.7" - - "pypy3.5" + - "3.7" + - "3.8-dev" + - "pypy" + - "pypy3" + +matrix: + allow_failures: + - python: "pypy3" before_install: - - sudo apt-get update -qq - - sudo apt-get install -qq libepr-api-dev + - sudo apt-get update + - sudo apt-get install -y libepr-api-dev install: - pip install -r requirements.txt + - pip install sphinx coverage codecov - if [[ $TRAVIS_PYTHON_VERSION < '3.4' ]]; then pip install -U unittest2; fi - - python setup.py build_ext --inplace -script: make PYTHON=python check +script: + - if [[ $TRAVIS_PYTHON_VERSION = '3.7' ]]; then make PYTHON=python coverage; else make PYTHON=python check; fi + +after_success: + - if [[ $TRAVIS_PYTHON_VERSION = '3.7' ]]; then codecov; fi ===================================== Makefile ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/make -f # -*- coding: utf-8 -*- -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -26,7 +26,7 @@ TEST_DATSET = tests/MER_LRC_2PTGMV20000620_104318_00000104X000_00000_00000_0001. EPRAPIROOT = ../epr-api .PHONY: default ext cythonize sdist eprsrc fullsdist doc clean distclean \ - check debug data upload manylinux + check debug data upload manylinux coverage ext-coverage coverage-report default: ext @@ -71,6 +71,10 @@ clean: $(MAKE) -C doc clean $(RM) -r doc/_build find . -name '*~' -delete + $(RM) *.c *.o *.html .coverage coverage.xml + $(RM) src/epr.html + $(RM) -r htmlcov + $(RM) epr.p* # workaround for Cython.Coverage bug #1985 distclean: clean $(RM) $(TEST_DATSET) @@ -78,9 +82,26 @@ distclean: clean $(RM) -r LICENSES epr-api-src $(MAKE) -C tests -f checksetup.mak distclean -check: ext $(TEST_DATSET) +check: ext data env PYTHONPATH=. $(PYTHON) tests/test_all.py --verbose +ext-coverage: src/epr.pyx + env PYEPR_COVERAGE=TRUE $(PYTHON) setup.py build_ext --inplace + +coverage: clean ext-coverage data + ln -s src/epr.p* . # workaround for Cython.Coverage bug #1985 + env PYEPR_COVERAGE=TRUE PYTHONPATH=. \ + $(PYTHON) -m coverage run --branch --source=src setup.py test + env PYTHONPATH=. $(PYTHON) -m coverage report + +coverage-report: coverage + env PYTHONPATH=. $(PYTHON) -m coverage xml -i + env PYTHONPATH=. $(PYTHON) -m cython -E CYTHON_TRACE_NOGIL=1 \ + -X linetrace=True -X language_level=3str \ + --annotate-coverage coverage.xml src/epr.pyx + env PYTHONPATH=. $(PYTHON) -m coverage html -i + cp src/epr.html htmlcov + debug: $(PYTHON) setup.py build_ext --inplace --debug @@ -92,5 +113,5 @@ $(TEST_DATSET): manylinux: # make fullsdist - # docker pull quay.io/pypa/manylinux1_x86_64 - docker run --rm -v $(shell pwd):/io quay.io/pypa/manylinux1_x86_64 sh /io/build-manylinux-wheels.sh + # docker pull quay.io/pypa/manylinux2010_x86_64 + docker run --rm -v $(shell pwd):/io quay.io/pypa/manylinux2010_x86_64 sh /io/build-manylinux-wheels.sh ===================================== README.rst ===================================== @@ -2,11 +2,43 @@ ENVISAT Product Reader Python API ================================= -:HomePage: http://avalentino.github.io/pyepr +:HomePage: https://avalentino.github.io/pyepr :Author: Antonio Valentino :Contact: antonio.valentino at tiscali.it -:Copyright: 2011-2018, Antonio Valentino -:Version: 0.9.5 +:Copyright: 2011-2019, Antonio Valentino +:Version: 1.0.0 + +.. image:: https://travis-ci.org/avalentino/pyepr.svg?branch=master + :alt: Travis-CI status page + :target: https://travis-ci.org/avalentino/pyepr + +.. image:: https://ci.appveyor.com/api/projects/status/github/avalentino/pyepr?branch=master&svg=true + :alt: AppVeyor status page + :target: https://ci.appveyor.com/project/avalentino/pyepr + +.. image:: https://img.shields.io/pypi/v/pyepr + :alt: Latest Version + :target: https://pypi.org/project/pyepr + +.. image:: https://img.shields.io/pypi/pyversions/pyepr + :alt: Supported Python versions + :target: https://pypi.org/project/pyepr + +.. image:: https://img.shields.io/pypi/l/pyepr + :alt: License + :target: https://pypi.org/project/pyepr + +.. image:: https://img.shields.io/pypi/wheel/pyepr + :alt: Wheel Status + :target: https://pypi.org/project/pyepr + +.. image:: https://readthedocs.org/projects/pyepr/badge + :alt: Documentation Status + :target: https://pyepr.readthedocs.io/en/latest + +.. image:: https://codecov.io/gh/avalentino/pyepr/branch/master/graph/badge.svg + :alt: Coverage Status + :target: https://codecov.io/gh/avalentino/pyepr Introduction @@ -36,28 +68,28 @@ In order to use PyEPR it is needed that the following software are correctly installed and configured: * Python2_ >= 2.6 or Python3_ >= 3.1 (including PyPy_) -* numpy_ >= 1.5.0 +* numpy_ >= 1.7.0 * `EPR API`_ >= 2.2 (optional, since PyEPR 0.7 the source tar-ball comes - with a copy of the PER C API sources) + with a copy of the EPR C API sources) * a reasonably updated C compiler (build only) -* Cython_ >= 0.15 (build only) +* Cython_ >= 0.19 (build only) * unittest2_ (only required for Python < 3.4) .. _Python2: Python_ .. _Python3: Python_ -.. _PyPy: http://pypy.org -.. _numpy: http://www.numpy.org -.. _gcc: http://gcc.gnu.org -.. _Cython: http://cython.org -.. _unittest2: https://pypi.python.org/pypi/unittest2 +.. _PyPy: https://pypy.org +.. _numpy: https://www.numpy.org +.. _gcc: https://gcc.gnu.org +.. _Cython: https://cython.org +.. _unittest2: https://pypi.org/project/unittest2 Download ======== -Official source tarballs can be downloaded form PyPi_: +Official source tar-balls can be downloaded form PyPi_: - https://pypi.python.org/pypi/pyepr + https://pypi.org/project/pyepr The source code of the development versions is available on the GitHub_ project page @@ -68,9 +100,9 @@ To clone the git_ repository the following command can be used:: $ git clone https://github.com/avalentino/pyepr.git -.. _PyPi: https://pypi.python.org/pypi +.. _PyPi: https://pypi.org .. _GitHub: https://github.com -.. _git: http://git-scm.com +.. _git: https://git-scm.com Installation @@ -100,7 +132,7 @@ To install PyEPR_ in a non-standard path:: License ======= -Copyright (C) 2011-2018 Antonio Valentino +Copyright (C) 2011-2019 Antonio Valentino PyEPR is free software: you can redistribute it and/or modify it under the terms of the `GNU General Public License`_ as published by ===================================== appveyor.yml ===================================== @@ -6,59 +6,34 @@ environment: global: - PYTHON: "C:\\conda" - MINICONDA_VERSION: "latest" CMD_IN_ENV: "cmd /E:ON /V:ON /C .\\ci-helpers\\appveyor\\windows_sdk.cmd" - # PYTHON_ARCH: "64" # needs to be set for CMD_IN_ENV to succeed. If a mix - # of 32 bit and 64 bit builds are needed, move this - # to the matrix section. + PYTHON_ARCH: "64" # needs to be set for CMD_IN_ENV to succeed. If a mix + # of 32 bit and 64 bit builds are needed, move this + # to the matrix section. CONDA_DEPENDENCIES: "setuptools numpy Cython unittest2" # DEBUG: True # NUMPY_VERSION: "stable" - matrix: - - platform: x86 + - PYTHON: "C:\\Miniconda-x64" PYTHON_VERSION: "2.7" - PYTHON_ARCH: "32" - - - PYTHON_VERSION: "2.7" - PYTHON_ARCH: "64" - - - platform: x86 - PYTHON_VERSION: "3.4" - PYTHON_ARCH: "32" - - - PYTHON_VERSION: "3.4" - PYTHON_ARCH: "64" - - platform: x86 - PYTHON_VERSION: "3.5" - PYTHON_ARCH: "32" - - - PYTHON_VERSION: "3.5" - PYTHON_ARCH: "64" - - - platform: x86 + - PYTHON: "C:\\Miniconda36-x64" PYTHON_VERSION: "3.6" - PYTHON_ARCH: "32" - - PYTHON_VERSION: "3.6" - PYTHON_ARCH: "64" - - - platform: x86 + - PYTHON: "C:\\Miniconda37-x64" PYTHON_VERSION: "3.7" - PYTHON_ARCH: "32" - - PYTHON_VERSION: "3.7" - PYTHON_ARCH: "64" +platform: + -x64 install: - # conda + # Set up ci-helpers - "git clone git://github.com/astropy/ci-helpers.git" - "powershell ci-helpers/appveyor/install-miniconda.ps1" - "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" - "activate test" + # epr-api - "git clone -b pyepr https://github.com/avalentino/epr-api.git" ===================================== debian/changelog ===================================== @@ -1,3 +1,12 @@ +pyepr (1.0.0-1) UNRELEASED; urgency=medium + + * New upstream release. + * Update debian/copyright file. + * debian/patches: + - refresh all patches + + -- Antonio Valentino Sun, 08 Sep 2019 18:58:19 +0000 + pyepr (0.9.5-3) unstable; urgency=medium [ Bas Couwenberg ] ===================================== debian/copyright ===================================== @@ -3,11 +3,11 @@ Upstream-Name: pyepr Source: https://github.com/avalentino/pyepr Files: * -Copyright: 2011-2018, Antonio Valentino +Copyright: 2011-2019, Antonio Valentino License: GPL-3+ Files: debian/* -Copyright: 2011-2015, Antonio Valentino +Copyright: 2011-2019, Antonio Valentino License: GPL-3+ Files: doc/sphinxext/ipython_console_highlighting.py ===================================== debian/patches/0001-Only-use-local-files-for-generating-sphinx-doc.patch ===================================== @@ -7,14 +7,14 @@ point to local object.inv files. This makes the package self contained and no download attempt happens any longer (Closes: #726859). --- - doc/conf.py | 14 ++++++++------ - 1 file changed, 8 insertions(+), 6 deletions(-) + doc/conf.py | 18 ++++++++++-------- + 1 file changed, 10 insertions(+), 8 deletions(-) diff --git a/doc/conf.py b/doc/conf.py -index 84d729f..148502f 100644 +index 5b7fbe8..ab0a3eb 100644 --- a/doc/conf.py +++ b/doc/conf.py -@@ -139,10 +139,10 @@ html_sidebars = { +@@ -136,12 +136,12 @@ html_sidebars = { 'relations.html', 'sourcelink.html', 'searchbox.html', @@ -22,14 +22,18 @@ index 84d729f..148502f 100644 - 'pypi.html', - 'travis-ci.html', - 'appveyor.html', +- 'readthedocs.html', +- 'codecov.html', + # 'ohloh.html', + # 'pypi.html', + # 'travis-ci.html', + # 'appveyor.html', ++ # 'readthedocs.html', ++ # 'codecov.html', ], } -@@ -249,8 +249,10 @@ extlinks = { +@@ -216,8 +216,10 @@ epub_exclude_files = ['search.html'] # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = { @@ -41,4 +45,4 @@ index 84d729f..148502f 100644 + '/usr/share/doc/python-numpy-doc/html/objects.inv'), } - # If true, `todo` and `todoList` produce output, else they produce nothing. + # -- Options for autodoc extension ------------------------------------------- ===================================== doc/Makefile ===================================== @@ -1,10 +1,10 @@ # Minimal makefile for Sphinx documentation # -# You can set these variables from the command line. -SPHINXOPTS = -SPHINXBUILD = sphinx-build -SPHINXPROJ = PyEPR +# You can set these variables from the command line, and also +# from the environment for the first two. +SPHINXOPTS ?= +SPHINXBUILD ?= sphinx-build SOURCEDIR = . BUILDDIR = _build ===================================== doc/NEWS.rst ===================================== @@ -1,11 +1,23 @@ Change history ============== +PyEPR 1.0.0 (08/09/2019) +------------------------ + +* Do not use deprecated numpy_ API (requires Cython_ >= 0.29) +* Minimal numpy_ version is now v1.7 +* Set cython_ 'language_level` explicitly to '3str' if cython_ >= v0.29, + to '2' otherwise +* Python v2.6, v3.2, v3.3 and v3.4 are now deprecated. + Support for the deprecated Python version will be removed in future + releases of PyEPR + + PyEPR 0.9.5 (23/08/2018) ------------------------ -* Fix compatibility with numpy >= 1.14: :func:`np.fromstring` - is deprecated. +* Fix compatibility with numpy_ >= 1.14: :func:`np.fromstring` + is deprecated * Update the pypi sidebar in the documentation * Use `.rst` extension for doc source files * Fix setup script to not use system libs if epr-api sources are available @@ -124,8 +136,8 @@ PyEPR 0.9 (27/02/2015) .. _pip: https://pip.pypa.io .. _setuptools: https://bitbucket.org/pypa/setuptools -.. _numpy: http://www.numpy.org -.. _Windows: http://windows.microsoft.com +.. _numpy: https://www.numpy.org +.. _Windows: https://windows.microsoft.com .. _AppVeyor: https://www.appveyor.com .. _PyPI: https://pypi.org/project/pyepr @@ -279,7 +291,7 @@ PyEPR 0.5 (25/04/2011) .. _`Python 3`: https://docs.python.org/3 .. _intersphinx: http://www.sphinx-doc.org/en/master/ext/intersphinx.html -.. _cython: http://cython.org +.. _cython: https://cython.org PyEPR 0.4 (10/04/2011) ===================================== doc/_templates/appveyor.html ===================================== @@ -1,5 +1,7 @@
-

-AppVeyor status page -

+

+ + AppVeyor status page + +

===================================== doc/_templates/codecov.html ===================================== @@ -0,0 +1,7 @@ +
+

+ + codecov status + +

+
===================================== doc/_templates/ohloh.html ===================================== @@ -1,3 +1,4 @@
- +
===================================== doc/_templates/pypi.html ===================================== @@ -1,14 +1,22 @@
-

-Latest Version -

-

-Supported Python versions -

-

-License -

-

-Wheel Status -

+

+ + Latest Version + +

+

+ + Supported Python versions + +

+

+ + License + +

+

+ + Wheel Status + +

===================================== doc/_templates/readthedocs.html ===================================== @@ -0,0 +1,7 @@ +
+

+ + readthedocs status + +

+
===================================== doc/_templates/travis-ci.html ===================================== @@ -1,5 +1,7 @@
-

-travis-ci status page -

+

+ + travis-ci status page + +

===================================== doc/conf.py ===================================== @@ -1,17 +1,12 @@ -#!/usr/bin/env python3 # -*- coding: utf-8 -*- # -# PyEPR documentation build configuration file, created by -# sphinx-quickstart on Sun Apr 29 18:26:52 2018. +# Configuration file for the Sphinx documentation builder. # -# This file is execfile()d with the current directory set to its -# containing dir. -# -# Note that not all possible configuration values are present in this -# autogenerated file. -# -# All configuration values have a default; values that are commented out -# serve to show the default. +# This file only contains a selection of the most common options. For a full +# list see the documentation: +# https://www.sphinx-doc.org/en/master/usage/configuration.html + +# -- Path setup -------------------------------------------------------------- # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the @@ -21,72 +16,75 @@ import os import sys sys.path.insert(0, os.path.abspath('sphinxext')) +# -- Project information ----------------------------------------------------- + +project = 'PyEPR' +copyright = '2011-2019, Antonio Valentino' +author = 'Antonio Valentino' + +def get_version(filename='../src/epr.pyx', release=False): + import re + from distutils.version import LooseVersion + + s = open(filename).read() + mobj = re.search("^__version__ = '(?P.*)'$", s, re.MULTILINE) + mobj.group('version') + + v = LooseVersion(mobj.group('version')) + + if release: + return v.vstring + else: + return '.'.join(map(str, v.version[:3])) -# -- General configuration ------------------------------------------------ +# The short X.Y version. +version = get_version() -# If your documentation needs a minimal Sphinx version, state it here. +# The full version, including alpha/beta/rc tags. +release = get_version(release=True) -needs_sphinx = '1.0' +# -- General configuration --------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ # 'sphinx.ext.autodoc', + # 'sphinx.ext.autosectionlabel', # 'sphinx.ext.autosummary', + # 'sphinx.ext.coverage', # 'sphinx.ext.doctest', + 'sphinx.ext.extlinks', + # 'sphinx.ext.githubpages', + # 'sphinx.ext.graphviz', + 'sphinx.ext.ifconfig', + # 'sphinx.ext.imgconverter', + # 'sphinx.ext.inheritance_diagram', 'sphinx.ext.intersphinx', + # 'sphinx.ext.linkcode', + # 'sphinx.ext.napoleon', 'sphinx.ext.todo', - # 'sphinx.ext.coverage', + 'sphinx.ext.viewcode', + + # Math support for HTML outputs in Sphinx 'sphinx.ext.imgmath', - # 'sphinx.ext.jsmath', # 'sphinx.ext.mathjax', - # 'sphinx.ext.graphviz', - # 'sphinx.ext.inheritance_diagram', - # 'sphinx.ext.refcounting', - 'sphinx.ext.ifconfig', - 'sphinx.ext.viewcode', - # 'sphinx.ext.githubpages', - 'sphinx.ext.extlinks', + # 'sphinx.ext.jsmath', + + # Additional extensions 'ipython_console_highlighting', + # 'IPython.sphinxext.ipython_console_highlighting', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] -# The suffix(es) of source filenames. -# You can specify multiple suffix as a list of string: -# -# source_suffix = ['.rst', '.md'] -source_suffix = '.rst' - # The master toctree document. master_doc = 'index' -# General information about the project. -project = u'PyEPR' -copyright = u'2011-2018, Antonio Valentino' -author = u'Antonio Valentino' - -# The version info for the project you're documenting, acts as replacement for -# |version| and |release|, also used in various other places throughout the -# built documents. -# -# The short X.Y version. -version = '0.9.5' -# The full version, including alpha/beta/rc tags. -release = version + '.dev0' - -# The language for content autogenerated by Sphinx. Refer to documentation -# for a list of supported languages. -# -# This is also used if you do content translation via gettext catalogs. -# Usually you set "language" from the command line for these cases. -language = None - # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. -# This patterns also effect to html_static_path and html_extra_path +# This pattern also affects html_static_path and html_extra_path. exclude_patterns = [ '_build', 'Thumbs.db', @@ -98,8 +96,7 @@ exclude_patterns = [ # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' - -# -- Options for HTML output ---------------------------------------------- +# -- Options for HTML output ------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. @@ -143,19 +140,19 @@ html_sidebars = { 'pypi.html', 'travis-ci.html', 'appveyor.html', + 'readthedocs.html', + 'codecov.html', ], } # If false, no module index is generated. html_domain_indices = False - # -- Options for HTMLHelp output ------------------------------------------ # Output file base name for HTML help builder. htmlhelp_basename = 'PyEPRdoc' - # -- Options for LaTeX output --------------------------------------------- latex_elements = { @@ -187,7 +184,6 @@ latex_documents = [ # If false, no module index is generated. latex_domain_indices = False - # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples @@ -197,7 +193,6 @@ man_pages = [ [author], 1) ] - # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples @@ -209,31 +204,23 @@ texinfo_documents = [ 'Miscellaneous'), ] - # -- Options for Epub output ---------------------------------------------- -# Bibliographic Dublin Core info. -epub_title = project -epub_author = author -epub_publisher = author -epub_copyright = copyright - -# The unique identifier of the text. This can be a ISBN number -# or the project homepage. -# -# epub_identifier = '' - -# A unique identification for the text. -# -# epub_uid = '' - # A list of files that should not be packed into the epub file. epub_exclude_files = ['search.html'] -# -- Extensions configuration -------------------------------------------------- +# -- Extension configuration ------------------------------------------------- + +# -- Options for intersphinx extension --------------------------------------- + +# Example configuration for intersphinx: refer to the Python standard library. +intersphinx_mapping = { + 'python': ('https://docs.python.org/3', None), + 'numpy': ('https://docs.scipy.org/doc/numpy', None), +} -# Autodoc configuration +# -- Options for autodoc extension ------------------------------------------- #autoclass_content = 'both' #autodoc_default_flags = ['members', 'undoc-members', 'show-inheritance'] # #,'inherited-members'] @@ -241,18 +228,13 @@ epub_exclude_files = ['search.html'] # Auto summary generation #autosummary_generate = ['reference'] - +# -- Options for extlinks extension ------------------------------------------ # External links configuration extlinks = { 'issue': ('https://github.com/avalentino/pyepr/issues/%s', 'gh-'), } -# Example configuration for intersphinx: refer to the Python standard library. -intersphinx_mapping = { - 'python': ('https://docs.python.org/3', None), - 'numpy': ('https://docs.scipy.org/doc/numpy', None), -} +# -- Options for todo extension ---------------------------------------------- # If true, `todo` and `todoList` produce output, else they produce nothing. todo_include_todos = True - ===================================== doc/gdal_export_example.rst ===================================== @@ -225,7 +225,7 @@ Complete listing :language: python -.. _GDAL: https://www.gdal.org +.. _GDAL: https://gdal.org .. _PyEPR: https://github.com/avalentino/pyepr .. _ENVISAT: https://envisat.esa.int ===================================== doc/index.rst ===================================== @@ -12,7 +12,7 @@ ENVISAT Product Reader Python API :HomePage: http://avalentino.github.io/pyepr :Author: Antonio Valentino :Contact: antonio.valentino at tiscali.it -:Copyright: 2011-2018, Antonio Valentino +:Copyright: 2011-2019, Antonio Valentino :Version: |release| @@ -60,7 +60,8 @@ ENVISAT Product Reader Python API Online documentation for other PyEpr_ versions: * `latest `_ development - * `0.9.5 `_ (latest stable) + * `1.0.0 `_ (latest stable) + * `0.9.5 `_ * `0.9.4 `_ * `0.9.3 `_ * `0.9.2 `_ @@ -80,7 +81,7 @@ License .. index:: license -Copyright (C) 2011-2018 Antonio Valentino +Copyright (C) 2011-2019 Antonio Valentino PyEPR is free software: you can redistribute it and/or modify it under the terms of the `GNU General Public License`_ as published by ===================================== doc/interactive_use.rst ===================================== @@ -20,7 +20,7 @@ ESA_ web site. .. _PyEPR: https://github.com/avalentino/pyepr .. _ENVISAT: https://envisat.esa.int .. _ASAR: https://earth.esa.int/handbooks/asar/CNTR.html -.. _Jupyter: http://jupyter.org/ +.. _Jupyter: https://jupyter.org/ .. _matplotlib: https://matplotlib.org .. _`free sample`: https://earth.esa.int/services/sample_products/asar/IMP/ASA_IMP_1PNUPA20060202_062233_000000152044_00435_20529_3110.N1.gz .. _ESA: https://earth.esa.int @@ -43,7 +43,7 @@ available classes and functions:: Jupyter console 5.2.0 - Python 3.6.5 (default, Apr 1 2018, 05:46:30) + Python 3.6.5 (default, Apr 1 2018, 05:46:30) Type "copyright", "credits" or "license" for more information. IPython 5.5.0 -- An enhanced Interactive Python. @@ -81,7 +81,7 @@ available classes and functions:: .. _ESA: https://earth.esa.int In [3]: epr.__version__, epr.EPR_C_API_VERSION - Out[3]: ('0.9.1', '2.3dev') + Out[3]: ('1.0.0', '2.3dev') .. index:: __version__ ===================================== doc/make.bat ===================================== @@ -9,7 +9,6 @@ if "%SPHINXBUILD%" == "" ( ) set SOURCEDIR=. set BUILDDIR=_build -set SPHINXPROJ=PyEPR if "%1" == "" goto help @@ -26,11 +25,11 @@ if errorlevel 9009 ( exit /b 1 ) -%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% +%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% goto end :help -%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% +%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% :end popd ===================================== doc/reference.rst ===================================== @@ -192,7 +192,6 @@ Product "(", ")", "NOT", "AND", "OR". Valid bit-mask expression are for example ``flags.LAND OR flags.CLOUD`` or ``NOT flags.WATER AND flags.TURBID_S`` - :param xoffset: across-track co-ordinate in pixel co-ordinates (zero-based) of the upper right corner of the source-region @@ -208,7 +207,7 @@ Product .. seealso:: :func:`create_bitmask_raster`. - .. method:: close + .. method:: close Closes the :class:`Product` product and free the underlying file descriptor. @@ -221,7 +220,7 @@ Product once; only the first call, however, will have an effect. - .. method:: flush() + .. method:: flush() Flush the file stream. @@ -1350,7 +1349,7 @@ EPRError :param message: error message - :pram code: + :param code: EPR error code ===================================== doc/sphinxext/ipython_console_highlighting.py ===================================== @@ -1,114 +1,543 @@ -"""reST directive for syntax-highlighting ipython interactive sessions. - -XXX - See what improvements can be made based on the new (as of Sept 2009) -'pycon' lexer for the python console. At the very least it will give better -highlighted tracebacks. +### IPython/lib/lexers.py #################################################### +# -*- coding: utf-8 -*- """ +Defines a variety of Pygments lexers for highlighting IPython code. + +This includes: + + IPythonLexer, IPython3Lexer + Lexers for pure IPython (python + magic/shell commands) + + IPythonPartialTracebackLexer, IPythonTracebackLexer + Supports 2.x and 3.x via keyword `python3`. The partial traceback + lexer reads everything but the Python code appearing in a traceback. + The full lexer combines the partial lexer with an IPython lexer. + IPythonConsoleLexer + A lexer for IPython console sessions, with support for tracebacks. + + IPyLexer + A friendly lexer which examines the first line of text and from it, + decides whether to use an IPython lexer or an IPython console lexer. + This is probably the only lexer that needs to be explicitly added + to Pygments. + +""" +#----------------------------------------------------------------------------- +# Copyright (c) 2013, the IPython Development Team. +# +# Distributed under the terms of the Modified BSD License. +# +# The full license is in the file COPYING.txt, distributed with this software. #----------------------------------------------------------------------------- -# Needed modules # Standard library import re # Third party -from pygments.lexer import Lexer, do_insertions -from pygments.lexers.agile import (PythonConsoleLexer, PythonLexer, - PythonTracebackLexer) -from pygments.token import Comment, Generic +from pygments.lexers import BashLexer, PythonLexer, Python3Lexer +from pygments.lexer import ( + Lexer, DelegatingLexer, RegexLexer, do_insertions, bygroups, using, +) +from pygments.token import ( + Generic, Keyword, Literal, Name, Operator, Other, Text, Error, +) +from pygments.util import get_bool_opt -from sphinx import highlighting +# Local -#----------------------------------------------------------------------------- -# Global constants line_re = re.compile('.*?\n') -#----------------------------------------------------------------------------- -# Code begins - classes and functions +__all__ = ['build_ipy_lexer', 'IPython3Lexer', 'IPythonLexer', + 'IPythonPartialTracebackLexer', 'IPythonTracebackLexer', + 'IPythonConsoleLexer', 'IPyLexer'] + +ipython_tokens = [ + (r"(?s)(\s*)(%%)(\w+)(.*)", bygroups(Text, Operator, Keyword, Text)), + (r'(?s)(^\s*)(%%!)([^\n]*\n)(.*)', bygroups(Text, Operator, Text, using(BashLexer))), + (r"(%%?)(\w+)(\?\??)$", bygroups(Operator, Keyword, Operator)), + (r"\b(\?\??)(\s*)$", bygroups(Operator, Text)), + (r'(%)(sx|sc|system)(.*)(\n)', bygroups(Operator, Keyword, + using(BashLexer), Text)), + (r'(%)(\w+)(.*\n)', bygroups(Operator, Keyword, Text)), + (r'^(!!)(.+)(\n)', bygroups(Operator, using(BashLexer), Text)), + (r'(!)(?!=)(.+)(\n)', bygroups(Operator, using(BashLexer), Text)), + (r'^(\s*)(\?\??)(\s*%{0,2}[\w\.\*]*)', bygroups(Text, Operator, Text)), + (r'(\s*%{0,2}[\w\.\*]*)(\?\??)(\s*)$', bygroups(Text, Operator, Text)), +] + +def build_ipy_lexer(python3): + """Builds IPython lexers depending on the value of `python3`. + + The lexer inherits from an appropriate Python lexer and then adds + information about IPython specific keywords (i.e. magic commands, + shell commands, etc.) + + Parameters + ---------- + python3 : bool + If `True`, then build an IPython lexer from a Python 3 lexer. + + """ + # It would be nice to have a single IPython lexer class which takes + # a boolean `python3`. But since there are two Python lexer classes, + # we will also have two IPython lexer classes. + if python3: + PyLexer = Python3Lexer + name = 'IPython3' + aliases = ['ipython3'] + doc = """IPython3 Lexer""" + else: + PyLexer = PythonLexer + name = 'IPython' + aliases = ['ipython2', 'ipython'] + doc = """IPython Lexer""" + + tokens = PyLexer.tokens.copy() + tokens['root'] = ipython_tokens + tokens['root'] + + attrs = {'name': name, 'aliases': aliases, 'filenames': [], + '__doc__': doc, 'tokens': tokens} + + return type(name, (PyLexer,), attrs) + + +IPython3Lexer = build_ipy_lexer(python3=True) +IPythonLexer = build_ipy_lexer(python3=False) + + +class IPythonPartialTracebackLexer(RegexLexer): + """ + Partial lexer for IPython tracebacks. + + Handles all the non-python output. This works for both Python 2.x and 3.x. + + """ + name = 'IPython Partial Traceback' + + tokens = { + 'root': [ + # Tracebacks for syntax errors have a different style. + # For both types of tracebacks, we mark the first line with + # Generic.Traceback. For syntax errors, we mark the filename + # as we mark the filenames for non-syntax tracebacks. + # + # These two regexps define how IPythonConsoleLexer finds a + # traceback. + # + ## Non-syntax traceback + (r'^(\^C)?(-+\n)', bygroups(Error, Generic.Traceback)), + ## Syntax traceback + (r'^( File)(.*)(, line )(\d+\n)', + bygroups(Generic.Traceback, Name.Namespace, + Generic.Traceback, Literal.Number.Integer)), + + # (Exception Identifier)(Whitespace)(Traceback Message) + (r'(?u)(^[^\d\W]\w*)(\s*)(Traceback.*?\n)', + bygroups(Name.Exception, Generic.Whitespace, Text)), + # (Module/Filename)(Text)(Callee)(Function Signature) + # Better options for callee and function signature? + (r'(.*)( in )(.*)(\(.*\)\n)', + bygroups(Name.Namespace, Text, Name.Entity, Name.Tag)), + # Regular line: (Whitespace)(Line Number)(Python Code) + (r'(\s*?)(\d+)(.*?\n)', + bygroups(Generic.Whitespace, Literal.Number.Integer, Other)), + # Emphasized line: (Arrow)(Line Number)(Python Code) + # Using Exception token so arrow color matches the Exception. + (r'(-*>?\s?)(\d+)(.*?\n)', + bygroups(Name.Exception, Literal.Number.Integer, Other)), + # (Exception Identifier)(Message) + (r'(?u)(^[^\d\W]\w*)(:.*?\n)', + bygroups(Name.Exception, Text)), + # Tag everything else as Other, will be handled later. + (r'.*\n', Other), + ], + } + + +class IPythonTracebackLexer(DelegatingLexer): + """ + IPython traceback lexer. + + For doctests, the tracebacks can be snipped as much as desired with the + exception to the lines that designate a traceback. For non-syntax error + tracebacks, this is the line of hyphens. For syntax error tracebacks, + this is the line which lists the File and line number. + + """ + # The lexer inherits from DelegatingLexer. The "root" lexer is an + # appropriate IPython lexer, which depends on the value of the boolean + # `python3`. First, we parse with the partial IPython traceback lexer. + # Then, any code marked with the "Other" token is delegated to the root + # lexer. + # + name = 'IPython Traceback' + aliases = ['ipythontb'] + + def __init__(self, **options): + self.python3 = get_bool_opt(options, 'python3', False) + if self.python3: + self.aliases = ['ipython3tb'] + else: + self.aliases = ['ipython2tb', 'ipythontb'] + + if self.python3: + IPyLexer = IPython3Lexer + else: + IPyLexer = IPythonLexer + + DelegatingLexer.__init__(self, IPyLexer, + IPythonPartialTracebackLexer, **options) class IPythonConsoleLexer(Lexer): """ - For IPython console output or doctests, such as: + An IPython console lexer for IPython code-blocks and doctests, such as: - .. sourcecode:: ipython + .. code-block:: rst - In [1]: a = 'foo' + .. code-block:: ipythonconsole - In [2]: a - Out[2]: 'foo' + In [1]: a = 'foo' - In [3]: print a - foo + In [2]: a + Out[2]: 'foo' - In [4]: 1 / 0 + In [3]: print a + foo - Notes: + In [4]: 1 / 0 - - Tracebacks are not currently supported. - - It assumes the default IPython prompts, not customized ones. + Support is also provided for IPython exceptions: + + .. code-block:: rst + + .. code-block:: ipythonconsole + + In [1]: raise Exception + + --------------------------------------------------------------------------- + Exception Traceback (most recent call last) + in () + ----> 1 raise Exception + + Exception: + """ - name = 'IPython console session' - aliases = ['ipython'] + aliases = ['ipythonconsole'] mimetypes = ['text/x-ipython-console'] - input_prompt = re.compile("(In \[[0-9]+\]: )|( \.\.\.+:)") - output_prompt = re.compile("(Out\[[0-9]+\]: )|( \.\.\.+:)") - continue_prompt = re.compile(" \.\.\.+:") - tb_start = re.compile("\-+") - def get_tokens_unprocessed(self, text): - pylexer = PythonLexer(**self.options) - tblexer = PythonTracebackLexer(**self.options) + # The regexps used to determine what is input and what is output. + # The default prompts for IPython are: + # + # in = 'In [#]: ' + # continuation = ' .D.: ' + # template = 'Out[#]: ' + # + # Where '#' is the 'prompt number' or 'execution count' and 'D' + # D is a number of dots matching the width of the execution count + # + in1_regex = r'In \[[0-9]+\]: ' + in2_regex = r' \.\.+\.: ' + out_regex = r'Out\[[0-9]+\]: ' + + #: The regex to determine when a traceback starts. + ipytb_start = re.compile(r'^(\^C)?(-+\n)|^( File)(.*)(, line )(\d+\n)') + + def __init__(self, **options): + """Initialize the IPython console lexer. + + Parameters + ---------- + python3 : bool + If `True`, then the console inputs are parsed using a Python 3 + lexer. Otherwise, they are parsed using a Python 2 lexer. + in1_regex : RegexObject + The compiled regular expression used to detect the start + of inputs. Although the IPython configuration setting may have a + trailing whitespace, do not include it in the regex. If `None`, + then the default input prompt is assumed. + in2_regex : RegexObject + The compiled regular expression used to detect the continuation + of inputs. Although the IPython configuration setting may have a + trailing whitespace, do not include it in the regex. If `None`, + then the default input prompt is assumed. + out_regex : RegexObject + The compiled regular expression used to detect outputs. If `None`, + then the default output prompt is assumed. + + """ + self.python3 = get_bool_opt(options, 'python3', False) + if self.python3: + self.aliases = ['ipython3console'] + else: + self.aliases = ['ipython2console', 'ipythonconsole'] + + in1_regex = options.get('in1_regex', self.in1_regex) + in2_regex = options.get('in2_regex', self.in2_regex) + out_regex = options.get('out_regex', self.out_regex) + + # So that we can work with input and output prompts which have been + # rstrip'd (possibly by editors) we also need rstrip'd variants. If + # we do not do this, then such prompts will be tagged as 'output'. + # The reason can't just use the rstrip'd variants instead is because + # we want any whitespace associated with the prompt to be inserted + # with the token. This allows formatted code to be modified so as hide + # the appearance of prompts, with the whitespace included. One example + # use of this is in copybutton.js from the standard lib Python docs. + in1_regex_rstrip = in1_regex.rstrip() + '\n' + in2_regex_rstrip = in2_regex.rstrip() + '\n' + out_regex_rstrip = out_regex.rstrip() + '\n' - curcode = '' - insertions = [] + # Compile and save them all. + attrs = ['in1_regex', 'in2_regex', 'out_regex', + 'in1_regex_rstrip', 'in2_regex_rstrip', 'out_regex_rstrip'] + for attr in attrs: + self.__setattr__(attr, re.compile(locals()[attr])) + + Lexer.__init__(self, **options) + + if self.python3: + pylexer = IPython3Lexer + tblexer = IPythonTracebackLexer + else: + pylexer = IPythonLexer + tblexer = IPythonTracebackLexer + + self.pylexer = pylexer(**options) + self.tblexer = tblexer(**options) + + self.reset() + + def reset(self): + self.mode = 'output' + self.index = 0 + self.buffer = u'' + self.insertions = [] + + def buffered_tokens(self): + """ + Generator of unprocessed tokens after doing insertions and before + changing to a new state. + + """ + if self.mode == 'output': + tokens = [(0, Generic.Output, self.buffer)] + elif self.mode == 'input': + tokens = self.pylexer.get_tokens_unprocessed(self.buffer) + else: # traceback + tokens = self.tblexer.get_tokens_unprocessed(self.buffer) + + for i, t, v in do_insertions(self.insertions, tokens): + # All token indexes are relative to the buffer. + yield self.index + i, t, v + + # Clear it all + self.index += len(self.buffer) + self.buffer = u'' + self.insertions = [] + + def get_mci(self, line): + """ + Parses the line and returns a 3-tuple: (mode, code, insertion). + + `mode` is the next mode (or state) of the lexer, and is always equal + to 'input', 'output', or 'tb'. + + `code` is a portion of the line that should be added to the buffer + corresponding to the next mode and eventually lexed by another lexer. + For example, `code` could be Python code if `mode` were 'input'. + + `insertion` is a 3-tuple (index, token, text) representing an + unprocessed "token" that will be inserted into the stream of tokens + that are created from the buffer once we change modes. This is usually + the input or output prompt. + + In general, the next mode depends on current mode and on the contents + of `line`. + + """ + # To reduce the number of regex match checks, we have multiple + # 'if' blocks instead of 'if-elif' blocks. + + # Check for possible end of input + in2_match = self.in2_regex.match(line) + in2_match_rstrip = self.in2_regex_rstrip.match(line) + if (in2_match and in2_match.group().rstrip() == line.rstrip()) or \ + in2_match_rstrip: + end_input = True + else: + end_input = False + if end_input and self.mode != 'tb': + # Only look for an end of input when not in tb mode. + # An ellipsis could appear within the traceback. + mode = 'output' + code = u'' + insertion = (0, Generic.Prompt, line) + return mode, code, insertion + + # Check for output prompt + out_match = self.out_regex.match(line) + out_match_rstrip = self.out_regex_rstrip.match(line) + if out_match or out_match_rstrip: + mode = 'output' + if out_match: + idx = out_match.end() + else: + idx = out_match_rstrip.end() + code = line[idx:] + # Use the 'heading' token for output. We cannot use Generic.Error + # since it would conflict with exceptions. + insertion = (0, Generic.Heading, line[:idx]) + return mode, code, insertion + + + # Check for input or continuation prompt (non stripped version) + in1_match = self.in1_regex.match(line) + if in1_match or (in2_match and self.mode != 'tb'): + # New input or when not in tb, continued input. + # We do not check for continued input when in tb since it is + # allowable to replace a long stack with an ellipsis. + mode = 'input' + if in1_match: + idx = in1_match.end() + else: # in2_match + idx = in2_match.end() + code = line[idx:] + insertion = (0, Generic.Prompt, line[:idx]) + return mode, code, insertion + + # Check for input or continuation prompt (stripped version) + in1_match_rstrip = self.in1_regex_rstrip.match(line) + if in1_match_rstrip or (in2_match_rstrip and self.mode != 'tb'): + # New input or when not in tb, continued input. + # We do not check for continued input when in tb since it is + # allowable to replace a long stack with an ellipsis. + mode = 'input' + if in1_match_rstrip: + idx = in1_match_rstrip.end() + else: # in2_match + idx = in2_match_rstrip.end() + code = line[idx:] + insertion = (0, Generic.Prompt, line[:idx]) + return mode, code, insertion + + # Check for traceback + if self.ipytb_start.match(line): + mode = 'tb' + code = line + insertion = None + return mode, code, insertion + + # All other stuff... + if self.mode in ('input', 'output'): + # We assume all other text is output. Multiline input that + # does not use the continuation marker cannot be detected. + # For example, the 3 in the following is clearly output: + # + # In [1]: print 3 + # 3 + # + # But the following second line is part of the input: + # + # In [2]: while True: + # print True + # + # In both cases, the 2nd line will be 'output'. + # + mode = 'output' + else: + mode = 'tb' + + code = line + insertion = None + + return mode, code, insertion + + def get_tokens_unprocessed(self, text): + self.reset() for match in line_re.finditer(text): line = match.group() - input_prompt = self.input_prompt.match(line) - continue_prompt = self.continue_prompt.match(line.rstrip()) - output_prompt = self.output_prompt.match(line) - if line.startswith("#"): - insertions.append((len(curcode), - [(0, Comment, line)])) - elif input_prompt is not None: - insertions.append((len(curcode), - [(0, Generic.Prompt, input_prompt.group())])) - curcode += line[input_prompt.end():] - elif continue_prompt is not None: - insertions.append((len(curcode), - [(0, Generic.Prompt, continue_prompt.group())])) - curcode += line[continue_prompt.end():] - elif output_prompt is not None: - # Use the 'error' token for output. We should probably make - # our own token, but error is typicaly in a bright color like - # red, so it works fine for our output prompts. - insertions.append((len(curcode), - [(0, Generic.Error, output_prompt.group())])) - curcode += line[output_prompt.end():] - else: - if curcode: - for item in do_insertions(insertions, - pylexer.get_tokens_unprocessed(curcode)): - yield item - curcode = '' - insertions = [] - yield match.start(), Generic.Output, line - if curcode: - for item in do_insertions(insertions, - pylexer.get_tokens_unprocessed(curcode)): - yield item + mode, code, insertion = self.get_mci(line) + + if mode != self.mode: + # Yield buffered tokens before transitioning to new mode. + for token in self.buffered_tokens(): + yield token + self.mode = mode + + if insertion: + self.insertions.append((len(self.buffer), [insertion])) + self.buffer += code + + for token in self.buffered_tokens(): + yield token + +class IPyLexer(Lexer): + """ + Primary lexer for all IPython-like code. + + This is a simple helper lexer. If the first line of the text begins with + "In \[[0-9]+\]:", then the entire text is parsed with an IPython console + lexer. If not, then the entire text is parsed with an IPython lexer. + + The goal is to reduce the number of lexers that are registered + with Pygments. + + """ + name = 'IPy session' + aliases = ['ipy'] + + def __init__(self, **options): + self.python3 = get_bool_opt(options, 'python3', False) + if self.python3: + self.aliases = ['ipy3'] + else: + self.aliases = ['ipy2', 'ipy'] + + Lexer.__init__(self, **options) + self.IPythonLexer = IPythonLexer(**options) + self.IPythonConsoleLexer = IPythonConsoleLexer(**options) + + def get_tokens_unprocessed(self, text): + # Search for the input prompt anywhere...this allows code blocks to + # begin with comments as well. + if re.match(r'.*(In \[[0-9]+\]:)', text.strip(), re.DOTALL): + lex = self.IPythonConsoleLexer + else: + lex = self.IPythonLexer + for token in lex.get_tokens_unprocessed(text): + yield token + + +### IPython/sphinxext/ipython_console_highighting.py ######################### +""" +reST directive for syntax-highlighting ipython interactive sessions. + +""" + +from sphinx import highlighting +# from IPython.lib.lexers import IPyLexer def setup(app): """Setup as a sphinx extension.""" # This is only a lexer, so adding it below to pygments appears sufficient. - # But if somebody knows that the right API usage should be to do that via + # But if somebody knows what the right API usage should be to do that via # sphinx, by all means fix it here. At least having this setup.py # suppresses the sphinx warning we'd get without it. - pass + metadata = {'parallel_read_safe': True, 'parallel_write_safe': True} + return metadata -#----------------------------------------------------------------------------- -# Register the extension as a valid pygments lexer -highlighting.lexers['ipython'] = IPythonConsoleLexer() +# Register the extension as a valid pygments lexer. +# Alternatively, we could register the lexer with pygments instead. This would +# require using setuptools entrypoints: http://pygments.org/docs/plugins + +ipy2 = IPyLexer(python3=False) +ipy3 = IPyLexer(python3=True) + +highlighting.lexers['ipython'] = ipy2 +highlighting.lexers['ipython2'] = ipy2 +highlighting.lexers['ipython3'] = ipy3 ===================================== doc/usermanual.rst ===================================== @@ -63,7 +63,7 @@ In order to use PyEPR it is needed that the following software are correctly installed and configured: * Python2_ >= 2.6 or Python3_ >= 3.1 (including PyPy_) -* numpy_ >= 1.5.0 +* numpy_ >= 1.7.0 * `EPR API`_ >= 2.2 (optional, since PyEPR 0.7 the source tar-ball comes with a copy of the EPR C API sources) * a reasonably updated C compiler [#]_ (build only) @@ -82,9 +82,9 @@ correctly installed and configured: .. _Python2: Python_ .. _Python3: Python_ .. _PyPy: http://pypy.org -.. _numpy: http://www.numpy.org +.. _numpy: https://www.numpy.org .. _gcc: http://gcc.gnu.org -.. _Cython: http://cython.org +.. _Cython: https://cython.org .. _unittest2: https://pypi.org/project/unittest2 ===================================== requirements.txt ===================================== @@ -1,3 +1,3 @@ -numpy>=1.5 +numpy>=1.7 cython>=0.19 unittest2;python_version<"3.4" ===================================== setup.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -26,12 +26,15 @@ import sys import glob +PYEPR_COVERAGE = False + + def get_version(filename): with open(filename) as fd: data = fd.read() mobj = re.search( - '''^__version__\s*=\s*(?P['"])(?P\d+(\.\d+)*.*)(?P=q)''', + r'''^__version__\s*=\s*(?P['"])(?P\d+(\.\d+)*.*)(?P=q)''', data, re.MULTILINE) return mobj.group('version') @@ -62,10 +65,14 @@ print('HAVE_SETUPTOOLS: {0}'.format(HAVE_SETUPTOOLS)) try: from Cython.Build import cythonize + from Cython import __version__ as CYTHON_VERSION HAVE_CYTHON = True except ImportError: HAVE_CYTHON = False + CYTHON_VERSION = None print('HAVE_CYTHON: {0}'.format(HAVE_CYTHON)) +if HAVE_CYTHON: + print('CYTHON_VERSION: {0}'.format(CYTHON_VERSION)) # @COMPATIBILITY: Extension is an old style class in Python 2 @@ -151,18 +158,44 @@ def get_extension(): sys.argv.remove(arg) break + define_macros = [] + + # @NOTE: uses the CYTHON_VERSION global variable + if HAVE_CYTHON and CYTHON_VERSION >= '0.29': + define_macros.append( + ('NPY_NO_DEPRECATED_API', 'NPY_1_7_API_VERSION'), + ) + ext = PyEprExtension( 'epr', sources=[os.path.join('src', 'epr.pyx')], # libraries=['m'], - # define_macros=[('NPY_NO_DEPRECATED_API', 'NPY_1_7_API_VERSION'),], + define_macros=define_macros, eprsrcdir=eprsrcdir, ) - # @NOTE: uses the HAVE_CYTHON global variable + # @NOTE: uses the HAVE_CYTHON and CYTHON_VERSION global variables if HAVE_CYTHON: - extlist = cythonize([ext]) + if CYTHON_VERSION >= '0.29': + language_level = '3str' + else: + language_level = '2' + print('CYTHON_LANGUAGE_LEVEL: {0}'.format(language_level)) + + compiler_directives = dict( + language_level=language_level, + ) + + if PYEPR_COVERAGE: + compiler_directives['linetrace'] = True + + extlist = cythonize([ext], compiler_directives=compiler_directives) ext = extlist[0] + + if PYEPR_COVERAGE: + ext.define_macros.extend([ + ('CYTHON_TRACE_NOGIL', '1'), + ]) else: ext.convert_pyx_sources_to_lang() @@ -203,12 +236,12 @@ any data field contained in a product file. 'Operating System :: POSIX', 'Programming Language :: Python', 'Programming Language :: Python :: 2', - 'Programming Language :: Python :: 2.6', + 'Programming Language :: Python :: 2.6', # deprecated 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', - 'Programming Language :: Python :: 3.2', - 'Programming Language :: Python :: 3.3', - 'Programming Language :: Python :: 3.4', + 'Programming Language :: Python :: 3.2', # deprecated + 'Programming Language :: Python :: 3.3', # deprecated + 'Programming Language :: Python :: 3.4', # deprecated 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', @@ -239,8 +272,8 @@ def setup_package(): if HAVE_SETUPTOOLS: config['test_suite'] = get_collector() - config.setdefault('setup_requires', []).append('numpy>=1.5') - config.setdefault('install_requires', []).append('numpy>=1.5') + config.setdefault('setup_requires', []).append('numpy>=1.7') + config.setdefault('install_requires', []).append('numpy>=1.7') if ext.setup_requires_cython: config['setup_requires'].append('cython>=0.19') @@ -248,4 +281,11 @@ def setup_package(): if __name__ == '__main__': + if '--coverage' in sys.argv or 'PYEPR_COVERAGE' in os.environ: + PYEPR_COVERAGE = True + if '--coverage' in sys.argv: + sys.argv.remove('--coverage') + + print('PYEPR_COVERAGE:', PYEPR_COVERAGE) + setup_package() ===================================== src/epr.pxd ===================================== @@ -2,7 +2,7 @@ # PyEPR - Python bindings for ENVISAT Product Reader API # -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # ===================================== src/epr.pyx ===================================== @@ -2,7 +2,7 @@ # PyEPR - Python bindings for ENVISAT Product Reader API # -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -33,15 +33,15 @@ products. It provides access to the data either on a geophysical The raw data access makes it possible to read any data field contained in a product file. -.. _PyEPR: http://avalentino.github.com/pyepr -.. _Python: http://www.python.org +.. _PyEPR: https://avalentino.github.io/pyepr +.. _Python: https://www.python.org .. _`EPR API`: https://github.com/bcdev/epr-api .. _ENVISAT: http://envisat.esa.int .. _ESA: http://earth.esa.int """ -__version__ = '0.9.5' +__version__ = '1.0.0' from libc cimport errno from libc cimport stdio @@ -69,6 +69,7 @@ np.import_array() import os import sys +import atexit from collections import namedtuple import numpy as np @@ -195,6 +196,48 @@ _MODEL_MAP = { } +ctypedef fused T: + np.uint8_t + np.int8_t + np.uint16_t + np.int16_t + np.uint32_t + np.int32_t + np.float32_t + np.float64_t + np.npy_byte + + +cdef const void* _view_to_ptr(T[:] a): + return &a[0] + + +cdef const void* _to_ptr(np.ndarray a, EPR_DataTypeId etype): + cdef const void *p = NULL + if etype == e_tid_uchar: + p = _view_to_ptr[np.uint8_t](a) + elif etype == e_tid_char: + p = _view_to_ptr[np.int8_t](a) + elif etype == e_tid_ushort: + p = _view_to_ptr[np.uint16_t](a) + elif etype == e_tid_short: + p = _view_to_ptr[np.int16_t](a) + elif etype == e_tid_uint: + p = _view_to_ptr[np.uint32_t](a) + elif etype == e_tid_int: + p = _view_to_ptr[np.int32_t](a) + elif etype == e_tid_float: + p = _view_to_ptr[np.float32_t](a) + elif etype == e_tid_double: + p = _view_to_ptr[np.float64_t](a) + elif etype == e_tid_string: + p = _view_to_ptr[np.npy_byte](a) + else: + raise ValueError('unexpected type ID: %d' % etype) + + return p + + class EPRError(Exception): """EPR API error.""" @@ -275,11 +318,11 @@ cdef class _CLib: cdef bytes msg # @TODO: check - #if EPR_C_API_VERSION != '2.2': - # raise ImportError('C library version not supported: "%s"' % - # EPR_C_API_VERSION) + # if EPR_C_API_VERSION != '2.2': + # raise ImportError( + # 'C library version not supported: "%s"' % EPR_C_API_VERSION) - #if epr_init_api(e_log_warning, epr_log_message, NULL): + # if epr_init_api(e_log_warning, epr_log_message, NULL): if epr_init_api(e_log_warning, NULL, NULL): msg = epr_get_last_err_message() epr_clear_err() @@ -388,7 +431,7 @@ cdef class DSD(EprObject): if isinstance(self, Dataset): (self._parent).check_closed_product() else: - #elif isinstance(self, Product): + # elif isinstance(self, Product): (self._parent).check_closed_product() property index: @@ -560,7 +603,6 @@ cdef class Field(EprObject): return offset - def print_(self, ostream=None): """print_(self, ostream=None) @@ -589,9 +631,9 @@ cdef class Field(EprObject): pyepr_check_errors() - #def dump_field(self): - # epr_dump_field(self._ptr) - # pyepr_check_errors() + # def dump_field(self): + # epr_dump_field(self._ptr) + # pyepr_check_errors() def get_unit(self): """get_unit(self) @@ -704,8 +746,8 @@ cdef class Field(EprObject): if index != 0: raise ValueError('invalid index: %d' % index) val = epr_get_field_elem_as_str(self._ptr) - #elif etype == e_tid_spare: - # val = epr_get_field_elem_as_str(self._ptr) + # elif etype == e_tid_spare: + # val = epr_get_field_elem_as_str(self._ptr) elif etype == e_tid_time: if index != 0: raise ValueError('invalid index: %d' % index) @@ -810,15 +852,15 @@ cdef class Field(EprObject): buf = epr_get_field_elem_as_str(self._ptr) if buf is NULL: pyepr_null_ptr_error(msg) - #elif etype == e_tid_unknown: - # pass - #elif etype = e_tid_spare: - # pass + # elif etype == e_tid_unknown: + # pass + # elif etype = e_tid_spare: + # pass else: raise ValueError('invalid field type') out = np.PyArray_SimpleNewFromData(nd, shape, dtype, buf) - #np.PyArray_CLEARFLAG(out, NPY_ARRAY_WRITEABLE) # new in numpy 1.7 + # np.PyArray_CLEARFLAG(out, NPY_ARRAY_WRITEABLE) # new in numpy 1.7 # Make the ndarray keep a reference to this object np.set_array_base(out, self) @@ -837,10 +879,11 @@ cdef class Field(EprObject): cdef long field_offset cdef char* buf cdef EPR_DataTypeId etype = epr_get_field_type(self._ptr) + cdef const void* p = NULL dtype = _DTYPE_MAP[etype] - elems = elems.astype(dtype) + elems = np.ascontiguousarray(elems, dtype=dtype) record = self._parent dataset = record._parent @@ -853,16 +896,18 @@ cdef class Field(EprObject): field_offset = index * elemsize file_offset = self._get_offset(absolute=1) buf = self._ptr.elems + field_offset + p = _to_ptr(elems, etype) - cstring.memcpy(buf, elems.data, datasize) + with nogil: + cstring.memcpy(buf, p, datasize) if SWAP_BYTES: elems = elems.byteswap() + p = _to_ptr(elems, etype) with nogil: stdio.fseek(istream, file_offset + field_offset, stdio.SEEK_SET) - ret = stdio.fwrite(elems.data, elemsize, nelems, - product._ptr.istream) + ret = stdio.fwrite(p, elemsize, nelems, product._ptr.istream) if ret != nelems: raise IOError( 'write error: %d of %d bytes written' % (ret, datasize)) @@ -939,7 +984,6 @@ cdef class Field(EprObject): cdef EPR_FieldInfo* info = self._ptr.info return info.tot_size - # --- high level interface ------------------------------------------------ def __repr__(self): return 'epr.Field("%s") %d %s elements' % (self.get_name(), @@ -1021,7 +1065,7 @@ cdef class Field(EprObject): n = epr_get_data_type_size(epr_get_field_type(p1)) if n != 0: n *= epr_get_field_num_elems(p1) - #pyepr_check_errors() + # pyepr_check_errors() if n <= 0: # @TODO: check return True @@ -1053,7 +1097,7 @@ cdef class Field(EprObject): n = epr_get_data_type_size(epr_get_field_type(p1)) if n != 0: n *= epr_get_field_num_elems(p1) - #pyepr_check_errors() + # pyepr_check_errors() if n <= 0: # @TODO: check return False @@ -1127,14 +1171,14 @@ cdef class Record(EprObject): if isinstance(self._parent, Dataset): (self._parent).check_closed_product() else: - #elif isinstance(self._parent, Product): + # elif isinstance(self._parent, Product): (self._parent).check_closed_product() cdef inline _check_write_mode(self): if isinstance(self._parent, Dataset): (self._parent)._check_write_mode() else: - #elif isinstance(self._parent, Product): + # elif isinstance(self._parent, Product): (self._parent)._check_write_mode() cdef inline uint _get_offset(self, bint absolure=0): @@ -1491,7 +1535,7 @@ cdef class Raster(EprObject): """ if (x < 0 or x >= self._ptr.raster_width or - y < 0 or y >= self._ptr.raster_height): + y < 0 or y >= self._ptr.raster_height): raise ValueError('index out of range: x=%d, y=%d' % (x, y)) cdef EPR_EDataTypeId dtype = self._ptr.data_type @@ -2131,7 +2175,6 @@ cdef class Band(EprObject): self.check_closed_product() return self._ptr.magic - property _field_index: """Index or the field (within the dataset) containing the raw data used to create the band's pixel values. @@ -2470,8 +2513,8 @@ cdef class Product(EprObject): """ if self._ptr is not NULL: - #if '+' in self.mode: - # stdio.fflush(self._ptr.istream) + # if '+' in self.mode: + # stdio.fflush(self._ptr.istream) epr_close_product(self._ptr) pyepr_check_errors() self._ptr = NULL @@ -2880,8 +2923,8 @@ cdef class Product(EprObject): return [self.get_band_at(idx) for idx in range(num_bands)] # @TODO: iter on both datasets and bands (??) - #def __iter__(self): - # return itertools.chain((self.datasets(), self.bands())) + # def __iter__(self): + # return itertools.chain((self.datasets(), self.bands())) def __repr__(self): return 'epr.Product(%s) %d datasets, %d bands' % (self.id_string, @@ -2940,9 +2983,6 @@ def open(filename, mode='rb'): _EPR_C_LIB = _CLib.__new__(_CLib) -import atexit - - @atexit.register def _close_api(): # ensure that all EprObject(s) are collected before removing the last ===================================== tests/test_all.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -26,6 +26,7 @@ import gzip import shutil import numbers import operator +import platform import tempfile import functools import contextlib @@ -40,6 +41,8 @@ try: from unittest import skipIf as _skipIf, TestCase as _TestCase if not hasattr(_TestCase, 'subTest'): raise ImportError + if not hasattr(_TestCase, 'assertRaisesRegex'): + raise ImportError except ImportError: import unittest2 as unittest else: @@ -520,9 +523,9 @@ class TestProductHighLevelAPI(unittest.TestCase): # pass def test_repr(self): - pattern = ('epr\.Product\((?P\w+)\) ' - '(?P\d+) datasets, ' - '(?P\d+) bands') + pattern = (r'epr\.Product\((?P\w+)\) ' + r'(?P\d+) datasets, ' + r'(?P\d+) bands') mobj = re.match(pattern, repr(self.product)) self.assertNotEqual(mobj, None) @@ -748,7 +751,7 @@ class TestDatasetHighLevelAPI(unittest.TestCase): self.assertEqual(index, self.dataset.get_num_records()) def test_repr(self): - pattern = 'epr\.Dataset\((?P\w+)\) (?P\d+) records' + pattern = r'epr\.Dataset\((?P\w+)\) (?P\d+) records' mobj = re.match(pattern, repr(self.dataset)) self.assertNotEqual(mobj, None) self.assertEqual(mobj.group('name'), self.dataset.get_name()) @@ -1321,8 +1324,8 @@ class TestBandHighLevelAPI(unittest.TestCase): self.product.close() def test_repr(self): - pattern = ('epr.Band\((?P\w+)\) of ' - 'epr.Product\((?P\w+)\)') + pattern = (r'epr.Band\((?P\w+)\) of ' + r'epr.Product\((?P\w+)\)') for band in self.product.bands(): mobj = re.match(pattern, repr(band)) self.assertNotEqual(mobj, None) @@ -1642,8 +1645,8 @@ class TestRasterHighLevelAPI(unittest.TestCase): self.RASTER_WIDTH, self.RASTER_HEIGHT) def test_repr(self): - pattern = (' (?P\w+) ' - '\((?P\d+)L x (?P\d+)P\)') + pattern = (r' (?P\w+) ' + r'\((?P\d+)L x (?P\d+)P\)') mobj = re.match(pattern, repr(self.raster)) self.assertNotEqual(mobj, None) self.assertEqual(mobj.group('data_type'), @@ -1865,7 +1868,7 @@ class TestMultipleRecordsHighLevelAPI(unittest.TestCase): self.product.close() def test_repr(self): - pattern = ' (?P\d+) fields' + pattern = r' (?P\d+) fields' for record in self.dataset: mobj = re.match(pattern, repr(record)) self.assertNotEqual(mobj, None) @@ -2429,8 +2432,8 @@ class TestFieldHighLevelAPI(unittest.TestCase): self.record = dataset.read_record(0) def test_repr(self): - pattern = ('epr\.Field\("(?P.+)"\) (?P\d+) ' - '(?P\w+) elements') + pattern = (r'epr\.Field\("(?P.+)"\) (?P\d+) ' + r'(?P\w+) elements') for field in self.record: mobj = re.match(pattern, repr(field)) self.assertNotEqual(mobj, None) @@ -2678,7 +2681,7 @@ class TestDsdHighLevelAPI(unittest.TestCase): self.dsd = product.get_dsd_at(0) def test_repr(self): - pattern = 'epr\.DSD\("(?P.+)"\)' + pattern = r'epr\.DSD\("(?P.+)"\)' mobj = re.match(pattern, repr(self.dsd)) self.assertNotEqual(mobj, None) self.assertEqual(mobj.group('name'), self.dsd.ds_name) @@ -2854,34 +2857,6 @@ class TestSampleModelFunctions(unittest.TestCase): class TestDirectInstantiation(unittest.TestCase): MSG_PATTERN = '"%s" class cannot be instantiated from Python' - if sys.version_info[:2] >= (3, 2): - # @COMPATIBILITY: python >= 3.2 - pass - elif sys.version_info[:2] in ((2, 7), (3, 1)): - # @COMPATIBILITY: unittest2, python2.7, python3.1 - assertRaisesRegex = unittest.TestCase.assertRaisesRegexp - else: - - # @COMPATIBILITY: python < 2.7 - def assertRaisesRegex(self, expected_exception, expected_regexp, - callable_obj=None, *args, **kwargs): - try: - callable_obj(*args, **kwargs) - except expected_exception as exc_value: - import types - if isinstance(expected_regexp, types.StringTypes): - expected_regexp = re.compile(expected_regexp) - if not expected_regexp.search(str(exc_value)): - raise self.failureException( - '"%s" does not match "%s"' % (expected_regexp.pattern, - str(exc_value))) - else: - if hasattr(expected_exception, '__name__'): - excName = expected_exception.__name__ - else: - excName = str(expected_exception) - raise self.failureException("%s not raised" % excName) - def test_direct_dsd_instantiation(self): pattern = self.MSG_PATTERN % epr.DSD.__name__ self.assertRaisesRegex(TypeError, pattern, epr.DSD) @@ -2915,6 +2890,9 @@ class TestLibVersion(unittest.TestCase): self.assertTrue(isinstance(epr.EPR_C_API_VERSION, str)) +# only PyPy 3 seems to be affected + at unittest.skipIf(platform.python_implementation() == 'PyPy', + 'skip memory leak check on PyPy') @unittest.skipIf(resource is None, '"resource" module not available') class TestMemoryLeaks(unittest.TestCase): # See gh-10 (https://github.com/avalentino/pyepr/issues/10) View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/compare/03545fcf958311f5682eedec354d9e3b3b1d4b0d...610a3278b7a2fab18b2fd05b14db96eb1c6caa6d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/compare/03545fcf958311f5682eedec354d9e3b3b1d4b0d...610a3278b7a2fab18b2fd05b14db96eb1c6caa6d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 20:46:30 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 19:46:30 +0000 Subject: [Git][debian-gis-team/pyepr][upstream] New upstream version 1.0.0 Message-ID: <5d755a96add1c_73483fbbb23f22dc123677@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / pyepr Commits: f7a06a44 by Antonio Valentino at 2019-09-08T18:58:03Z New upstream version 1.0.0 - - - - - 27 changed files: - + .coveragerc - .gitignore - .travis.yml - Makefile - README.rst - appveyor.yml - doc/Makefile - doc/NEWS.rst - doc/_templates/appveyor.html - + doc/_templates/codecov.html - doc/_templates/ohloh.html - doc/_templates/pypi.html - + doc/_templates/readthedocs.html - doc/_templates/travis-ci.html - doc/conf.py - doc/gdal_export_example.rst - doc/index.rst - doc/interactive_use.rst - doc/make.bat - doc/reference.rst - doc/sphinxext/ipython_console_highlighting.py - doc/usermanual.rst - requirements.txt - setup.py - src/epr.pxd - src/epr.pyx - tests/test_all.py Changes: ===================================== .coveragerc ===================================== @@ -0,0 +1,5 @@ +[run] +plugins = Cython.Coverage +source = src +branch = True +# omit = */Cython/Includes/* ===================================== .gitignore ===================================== @@ -1,2 +1,4 @@ SciTEDirectory.properties .idea +.DS_Store + ===================================== .travis.yml ===================================== @@ -1,24 +1,30 @@ language: python python: - - "2.6" - "2.7" - - "3.3" - "3.4" - "3.5" - "3.6" - # - "3.7" - # - "3.8-dev" - # - "pypy2.7" - - "pypy3.5" + - "3.7" + - "3.8-dev" + - "pypy" + - "pypy3" + +matrix: + allow_failures: + - python: "pypy3" before_install: - - sudo apt-get update -qq - - sudo apt-get install -qq libepr-api-dev + - sudo apt-get update + - sudo apt-get install -y libepr-api-dev install: - pip install -r requirements.txt + - pip install sphinx coverage codecov - if [[ $TRAVIS_PYTHON_VERSION < '3.4' ]]; then pip install -U unittest2; fi - - python setup.py build_ext --inplace -script: make PYTHON=python check +script: + - if [[ $TRAVIS_PYTHON_VERSION = '3.7' ]]; then make PYTHON=python coverage; else make PYTHON=python check; fi + +after_success: + - if [[ $TRAVIS_PYTHON_VERSION = '3.7' ]]; then codecov; fi ===================================== Makefile ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/make -f # -*- coding: utf-8 -*- -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -26,7 +26,7 @@ TEST_DATSET = tests/MER_LRC_2PTGMV20000620_104318_00000104X000_00000_00000_0001. EPRAPIROOT = ../epr-api .PHONY: default ext cythonize sdist eprsrc fullsdist doc clean distclean \ - check debug data upload manylinux + check debug data upload manylinux coverage ext-coverage coverage-report default: ext @@ -71,6 +71,10 @@ clean: $(MAKE) -C doc clean $(RM) -r doc/_build find . -name '*~' -delete + $(RM) *.c *.o *.html .coverage coverage.xml + $(RM) src/epr.html + $(RM) -r htmlcov + $(RM) epr.p* # workaround for Cython.Coverage bug #1985 distclean: clean $(RM) $(TEST_DATSET) @@ -78,9 +82,26 @@ distclean: clean $(RM) -r LICENSES epr-api-src $(MAKE) -C tests -f checksetup.mak distclean -check: ext $(TEST_DATSET) +check: ext data env PYTHONPATH=. $(PYTHON) tests/test_all.py --verbose +ext-coverage: src/epr.pyx + env PYEPR_COVERAGE=TRUE $(PYTHON) setup.py build_ext --inplace + +coverage: clean ext-coverage data + ln -s src/epr.p* . # workaround for Cython.Coverage bug #1985 + env PYEPR_COVERAGE=TRUE PYTHONPATH=. \ + $(PYTHON) -m coverage run --branch --source=src setup.py test + env PYTHONPATH=. $(PYTHON) -m coverage report + +coverage-report: coverage + env PYTHONPATH=. $(PYTHON) -m coverage xml -i + env PYTHONPATH=. $(PYTHON) -m cython -E CYTHON_TRACE_NOGIL=1 \ + -X linetrace=True -X language_level=3str \ + --annotate-coverage coverage.xml src/epr.pyx + env PYTHONPATH=. $(PYTHON) -m coverage html -i + cp src/epr.html htmlcov + debug: $(PYTHON) setup.py build_ext --inplace --debug @@ -92,5 +113,5 @@ $(TEST_DATSET): manylinux: # make fullsdist - # docker pull quay.io/pypa/manylinux1_x86_64 - docker run --rm -v $(shell pwd):/io quay.io/pypa/manylinux1_x86_64 sh /io/build-manylinux-wheels.sh + # docker pull quay.io/pypa/manylinux2010_x86_64 + docker run --rm -v $(shell pwd):/io quay.io/pypa/manylinux2010_x86_64 sh /io/build-manylinux-wheels.sh ===================================== README.rst ===================================== @@ -2,11 +2,43 @@ ENVISAT Product Reader Python API ================================= -:HomePage: http://avalentino.github.io/pyepr +:HomePage: https://avalentino.github.io/pyepr :Author: Antonio Valentino :Contact: antonio.valentino at tiscali.it -:Copyright: 2011-2018, Antonio Valentino -:Version: 0.9.5 +:Copyright: 2011-2019, Antonio Valentino +:Version: 1.0.0 + +.. image:: https://travis-ci.org/avalentino/pyepr.svg?branch=master + :alt: Travis-CI status page + :target: https://travis-ci.org/avalentino/pyepr + +.. image:: https://ci.appveyor.com/api/projects/status/github/avalentino/pyepr?branch=master&svg=true + :alt: AppVeyor status page + :target: https://ci.appveyor.com/project/avalentino/pyepr + +.. image:: https://img.shields.io/pypi/v/pyepr + :alt: Latest Version + :target: https://pypi.org/project/pyepr + +.. image:: https://img.shields.io/pypi/pyversions/pyepr + :alt: Supported Python versions + :target: https://pypi.org/project/pyepr + +.. image:: https://img.shields.io/pypi/l/pyepr + :alt: License + :target: https://pypi.org/project/pyepr + +.. image:: https://img.shields.io/pypi/wheel/pyepr + :alt: Wheel Status + :target: https://pypi.org/project/pyepr + +.. image:: https://readthedocs.org/projects/pyepr/badge + :alt: Documentation Status + :target: https://pyepr.readthedocs.io/en/latest + +.. image:: https://codecov.io/gh/avalentino/pyepr/branch/master/graph/badge.svg + :alt: Coverage Status + :target: https://codecov.io/gh/avalentino/pyepr Introduction @@ -36,28 +68,28 @@ In order to use PyEPR it is needed that the following software are correctly installed and configured: * Python2_ >= 2.6 or Python3_ >= 3.1 (including PyPy_) -* numpy_ >= 1.5.0 +* numpy_ >= 1.7.0 * `EPR API`_ >= 2.2 (optional, since PyEPR 0.7 the source tar-ball comes - with a copy of the PER C API sources) + with a copy of the EPR C API sources) * a reasonably updated C compiler (build only) -* Cython_ >= 0.15 (build only) +* Cython_ >= 0.19 (build only) * unittest2_ (only required for Python < 3.4) .. _Python2: Python_ .. _Python3: Python_ -.. _PyPy: http://pypy.org -.. _numpy: http://www.numpy.org -.. _gcc: http://gcc.gnu.org -.. _Cython: http://cython.org -.. _unittest2: https://pypi.python.org/pypi/unittest2 +.. _PyPy: https://pypy.org +.. _numpy: https://www.numpy.org +.. _gcc: https://gcc.gnu.org +.. _Cython: https://cython.org +.. _unittest2: https://pypi.org/project/unittest2 Download ======== -Official source tarballs can be downloaded form PyPi_: +Official source tar-balls can be downloaded form PyPi_: - https://pypi.python.org/pypi/pyepr + https://pypi.org/project/pyepr The source code of the development versions is available on the GitHub_ project page @@ -68,9 +100,9 @@ To clone the git_ repository the following command can be used:: $ git clone https://github.com/avalentino/pyepr.git -.. _PyPi: https://pypi.python.org/pypi +.. _PyPi: https://pypi.org .. _GitHub: https://github.com -.. _git: http://git-scm.com +.. _git: https://git-scm.com Installation @@ -100,7 +132,7 @@ To install PyEPR_ in a non-standard path:: License ======= -Copyright (C) 2011-2018 Antonio Valentino +Copyright (C) 2011-2019 Antonio Valentino PyEPR is free software: you can redistribute it and/or modify it under the terms of the `GNU General Public License`_ as published by ===================================== appveyor.yml ===================================== @@ -6,59 +6,34 @@ environment: global: - PYTHON: "C:\\conda" - MINICONDA_VERSION: "latest" CMD_IN_ENV: "cmd /E:ON /V:ON /C .\\ci-helpers\\appveyor\\windows_sdk.cmd" - # PYTHON_ARCH: "64" # needs to be set for CMD_IN_ENV to succeed. If a mix - # of 32 bit and 64 bit builds are needed, move this - # to the matrix section. + PYTHON_ARCH: "64" # needs to be set for CMD_IN_ENV to succeed. If a mix + # of 32 bit and 64 bit builds are needed, move this + # to the matrix section. CONDA_DEPENDENCIES: "setuptools numpy Cython unittest2" # DEBUG: True # NUMPY_VERSION: "stable" - matrix: - - platform: x86 + - PYTHON: "C:\\Miniconda-x64" PYTHON_VERSION: "2.7" - PYTHON_ARCH: "32" - - - PYTHON_VERSION: "2.7" - PYTHON_ARCH: "64" - - - platform: x86 - PYTHON_VERSION: "3.4" - PYTHON_ARCH: "32" - - - PYTHON_VERSION: "3.4" - PYTHON_ARCH: "64" - - platform: x86 - PYTHON_VERSION: "3.5" - PYTHON_ARCH: "32" - - - PYTHON_VERSION: "3.5" - PYTHON_ARCH: "64" - - - platform: x86 + - PYTHON: "C:\\Miniconda36-x64" PYTHON_VERSION: "3.6" - PYTHON_ARCH: "32" - - PYTHON_VERSION: "3.6" - PYTHON_ARCH: "64" - - - platform: x86 + - PYTHON: "C:\\Miniconda37-x64" PYTHON_VERSION: "3.7" - PYTHON_ARCH: "32" - - PYTHON_VERSION: "3.7" - PYTHON_ARCH: "64" +platform: + -x64 install: - # conda + # Set up ci-helpers - "git clone git://github.com/astropy/ci-helpers.git" - "powershell ci-helpers/appveyor/install-miniconda.ps1" - "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" - "activate test" + # epr-api - "git clone -b pyepr https://github.com/avalentino/epr-api.git" ===================================== doc/Makefile ===================================== @@ -1,10 +1,10 @@ # Minimal makefile for Sphinx documentation # -# You can set these variables from the command line. -SPHINXOPTS = -SPHINXBUILD = sphinx-build -SPHINXPROJ = PyEPR +# You can set these variables from the command line, and also +# from the environment for the first two. +SPHINXOPTS ?= +SPHINXBUILD ?= sphinx-build SOURCEDIR = . BUILDDIR = _build ===================================== doc/NEWS.rst ===================================== @@ -1,11 +1,23 @@ Change history ============== +PyEPR 1.0.0 (08/09/2019) +------------------------ + +* Do not use deprecated numpy_ API (requires Cython_ >= 0.29) +* Minimal numpy_ version is now v1.7 +* Set cython_ 'language_level` explicitly to '3str' if cython_ >= v0.29, + to '2' otherwise +* Python v2.6, v3.2, v3.3 and v3.4 are now deprecated. + Support for the deprecated Python version will be removed in future + releases of PyEPR + + PyEPR 0.9.5 (23/08/2018) ------------------------ -* Fix compatibility with numpy >= 1.14: :func:`np.fromstring` - is deprecated. +* Fix compatibility with numpy_ >= 1.14: :func:`np.fromstring` + is deprecated * Update the pypi sidebar in the documentation * Use `.rst` extension for doc source files * Fix setup script to not use system libs if epr-api sources are available @@ -124,8 +136,8 @@ PyEPR 0.9 (27/02/2015) .. _pip: https://pip.pypa.io .. _setuptools: https://bitbucket.org/pypa/setuptools -.. _numpy: http://www.numpy.org -.. _Windows: http://windows.microsoft.com +.. _numpy: https://www.numpy.org +.. _Windows: https://windows.microsoft.com .. _AppVeyor: https://www.appveyor.com .. _PyPI: https://pypi.org/project/pyepr @@ -279,7 +291,7 @@ PyEPR 0.5 (25/04/2011) .. _`Python 3`: https://docs.python.org/3 .. _intersphinx: http://www.sphinx-doc.org/en/master/ext/intersphinx.html -.. _cython: http://cython.org +.. _cython: https://cython.org PyEPR 0.4 (10/04/2011) ===================================== doc/_templates/appveyor.html ===================================== @@ -1,5 +1,7 @@
-

-AppVeyor status page -

+

+ + AppVeyor status page + +

===================================== doc/_templates/codecov.html ===================================== @@ -0,0 +1,7 @@ +
+

+ + codecov status + +

+
===================================== doc/_templates/ohloh.html ===================================== @@ -1,3 +1,4 @@
- +
===================================== doc/_templates/pypi.html ===================================== @@ -1,14 +1,22 @@
-

-Latest Version -

-

-Supported Python versions -

-

-License -

-

-Wheel Status -

+

+ + Latest Version + +

+

+ + Supported Python versions + +

+

+ + License + +

+

+ + Wheel Status + +

===================================== doc/_templates/readthedocs.html ===================================== @@ -0,0 +1,7 @@ +
+

+ + readthedocs status + +

+
===================================== doc/_templates/travis-ci.html ===================================== @@ -1,5 +1,7 @@
-

-travis-ci status page -

+

+ + travis-ci status page + +

===================================== doc/conf.py ===================================== @@ -1,17 +1,12 @@ -#!/usr/bin/env python3 # -*- coding: utf-8 -*- # -# PyEPR documentation build configuration file, created by -# sphinx-quickstart on Sun Apr 29 18:26:52 2018. +# Configuration file for the Sphinx documentation builder. # -# This file is execfile()d with the current directory set to its -# containing dir. -# -# Note that not all possible configuration values are present in this -# autogenerated file. -# -# All configuration values have a default; values that are commented out -# serve to show the default. +# This file only contains a selection of the most common options. For a full +# list see the documentation: +# https://www.sphinx-doc.org/en/master/usage/configuration.html + +# -- Path setup -------------------------------------------------------------- # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the @@ -21,72 +16,75 @@ import os import sys sys.path.insert(0, os.path.abspath('sphinxext')) +# -- Project information ----------------------------------------------------- + +project = 'PyEPR' +copyright = '2011-2019, Antonio Valentino' +author = 'Antonio Valentino' + +def get_version(filename='../src/epr.pyx', release=False): + import re + from distutils.version import LooseVersion + + s = open(filename).read() + mobj = re.search("^__version__ = '(?P.*)'$", s, re.MULTILINE) + mobj.group('version') + + v = LooseVersion(mobj.group('version')) + + if release: + return v.vstring + else: + return '.'.join(map(str, v.version[:3])) -# -- General configuration ------------------------------------------------ +# The short X.Y version. +version = get_version() -# If your documentation needs a minimal Sphinx version, state it here. +# The full version, including alpha/beta/rc tags. +release = get_version(release=True) -needs_sphinx = '1.0' +# -- General configuration --------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ # 'sphinx.ext.autodoc', + # 'sphinx.ext.autosectionlabel', # 'sphinx.ext.autosummary', + # 'sphinx.ext.coverage', # 'sphinx.ext.doctest', + 'sphinx.ext.extlinks', + # 'sphinx.ext.githubpages', + # 'sphinx.ext.graphviz', + 'sphinx.ext.ifconfig', + # 'sphinx.ext.imgconverter', + # 'sphinx.ext.inheritance_diagram', 'sphinx.ext.intersphinx', + # 'sphinx.ext.linkcode', + # 'sphinx.ext.napoleon', 'sphinx.ext.todo', - # 'sphinx.ext.coverage', + 'sphinx.ext.viewcode', + + # Math support for HTML outputs in Sphinx 'sphinx.ext.imgmath', - # 'sphinx.ext.jsmath', # 'sphinx.ext.mathjax', - # 'sphinx.ext.graphviz', - # 'sphinx.ext.inheritance_diagram', - # 'sphinx.ext.refcounting', - 'sphinx.ext.ifconfig', - 'sphinx.ext.viewcode', - # 'sphinx.ext.githubpages', - 'sphinx.ext.extlinks', + # 'sphinx.ext.jsmath', + + # Additional extensions 'ipython_console_highlighting', + # 'IPython.sphinxext.ipython_console_highlighting', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] -# The suffix(es) of source filenames. -# You can specify multiple suffix as a list of string: -# -# source_suffix = ['.rst', '.md'] -source_suffix = '.rst' - # The master toctree document. master_doc = 'index' -# General information about the project. -project = u'PyEPR' -copyright = u'2011-2018, Antonio Valentino' -author = u'Antonio Valentino' - -# The version info for the project you're documenting, acts as replacement for -# |version| and |release|, also used in various other places throughout the -# built documents. -# -# The short X.Y version. -version = '0.9.5' -# The full version, including alpha/beta/rc tags. -release = version + '.dev0' - -# The language for content autogenerated by Sphinx. Refer to documentation -# for a list of supported languages. -# -# This is also used if you do content translation via gettext catalogs. -# Usually you set "language" from the command line for these cases. -language = None - # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. -# This patterns also effect to html_static_path and html_extra_path +# This pattern also affects html_static_path and html_extra_path. exclude_patterns = [ '_build', 'Thumbs.db', @@ -98,8 +96,7 @@ exclude_patterns = [ # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' - -# -- Options for HTML output ---------------------------------------------- +# -- Options for HTML output ------------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. @@ -143,19 +140,19 @@ html_sidebars = { 'pypi.html', 'travis-ci.html', 'appveyor.html', + 'readthedocs.html', + 'codecov.html', ], } # If false, no module index is generated. html_domain_indices = False - # -- Options for HTMLHelp output ------------------------------------------ # Output file base name for HTML help builder. htmlhelp_basename = 'PyEPRdoc' - # -- Options for LaTeX output --------------------------------------------- latex_elements = { @@ -187,7 +184,6 @@ latex_documents = [ # If false, no module index is generated. latex_domain_indices = False - # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples @@ -197,7 +193,6 @@ man_pages = [ [author], 1) ] - # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples @@ -209,31 +204,23 @@ texinfo_documents = [ 'Miscellaneous'), ] - # -- Options for Epub output ---------------------------------------------- -# Bibliographic Dublin Core info. -epub_title = project -epub_author = author -epub_publisher = author -epub_copyright = copyright - -# The unique identifier of the text. This can be a ISBN number -# or the project homepage. -# -# epub_identifier = '' - -# A unique identification for the text. -# -# epub_uid = '' - # A list of files that should not be packed into the epub file. epub_exclude_files = ['search.html'] -# -- Extensions configuration -------------------------------------------------- +# -- Extension configuration ------------------------------------------------- + +# -- Options for intersphinx extension --------------------------------------- + +# Example configuration for intersphinx: refer to the Python standard library. +intersphinx_mapping = { + 'python': ('https://docs.python.org/3', None), + 'numpy': ('https://docs.scipy.org/doc/numpy', None), +} -# Autodoc configuration +# -- Options for autodoc extension ------------------------------------------- #autoclass_content = 'both' #autodoc_default_flags = ['members', 'undoc-members', 'show-inheritance'] # #,'inherited-members'] @@ -241,18 +228,13 @@ epub_exclude_files = ['search.html'] # Auto summary generation #autosummary_generate = ['reference'] - +# -- Options for extlinks extension ------------------------------------------ # External links configuration extlinks = { 'issue': ('https://github.com/avalentino/pyepr/issues/%s', 'gh-'), } -# Example configuration for intersphinx: refer to the Python standard library. -intersphinx_mapping = { - 'python': ('https://docs.python.org/3', None), - 'numpy': ('https://docs.scipy.org/doc/numpy', None), -} +# -- Options for todo extension ---------------------------------------------- # If true, `todo` and `todoList` produce output, else they produce nothing. todo_include_todos = True - ===================================== doc/gdal_export_example.rst ===================================== @@ -225,7 +225,7 @@ Complete listing :language: python -.. _GDAL: https://www.gdal.org +.. _GDAL: https://gdal.org .. _PyEPR: https://github.com/avalentino/pyepr .. _ENVISAT: https://envisat.esa.int ===================================== doc/index.rst ===================================== @@ -12,7 +12,7 @@ ENVISAT Product Reader Python API :HomePage: http://avalentino.github.io/pyepr :Author: Antonio Valentino :Contact: antonio.valentino at tiscali.it -:Copyright: 2011-2018, Antonio Valentino +:Copyright: 2011-2019, Antonio Valentino :Version: |release| @@ -60,7 +60,8 @@ ENVISAT Product Reader Python API Online documentation for other PyEpr_ versions: * `latest `_ development - * `0.9.5 `_ (latest stable) + * `1.0.0 `_ (latest stable) + * `0.9.5 `_ * `0.9.4 `_ * `0.9.3 `_ * `0.9.2 `_ @@ -80,7 +81,7 @@ License .. index:: license -Copyright (C) 2011-2018 Antonio Valentino +Copyright (C) 2011-2019 Antonio Valentino PyEPR is free software: you can redistribute it and/or modify it under the terms of the `GNU General Public License`_ as published by ===================================== doc/interactive_use.rst ===================================== @@ -20,7 +20,7 @@ ESA_ web site. .. _PyEPR: https://github.com/avalentino/pyepr .. _ENVISAT: https://envisat.esa.int .. _ASAR: https://earth.esa.int/handbooks/asar/CNTR.html -.. _Jupyter: http://jupyter.org/ +.. _Jupyter: https://jupyter.org/ .. _matplotlib: https://matplotlib.org .. _`free sample`: https://earth.esa.int/services/sample_products/asar/IMP/ASA_IMP_1PNUPA20060202_062233_000000152044_00435_20529_3110.N1.gz .. _ESA: https://earth.esa.int @@ -43,7 +43,7 @@ available classes and functions:: Jupyter console 5.2.0 - Python 3.6.5 (default, Apr 1 2018, 05:46:30) + Python 3.6.5 (default, Apr 1 2018, 05:46:30) Type "copyright", "credits" or "license" for more information. IPython 5.5.0 -- An enhanced Interactive Python. @@ -81,7 +81,7 @@ available classes and functions:: .. _ESA: https://earth.esa.int In [3]: epr.__version__, epr.EPR_C_API_VERSION - Out[3]: ('0.9.1', '2.3dev') + Out[3]: ('1.0.0', '2.3dev') .. index:: __version__ ===================================== doc/make.bat ===================================== @@ -9,7 +9,6 @@ if "%SPHINXBUILD%" == "" ( ) set SOURCEDIR=. set BUILDDIR=_build -set SPHINXPROJ=PyEPR if "%1" == "" goto help @@ -26,11 +25,11 @@ if errorlevel 9009 ( exit /b 1 ) -%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% +%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% goto end :help -%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% +%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O% :end popd ===================================== doc/reference.rst ===================================== @@ -192,7 +192,6 @@ Product "(", ")", "NOT", "AND", "OR". Valid bit-mask expression are for example ``flags.LAND OR flags.CLOUD`` or ``NOT flags.WATER AND flags.TURBID_S`` - :param xoffset: across-track co-ordinate in pixel co-ordinates (zero-based) of the upper right corner of the source-region @@ -208,7 +207,7 @@ Product .. seealso:: :func:`create_bitmask_raster`. - .. method:: close + .. method:: close Closes the :class:`Product` product and free the underlying file descriptor. @@ -221,7 +220,7 @@ Product once; only the first call, however, will have an effect. - .. method:: flush() + .. method:: flush() Flush the file stream. @@ -1350,7 +1349,7 @@ EPRError :param message: error message - :pram code: + :param code: EPR error code ===================================== doc/sphinxext/ipython_console_highlighting.py ===================================== @@ -1,114 +1,543 @@ -"""reST directive for syntax-highlighting ipython interactive sessions. - -XXX - See what improvements can be made based on the new (as of Sept 2009) -'pycon' lexer for the python console. At the very least it will give better -highlighted tracebacks. +### IPython/lib/lexers.py #################################################### +# -*- coding: utf-8 -*- """ +Defines a variety of Pygments lexers for highlighting IPython code. + +This includes: + + IPythonLexer, IPython3Lexer + Lexers for pure IPython (python + magic/shell commands) + + IPythonPartialTracebackLexer, IPythonTracebackLexer + Supports 2.x and 3.x via keyword `python3`. The partial traceback + lexer reads everything but the Python code appearing in a traceback. + The full lexer combines the partial lexer with an IPython lexer. + IPythonConsoleLexer + A lexer for IPython console sessions, with support for tracebacks. + + IPyLexer + A friendly lexer which examines the first line of text and from it, + decides whether to use an IPython lexer or an IPython console lexer. + This is probably the only lexer that needs to be explicitly added + to Pygments. + +""" +#----------------------------------------------------------------------------- +# Copyright (c) 2013, the IPython Development Team. +# +# Distributed under the terms of the Modified BSD License. +# +# The full license is in the file COPYING.txt, distributed with this software. #----------------------------------------------------------------------------- -# Needed modules # Standard library import re # Third party -from pygments.lexer import Lexer, do_insertions -from pygments.lexers.agile import (PythonConsoleLexer, PythonLexer, - PythonTracebackLexer) -from pygments.token import Comment, Generic +from pygments.lexers import BashLexer, PythonLexer, Python3Lexer +from pygments.lexer import ( + Lexer, DelegatingLexer, RegexLexer, do_insertions, bygroups, using, +) +from pygments.token import ( + Generic, Keyword, Literal, Name, Operator, Other, Text, Error, +) +from pygments.util import get_bool_opt -from sphinx import highlighting +# Local -#----------------------------------------------------------------------------- -# Global constants line_re = re.compile('.*?\n') -#----------------------------------------------------------------------------- -# Code begins - classes and functions +__all__ = ['build_ipy_lexer', 'IPython3Lexer', 'IPythonLexer', + 'IPythonPartialTracebackLexer', 'IPythonTracebackLexer', + 'IPythonConsoleLexer', 'IPyLexer'] + +ipython_tokens = [ + (r"(?s)(\s*)(%%)(\w+)(.*)", bygroups(Text, Operator, Keyword, Text)), + (r'(?s)(^\s*)(%%!)([^\n]*\n)(.*)', bygroups(Text, Operator, Text, using(BashLexer))), + (r"(%%?)(\w+)(\?\??)$", bygroups(Operator, Keyword, Operator)), + (r"\b(\?\??)(\s*)$", bygroups(Operator, Text)), + (r'(%)(sx|sc|system)(.*)(\n)', bygroups(Operator, Keyword, + using(BashLexer), Text)), + (r'(%)(\w+)(.*\n)', bygroups(Operator, Keyword, Text)), + (r'^(!!)(.+)(\n)', bygroups(Operator, using(BashLexer), Text)), + (r'(!)(?!=)(.+)(\n)', bygroups(Operator, using(BashLexer), Text)), + (r'^(\s*)(\?\??)(\s*%{0,2}[\w\.\*]*)', bygroups(Text, Operator, Text)), + (r'(\s*%{0,2}[\w\.\*]*)(\?\??)(\s*)$', bygroups(Text, Operator, Text)), +] + +def build_ipy_lexer(python3): + """Builds IPython lexers depending on the value of `python3`. + + The lexer inherits from an appropriate Python lexer and then adds + information about IPython specific keywords (i.e. magic commands, + shell commands, etc.) + + Parameters + ---------- + python3 : bool + If `True`, then build an IPython lexer from a Python 3 lexer. + + """ + # It would be nice to have a single IPython lexer class which takes + # a boolean `python3`. But since there are two Python lexer classes, + # we will also have two IPython lexer classes. + if python3: + PyLexer = Python3Lexer + name = 'IPython3' + aliases = ['ipython3'] + doc = """IPython3 Lexer""" + else: + PyLexer = PythonLexer + name = 'IPython' + aliases = ['ipython2', 'ipython'] + doc = """IPython Lexer""" + + tokens = PyLexer.tokens.copy() + tokens['root'] = ipython_tokens + tokens['root'] + + attrs = {'name': name, 'aliases': aliases, 'filenames': [], + '__doc__': doc, 'tokens': tokens} + + return type(name, (PyLexer,), attrs) + + +IPython3Lexer = build_ipy_lexer(python3=True) +IPythonLexer = build_ipy_lexer(python3=False) + + +class IPythonPartialTracebackLexer(RegexLexer): + """ + Partial lexer for IPython tracebacks. + + Handles all the non-python output. This works for both Python 2.x and 3.x. + + """ + name = 'IPython Partial Traceback' + + tokens = { + 'root': [ + # Tracebacks for syntax errors have a different style. + # For both types of tracebacks, we mark the first line with + # Generic.Traceback. For syntax errors, we mark the filename + # as we mark the filenames for non-syntax tracebacks. + # + # These two regexps define how IPythonConsoleLexer finds a + # traceback. + # + ## Non-syntax traceback + (r'^(\^C)?(-+\n)', bygroups(Error, Generic.Traceback)), + ## Syntax traceback + (r'^( File)(.*)(, line )(\d+\n)', + bygroups(Generic.Traceback, Name.Namespace, + Generic.Traceback, Literal.Number.Integer)), + + # (Exception Identifier)(Whitespace)(Traceback Message) + (r'(?u)(^[^\d\W]\w*)(\s*)(Traceback.*?\n)', + bygroups(Name.Exception, Generic.Whitespace, Text)), + # (Module/Filename)(Text)(Callee)(Function Signature) + # Better options for callee and function signature? + (r'(.*)( in )(.*)(\(.*\)\n)', + bygroups(Name.Namespace, Text, Name.Entity, Name.Tag)), + # Regular line: (Whitespace)(Line Number)(Python Code) + (r'(\s*?)(\d+)(.*?\n)', + bygroups(Generic.Whitespace, Literal.Number.Integer, Other)), + # Emphasized line: (Arrow)(Line Number)(Python Code) + # Using Exception token so arrow color matches the Exception. + (r'(-*>?\s?)(\d+)(.*?\n)', + bygroups(Name.Exception, Literal.Number.Integer, Other)), + # (Exception Identifier)(Message) + (r'(?u)(^[^\d\W]\w*)(:.*?\n)', + bygroups(Name.Exception, Text)), + # Tag everything else as Other, will be handled later. + (r'.*\n', Other), + ], + } + + +class IPythonTracebackLexer(DelegatingLexer): + """ + IPython traceback lexer. + + For doctests, the tracebacks can be snipped as much as desired with the + exception to the lines that designate a traceback. For non-syntax error + tracebacks, this is the line of hyphens. For syntax error tracebacks, + this is the line which lists the File and line number. + + """ + # The lexer inherits from DelegatingLexer. The "root" lexer is an + # appropriate IPython lexer, which depends on the value of the boolean + # `python3`. First, we parse with the partial IPython traceback lexer. + # Then, any code marked with the "Other" token is delegated to the root + # lexer. + # + name = 'IPython Traceback' + aliases = ['ipythontb'] + + def __init__(self, **options): + self.python3 = get_bool_opt(options, 'python3', False) + if self.python3: + self.aliases = ['ipython3tb'] + else: + self.aliases = ['ipython2tb', 'ipythontb'] + + if self.python3: + IPyLexer = IPython3Lexer + else: + IPyLexer = IPythonLexer + + DelegatingLexer.__init__(self, IPyLexer, + IPythonPartialTracebackLexer, **options) class IPythonConsoleLexer(Lexer): """ - For IPython console output or doctests, such as: + An IPython console lexer for IPython code-blocks and doctests, such as: - .. sourcecode:: ipython + .. code-block:: rst - In [1]: a = 'foo' + .. code-block:: ipythonconsole - In [2]: a - Out[2]: 'foo' + In [1]: a = 'foo' - In [3]: print a - foo + In [2]: a + Out[2]: 'foo' - In [4]: 1 / 0 + In [3]: print a + foo - Notes: + In [4]: 1 / 0 - - Tracebacks are not currently supported. - - It assumes the default IPython prompts, not customized ones. + Support is also provided for IPython exceptions: + + .. code-block:: rst + + .. code-block:: ipythonconsole + + In [1]: raise Exception + + --------------------------------------------------------------------------- + Exception Traceback (most recent call last) + in () + ----> 1 raise Exception + + Exception: + """ - name = 'IPython console session' - aliases = ['ipython'] + aliases = ['ipythonconsole'] mimetypes = ['text/x-ipython-console'] - input_prompt = re.compile("(In \[[0-9]+\]: )|( \.\.\.+:)") - output_prompt = re.compile("(Out\[[0-9]+\]: )|( \.\.\.+:)") - continue_prompt = re.compile(" \.\.\.+:") - tb_start = re.compile("\-+") - def get_tokens_unprocessed(self, text): - pylexer = PythonLexer(**self.options) - tblexer = PythonTracebackLexer(**self.options) + # The regexps used to determine what is input and what is output. + # The default prompts for IPython are: + # + # in = 'In [#]: ' + # continuation = ' .D.: ' + # template = 'Out[#]: ' + # + # Where '#' is the 'prompt number' or 'execution count' and 'D' + # D is a number of dots matching the width of the execution count + # + in1_regex = r'In \[[0-9]+\]: ' + in2_regex = r' \.\.+\.: ' + out_regex = r'Out\[[0-9]+\]: ' + + #: The regex to determine when a traceback starts. + ipytb_start = re.compile(r'^(\^C)?(-+\n)|^( File)(.*)(, line )(\d+\n)') + + def __init__(self, **options): + """Initialize the IPython console lexer. + + Parameters + ---------- + python3 : bool + If `True`, then the console inputs are parsed using a Python 3 + lexer. Otherwise, they are parsed using a Python 2 lexer. + in1_regex : RegexObject + The compiled regular expression used to detect the start + of inputs. Although the IPython configuration setting may have a + trailing whitespace, do not include it in the regex. If `None`, + then the default input prompt is assumed. + in2_regex : RegexObject + The compiled regular expression used to detect the continuation + of inputs. Although the IPython configuration setting may have a + trailing whitespace, do not include it in the regex. If `None`, + then the default input prompt is assumed. + out_regex : RegexObject + The compiled regular expression used to detect outputs. If `None`, + then the default output prompt is assumed. + + """ + self.python3 = get_bool_opt(options, 'python3', False) + if self.python3: + self.aliases = ['ipython3console'] + else: + self.aliases = ['ipython2console', 'ipythonconsole'] + + in1_regex = options.get('in1_regex', self.in1_regex) + in2_regex = options.get('in2_regex', self.in2_regex) + out_regex = options.get('out_regex', self.out_regex) + + # So that we can work with input and output prompts which have been + # rstrip'd (possibly by editors) we also need rstrip'd variants. If + # we do not do this, then such prompts will be tagged as 'output'. + # The reason can't just use the rstrip'd variants instead is because + # we want any whitespace associated with the prompt to be inserted + # with the token. This allows formatted code to be modified so as hide + # the appearance of prompts, with the whitespace included. One example + # use of this is in copybutton.js from the standard lib Python docs. + in1_regex_rstrip = in1_regex.rstrip() + '\n' + in2_regex_rstrip = in2_regex.rstrip() + '\n' + out_regex_rstrip = out_regex.rstrip() + '\n' - curcode = '' - insertions = [] + # Compile and save them all. + attrs = ['in1_regex', 'in2_regex', 'out_regex', + 'in1_regex_rstrip', 'in2_regex_rstrip', 'out_regex_rstrip'] + for attr in attrs: + self.__setattr__(attr, re.compile(locals()[attr])) + + Lexer.__init__(self, **options) + + if self.python3: + pylexer = IPython3Lexer + tblexer = IPythonTracebackLexer + else: + pylexer = IPythonLexer + tblexer = IPythonTracebackLexer + + self.pylexer = pylexer(**options) + self.tblexer = tblexer(**options) + + self.reset() + + def reset(self): + self.mode = 'output' + self.index = 0 + self.buffer = u'' + self.insertions = [] + + def buffered_tokens(self): + """ + Generator of unprocessed tokens after doing insertions and before + changing to a new state. + + """ + if self.mode == 'output': + tokens = [(0, Generic.Output, self.buffer)] + elif self.mode == 'input': + tokens = self.pylexer.get_tokens_unprocessed(self.buffer) + else: # traceback + tokens = self.tblexer.get_tokens_unprocessed(self.buffer) + + for i, t, v in do_insertions(self.insertions, tokens): + # All token indexes are relative to the buffer. + yield self.index + i, t, v + + # Clear it all + self.index += len(self.buffer) + self.buffer = u'' + self.insertions = [] + + def get_mci(self, line): + """ + Parses the line and returns a 3-tuple: (mode, code, insertion). + + `mode` is the next mode (or state) of the lexer, and is always equal + to 'input', 'output', or 'tb'. + + `code` is a portion of the line that should be added to the buffer + corresponding to the next mode and eventually lexed by another lexer. + For example, `code` could be Python code if `mode` were 'input'. + + `insertion` is a 3-tuple (index, token, text) representing an + unprocessed "token" that will be inserted into the stream of tokens + that are created from the buffer once we change modes. This is usually + the input or output prompt. + + In general, the next mode depends on current mode and on the contents + of `line`. + + """ + # To reduce the number of regex match checks, we have multiple + # 'if' blocks instead of 'if-elif' blocks. + + # Check for possible end of input + in2_match = self.in2_regex.match(line) + in2_match_rstrip = self.in2_regex_rstrip.match(line) + if (in2_match and in2_match.group().rstrip() == line.rstrip()) or \ + in2_match_rstrip: + end_input = True + else: + end_input = False + if end_input and self.mode != 'tb': + # Only look for an end of input when not in tb mode. + # An ellipsis could appear within the traceback. + mode = 'output' + code = u'' + insertion = (0, Generic.Prompt, line) + return mode, code, insertion + + # Check for output prompt + out_match = self.out_regex.match(line) + out_match_rstrip = self.out_regex_rstrip.match(line) + if out_match or out_match_rstrip: + mode = 'output' + if out_match: + idx = out_match.end() + else: + idx = out_match_rstrip.end() + code = line[idx:] + # Use the 'heading' token for output. We cannot use Generic.Error + # since it would conflict with exceptions. + insertion = (0, Generic.Heading, line[:idx]) + return mode, code, insertion + + + # Check for input or continuation prompt (non stripped version) + in1_match = self.in1_regex.match(line) + if in1_match or (in2_match and self.mode != 'tb'): + # New input or when not in tb, continued input. + # We do not check for continued input when in tb since it is + # allowable to replace a long stack with an ellipsis. + mode = 'input' + if in1_match: + idx = in1_match.end() + else: # in2_match + idx = in2_match.end() + code = line[idx:] + insertion = (0, Generic.Prompt, line[:idx]) + return mode, code, insertion + + # Check for input or continuation prompt (stripped version) + in1_match_rstrip = self.in1_regex_rstrip.match(line) + if in1_match_rstrip or (in2_match_rstrip and self.mode != 'tb'): + # New input or when not in tb, continued input. + # We do not check for continued input when in tb since it is + # allowable to replace a long stack with an ellipsis. + mode = 'input' + if in1_match_rstrip: + idx = in1_match_rstrip.end() + else: # in2_match + idx = in2_match_rstrip.end() + code = line[idx:] + insertion = (0, Generic.Prompt, line[:idx]) + return mode, code, insertion + + # Check for traceback + if self.ipytb_start.match(line): + mode = 'tb' + code = line + insertion = None + return mode, code, insertion + + # All other stuff... + if self.mode in ('input', 'output'): + # We assume all other text is output. Multiline input that + # does not use the continuation marker cannot be detected. + # For example, the 3 in the following is clearly output: + # + # In [1]: print 3 + # 3 + # + # But the following second line is part of the input: + # + # In [2]: while True: + # print True + # + # In both cases, the 2nd line will be 'output'. + # + mode = 'output' + else: + mode = 'tb' + + code = line + insertion = None + + return mode, code, insertion + + def get_tokens_unprocessed(self, text): + self.reset() for match in line_re.finditer(text): line = match.group() - input_prompt = self.input_prompt.match(line) - continue_prompt = self.continue_prompt.match(line.rstrip()) - output_prompt = self.output_prompt.match(line) - if line.startswith("#"): - insertions.append((len(curcode), - [(0, Comment, line)])) - elif input_prompt is not None: - insertions.append((len(curcode), - [(0, Generic.Prompt, input_prompt.group())])) - curcode += line[input_prompt.end():] - elif continue_prompt is not None: - insertions.append((len(curcode), - [(0, Generic.Prompt, continue_prompt.group())])) - curcode += line[continue_prompt.end():] - elif output_prompt is not None: - # Use the 'error' token for output. We should probably make - # our own token, but error is typicaly in a bright color like - # red, so it works fine for our output prompts. - insertions.append((len(curcode), - [(0, Generic.Error, output_prompt.group())])) - curcode += line[output_prompt.end():] - else: - if curcode: - for item in do_insertions(insertions, - pylexer.get_tokens_unprocessed(curcode)): - yield item - curcode = '' - insertions = [] - yield match.start(), Generic.Output, line - if curcode: - for item in do_insertions(insertions, - pylexer.get_tokens_unprocessed(curcode)): - yield item + mode, code, insertion = self.get_mci(line) + + if mode != self.mode: + # Yield buffered tokens before transitioning to new mode. + for token in self.buffered_tokens(): + yield token + self.mode = mode + + if insertion: + self.insertions.append((len(self.buffer), [insertion])) + self.buffer += code + + for token in self.buffered_tokens(): + yield token + +class IPyLexer(Lexer): + """ + Primary lexer for all IPython-like code. + + This is a simple helper lexer. If the first line of the text begins with + "In \[[0-9]+\]:", then the entire text is parsed with an IPython console + lexer. If not, then the entire text is parsed with an IPython lexer. + + The goal is to reduce the number of lexers that are registered + with Pygments. + + """ + name = 'IPy session' + aliases = ['ipy'] + + def __init__(self, **options): + self.python3 = get_bool_opt(options, 'python3', False) + if self.python3: + self.aliases = ['ipy3'] + else: + self.aliases = ['ipy2', 'ipy'] + + Lexer.__init__(self, **options) + self.IPythonLexer = IPythonLexer(**options) + self.IPythonConsoleLexer = IPythonConsoleLexer(**options) + + def get_tokens_unprocessed(self, text): + # Search for the input prompt anywhere...this allows code blocks to + # begin with comments as well. + if re.match(r'.*(In \[[0-9]+\]:)', text.strip(), re.DOTALL): + lex = self.IPythonConsoleLexer + else: + lex = self.IPythonLexer + for token in lex.get_tokens_unprocessed(text): + yield token + + +### IPython/sphinxext/ipython_console_highighting.py ######################### +""" +reST directive for syntax-highlighting ipython interactive sessions. + +""" + +from sphinx import highlighting +# from IPython.lib.lexers import IPyLexer def setup(app): """Setup as a sphinx extension.""" # This is only a lexer, so adding it below to pygments appears sufficient. - # But if somebody knows that the right API usage should be to do that via + # But if somebody knows what the right API usage should be to do that via # sphinx, by all means fix it here. At least having this setup.py # suppresses the sphinx warning we'd get without it. - pass + metadata = {'parallel_read_safe': True, 'parallel_write_safe': True} + return metadata -#----------------------------------------------------------------------------- -# Register the extension as a valid pygments lexer -highlighting.lexers['ipython'] = IPythonConsoleLexer() +# Register the extension as a valid pygments lexer. +# Alternatively, we could register the lexer with pygments instead. This would +# require using setuptools entrypoints: http://pygments.org/docs/plugins + +ipy2 = IPyLexer(python3=False) +ipy3 = IPyLexer(python3=True) + +highlighting.lexers['ipython'] = ipy2 +highlighting.lexers['ipython2'] = ipy2 +highlighting.lexers['ipython3'] = ipy3 ===================================== doc/usermanual.rst ===================================== @@ -63,7 +63,7 @@ In order to use PyEPR it is needed that the following software are correctly installed and configured: * Python2_ >= 2.6 or Python3_ >= 3.1 (including PyPy_) -* numpy_ >= 1.5.0 +* numpy_ >= 1.7.0 * `EPR API`_ >= 2.2 (optional, since PyEPR 0.7 the source tar-ball comes with a copy of the EPR C API sources) * a reasonably updated C compiler [#]_ (build only) @@ -82,9 +82,9 @@ correctly installed and configured: .. _Python2: Python_ .. _Python3: Python_ .. _PyPy: http://pypy.org -.. _numpy: http://www.numpy.org +.. _numpy: https://www.numpy.org .. _gcc: http://gcc.gnu.org -.. _Cython: http://cython.org +.. _Cython: https://cython.org .. _unittest2: https://pypi.org/project/unittest2 ===================================== requirements.txt ===================================== @@ -1,3 +1,3 @@ -numpy>=1.5 +numpy>=1.7 cython>=0.19 unittest2;python_version<"3.4" ===================================== setup.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -26,12 +26,15 @@ import sys import glob +PYEPR_COVERAGE = False + + def get_version(filename): with open(filename) as fd: data = fd.read() mobj = re.search( - '''^__version__\s*=\s*(?P['"])(?P\d+(\.\d+)*.*)(?P=q)''', + r'''^__version__\s*=\s*(?P['"])(?P\d+(\.\d+)*.*)(?P=q)''', data, re.MULTILINE) return mobj.group('version') @@ -62,10 +65,14 @@ print('HAVE_SETUPTOOLS: {0}'.format(HAVE_SETUPTOOLS)) try: from Cython.Build import cythonize + from Cython import __version__ as CYTHON_VERSION HAVE_CYTHON = True except ImportError: HAVE_CYTHON = False + CYTHON_VERSION = None print('HAVE_CYTHON: {0}'.format(HAVE_CYTHON)) +if HAVE_CYTHON: + print('CYTHON_VERSION: {0}'.format(CYTHON_VERSION)) # @COMPATIBILITY: Extension is an old style class in Python 2 @@ -151,18 +158,44 @@ def get_extension(): sys.argv.remove(arg) break + define_macros = [] + + # @NOTE: uses the CYTHON_VERSION global variable + if HAVE_CYTHON and CYTHON_VERSION >= '0.29': + define_macros.append( + ('NPY_NO_DEPRECATED_API', 'NPY_1_7_API_VERSION'), + ) + ext = PyEprExtension( 'epr', sources=[os.path.join('src', 'epr.pyx')], # libraries=['m'], - # define_macros=[('NPY_NO_DEPRECATED_API', 'NPY_1_7_API_VERSION'),], + define_macros=define_macros, eprsrcdir=eprsrcdir, ) - # @NOTE: uses the HAVE_CYTHON global variable + # @NOTE: uses the HAVE_CYTHON and CYTHON_VERSION global variables if HAVE_CYTHON: - extlist = cythonize([ext]) + if CYTHON_VERSION >= '0.29': + language_level = '3str' + else: + language_level = '2' + print('CYTHON_LANGUAGE_LEVEL: {0}'.format(language_level)) + + compiler_directives = dict( + language_level=language_level, + ) + + if PYEPR_COVERAGE: + compiler_directives['linetrace'] = True + + extlist = cythonize([ext], compiler_directives=compiler_directives) ext = extlist[0] + + if PYEPR_COVERAGE: + ext.define_macros.extend([ + ('CYTHON_TRACE_NOGIL', '1'), + ]) else: ext.convert_pyx_sources_to_lang() @@ -203,12 +236,12 @@ any data field contained in a product file. 'Operating System :: POSIX', 'Programming Language :: Python', 'Programming Language :: Python :: 2', - 'Programming Language :: Python :: 2.6', + 'Programming Language :: Python :: 2.6', # deprecated 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', - 'Programming Language :: Python :: 3.2', - 'Programming Language :: Python :: 3.3', - 'Programming Language :: Python :: 3.4', + 'Programming Language :: Python :: 3.2', # deprecated + 'Programming Language :: Python :: 3.3', # deprecated + 'Programming Language :: Python :: 3.4', # deprecated 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', @@ -239,8 +272,8 @@ def setup_package(): if HAVE_SETUPTOOLS: config['test_suite'] = get_collector() - config.setdefault('setup_requires', []).append('numpy>=1.5') - config.setdefault('install_requires', []).append('numpy>=1.5') + config.setdefault('setup_requires', []).append('numpy>=1.7') + config.setdefault('install_requires', []).append('numpy>=1.7') if ext.setup_requires_cython: config['setup_requires'].append('cython>=0.19') @@ -248,4 +281,11 @@ def setup_package(): if __name__ == '__main__': + if '--coverage' in sys.argv or 'PYEPR_COVERAGE' in os.environ: + PYEPR_COVERAGE = True + if '--coverage' in sys.argv: + sys.argv.remove('--coverage') + + print('PYEPR_COVERAGE:', PYEPR_COVERAGE) + setup_package() ===================================== src/epr.pxd ===================================== @@ -2,7 +2,7 @@ # PyEPR - Python bindings for ENVISAT Product Reader API # -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # ===================================== src/epr.pyx ===================================== @@ -2,7 +2,7 @@ # PyEPR - Python bindings for ENVISAT Product Reader API # -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -33,15 +33,15 @@ products. It provides access to the data either on a geophysical The raw data access makes it possible to read any data field contained in a product file. -.. _PyEPR: http://avalentino.github.com/pyepr -.. _Python: http://www.python.org +.. _PyEPR: https://avalentino.github.io/pyepr +.. _Python: https://www.python.org .. _`EPR API`: https://github.com/bcdev/epr-api .. _ENVISAT: http://envisat.esa.int .. _ESA: http://earth.esa.int """ -__version__ = '0.9.5' +__version__ = '1.0.0' from libc cimport errno from libc cimport stdio @@ -69,6 +69,7 @@ np.import_array() import os import sys +import atexit from collections import namedtuple import numpy as np @@ -195,6 +196,48 @@ _MODEL_MAP = { } +ctypedef fused T: + np.uint8_t + np.int8_t + np.uint16_t + np.int16_t + np.uint32_t + np.int32_t + np.float32_t + np.float64_t + np.npy_byte + + +cdef const void* _view_to_ptr(T[:] a): + return &a[0] + + +cdef const void* _to_ptr(np.ndarray a, EPR_DataTypeId etype): + cdef const void *p = NULL + if etype == e_tid_uchar: + p = _view_to_ptr[np.uint8_t](a) + elif etype == e_tid_char: + p = _view_to_ptr[np.int8_t](a) + elif etype == e_tid_ushort: + p = _view_to_ptr[np.uint16_t](a) + elif etype == e_tid_short: + p = _view_to_ptr[np.int16_t](a) + elif etype == e_tid_uint: + p = _view_to_ptr[np.uint32_t](a) + elif etype == e_tid_int: + p = _view_to_ptr[np.int32_t](a) + elif etype == e_tid_float: + p = _view_to_ptr[np.float32_t](a) + elif etype == e_tid_double: + p = _view_to_ptr[np.float64_t](a) + elif etype == e_tid_string: + p = _view_to_ptr[np.npy_byte](a) + else: + raise ValueError('unexpected type ID: %d' % etype) + + return p + + class EPRError(Exception): """EPR API error.""" @@ -275,11 +318,11 @@ cdef class _CLib: cdef bytes msg # @TODO: check - #if EPR_C_API_VERSION != '2.2': - # raise ImportError('C library version not supported: "%s"' % - # EPR_C_API_VERSION) + # if EPR_C_API_VERSION != '2.2': + # raise ImportError( + # 'C library version not supported: "%s"' % EPR_C_API_VERSION) - #if epr_init_api(e_log_warning, epr_log_message, NULL): + # if epr_init_api(e_log_warning, epr_log_message, NULL): if epr_init_api(e_log_warning, NULL, NULL): msg = epr_get_last_err_message() epr_clear_err() @@ -388,7 +431,7 @@ cdef class DSD(EprObject): if isinstance(self, Dataset): (self._parent).check_closed_product() else: - #elif isinstance(self, Product): + # elif isinstance(self, Product): (self._parent).check_closed_product() property index: @@ -560,7 +603,6 @@ cdef class Field(EprObject): return offset - def print_(self, ostream=None): """print_(self, ostream=None) @@ -589,9 +631,9 @@ cdef class Field(EprObject): pyepr_check_errors() - #def dump_field(self): - # epr_dump_field(self._ptr) - # pyepr_check_errors() + # def dump_field(self): + # epr_dump_field(self._ptr) + # pyepr_check_errors() def get_unit(self): """get_unit(self) @@ -704,8 +746,8 @@ cdef class Field(EprObject): if index != 0: raise ValueError('invalid index: %d' % index) val = epr_get_field_elem_as_str(self._ptr) - #elif etype == e_tid_spare: - # val = epr_get_field_elem_as_str(self._ptr) + # elif etype == e_tid_spare: + # val = epr_get_field_elem_as_str(self._ptr) elif etype == e_tid_time: if index != 0: raise ValueError('invalid index: %d' % index) @@ -810,15 +852,15 @@ cdef class Field(EprObject): buf = epr_get_field_elem_as_str(self._ptr) if buf is NULL: pyepr_null_ptr_error(msg) - #elif etype == e_tid_unknown: - # pass - #elif etype = e_tid_spare: - # pass + # elif etype == e_tid_unknown: + # pass + # elif etype = e_tid_spare: + # pass else: raise ValueError('invalid field type') out = np.PyArray_SimpleNewFromData(nd, shape, dtype, buf) - #np.PyArray_CLEARFLAG(out, NPY_ARRAY_WRITEABLE) # new in numpy 1.7 + # np.PyArray_CLEARFLAG(out, NPY_ARRAY_WRITEABLE) # new in numpy 1.7 # Make the ndarray keep a reference to this object np.set_array_base(out, self) @@ -837,10 +879,11 @@ cdef class Field(EprObject): cdef long field_offset cdef char* buf cdef EPR_DataTypeId etype = epr_get_field_type(self._ptr) + cdef const void* p = NULL dtype = _DTYPE_MAP[etype] - elems = elems.astype(dtype) + elems = np.ascontiguousarray(elems, dtype=dtype) record = self._parent dataset = record._parent @@ -853,16 +896,18 @@ cdef class Field(EprObject): field_offset = index * elemsize file_offset = self._get_offset(absolute=1) buf = self._ptr.elems + field_offset + p = _to_ptr(elems, etype) - cstring.memcpy(buf, elems.data, datasize) + with nogil: + cstring.memcpy(buf, p, datasize) if SWAP_BYTES: elems = elems.byteswap() + p = _to_ptr(elems, etype) with nogil: stdio.fseek(istream, file_offset + field_offset, stdio.SEEK_SET) - ret = stdio.fwrite(elems.data, elemsize, nelems, - product._ptr.istream) + ret = stdio.fwrite(p, elemsize, nelems, product._ptr.istream) if ret != nelems: raise IOError( 'write error: %d of %d bytes written' % (ret, datasize)) @@ -939,7 +984,6 @@ cdef class Field(EprObject): cdef EPR_FieldInfo* info = self._ptr.info return info.tot_size - # --- high level interface ------------------------------------------------ def __repr__(self): return 'epr.Field("%s") %d %s elements' % (self.get_name(), @@ -1021,7 +1065,7 @@ cdef class Field(EprObject): n = epr_get_data_type_size(epr_get_field_type(p1)) if n != 0: n *= epr_get_field_num_elems(p1) - #pyepr_check_errors() + # pyepr_check_errors() if n <= 0: # @TODO: check return True @@ -1053,7 +1097,7 @@ cdef class Field(EprObject): n = epr_get_data_type_size(epr_get_field_type(p1)) if n != 0: n *= epr_get_field_num_elems(p1) - #pyepr_check_errors() + # pyepr_check_errors() if n <= 0: # @TODO: check return False @@ -1127,14 +1171,14 @@ cdef class Record(EprObject): if isinstance(self._parent, Dataset): (self._parent).check_closed_product() else: - #elif isinstance(self._parent, Product): + # elif isinstance(self._parent, Product): (self._parent).check_closed_product() cdef inline _check_write_mode(self): if isinstance(self._parent, Dataset): (self._parent)._check_write_mode() else: - #elif isinstance(self._parent, Product): + # elif isinstance(self._parent, Product): (self._parent)._check_write_mode() cdef inline uint _get_offset(self, bint absolure=0): @@ -1491,7 +1535,7 @@ cdef class Raster(EprObject): """ if (x < 0 or x >= self._ptr.raster_width or - y < 0 or y >= self._ptr.raster_height): + y < 0 or y >= self._ptr.raster_height): raise ValueError('index out of range: x=%d, y=%d' % (x, y)) cdef EPR_EDataTypeId dtype = self._ptr.data_type @@ -2131,7 +2175,6 @@ cdef class Band(EprObject): self.check_closed_product() return self._ptr.magic - property _field_index: """Index or the field (within the dataset) containing the raw data used to create the band's pixel values. @@ -2470,8 +2513,8 @@ cdef class Product(EprObject): """ if self._ptr is not NULL: - #if '+' in self.mode: - # stdio.fflush(self._ptr.istream) + # if '+' in self.mode: + # stdio.fflush(self._ptr.istream) epr_close_product(self._ptr) pyepr_check_errors() self._ptr = NULL @@ -2880,8 +2923,8 @@ cdef class Product(EprObject): return [self.get_band_at(idx) for idx in range(num_bands)] # @TODO: iter on both datasets and bands (??) - #def __iter__(self): - # return itertools.chain((self.datasets(), self.bands())) + # def __iter__(self): + # return itertools.chain((self.datasets(), self.bands())) def __repr__(self): return 'epr.Product(%s) %d datasets, %d bands' % (self.id_string, @@ -2940,9 +2983,6 @@ def open(filename, mode='rb'): _EPR_C_LIB = _CLib.__new__(_CLib) -import atexit - - @atexit.register def _close_api(): # ensure that all EprObject(s) are collected before removing the last ===================================== tests/test_all.py ===================================== @@ -1,7 +1,7 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- -# Copyright (C) 2011-2018, Antonio Valentino +# Copyright (C) 2011-2019, Antonio Valentino # # This file is part of PyEPR. # @@ -26,6 +26,7 @@ import gzip import shutil import numbers import operator +import platform import tempfile import functools import contextlib @@ -40,6 +41,8 @@ try: from unittest import skipIf as _skipIf, TestCase as _TestCase if not hasattr(_TestCase, 'subTest'): raise ImportError + if not hasattr(_TestCase, 'assertRaisesRegex'): + raise ImportError except ImportError: import unittest2 as unittest else: @@ -520,9 +523,9 @@ class TestProductHighLevelAPI(unittest.TestCase): # pass def test_repr(self): - pattern = ('epr\.Product\((?P\w+)\) ' - '(?P\d+) datasets, ' - '(?P\d+) bands') + pattern = (r'epr\.Product\((?P\w+)\) ' + r'(?P\d+) datasets, ' + r'(?P\d+) bands') mobj = re.match(pattern, repr(self.product)) self.assertNotEqual(mobj, None) @@ -748,7 +751,7 @@ class TestDatasetHighLevelAPI(unittest.TestCase): self.assertEqual(index, self.dataset.get_num_records()) def test_repr(self): - pattern = 'epr\.Dataset\((?P\w+)\) (?P\d+) records' + pattern = r'epr\.Dataset\((?P\w+)\) (?P\d+) records' mobj = re.match(pattern, repr(self.dataset)) self.assertNotEqual(mobj, None) self.assertEqual(mobj.group('name'), self.dataset.get_name()) @@ -1321,8 +1324,8 @@ class TestBandHighLevelAPI(unittest.TestCase): self.product.close() def test_repr(self): - pattern = ('epr.Band\((?P\w+)\) of ' - 'epr.Product\((?P\w+)\)') + pattern = (r'epr.Band\((?P\w+)\) of ' + r'epr.Product\((?P\w+)\)') for band in self.product.bands(): mobj = re.match(pattern, repr(band)) self.assertNotEqual(mobj, None) @@ -1642,8 +1645,8 @@ class TestRasterHighLevelAPI(unittest.TestCase): self.RASTER_WIDTH, self.RASTER_HEIGHT) def test_repr(self): - pattern = (' (?P\w+) ' - '\((?P\d+)L x (?P\d+)P\)') + pattern = (r' (?P\w+) ' + r'\((?P\d+)L x (?P\d+)P\)') mobj = re.match(pattern, repr(self.raster)) self.assertNotEqual(mobj, None) self.assertEqual(mobj.group('data_type'), @@ -1865,7 +1868,7 @@ class TestMultipleRecordsHighLevelAPI(unittest.TestCase): self.product.close() def test_repr(self): - pattern = ' (?P\d+) fields' + pattern = r' (?P\d+) fields' for record in self.dataset: mobj = re.match(pattern, repr(record)) self.assertNotEqual(mobj, None) @@ -2429,8 +2432,8 @@ class TestFieldHighLevelAPI(unittest.TestCase): self.record = dataset.read_record(0) def test_repr(self): - pattern = ('epr\.Field\("(?P.+)"\) (?P\d+) ' - '(?P\w+) elements') + pattern = (r'epr\.Field\("(?P.+)"\) (?P\d+) ' + r'(?P\w+) elements') for field in self.record: mobj = re.match(pattern, repr(field)) self.assertNotEqual(mobj, None) @@ -2678,7 +2681,7 @@ class TestDsdHighLevelAPI(unittest.TestCase): self.dsd = product.get_dsd_at(0) def test_repr(self): - pattern = 'epr\.DSD\("(?P.+)"\)' + pattern = r'epr\.DSD\("(?P.+)"\)' mobj = re.match(pattern, repr(self.dsd)) self.assertNotEqual(mobj, None) self.assertEqual(mobj.group('name'), self.dsd.ds_name) @@ -2854,34 +2857,6 @@ class TestSampleModelFunctions(unittest.TestCase): class TestDirectInstantiation(unittest.TestCase): MSG_PATTERN = '"%s" class cannot be instantiated from Python' - if sys.version_info[:2] >= (3, 2): - # @COMPATIBILITY: python >= 3.2 - pass - elif sys.version_info[:2] in ((2, 7), (3, 1)): - # @COMPATIBILITY: unittest2, python2.7, python3.1 - assertRaisesRegex = unittest.TestCase.assertRaisesRegexp - else: - - # @COMPATIBILITY: python < 2.7 - def assertRaisesRegex(self, expected_exception, expected_regexp, - callable_obj=None, *args, **kwargs): - try: - callable_obj(*args, **kwargs) - except expected_exception as exc_value: - import types - if isinstance(expected_regexp, types.StringTypes): - expected_regexp = re.compile(expected_regexp) - if not expected_regexp.search(str(exc_value)): - raise self.failureException( - '"%s" does not match "%s"' % (expected_regexp.pattern, - str(exc_value))) - else: - if hasattr(expected_exception, '__name__'): - excName = expected_exception.__name__ - else: - excName = str(expected_exception) - raise self.failureException("%s not raised" % excName) - def test_direct_dsd_instantiation(self): pattern = self.MSG_PATTERN % epr.DSD.__name__ self.assertRaisesRegex(TypeError, pattern, epr.DSD) @@ -2915,6 +2890,9 @@ class TestLibVersion(unittest.TestCase): self.assertTrue(isinstance(epr.EPR_C_API_VERSION, str)) +# only PyPy 3 seems to be affected + at unittest.skipIf(platform.python_implementation() == 'PyPy', + 'skip memory leak check on PyPy') @unittest.skipIf(resource is None, '"resource" module not available') class TestMemoryLeaks(unittest.TestCase): # See gh-10 (https://github.com/avalentino/pyepr/issues/10) View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/f7a06a4415b5e15ddcf3845fb1534d79b385589e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/f7a06a4415b5e15ddcf3845fb1534d79b385589e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 21:31:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 20:31:09 +0000 Subject: [Git][debian-gis-team/gdal-grass][master] 2 commits: Disable as-needed linking on Debian too, gcc-9 enables it by default. Message-ID: <5d75650d6cfd5_73482ad95ff622b01261b4@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / gdal-grass Commits: 250b9d31 by Bas Couwenberg at 2019-09-08T20:20:52Z Disable as-needed linking on Debian too, gcc-9 enables it by default. - - - - - 9b702e1a by Bas Couwenberg at 2019-09-08T20:21:52Z Set distribution to unstable. - - - - - 2 changed files: - debian/changelog - debian/rules Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libgdal-grass (2.4.2-3) unstable; urgency=medium + + * Disable as-needed linking on Debian too, gcc-9 enables it by default. + + -- Bas Couwenberg Sun, 08 Sep 2019 22:21:39 +0200 + libgdal-grass (2.4.2-2) unstable; urgency=medium * Update packaging for GRASS 7.8.0. ===================================== debian/rules ===================================== @@ -7,6 +7,9 @@ # Enable hardening build flags export DEB_BUILD_MAINT_OPTIONS=hardening=+all +# Don't link with as-needed to prevent missing libraries +export DEB_LDFLAGS_MAINT_APPEND=-Wl,--no-as-needed + # Disable PIE on Ubuntu where it's still problematic VENDOR_DERIVES_FROM_UBUNTU ?= $(shell dpkg-vendor --derives-from Ubuntu && echo yes) DISTRIBUTION_RELEASE := $(shell lsb_release -cs) @@ -15,7 +18,6 @@ ifeq ($(VENDOR_DERIVES_FROM_UBUNTU),yes) ifneq (,$(filter $(DISTRIBUTION_RELEASE),xenial bionic)) export DEB_BUILD_MAINT_OPTIONS=hardening=+all,-pie endif - export DEB_LDFLAGS_MAINT_APPEND=-Wl,--no-as-needed endif PKGNAME=$(shell grep Package: debian/control | head -1 | cut -d' ' -f2) View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/c200aa699c82a1534ef2689f899949d7ea7efa39...9b702e1ac0bf38a0dc7e3f7ee387c0a0d25774c9 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/c200aa699c82a1534ef2689f899949d7ea7efa39...9b702e1ac0bf38a0dc7e3f7ee387c0a0d25774c9 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 21:31:45 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 20:31:45 +0000 Subject: [Git][debian-gis-team/gdal-grass] Pushed new tag debian/2.4.2-3 Message-ID: <5d756531d5728_73483fbbb23f22dc12635@godard.mail> Bas Couwenberg pushed new tag debian/2.4.2-3 at Debian GIS Project / gdal-grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/tree/debian/2.4.2-3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 21:40:28 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 20:40:28 +0000 Subject: [Git][debian-gis-team/gdal-grass][experimental] 2 commits: Disable as-needed linking on Debian too, gcc-9 enables it by default. Message-ID: <5d75673c55ed7_73483fbbb23f22dc1275d9@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / gdal-grass Commits: 551e548c by Bas Couwenberg at 2019-09-08T20:23:00Z Disable as-needed linking on Debian too, gcc-9 enables it by default. - - - - - 50ec7fba by Bas Couwenberg at 2019-09-08T20:27:39Z Set distribution to experimental. - - - - - 2 changed files: - debian/changelog - debian/rules Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libgdal-grass (3.0.1-1~exp3) experimental; urgency=medium + + * Disable as-needed linking on Debian too, gcc-9 enables it by default. + + -- Bas Couwenberg Sun, 08 Sep 2019 22:27:30 +0200 + libgdal-grass (3.0.1-1~exp2) experimental; urgency=medium * Update packaging for GRASS 7.8.0. ===================================== debian/rules ===================================== @@ -7,6 +7,9 @@ # Enable hardening build flags export DEB_BUILD_MAINT_OPTIONS=hardening=+all +# Don't link with as-needed to prevent missing libraries +export DEB_LDFLAGS_MAINT_APPEND=-Wl,--no-as-needed + # Disable PIE on Ubuntu where it's still problematic VENDOR_DERIVES_FROM_UBUNTU ?= $(shell dpkg-vendor --derives-from Ubuntu && echo yes) DISTRIBUTION_RELEASE := $(shell lsb_release -cs) @@ -15,7 +18,6 @@ ifeq ($(VENDOR_DERIVES_FROM_UBUNTU),yes) ifneq (,$(filter $(DISTRIBUTION_RELEASE),xenial bionic)) export DEB_BUILD_MAINT_OPTIONS=hardening=+all,-pie endif - export DEB_LDFLAGS_MAINT_APPEND=-Wl,--no-as-needed endif PKGNAME=$(shell grep Package: debian/control | head -1 | cut -d' ' -f2) View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/8d42e1a07ec9413c0046821a25d0b225e92b30fa...50ec7fba528b82240c760531dc1b20acf7a42ffc -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/compare/8d42e1a07ec9413c0046821a25d0b225e92b30fa...50ec7fba528b82240c760531dc1b20acf7a42ffc You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 8 21:40:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 08 Sep 2019 20:40:32 +0000 Subject: [Git][debian-gis-team/gdal-grass] Pushed new tag debian/3.0.1-1_exp3 Message-ID: <5d756740d686f_73483fbbb23f22dc127799@godard.mail> Bas Couwenberg pushed new tag debian/3.0.1-1_exp3 at Debian GIS Project / gdal-grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/tree/debian/3.0.1-1_exp3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 8 21:42:49 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 20:42:49 +0000 Subject: Processing of libgdal-grass_2.4.2-3_source.changes Message-ID: libgdal-grass_2.4.2-3_source.changes uploaded successfully to localhost along with the files: libgdal-grass_2.4.2-3.dsc libgdal-grass_2.4.2-3.debian.tar.xz libgdal-grass_2.4.2-3_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 8 21:52:51 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 20:52:51 +0000 Subject: Processing of libgdal-grass_3.0.1-1~exp3_source.changes Message-ID: libgdal-grass_3.0.1-1~exp3_source.changes uploaded successfully to localhost along with the files: libgdal-grass_3.0.1-1~exp3.dsc libgdal-grass_3.0.1-1~exp3.debian.tar.xz libgdal-grass_3.0.1-1~exp3_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 8 22:04:22 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 21:04:22 +0000 Subject: libgdal-grass_2.4.2-3_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 08 Sep 2019 22:21:39 +0200 Source: libgdal-grass Architecture: source Version: 2.4.2-3 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: libgdal-grass (2.4.2-3) unstable; urgency=medium . * Disable as-needed linking on Debian too, gcc-9 enables it by default. Checksums-Sha1: 90497fd96285111dd7e13e71ecbbe8bd4ca26b6d 2112 libgdal-grass_2.4.2-3.dsc cdc37e93b3c12ab42fb867856459ebe7b9ef9b67 8292 libgdal-grass_2.4.2-3.debian.tar.xz 61563b6ff70e824cb6aefa89d101074d82322a38 15715 libgdal-grass_2.4.2-3_amd64.buildinfo Checksums-Sha256: 8e9012f20c9d10875f0ebb3fda2ca3992c0875fe1754eb0e52545054fae22263 2112 libgdal-grass_2.4.2-3.dsc a3786be39ea7a29022931e9d4af81235747dc75e0ae438bfa7e59f84f9ce0c01 8292 libgdal-grass_2.4.2-3.debian.tar.xz d1cd9fed88cde43f83d15e1d0aa4ed4d65d9f159f37726c2cc2b7a6ce6871d51 15715 libgdal-grass_2.4.2-3_amd64.buildinfo Files: a2eef570e15b6a9ee442ef4b55599607 2112 science optional libgdal-grass_2.4.2-3.dsc 42b265fb34498e41a5c8743c9ac562db 8292 science optional libgdal-grass_2.4.2-3.debian.tar.xz dbafc6e344caf456ff02a7f7b7a96cac 15715 science optional libgdal-grass_2.4.2-3_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl11ZPsACgkQZ1DxCuiN SvEWdg/+O369GUPRB0oNXz3EfqG7/ubcLRHI8KigK+EB2Q9poR7n790uvChJfa3U PL4qN557QCyk7JmLxrJ7KDfEj1PyYlenSxFzol94WzZMjcT+OtD3ql+5mEviMw89 VOFOIQYUMq+zJALBRvHmQcDjUzr7fnJiA+DCvre5XN/tG6PbyAmezwI+AJZt5Dor P7fxO2PSbJJXFmE44TtB4saYt+BqtWwusGqppL260R3uiqmtkYnTsCLZqtCyzYfM dnCwCTQy+aEnJCNwl+EWvIPAyFWl5wuMBJJ5zaFYqhiBI1qJ6YsqvTv9LivTsikU uL4cH1r3lgXneak9X6kYBaNeJb9NzhS8Ah24p1wvNaZJIQEMZsB9A8sLbNXHWiu8 qhfxU+6aDUGxMT6SMRvt50e1CjzibMcNKGFm3u+dbGrNJUIAMzQ3TLfA7yiWE0p3 vr4uFnrIVSvH2uChLwpXlCk+f1CjTSTpVo+OAOKGXv+q4fcb01uy2f4YSnPszulf aNxx+XmMUweiaKlpnq0XTqv6A5K7YSp0NgrsN6JYnW3vlssJuWO4W/1tJkZWC86F ftSYigo4PQ5XIWGA+aTxwxc63bGve9kZ1gqkWL4DLds0/lbNUGS5KJ/xFksfrfoW za6h6jJ2jkks/W4d2L/DY6eiMXYL7fWdwDqw/rDne1Bb4g3APRs= =RNO9 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sun Sep 8 22:04:27 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 08 Sep 2019 21:04:27 +0000 Subject: libgdal-grass_3.0.1-1~exp3_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 08 Sep 2019 22:27:30 +0200 Source: libgdal-grass Architecture: source Version: 3.0.1-1~exp3 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: libgdal-grass (3.0.1-1~exp3) experimental; urgency=medium . * Disable as-needed linking on Debian too, gcc-9 enables it by default. Checksums-Sha1: c7ce747de483cea27c8dc4b0e13debe8ec524dcb 2148 libgdal-grass_3.0.1-1~exp3.dsc d0a1d588e2987d058a1dc6f2e109508dbbdc6a6c 8280 libgdal-grass_3.0.1-1~exp3.debian.tar.xz b61feb5b0efb1c6395ad4b3c6dbea9ef9b3b1884 15602 libgdal-grass_3.0.1-1~exp3_amd64.buildinfo Checksums-Sha256: 738154db038bc4fea9830e0df87d75b9f0e733765f9dfc7dffe78ba4681c3c6e 2148 libgdal-grass_3.0.1-1~exp3.dsc 4ed40782f3b004986691b012589dab01f19bcb871258210d1c46da1f6548145b 8280 libgdal-grass_3.0.1-1~exp3.debian.tar.xz b732cd16a6489c1135787db1c8e1edff9c2f52d9ba48bd28d5a16a36d1cfff31 15602 libgdal-grass_3.0.1-1~exp3_amd64.buildinfo Files: d0695a5e8066b3a75464c53bbd71a0dd 2148 science optional libgdal-grass_3.0.1-1~exp3.dsc acf17654259e756e7f25119b688f3b48 8280 science optional libgdal-grass_3.0.1-1~exp3.debian.tar.xz ed72ab385742d2cbd8957ea533fe696a 15602 science optional libgdal-grass_3.0.1-1~exp3_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl11Zy4ACgkQZ1DxCuiN SvHk+hAApZsS8cmjqarqDFrwFUfLUmqpNiUCesPBBb2duhR9hUgP9VG5fAsJ+A0y xM9JRH5BHjhJCfr78eR2xf7Rxa1Q5YE3Nipvu54AUtc1X8G7YYSDolCScp8QbDWH iGcL5dCRZICO5ouqCTRjl5/ojRa0Vcm4FxOQRdUk/QEPsKrfyC8hwHmXZDAzcdlO LLgzRqg2RLMHZ6TRzJjrNiGfLHkuFKDlv4eTUIuEvMzxJawyEzmRm6nPrL3Ry/pU cA2yGZYJzmdKfdZZTYVbfT7pCahQbRRY1u2AYlTbEkwVtrzAg991xzc10vtjFP2v vVdClaejazJslkepm03KNac3w/tbyzE/8AWyMq40KD8V5vJYeSKmofOP1Af8gz/q blu4DFaEvqgTUqmAQBT6qMIFDQBAQFrXcNQUNN6839FmoYzp+AFvwVu+HZ6MhICK zRqYRPMp/+tuAyw96eM8pjj/qB29mQ3H77hSj43XOfWnmhxhLAaezWtgpKZMsCTz Frk86MlYC0mGOmOIq/2NYxytaXmGyk0gsAMn52hirnwHiS1ymYAvy1CI64yPJ9I0 nNmtuyMWSZmnEZHypy6kk0vjOdhD9+QkJVBOSsJL/JlM1Nl42DYzU9I3t99wRclv 6epUYww+BtJx7ZNrFFpiDraOGdB8zBZzDKObUAJlOrL/6xdHZG0= =25Y+ -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sun Sep 8 22:40:48 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 08 Sep 2019 21:40:48 +0000 Subject: [Git][debian-gis-team/pyepr][master] Set distribution to unstable Message-ID: <5d7575604f44_73482ad9615dca90131453@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyepr Commits: 925ad8b7 by Antonio Valentino at 2019-09-08T21:37:09Z Set distribution to unstable - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,11 +1,11 @@ -pyepr (1.0.0-1) UNRELEASED; urgency=medium +pyepr (1.0.0-1) unstable; urgency=medium * New upstream release. * Update debian/copyright file. * debian/patches: - refresh all patches - -- Antonio Valentino Sun, 08 Sep 2019 18:58:19 +0000 + -- Antonio Valentino Sun, 08 Sep 2019 21:36:41 +0000 pyepr (0.9.5-3) unstable; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/925ad8b7e747a79d3a207cb85ad3a591e5230e3c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/925ad8b7e747a79d3a207cb85ad3a591e5230e3c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From noreply at release.debian.org Mon Sep 9 05:39:23 2019 From: noreply at release.debian.org (Debian testing watch) Date: Mon, 09 Sep 2019 04:39:23 +0000 Subject: pdal 2.0.1+ds-1 MIGRATED to testing Message-ID: FYI: The status of the pdal source package in Debian's testing distribution has changed. Previous version: 1.9.1+ds-2 Current version: 2.0.1+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Mon Sep 9 05:49:35 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 09 Sep 2019 04:49:35 +0000 Subject: [Git][debian-gis-team/routino][pristine-tar] 2 commits: pristine-tar data for routino_3.3.orig.tar.gz Message-ID: <5d75d9df2601c_73483fbbbee4219014316c@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / routino Commits: 750e3d59 by Bas Couwenberg at 2019-09-08T05:46:48Z pristine-tar data for routino_3.3.orig.tar.gz - - - - - 1e9b52b5 by Bas Couwenberg at 2019-09-09T04:23:50Z pristine-tar data for routino_3.3.1.orig.tar.gz - - - - - 4 changed files: - + routino_3.3.1.orig.tar.gz.delta - + routino_3.3.1.orig.tar.gz.id - + routino_3.3.orig.tar.gz.delta - + routino_3.3.orig.tar.gz.id Changes: ===================================== routino_3.3.1.orig.tar.gz.delta ===================================== Binary files /dev/null and b/routino_3.3.1.orig.tar.gz.delta differ ===================================== routino_3.3.1.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +80beb44e610e0642b19d6460def314da5267071b ===================================== routino_3.3.orig.tar.gz.delta ===================================== Binary files /dev/null and b/routino_3.3.orig.tar.gz.delta differ ===================================== routino_3.3.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +0579ccf2d913be7af52e43be14e124a0699f84f8 View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/ff45d14804146b7c9d7d883f3af3b8edef89518f...1e9b52b5adf8ab24ef175b194382d44416441428 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/ff45d14804146b7c9d7d883f3af3b8edef89518f...1e9b52b5adf8ab24ef175b194382d44416441428 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 9 05:49:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 09 Sep 2019 04:49:46 +0000 Subject: [Git][debian-gis-team/routino][upstream] 2 commits: New upstream version 3.3 Message-ID: <5d75d9ea609a6_73482ad961b55c3814336@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / routino Commits: 01598e6d by Bas Couwenberg at 2019-09-08T05:46:40Z New upstream version 3.3 - - - - - 87a9a0ae by Bas Couwenberg at 2019-09-09T04:23:42Z New upstream version 3.3.1 - - - - - 29 changed files: - ChangeLog - Makefile - Makefile.conf - doc/INSTALL.txt - doc/NEWS.txt - doc/README.txt - doc/TAGGING.txt - doc/html/installation.html - doc/html/readme.html - doc/html/tagging.html - extras/README.txt - extras/errorlog/summarise-log.pl - extras/find-fixme/Makefile - extras/find-fixme/fixme-dumper.c - extras/find-fixme/web/www/fixme.leaflet.js - extras/find-fixme/web/www/fixme.openlayers.js - + extras/find-fixme/web/www/fixme.openlayers2.js - extras/statistics/Makefile - extras/statistics/dumper.c - extras/statistics/update.sh - extras/tagmodifier/Makefile - + python/Makefile - + python/README.txt - + python/database.py - + python/router.py - + python/setup.py - + python/src/__init__.py - + python/src/database.cc - + python/src/database.hh The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/ee49c67b8c5820878d42f26326c9f3d1524a6777...87a9a0ae346fb0c9e7836e81f2a7788b284bea25 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/ee49c67b8c5820878d42f26326c9f3d1524a6777...87a9a0ae346fb0c9e7836e81f2a7788b284bea25 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 9 05:50:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 09 Sep 2019 04:50:00 +0000 Subject: [Git][debian-gis-team/routino] Pushed new tag debian/3.3.1-1 Message-ID: <5d75d9f8a7432_73483fbbbee421901435b7@godard.mail> Bas Couwenberg pushed new tag debian/3.3.1-1 at Debian GIS Project / routino -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/tree/debian/3.3.1-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 9 05:50:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 09 Sep 2019 04:50:05 +0000 Subject: [Git][debian-gis-team/routino] Pushed new tag upstream/3.3.1 Message-ID: <5d75d9fd4b37f_73483fbbb23f22dc143880@godard.mail> Bas Couwenberg pushed new tag upstream/3.3.1 at Debian GIS Project / routino -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/tree/upstream/3.3.1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 9 05:50:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 09 Sep 2019 04:50:04 +0000 Subject: [Git][debian-gis-team/routino][master] 5 commits: New upstream version 3.3.1 Message-ID: <5d75d9fcb67c6_73482ad9616cc82414363@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / routino Commits: 87a9a0ae by Bas Couwenberg at 2019-09-09T04:23:42Z New upstream version 3.3.1 - - - - - faa42cdf by Bas Couwenberg at 2019-09-09T04:23:50Z Update upstream source from tag 'upstream/3.3.1' Update to upstream version '3.3.1' with Debian dir c09eaffb1b7908d62415b58305ea9721ef70d064 - - - - - f246fbb1 by Bas Couwenberg at 2019-09-09T04:25:37Z New upstream release. - - - - - fd215699 by Bas Couwenberg at 2019-09-09T04:26:08Z Drop python.patch, fixed upstream. - - - - - fd0b60b5 by Bas Couwenberg at 2019-09-09T04:41:01Z Set distribution to unstable. - - - - - 21 changed files: - ChangeLog - debian/changelog - − debian/patches/python.patch - debian/patches/series - doc/NEWS.txt - doc/README.txt - doc/html/readme.html - + python/Makefile - + python/README.txt - + python/database.py - + python/router.py - + python/setup.py - + python/src/__init__.py - + python/src/database.cc - + python/src/database.hh - + python/src/database.i - + python/src/router.i - + python/test/Makefile - + python/test/run-database-tests.sh - + python/test/run-one-test.sh - + python/test/run-router-tests.sh Changes: ===================================== ChangeLog ===================================== @@ -1,3 +1,20 @@ +2019-09-08 Andrew M. Bishop + + Version 3.3.1 released. + +2019-09-08 [r2019] Andrew M. Bishop + + * FILES, doc/NEWS.txt, doc/README.txt, doc/html/readme.html: Update + for version 3.3.1 release. + +2019-09-08 [r2017-2018] Andrew M. Bishop + + * python/Makefile: Make sure that 'make clean' deletes all files + generated by swig. + + * python/README.txt: Correct tiny mistake in documentation + formatting. + 2019-09-07 Andrew M. Bishop Version 3.3 released. ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +routino (3.3.1-1) unstable; urgency=medium + + * New upstream release. + * Drop python.patch, fixed upstream. + + -- Bas Couwenberg Mon, 09 Sep 2019 06:40:48 +0200 + routino (3.3-1) unstable; urgency=medium * New upstream release. ===================================== debian/patches/python.patch deleted ===================================== @@ -1,14 +0,0 @@ -Description: Remove python subdir, not included in upstream tarball. -Author: Bas Couwenberg - ---- a/Makefile -+++ b/Makefile -@@ -24,7 +24,7 @@ include Makefile.conf - - # Sub-directories and sub-makefiles - --SUBDIRS=src xml doc web extras python -+SUBDIRS=src xml doc web extras - - ######## - ===================================== debian/patches/series ===================================== @@ -6,4 +6,3 @@ mapprops hardening #map_bounds #use_openlayers -python.patch ===================================== doc/NEWS.txt ===================================== @@ -1,3 +1,14 @@ +Version 3.3.1 of Routino released : Sun Sep 8 2019 +-------------------------------------------------- + +Bug fixes: + Ensure that 'make clean' in the python directory deletes auto-generated files. + Include the python directory in the release file (include in 'FILES'). + + +Note: This version is compatible with databases from versions 2.7.1 - 3.3. + + Version 3.3 of Routino released : Sat Sep 7 2019 ------------------------------------------------ ===================================== doc/README.txt ===================================== @@ -135,6 +135,7 @@ Status Version 3.1.1 of Routino was released on 6th March 2016. Version 3.2 of Routino was released on 12th March 2017. Version 3.3 of Routino was released on 7th September 2019. + Version 3.3.1 of Routino was released on 8th September 2019. The full version history is available in the NEWS.txt file. ===================================== doc/html/readme.html ===================================== @@ -222,13 +222,27 @@ Version 3.1.1 of Routino was released on 6th March 2016. Version 3.2 of Routino was released on 12th March 2017.
Version 3.3 of Routino was released on 7th September 2019. +
+Version 3.3.1 of Routino was released on 8th September 2019.

The full version history is available in the NEWS.txt file. -

Changes in Version 3.3

+

Changes in Version 3.3.1

+ +
+
Bug fixes: +
Ensure that 'make clean' in the python directory deletes auto-generated files. +
Include the python directory in the release file (include in 'FILES'). +
+ +

+Note: This version is compatible with databases from versions 2.7.1 - 3.3. + + +

Changes in Version 3.3

Bug fixes: ===================================== python/Makefile ===================================== @@ -0,0 +1,130 @@ +# Python interface Makefile +# +# Part of the Routino routing software. +# +# This file Copyright 2018, 2019 Andrew M. Bishop +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU Affero General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Affero General Public License for more details. +# +# You should have received a copy of the GNU Affero General Public License +# along with this program. If not, see . +# + +# All configuration is in the top-level Makefile.conf + +include ../Makefile.conf + +# Programs + +PYTHON=python3 + +SWIG=swig + +# Compilation targets + +PY_FILES=$(wildcard src/*.py) + +C_FILES=$(wildcard src/*.c) +CC_FILES=$(wildcard src/*.cc) + +SWIG_C=src/_router.c +SWIG_CC=src/_database.cc +SWIG_PY=src/router.py src/database.py + +ifneq ($(HOST),MINGW) + LIBROUTINO=../src/libroutino.so +else + LIBROUTINO=../src/routino.dll +endif + +BUILD_TIMESTAMP=build/.timestamp + +# Check that we have Python3 and swig installed + +HAVE_PYTHON=$(shell $(PYTHON) --version 2> /dev/null) + +HAVE_SWIG=$(shell $(SWIG) -version 2> /dev/null) + +ifeq ($(HAVE_PYTHON),) + $(warning Python3 not installed - skipping Python module creation) +endif + +ifeq ($(HAVE_SWIG),) + $(warning Swig not installed - skipping Python module creation) +endif + +######## + +all: $(and $(HAVE_SWIG),$(HAVE_PYTHON),all-if-python) + +all-if-python: $(BUILD_TIMESTAMP) + +######## + +$(BUILD_TIMESTAMP): $(SWIG_C) $(SWIG_CC) $(SWIG_PY) $(PY_FILES) $(C_FILES) $(CC_FILES) $(LIBROUTINO) setup.py + @rm -f $@ + $(PYTHON) setup.py build && touch $(BUILD_TIMESTAMP) + +src/_router.c : src/router.i ../src/routino.h + $(SWIG) -python -o $@ $< + +src/_database.cc : src/database.i src/database.hh + $(SWIG) -c++ -python -o $@ $< + +src/%.o : src/%.c + $(CC) -c $(CFLAGS) $< -o $@ + +src/%.o : src/%.cc + $(CXX) -c $(CFLAGS) $< -o $@ + +$(LIBROUTINO): + cd ../src && $(MAKE) all-lib + +######## + +test: $(and $(HAVE_SWIG),$(HAVE_PYTHON),test-if-python) + +test-if-python: $(BUILD_TIMESTAMP) + cd test && $(MAKE) test + +######## + +install: $(and $(HAVE_SWIG),$(HAVE_PYTHON),install-if-python) + +install-if-python: all + $(PYTHON) setup.py install --prefix $(prefix) + +######## + +clean: clean-local + cd test && $(MAKE) $@ + +clean-local: + rm -f *~ + rm -rf build + rm -f $(SWIG_C) + rm -f $(SWIG_CC) + rm -f $(SWIG_PY) + +######## + +distclean: distclean-local + cd test && $(MAKE) $@ + +distclean-local: clean-local + +######## + +.PHONY:: all test install clean distclean + +.PHONY:: all-if-python test-if-python install-if-python + +.PHONY:: clean-local distclean-local ===================================== python/README.txt ===================================== @@ -0,0 +1,56 @@ + ROUTINO PYTHON + ============== + +This directory contains a Python version 3 interface to the Routino routing +database that allows routes to be calculated and the database to be accessed. + +Compilation +----------- + +To compile the Python module run 'make'. A working Python 3 installation and +the Swig tool are required to be able to compile this Python module. If they +are not available then a warning will be printed but no error occur. + +Running 'make' in the top level directory will also try to build the module. + +Testing +------- + +To run the test scripts run 'make test'. The tests verify that the results of +the Python version are identical to the results of the compiled version. + +Running 'make test' in the top level directory will also try to run the tests +for the Python module. + +Installation +------------ + +To install the Python module run 'make install'. The installation directory is +the one defined in 'Makefile.conf'. + +Running 'make install' in the top level directory will also try to install the +module. + +Using - Router +-------------- + +To use the Python module normally it must be installed and the libroutino +library must also be installed in a directory that is searched for libraries. + +The Python example router 'router.py' accepts the same command line arguments as +the compiled versions. + +The Python module supports exactly the same functionality as the Routino library +(libroutino) because it is implemented simply as a wrapper around that library. +The documentation for using the library (and therefore the Python module) is +available in the files "doc/LIBRARY.txt" and "doc/html/library.html". + +Using - Database +---------------- + +To use the Python module normally it must be installed, the libroutino library +is not required for the database access functions. + +The Python script 'database.py' is an example of using the Python module for +accessing a Routino database (one created by 'make test'). No further +documentation is provided, all possible features are used in the example script. ===================================== python/database.py ===================================== @@ -0,0 +1,272 @@ +#!/usr/bin/python3 +########################################## +# Routino database access from Python. +# +# Part of the Routino routing software. +########################################## +# This file Copyright 2018 Andrew M. Bishop +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU Affero General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Affero General Public License for more details. +# +# You should have received a copy of the GNU Affero General Public License +# along with this program. If not, see . +########################################## + +import routino.database + + +# Database, access all attributes + +database = routino.database.LoadDatabase("../../src/test/fat", "turns") + +if database is None: + database = routino.database.LoadDatabase("../src/test/fat", "turns") + + if database is None: + print("Failed to load database") + exit(1) + +print(database) + +database_attrs = ['nnodes', 'nsegments', 'nways', 'nrelations'] + +for attr in database_attrs: + print(" Attribute: " + attr + " =", getattr(database, attr)) + +print("") + + +# A single node, access all attributes and all functions + +node=database.GetNode(0) + +print("1st node =", node) + +node_attrs = ['id', 'firstsegment', 'latitude', 'longitude', 'allow', 'flags'] +node_infos = ['', '', 'degrees', 'degrees', '[note 1]', '[note 2]'] + +for attr,info in zip(node_attrs,node_infos): + print(" Attribute: " + attr + " =", getattr(node, attr), info) + +segments = node.Segments() +print(" Function: " + "Segments()" + " = [" + ", ".join([str(segments[x]) for x in range(len(segments))]) + "]") + +print("") + + +# A single segment, access all attributes and all functions + +segment=database.GetSegment(0) + +print("1st segment =", segment) + +segment_attrs = ['id', 'node1', 'node2', 'next2', 'way', 'distance', 'flags'] +segment_infos = ['', '', '', '', '', 'km', '[note 3]'] + +for attr,info in zip(segment_attrs,segment_infos): + print(" Attribute: " + attr + " =", getattr(segment, attr), info) + +print(" Function: " + "Node1()" + " = " + str(segment.Node1())) +print(" Function: " + "Node2()" + " = " + str(segment.Node2())) +print(" Function: " + "Way()" + " = " + str(segment.Way())) + +print("") + + +# A single way, access all attributes and all functions + +way=database.GetWay(0) + +print("1st way =", way) + +way_attrs = ['id', 'name', 'allow', 'type', 'props', 'speed', 'weight', 'height', 'width', 'length'] +way_infos = ['', '', '[note 1]', '[note 4]', '[note 5]', 'km/hr [note 6]', 'tonnes [note 6]', 'metres [note 6]', 'metres [note 6]', 'metres [note 6]'] + +for attr,info in zip(way_attrs,way_infos): + print(" Attribute: " + attr + " =", getattr(way, attr), info) + +print("") + + +# A single relation, access all attributes and all functions + +relation=database.GetRelation(0) + +print("1st relation =", relation) + +relation_attrs = ['id', 'from_seg', 'via_node', 'to_seg', 'from_way', 'to_way', 'from_node', 'to_node', 'except_transport'] +relation_infos = ['', '', '', '', '', '', '', '', '[note 7]'] + +for attr,info in zip(relation_attrs,relation_infos): + print(" Attribute: " + attr + " =", getattr(relation, attr), info) + +print(" Function: " + "FromSegment()" + " = " + str(relation.FromSegment())) +print(" Function: " + "ViaNode()" + " = " + str(relation.ViaNode())) +print(" Function: " + "ToSegment()" + " = " + str(relation.ToSegment())) + +print(" Function: " + "FromWay()" + " = " + str(relation.FromWay())) +print(" Function: " + "ToWay()" + " = " + str(relation.ToWay())) + +print(" Function: " + "FromNode()" + " = " + str(relation.FromNode())) +print(" Function: " + "ToNode()" + " = " + str(relation.ToNode())) + +print("") + + +# The list of nodes as a list and an iterable (just the first 4) + +nodes=database.Nodes() + +print("len(database.Nodes()) = " + str(len(nodes))) + +print("database.Nodes() = [" + ", ".join([str(nodes[x]) for x in range(4)]) + ", ...]") + +for node in nodes: + if node.id == 4: + break + print(node) + +print("") + + +# The list of segments as a list and an iterable (just the first 4) + +segments=database.Segments() + +print("len(database.Segments()) = " + str(len(segments))) + +print("database.Segments() = [" + ", ".join([str(segments[x]) for x in range(4)]) + ", ...]") + +for segment in segments: + if segment.id == 4: + break + print(segment) + +print("") + + +# The list of ways as a list and an iterable (just the first 4) + +ways=database.Ways() + +print("len(database.Ways()) = " + str(len(ways))) + +print("database.Ways() = [" + ", ".join([str(ways[x]) for x in range(4)]) + ", ...]") + +for way in ways: + if way.id == 4: + break + print(way) + +print("") + + +# The list of relations as a list and an iterable (just the first 4) + +relations=database.Relations() + +print("len(database.Relations()) = " + str(len(relations))) + +print("database.Relations() = [" + ", ".join([str(relations[x]) for x in range(4)]) + ", ...]") + +for relation in relations: + if relation.id == 4: + break + print(relation) + +print("") + + +# Enumerated lists + +transports_enum = ["Transports_None", + "Transports_Foot", + "Transports_Horse", + "Transports_Wheelchair", + "Transports_Bicycle", + "Transports_Moped", + "Transports_Motorcycle", + "Transports_Motorcar", + "Transports_Goods", + "Transports_HGV", + "Transports_PSV", + "Transports_ALL"] + +nodeflags_enum = ["Nodeflag_Super", + "Nodeflag_U_Turn", + "Nodeflag_Mini_Roundabout", + "Nodeflag_Turn_Restrict", + "Nodeflag_Turn_Restrict2"] + +segmentflags_enum = ["Segmentflag_Area", + "Segmentflag_Oneway_1to2", + "Segmentflag_Oneway_2to1", + "Segmentflag_Super", + "Segmentflag_Normal"] + +properties_enum = ["Properties_None", + "Properties_Paved", + "Properties_Multilane", + "Properties_Bridge", + "Properties_Tunnel", + "Properties_FootRoute", + "Properties_BicycleRoute", + "Properties_ALL"] + +highway_enum = ["Highway_Motorway", + "Highway_Trunk", + "Highway_Primary", + "Highway_Secondary", + "Highway_Tertiary", + "Highway_Unclassified", + "Highway_Residential", + "Highway_Service", + "Highway_Track", + "Highway_Cycleway", + "Highway_Path", + "Highway_Steps", + "Highway_Ferry", + "Highway_Count", + "Highway_CycleBothWays", + "Highway_OneWay", + "Highway_Roundabout", + "Highway_Area"] + +def print_enum(list): + for item in list: + print(" routino.database."+item) + + +print("Note 1: The Node's and Way's 'allow' parameter can be the combination of these enumerated values:") +print_enum(transports_enum) +print("") +print("Note 2: The Node's 'flags' parameter can be the combination of these enumerated values:") +print_enum(nodeflags_enum) +print("") +print("Note 3: The Segment's 'flags' parameter can be the combination of these enumerated values:") +print_enum(segmentflags_enum) +print("") +print("Note 4: The Way's 'type' parameter can be one the combination of these enumerated values:") +print_enum(highway_enum) +print("") +print("Note 5: The Way's 'props' parameter can be the combination of these enumerated values:") +print_enum(properties_enum) +print("") +print("Note 6: A value of zero for a Way's speed, weight, height, width or length means that there is no limit.") +print("") +print("Note 7: The Relation's 'except_transport' parameter can be the combination of these enumerated values:") +print_enum(transports_enum) +print("") + + +import gc + +gc.collect() ===================================== python/router.py ===================================== @@ -0,0 +1,327 @@ +#!/usr/bin/python3 +########################################## +# OSM router calling libroutino library from Python. +# +# Part of the Routino routing software. +########################################## +# This file Copyright 2018 Andrew M. Bishop +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU Affero General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Affero General Public License for more details. +# +# You should have received a copy of the GNU Affero General Public License +# along with this program. If not, see . +########################################## + +import argparse +import sys +import os +import math + +import routino.router as routino + + +# Parse the command line arguments + +argparser = argparse.ArgumentParser(description="Calculates a route using Routino and command line data.") + +argparser.add_argument("--version", dest="version", action='store_false', help="Print the version of Routino.") + +argparser.add_argument("--dir", dest="dirname", type=str, default=None, help="The directory containing the routing database.") +argparser.add_argument("--prefix", dest="prefix", type=str, default=None, help="The filename prefix for the routing database.") + +argparser.add_argument("--profiles", dest="profiles", type=str, default=None, help="The name of the XML file containing the profiles (defaults to 'profiles.xml' with '--dir' and '--prefix' options).") +argparser.add_argument("--translations", dest="translations", type=str, default=None, help="The name of the XML file containing the translations (defaults to 'translations.xml' with '--dir' and '--prefix' options).") + +argparser.add_argument("--reverse", dest="reverse", action='store_true', help="Find a route between the waypoints in reverse order.") +argparser.add_argument("--loop", dest="loop", action='store_true', help="Find a route that returns to the first waypoint.") + +argparser.add_argument("--output-html", dest="html", action='store_true', help="Write an HTML description of the route.") +argparser.add_argument("--output-gpx-track", dest="gpx_track", action='store_true', help="Write a GPX track file with all route points.") +argparser.add_argument("--output-gpx-route", dest="gpx_route", action='store_true', help="Write a GPX route file with interesting junctions.") +argparser.add_argument("--output-text", dest="text", action='store_true', help="Write a plain text file with interesting junctions.") +argparser.add_argument("--output-text-all", dest="text_all", action='store_true', help="Write a plain text file with all route points.") +argparser.add_argument("--output-none", dest="none", action='store_true', help="Don't write any output files or read any translations. (If no output option is given then all are written.)") +argparser.add_argument("--output-stdout", dest="use_stdout", action='store_true', help="Write to stdout instead of a file (requires exactly one output format option, implies '--quiet').") + +argparser.add_argument("--list-html", dest="list_html", action='store_true', help="Create an HTML list of the route.") +argparser.add_argument("--list-html-all", dest="list_html_all", action='store_true', help="Create an HTML list of the route with all points.") +argparser.add_argument("--list-text", dest="list_text", action='store_true', help="Create a plain text list with interesting junctions.") +argparser.add_argument("--list-text-all", dest="list_text_all", action='store_true', help="Create a plain text list with all route points.") + +argparser.add_argument("--profile", dest="profilename", type=str, default=None, help="Select the loaded profile with this name.") +argparser.add_argument("--language", dest="language", type=str, default=None, help="Use the translations for specified language.") + +argparser.add_argument("--quickest", dest="shortest", action='store_false', help="Find the quickest route between the waypoints.") +argparser.add_argument("--shortest", dest="shortest", action='store_true', help="Find the shortest route between the waypoints.") + +argparser.add_argument("--lon", dest="lons", action='append', type=float, help="Specify the longitude of the next waypoint (can also use '--lon' to specify the n'th longitude).") +argparser.add_argument("--lat", dest="lats", action='append', type=float, help="Specify the latitude of the next waypoint (can also use '--lat' to specify the n'th latitude).") + +for i in range(1,99): + argparser.add_argument("--lon"+str(i), dest="lon"+str(i), type=float, help=argparse.SUPPRESS) + argparser.add_argument("--lat"+str(i), dest="lat"+str(i), type=float, help=argparse.SUPPRESS) + +args = argparser.parse_args() + + +# Check the specified command line options + +if args.use_stdout and (int(args.html)+int(args.gpx_track)+int(args.gpx_route)+int(args.text)+int(args.text_all))!=1: + print("Error: The '--output-stdout' option requires exactly one other output option (but not '--output-none').") + sys.exit(1) + +if not args.html and not args.gpx_track and not args.gpx_route and not args.text and not args.text_all and not args.none: + args.html=True + args.gpx_track=True + args.gpx_route=True + args.text=True + args.text_all=True + + +# Load in the selected profiles + +if args.profiles is not None: + if not os.access(args.profiles,os.F_OK): + print("Error: The '--profiles' option specifies a file '{:s}' that does not exist.".format(args.profiles)) + sys.exit(1) +else: + args.profiles=routino.FileName(args.dirname,args.prefix,"profiles.xml") + + if not os.access(args.profiles,os.F_OK): + defaultprofiles = routino.FileName("../xml/","routino","profiles.xml") + + if not os.access(defaultprofiles,os.F_OK): + print("Error: The '--profiles' option was not used and the files '{:s}' and '{:s}' do not exist.".format(args.profiles,defaultprofiles)) + sys.exit(1) + + args.profiles=defaultprofiles + +if args.profilename is None: + print("Error: A profile name must be specified") + sys.exit(1) + +if routino.ParseXMLProfiles(args.profiles): + print("Error: Cannot read the profiles in the file '{:s}'.".format(args.profiles)) + sys.exit(1) + +profile=routino.GetProfile(args.profilename) + +if profile is None: + list = routino.GetProfileNames() + + print("Error: Cannot find a profile called '{:s}' in the file '{:s}'.".format(args.profilename,args.profiles)) + print("Profiles available are: {:s}.".format(", ".join(list))) + sys.exit(1) + + +# Load in the selected translation + +if args.translations is not None: + if not os.access(args.translations,os.F_OK): + print("Error: The '--translations' option specifies a file '{:s}' that does not exist.".format(args.translations)) + sys.exit(1) + +else: + args.translations=routino.FileName(args.dirname,args.prefix,"translations.xml") + + if not os.access(translations,os.F_OK): + defaulttranslations = routino.FileName("../xml/","routino","translations.xml") + + if not os.access(defaulttranslations,os.F_OK): + print("Error: The '--translations' option was not used and the files '{:s}' and '{:s}' do not exist.".format(args.translations,defaulttranslations)) + sys.exit(1) + + args.translations=defaulttranslations + +if routino.ParseXMLTranslations(args.translations): + print("Error: Cannot read the translations in the file '{:s}'.".format(args.translations)) + sys.exit(1) + +if args.language is not None: + translation = routino.GetTranslation(args.language) + + if translation is None: + list1 = routino.GetTranslationLanguages() + list2 = routino.GetTranslationLanguageFullNames() + + print("Error: Cannot find a translation called '{:s}' in the file '{:s}'.".format(args.language,args.translations)) + print("Languages available are: {:s}".format(", ".join([i1+" ("+i2+")" for i1,i2 in zip(list1,list2)]))) + sys.exit(1) + +else: + translation = routino.GetTranslation("") # first in file + + if translation is None: + print("Error: No translations in '{:s}'.".format(args.translations)) + sys.exit(1) + + +# Create the numbered waypoints + +firstlatlon = True + +for i in range(1,99): + + lon = getattr(args,"lon"+str(i),None) + lat = getattr(args,"lat"+str(i),None) + + if lon is None and lat is None: + continue + + if lon is None or lat is None: + print("Error: All waypoints must have latitude and longitude.") + sys.exit(1) + + if firstlatlon: + if args.lats is not None or args.lons is not None: + print("Error: Mixing numbered and un-numbered waypoints is not allowed.") + sys.exit(1) + else: + firstlatlon = False + args.lons = [] + args.lats = [] + + args.lons.append(lon) + args.lats.append(lat) + + +# Check the waypoints are valid + +if args.lats is None or len(args.lats) < 2 or args.lons is None or len(args.lons) < 2: + print("Error: At least two waypoints must be specified.") + sys.exit(1) + +if len(args.lats) != len(args.lons): + print("Error: Number of latitudes ({:d}) and longitudes ({:d}) do not match.".format(len(lats),len(lons))) + sys.exit(1) + + +# Load in the routing database + +database = routino.LoadDatabase(args.dirname,args.prefix) + +if database is None: + print("Error: Could not load Routino database.") + sys.exit(1) + + +# Check the profile is valid for use with this database + +if routino.ValidateProfile(database,profile)!=routino.ERROR_NONE: + print("Error: Profile is invalid or not compatible with database.") + sys.exit(1) + + +# Loop through all waypoints + +nwaypoints = 0 +waypoints = [] + +for n in range(len(args.lats)): + + waypoint = routino.FindWaypoint(database, profile, args.lats[n], args.lons[n]) + + if waypoint is None: + print("Error: Cannot find node close to specified point {:d}.",n); + sys.exit(1) + + waypoints.append(waypoint) + + +# Create the route + +routing_options=0 + +if args.shortest: + routing_options |= routino.ROUTE_SHORTEST +else: + routing_options |= routino.ROUTE_QUICKEST + +if args.html : routing_options |= routino.ROUTE_FILE_HTML +if args.gpx_track: routing_options |= routino.ROUTE_FILE_GPX_TRACK +if args.gpx_route: routing_options |= routino.ROUTE_FILE_GPX_ROUTE +if args.text : routing_options |= routino.ROUTE_FILE_TEXT +if args.text_all : routing_options |= routino.ROUTE_FILE_TEXT_ALL + +if args.list_html : routing_options |= routino.ROUTE_LIST_HTML +if args.list_html_all: routing_options |= routino.ROUTE_LIST_HTML_ALL +if args.list_text : routing_options |= routino.ROUTE_LIST_TEXT +if args.list_text_all: routing_options |= routino.ROUTE_LIST_TEXT_ALL + +if args.reverse: routing_options |= routino.ROUTE_REVERSE +if args.loop : routing_options |= routino.ROUTE_LOOP + +route = routino.CalculateRoute(database, profile, translation, waypoints, routing_options) + +if routino.errno >= routino.ERROR_NO_ROUTE_1: + print("Error: Cannot find a route between specified waypoints") + sys.exit(1) + +if routino.errno != routino.ERROR_NONE: + print("Error: Internal error ({:d}).".format(routino.errno)) + sys.exit(1) + + +# Print the list output + +if args.list_html or args.list_html_all or args.list_text or args.list_text_all: + + list=route + first=True + last=False + + while list: + + if list.next: + last = False + else: + last = True + + print("----------------") + print("Lon,Lat: {:.5f}, {:.5f}".format((180.0/math.pi)*list.lon,(180.0/math.pi)*list.lat)) + + if args.list_html or args.list_html_all or args.list_text or args.list_text_all: + print("Dist,Time: {:.3f} km, {:.1f} minutes".format(list.dist,list.time)) + + if args.list_text_all and not first: + print("Speed: {:0f} km/hr".format(list.speed)) + + print("Point type: {:d}".format(list.type)) + + if (args.list_html or args.list_html_all or args.list_text) and not first and not last: + print("Turn: {:d} degrees".format(list.turn)) + + if ((args.list_html or args.list_html_all or args.list_text) and not last) or (args.list_text_all and not first): + print("Bearing: {:d} degrees".format(list.bearing)) + + if ((args.list_html or args.list_text) and not last) or (args.list_html_all and list.name) or (args.list_text_all and not first): + print("Name: {:s}".format(list.name)) + + if args.list_html or (args.list_html_all and list.name): + print("Desc1: {:s}".format(list.desc1)) + print("Desc2: {:s}".format(list.desc2)) + + if not last: + print("Desc3: {:s}".format(list.desc3)) + + list = list.next + first = False + + +# Tidy up and exit + +routino.DeleteRoute(route) + +routino.UnloadDatabase(database) + +routino.FreeXMLProfiles() + +routino.FreeXMLTranslations() ===================================== python/setup.py ===================================== @@ -0,0 +1,55 @@ +# Python interface setup script +# +# Part of the Routino routing software. +# +# This file Copyright 2018 Andrew M. Bishop +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU Affero General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Affero General Public License for more details. +# +# You should have received a copy of the GNU Affero General Public License +# along with this program. If not, see . +# + +import os +import re +from distutils.core import setup, Extension + +routino_router = Extension('routino._router', + sources = ['src/_router.c'], + include_dirs = ['../src'], + library_dirs = ['../src'], + libraries = ['routino']) + +# Note: the database needs access to all symbols, not just those +# exported by the libroutino library so it must link with the object +# files and not just the library. + +lib_files = [] + +for file in os.listdir('../src'): + if re.search("-lib.o", file) and not re.search("-slim-lib.o", file): + lib_files.append("../src/" + file) + +routino_database = Extension('routino._database', + sources = ['src/_database.cc', 'src/database.cc'], + include_dirs = ['../src'], + extra_objects = lib_files, + library_dirs = ['../src']) + +setup (name = 'Routino', + version = '1.0', + author="Andrew M. Bishop", author_email='amb at routino.org', + url='http://routino.org/', + description = 'Interfaces to Routino in Python', + packages = ['routino'], + package_dir = {'routino': 'src'}, + py_modules = ['routino', 'routino.router', 'routino.database'], + ext_modules = [routino_router, routino_database]) ===================================== python/src/__init__.py ===================================== ===================================== python/src/database.cc ===================================== @@ -0,0 +1,544 @@ +/*************************************** + Routino database access from Python. + + Part of the Routino routing software. + ******************/ /****************** + This file Copyright 2018 Andrew M. Bishop + + This program is free software: you can redistribute it and/or modify + it under the terms of the GNU Affero General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + This program is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU Affero General Public License for more details. + + You should have received a copy of the GNU Affero General Public License + along with this program. If not, see . + ***************************************/ + + +#include + +extern "C" { + +#include "types.h" + +#include "nodes.h" +#include "segments.h" +#include "ways.h" +#include "relations.h" + +#include "routino.h" + +} + +#include "database.hh" + + +/* Copied from ../src/routino.c */ + +struct _Routino_Database +{ + Nodes *nodes; + Segments *segments; + Ways *ways; + Relations *relations; +}; + + +/*++++++++++++++++++++++++++++++++++++++ + Create the PythonDatabase object by loading the database and filling in some useful information. + + PythonDatabase *LoadDatabase Return a pointer to the Python view of the database. + + const char *dirname The name of the directory. + + const char *prefix The filename prefix (or NULL). + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabase *LoadDatabase(const char *dirname,const char *prefix) +{ + Routino_Database *database = Routino_LoadDatabase(dirname, prefix); + + if(!database) + return NULL; + else + return new PythonDatabase(dirname, prefix, database); +} + + +/*++++++++++++++++++++++++++++++++++++++ + Create the PythonDatabase object by loading the database and filling in some useful information. + + PythonDatabase LoadDatabase Return a pointer to the Python view of the database. + + const char *dirname The name of the directory. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabase *LoadDatabase(const char *dirname) +{ + return LoadDatabase(dirname,NULL); +} + + +/*++++++++++++++++++++++++++++++++++++++ + Create the PythonDatabase by passing it a loaded database. + + const char *_dirname The name of the directory. + + const char *_prefix The filename prefix (or NULL). + + Routino_Database *_database The opened database. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabase::PythonDatabase(const char *_dirname,const char *_prefix, Routino_Database *_database) +{ + database = _database; + + /* Copy the database path information */ + + dirname = new char[strlen(_dirname)+1]; + strcpy(dirname,_dirname); + + prefix = new char[strlen(_prefix)+1]; + strcpy(prefix,_prefix); + + /* Fill in the extra information */ + + nnodes = database->segments->file.number; + nsegments = database->nodes->file.number; + nways = database->ways->file.number; + nrelations = database->relations->file.trnumber; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Destroy the PythonDatabase by unloading the database. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabase::~PythonDatabase() +{ + Routino_UnloadDatabase(database); + + delete[] dirname; + delete[] prefix; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Return a pointer to a modified Node data structure for use by Python. + + PythonNode *GetNode Returns a pointer to the Python view of the node. + + index_t item The index number of the Node. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonNode *PythonDatabase::GetNode(index_t item) +{ + PythonNode *pynode=new PythonNode(this); + + Node *nodep=LookupNode(database->nodes,item,1); + double latitude,longitude; + + GetLatLong(database->nodes,item,nodep,&latitude,&longitude); + + pynode->id = item; + + pynode->firstsegment = nodep->firstseg; + + pynode->latitude = radians_to_degrees(latitude); + pynode->longitude = radians_to_degrees(longitude); + + pynode->allow = nodep->allow; + pynode->flags = nodep->flags; + + return pynode; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Return a pointer to a modified Segment data structure for use by Python. + + PythonSegment *GetSegment Returs a pointer to the Python view of the segment. + + index_t item The index number of the Segment. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonSegment *PythonDatabase::GetSegment(index_t item) +{ + PythonSegment *pysegment=new PythonSegment(this); + + Segment *segmentp=LookupSegment(database->segments,item,1); + + pysegment->id = item; + + pysegment->node1 = segmentp->node1; + pysegment->node2 = segmentp->node2; + + pysegment->next2 = segmentp->next2; + + pysegment->way = segmentp->way; + + pysegment->distance = distance_to_km(DISTANCE(segmentp->distance)); + + pysegment->flags = DISTFLAG(segmentp->distance); + + return pysegment; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Return a pointer to a modified Way data structure for use by Python. + + PythonWay *GetWay Returs a pointer to the Python view of the way. + + index_t item The index number of the Way. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonWay *PythonDatabase::GetWay(index_t item) +{ + PythonWay *pyway=new PythonWay(this); + + Way *wayp=LookupWay(database->ways,item,1); + char *name=WayName(database->ways,wayp); + + pyway->id = item; + + pyway->name = name; + + pyway->allow = wayp->allow; + + pyway->type = wayp->type; + + pyway->props = wayp->props; + + pyway->speed = speed_to_kph(wayp->speed); + + pyway->weight = weight_to_tonnes(wayp->weight); + pyway->height = height_to_metres(wayp->height); + pyway->width = width_to_metres(wayp->width); + pyway->length = length_to_metres(wayp->length); + + return pyway; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Return a pointer to a modified Relation data structure for use by Python. + + PythonRelation *GetRelation Returs a pointer to the Python view of the relation. + + index_t item The index number of the Relation. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonRelation *PythonDatabase::GetRelation(index_t item) +{ + PythonRelation *pyrelation=new PythonRelation(this); + + TurnRelation *relationp=LookupTurnRelation(database->relations,item,1); + + pyrelation->id = item; + + pyrelation->from_seg = relationp->from; + pyrelation->via_node = relationp->via; + pyrelation->to_seg = relationp->to; + + Node *nodep=LookupNode(database->nodes,relationp->via,1); + index_t from_way=NO_WAY,to_way=NO_WAY; + index_t from_node=NO_NODE,to_node=NO_NODE; + + Segment *segmentp=FirstSegment(database->segments,nodep,1); + + do + { + index_t seg=IndexSegment(database->segments,segmentp); + + if(seg==relationp->from) + { + from_node=OtherNode(segmentp,relationp->via); + from_way=segmentp->way; + } + + if(seg==relationp->to) + { + to_node=OtherNode(segmentp,relationp->via); + to_way=segmentp->way; + } + + segmentp=NextSegment(database->segments,segmentp,relationp->via); + } + while(segmentp); + + pyrelation->from_way = from_way; + pyrelation->to_way = to_way; + + pyrelation->from_node = from_node; + pyrelation->to_node = to_node; + + pyrelation->except_transport = relationp->except; + + return pyrelation; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Create an iterator so that we can iterate through all nodes in the database. + + PythonDatabaseIter *PythonDatabase::Nodes Returns a pointer to a node iterator. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabaseIter *PythonDatabase::Nodes() +{ + return new PythonDatabaseIter(this,nnodes); +} + + +/*++++++++++++++++++++++++++++++++++++++ + Create an iterator so that we can iterate through all segments in the database. + + PythonDatabaseIter *PythonDatabase::Segments Returns a pointer to a segment iterator. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabaseIter *PythonDatabase::Segments() +{ + return new PythonDatabaseIter(this,nsegments); +} + + +/*++++++++++++++++++++++++++++++++++++++ + Create an iterator so that we can iterate through all ways in the database. + + PythonDatabaseIter *PythonDatabase::Ways Returns a pointer to a way iterator. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabaseIter *PythonDatabase::Ways() +{ + return new PythonDatabaseIter(this,nways); +} + + +/*++++++++++++++++++++++++++++++++++++++ + Create an iterator so that we can iterate through all relations in the database. + + PythonDatabaseIter *PythonDatabase::Relations Returns a pointer to a relation iterator. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonDatabaseIter *PythonDatabase::Relations() +{ + return new PythonDatabaseIter(this,nrelations); +} + + +/*++++++++++++++++++++++++++++++++++++++ + Fill in the segments array so that we can access all segments on the node. + ++++++++++++++++++++++++++++++++++++++*/ + +void PythonNode::fill_segments() +{ + if(segments.size()==0) + { + Node *nodep=LookupNode(pydatabase->database->nodes,id,1); + Segment *segmentp=FirstSegment(pydatabase->database->segments,nodep,1); + + do + { + index_t seg=IndexSegment(pydatabase->database->segments,segmentp); + + segments.push_back(seg); + + segmentp=NextSegment(pydatabase->database->segments,segmentp,id); + } + while(segmentp); + } +} + + +/*++++++++++++++++++++++++++++++++++++++ + Create an iterator so that we can iterate through all segments on the node. + + PythonNodeIter *PythonNode::Segments Returns a pointer to a segment iterator. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonNodeIter *PythonNode::Segments() +{ + fill_segments(); + + PythonNodeIter *pyiter=new PythonNodeIter(this,segments.size()); + + return pyiter; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Get a segment from the set of segments on the node. + + PythonSegment *PythonNode::get_segment Returns a pointer to a segment. + + index_t n The index of the segment. + ++++++++++++++++++++++++++++++++++++++*/ + +PythonSegment *PythonNode::get_segment(index_t n) +{ + fill_segments(); + + if(n > segments.size()) + return NULL; + + return pydatabase->GetSegment(segments[n]); +} + + +/*++++++++++++++++++++++++++++++++++++++ + When acting as a list return the selected item from the iterator. + + template<> PythonNode *PythonDatabaseIter::__getitem__ Returns a pointer to a node. + + index_t n The index of the node. + ++++++++++++++++++++++++++++++++++++++*/ + +template<> PythonNode *PythonDatabaseIter::__getitem__(index_t n) +{ + return pydatabase->GetNode(n); +} + + +/*++++++++++++++++++++++++++++++++++++++ + When acting as a list return the selected item from the iterator. + + template<> PythonSegment *PythonDatabaseIter::__getitem__ Returns a pointer to a segment. + + index_t n The index of the segment. + ++++++++++++++++++++++++++++++++++++++*/ + +template<> PythonSegment *PythonDatabaseIter::__getitem__(index_t n) +{ + return pydatabase->GetSegment(n); +} + + +/*++++++++++++++++++++++++++++++++++++++ + When acting as a list return the selected item from the iterator. + + template<> PythonWay *PythonDatabaseIter::__getitem__ Returns a pointer to a way. + + index_t n The index of the way. + ++++++++++++++++++++++++++++++++++++++*/ + +template<> PythonWay *PythonDatabaseIter::__getitem__(index_t n) +{ + return pydatabase->GetWay(n); +} + + +/*++++++++++++++++++++++++++++++++++++++ + When acting as a list return the selected item from the iterator. + + template<> PythonRelation *PythonDatabaseIter::__getitem__ Returns a pointer to a relation. + + index_t n The index of the relation. + ++++++++++++++++++++++++++++++++++++++*/ + +template<> PythonRelation *PythonDatabaseIter::__getitem__(index_t n) +{ + return pydatabase->GetRelation(n); +} + + +/*++++++++++++++++++++++++++++++++++++++ + When acting as a list return the selected item from the iterator. + + template<> PythonSegment *PythonNodeIter::__getitem__ Returns a pointer to a segment. + + index_t n The index of the segment. + ++++++++++++++++++++++++++++++++++++++*/ + +template<> PythonSegment *PythonNodeIter::__getitem__(index_t n) +{ + return pynode->get_segment(n); +} + + +/*++++++++++++++++++++++++++++++++++++++ + Convert a Python database to a viewable string. + + char *PythonDatabase::__str__ Returns a pointer to a statically allocated string. + ++++++++++++++++++++++++++++++++++++++*/ + +char *PythonDatabase::__str__() +{ + static char tmp[256]; + + if(prefix) + sprintf(tmp, "Database(%s,%s)", dirname, prefix); + else + sprintf(tmp, "Database(%s)", dirname); + + return tmp; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Convert a Python node to a viewable string. + + char *PythonNode::__str__ Returns a pointer to a statically allocated string. + ++++++++++++++++++++++++++++++++++++++*/ + +char *PythonNode::__str__() +{ + static char tmp[64]; + + sprintf(tmp, "Node(%" Pindex_t ")", id); + + return tmp; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Convert a Python segment to a viewable string. + + char *PythonSegment::__str__ Returns a pointer to a statically allocated string. + ++++++++++++++++++++++++++++++++++++++*/ + +char *PythonSegment::__str__() +{ + static char tmp[64]; + + sprintf(tmp, "Segment(%" Pindex_t ")", id); + + return tmp; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Convert a Python way to a viewable string. + + char *PythonWay::__str__ Returns a pointer to a statically allocated string. + ++++++++++++++++++++++++++++++++++++++*/ + +char *PythonWay::__str__() +{ + static char tmp[64]; + + sprintf(tmp, "Way(%" Pindex_t ")", id); + + return tmp; +} + + +/*++++++++++++++++++++++++++++++++++++++ + Convert a Python relation to a viewable string. + + char *PythonRelation::__str__ Returns a pointer to a statically allocated string. + ++++++++++++++++++++++++++++++++++++++*/ + +char *PythonRelation::__str__() +{ + static char tmp[64]; + + sprintf(tmp, "Relation(%" Pindex_t ")", id); + + return tmp; +} ===================================== python/src/database.hh ===================================== @@ -0,0 +1,289 @@ +/*************************************** + Header file for interface between Routino database and Python. + + Part of the Routino routing software. + ******************/ /****************** + This file Copyright 2018 Andrew M. Bishop + + This program is free software: you can redistribute it and/or modify + it under the terms of the GNU Affero General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + This program is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU Affero General Public License for more details. + + You should have received a copy of the GNU Affero General Public License + along with this program. If not, see . + ***************************************/ + +#ifndef DATABASE_H +#define DATABASE_H /*+ To stop multiple inclusions. +*/ + +#include + +extern "C" { + +#include "types.h" + +#include "routino.h" + +} + + +/* Constants that are not automatically picked up from types.h */ + +const nodeflags_t Nodeflag_Super = NODE_SUPER; +const nodeflags_t Nodeflag_U_Turn = NODE_UTURN; +const nodeflags_t Nodeflag_Mini_Roundabout = NODE_MINIRNDBT; +const nodeflags_t Nodeflag_Turn_Restrict = NODE_TURNRSTRCT; +const nodeflags_t Nodeflag_Turn_Restrict2 = NODE_TURNRSTRCT2; + +const distance_t Segmentflag_Area = SEGMENT_AREA; +const distance_t Segmentflag_Oneway_1to2 = ONEWAY_1TO2; +const distance_t Segmentflag_Oneway_2to1 = ONEWAY_2TO1; +const distance_t Segmentflag_Super = SEGMENT_SUPER; +const distance_t Segmentflag_Normal = SEGMENT_NORMAL; + + +/* Classes (much easier to use them than C for doing this with swig) */ + +class PythonDatabase; + +class PythonNode; +class PythonSegment; +class PythonWay; +class PythonRelation; + +template class PythonDatabaseIter; +template class PythonNodeIter; + + +/* The database as seen by Python */ + +PythonDatabase *LoadDatabase(const char *dirname, const char *prefix); +PythonDatabase *LoadDatabase(const char *dirname); + +class PythonDatabase +{ +public: + PythonDatabase(const char *_dirname,const char *_prefix, Routino_Database* database); /*+ A constructor +*/ + ~PythonDatabase(); /*+ A destructor to unload the database. +*/ + + PythonNode *GetNode(index_t item); /*+ Get a single node from the database. +*/ + PythonSegment *GetSegment(index_t item); /*+ Get a single segment from the database. +*/ + PythonWay *GetWay(index_t item); /*+ Get a single way from the database. +*/ + PythonRelation *GetRelation(index_t item); /*+ Get a single relation from the database. +*/ + + PythonDatabaseIter *Nodes(); /*+ Create a node iterator to get all the nodes from the database. +*/ + PythonDatabaseIter *Segments(); /*+ Create a segment iterator to get all the segments from the database. +*/ + PythonDatabaseIter *Ways(); /*+ Create a way iterator to get all the ways from the database. +*/ + PythonDatabaseIter *Relations(); /*+ Create a relation iterator to get all the relations from the database. +*/ + + index_t nnodes; /*+ The number of nodes in the database. +*/ + index_t nsegments; /*+ The number of segments in the database. +*/ + index_t nways; /*+ The number of ways in the database. +*/ + index_t nrelations; /*+ The number of relations in the database. +*/ + + char *__str__(); /*+ Convert the Python database to a string. +*/ + + friend class PythonNode; + friend class PythonSegment; + friend class PythonWay; + friend class PythonRelation; + + private: + + char *dirname; /*+ A copy of the database directory name. +*/ + char *prefix; /*+ A copy of the database prefix. +*/ + + Routino_Database *database; /*+ The database opened using the libroutino function. +*/ +}; + + +/* A node as seen by Python - copied from ../src/nodes.h and then modified */ + +class PythonNode +{ +public: + PythonNode(PythonDatabase* _pydatabase) { pydatabase = _pydatabase; } /*+ A constructor passed the database. +*/ + + index_t id; /*+ The index of this node. +*/ + + index_t firstsegment; /*+ The index of the first segment. +*/ + + PythonNodeIter *Segments(); + + double latitude; /*+ The node latitude in degrees. +*/ + double longitude; /*+ The node longitude in degrees. +*/ + + transports_t allow; /*+ The types of transport that are allowed through the node. +*/ + nodeflags_t flags; /*+ Flags containing extra information (e.g. super-node, turn restriction). +*/ + + char *__str__(); /*+ Convert the Python node to a string. +*/ + + private: + + friend class PythonNodeIter; + + PythonDatabase *pydatabase; /*+ A pointer to the database that this node came from. +*/ + + std::vector segments; /*+ The list of segments for this node, only filled in after calling Segments(). +*/ + + PythonSegment *get_segment(index_t item); /*+ Get a single segment from the node. +*/ + void fill_segments(); /*+ Fill in the list of segments. +*/ +}; + + +/* A segment as seen by Python - copied from ../src/segments.h and then modified */ + +class PythonSegment +{ +public: + PythonSegment(PythonDatabase* _pydatabase) { pydatabase = _pydatabase; } /*+ A constructor passed the database. +*/ + + index_t id; /*+ The index of this segment. +*/ + + index_t node1; /*+ The index of the starting node. +*/ + index_t node2; /*+ The index of the finishing node. +*/ + + PythonNode *Node1() { return pydatabase->GetNode(node1); } + PythonNode *Node2() { return pydatabase->GetNode(node2); } + + index_t next2; /*+ The index of the next segment sharing node2. +*/ + + index_t way; /*+ The index of the way associated with the segment. +*/ + + PythonWay *Way() { return pydatabase->GetWay(way); } + + double distance; /*+ The distance between the nodes. +*/ + + distance_t flags; /*+ The flags associated with the segment. +*/ + + char *__str__(); /*+ Convert the Python segment to a string. +*/ + + private: + + PythonDatabase *pydatabase; /*+ A pointer to the database that this segment came from. +*/ +}; + + +/* A way as seen by Python - copied from ../src/ways.h and then modified */ + +class PythonWay +{ +public: + PythonWay(PythonDatabase* _pydatabase) { pydatabase = _pydatabase; } /*+ A constructor passed the database. +*/ + + index_t id; /*+ The index of this way. +*/ + + char *name; /*+ The offset of the name of the way in the names array. +*/ + + transports_t allow; /*+ The type of traffic allowed on the way. +*/ + + highway_t type; /*+ The highway type of the way. +*/ + + properties_t props; /*+ The properties of the way. +*/ + + double speed; /*+ The defined maximum speed limit of the way. +*/ + + double weight; /*+ The defined maximum weight of traffic on the way. +*/ + double height; /*+ The defined maximum height of traffic on the way. +*/ + double width; /*+ The defined maximum width of traffic on the way. +*/ + double length; /*+ The defined maximum length of traffic on the way. +*/ + + char *__str__(); /*+ Convert the Python way to a string. +*/ + + private: + + PythonDatabase *pydatabase; /*+ A pointer to the database that this segment came from. +*/ +}; + + +/* A relation as seen by Python - copied from ../src/relations.h and then modified */ + +class PythonRelation +{ +public: + PythonRelation(PythonDatabase* _pydatabase) { pydatabase = _pydatabase; } /*+ A constructor passed the database. +*/ + + index_t id; /*+ The index of this relation. +*/ + + index_t from_seg; /*+ The segment that the path comes from. +*/ + index_t via_node; /*+ The node that the path goes via. +*/ + index_t to_seg; /*+ The segment that the path goes to. +*/ + + PythonSegment *FromSegment() { return pydatabase->GetSegment(from_seg); } + PythonNode *ViaNode() { return pydatabase->GetNode(via_node); } + PythonSegment *ToSegment() { return pydatabase->GetSegment(to_seg); } + + index_t from_way; /*+ The way that the path comes from. +*/ + index_t to_way; /*+ The way that the path goes to. +*/ + + PythonWay *FromWay() { return pydatabase->GetWay(from_way); } + PythonWay *ToWay() { return pydatabase->GetWay(to_way); } + + index_t from_node; /*+ The node that the path comes from. +*/ + index_t to_node; /*+ The node that the path goes to. +*/ + + PythonNode *FromNode() { return pydatabase->GetNode(from_node); } + PythonNode *ToNode() { return pydatabase->GetNode(to_node); } + + transports_t except_transport; /*+ The types of transports that that this relation does not apply to. +*/ + + char *__str__(); /*+ Convert the Python relation to a string. +*/ + + private: + + PythonDatabase *pydatabase; /*+ A pointer to the database that this segment came from. +*/ +}; + + +/* A generic node/segment/way/relation iterator */ + +template class PythonDatabaseIter +{ + public: + + PythonDatabaseIter(PythonDatabase* _pydatabase, index_t _number) { pydatabase = _pydatabase; number = _number; } /*+ A constructor passed the database. +*/ + + index_t __len__() { return number; } /*+ When used as a list return the length of it. +*/ + T *__getitem__(index_t index); /*+ When used as a list get a particular item from it. +*/ + + PythonDatabaseIter *__iter__() { return this; } /*+ When used as an iterator return itself. +*/ + T *__next__() { if( next < number ) return __getitem__(next++); else return NULL; } /*+ When used as an iterator return the next item. +*/ + + private: + + index_t next=0; /*+ The next node/segment/way/relation to be returned. +*/ + index_t number; /*+ The number of nodes/segments/ways/relations in total. +*/ + + PythonDatabase *pydatabase; /*+ A pointer to the database that this node/segment/way/relation came from. +*/ +}; + + +/* A segment iterator for nodes */ + +template class PythonNodeIter +{ + public: + + PythonNodeIter(PythonNode *_pynode, index_t _number) { pynode = _pynode; number = _number; } /*+ A constructor passed the node. +*/ + + index_t __len__() { return number; } /*+ When used as a list return the length of it. +*/ + T *__getitem__(index_t index); /*+ When used as a list get a particular item from it. +*/ + + PythonNodeIter *__iter__() { return this; } /*+ When used as an iterator return itself. +*/ + T *__next__() { if( next < number ) return __getitem__(next++); else return NULL; } /*+ When used as an iterator return the next item. +*/ + + private: + + index_t next=0; /*+ The next segment to be returned. +*/ + index_t number; /*+ The number of segments in total. +*/ + + PythonNode *pynode; /*+ A pointer to the node that these segments come from. +*/ +}; + +#endif /* DATABASE_H */ ===================================== python/src/database.i ===================================== @@ -0,0 +1,220 @@ +/*************************************** + Python database interface definition. + + Part of the Routino routing software. + ******************/ /****************** + This file Copyright 2018 Andrew M. Bishop + + This program is free software: you can redistribute it and/or modify + it under the terms of the GNU Affero General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + This program is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU Affero General Public License for more details. + + You should have received a copy of the GNU Affero General Public License + along with this program. If not, see . + ***************************************/ + + +/* Name the module 'database' in the 'routino' package */ + +%module(package="routino.database") database + + +/* Include the 'database.hh' header file in the auto-generated code */ + +%{ +#include "database.hh" +%} + + +/* Typemaps for the special integer types used by Routino */ + +%typemap(in) index_t { $1 = PyInt_AsLong($input); } +%typemap(out) index_t { $result = PyInt_FromLong($1); } + +%typemap(in) transport_t { $1 = PyInt_AsLong($input); } +%typemap(out) transport_t { $result = PyInt_FromLong($1); } + +%typemap(in) transports_t { $1 = PyInt_AsLong($input); } +%typemap(out) transports_t { $result = PyInt_FromLong($1); } + +%typemap(in) nodeflags_t { $1 = PyInt_AsLong($input); } +%typemap(out) nodeflags_t { $result = PyInt_FromLong($1); } + +%typemap(in) highway_t { $1 = PyInt_AsLong($input); } +%typemap(out) highway_t { $result = PyInt_FromLong($1); } + +%typemap(in) properties_t { $1 = PyInt_AsLong($input); } +%typemap(out) properties_t { $result = PyInt_FromLong($1); } + +%typemap(in) distance_t { $1 = PyInt_AsLong($input); } +%typemap(out) distance_t { $result = PyInt_FromLong($1); } + + +/* Exception handling for the iterators */ + +%exception PythonDatabaseIter::__next__ { + $action + if (!result) + { + PyErr_SetString(PyExc_StopIteration, "End of iterator"); + return NULL; + } +} + +%exception PythonDatabaseIter::__next__ { + $action + if (!result) + { + PyErr_SetString(PyExc_StopIteration, "End of iterator"); + return NULL; + } +} + +%exception PythonDatabaseIter::__next__ { + $action + if (!result) + { + PyErr_SetString(PyExc_StopIteration, "End of iterator"); + return NULL; + } +} + +%exception PythonDatabaseIter::__next__ { + $action + if (!result) + { + PyErr_SetString(PyExc_StopIteration, "End of iterator"); + return NULL; + } +} + +%exception PythonNodeIter::__next__ { + $action + if (!result) + { + PyErr_SetString(PyExc_StopIteration, "End of iterator"); + return NULL; + } +} + + +/* Rename the internal data types to remove the 'Python' prefix */ + +%rename("Database") "PythonDatabase"; + +%rename("Node") "PythonNode"; +%rename("Segment") "PythonSegment"; +%rename("Way") "PythonWay"; +%rename("Relation") "PythonRelation"; + + +/* Ignore most of the constructors */ + +%ignore PythonDatabase::PythonDatabase; + +%ignore PythonNode::PythonNode; +%ignore PythonSegment::PythonSegment; +%ignore PythonWay::PythonWay; +%ignore PythonRelation::PythonRelation; + +%ignore PythonDatabaseIter::PythonDatabaseIter; +%ignore PythonDatabaseIter::PythonDatabaseIter; +%ignore PythonDatabaseIter::PythonDatabaseIter; +%ignore PythonDatabaseIter::PythonDatabaseIter; + +%ignore PythonNodeIter::PythonNodeIter; + + +/* Mark the functions that create new objects so they can be garbage collected */ + +%newobject LoadDatabase; + +%newobject PythonDatabase::GetNode; +%newobject PythonDatabase::GetSegment; +%newobject PythonDatabase::GetWay; +%newobject PythonDatabase::GetRelation; + +%newobject PythonDatabase::Nodes; +%newobject PythonDatabase::Segments; +%newobject PythonDatabase::Ways; +%newobject PythonDatabase::Relations; + +%newobject PythonNode::Segments; + +%newobject PythonSegment::Node1; +%newobject PythonSegment::Node2; +%newobject PythonSegment::Way; + +%newobject PythonRelation::FromSegment; +%newobject PythonRelation::ViaNode; +%newobject PythonRelation::ToSegment; +%newobject PythonRelation::FromWay; +%newobject PythonRelation::ToWay; +%newobject PythonRelation::FromNode; +%newobject PythonRelation::ToNode; + +%newobject PythonDatabaseIter::__getitem__; +%newobject PythonDatabaseIter::__next__; + +%newobject PythonDatabaseIter::__getitem__; +%newobject PythonDatabaseIter::__next__; + +%newobject PythonDatabaseIter::__getitem__; +%newobject PythonDatabaseIter::__next__; + +%newobject PythonDatabaseIter::__getitem__; +%newobject PythonDatabaseIter::__next__; + +%newobject PythonNodeIter::__getitem__; +%newobject PythonNodeIter::__next__; + + +/* Ignore most things from the types.h file except the enumerations */ + +%ignore M_PI; +%ignore NWAYPOINTS; +%ignore LAT_LONG_SCALE; +%ignore LAT_LONG_BIN; + +%ignore kph_to_speed; +%ignore tonnes_to_weight; +%ignore metres_to_height; +%ignore metres_to_width; +%ignore metres_to_length; + +%ignore HighwayType; +%ignore TransportType; +%ignore PropertyType; +%ignore HighwayName; +%ignore TransportName; +%ignore PropertyName; +%ignore HighwaysNameList; +%ignore AllowedNameList; +%ignore PropertiesNameList; +%ignore HighwayList; +%ignore TransportList; +%ignore PropertyList; + + +/* Use the 'database.hh' header file to generate the wrapper (everything is read-only) */ + +%immutable; + +%include "database.hh" +%include "../src/types.h" + + +/* Declare the specific templates */ + +%template(DatabaseNodeIter) PythonDatabaseIter; +%template(DatabaseSegmentIter) PythonDatabaseIter; +%template(DatabaseWayIter) PythonDatabaseIter; +%template(DatabaseRelationIter) PythonDatabaseIter; + +%template(NodeSegmentIter) PythonNodeIter; ===================================== python/src/router.i ===================================== @@ -0,0 +1,127 @@ +/*************************************** + Python router interface definition. + + Part of the Routino routing software. + ******************/ /****************** + This file Copyright 2018 Andrew M. Bishop + + This program is free software: you can redistribute it and/or modify + it under the terms of the GNU Affero General Public License as published by + the Free Software Foundation, either version 3 of the License, or + (at your option) any later version. + + This program is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + GNU Affero General Public License for more details. + + You should have received a copy of the GNU Affero General Public License + along with this program. If not, see . + ***************************************/ + + +/* Name the module 'router' in the 'routino' package */ + +%module(package="routino.router") router + + +/* Include the 'routino.h' header file from the library in the auto-generated code */ + +%{ +#include "routino.h" +%} + + +/* Return NULL-terminated arrays of strings as a list of strings */ + +%typemap(ret) char** { + $result = PyList_New(0); + + char **p=$1; + + while(*p) + { + PyList_Append($result, PyString_FromString(*p)); + p++; + } +} + + +/* Handle lists of Routino Waypoints as an array */ + +%typemap(in) Routino_Waypoint ** { + /* Check if is a list */ + if (PyList_Check($input)) + { + int size = PyList_Size($input); + int i = 0; + $1 = (Routino_Waypoint **) malloc(size*sizeof(Routino_Waypoint *)); + for (i = 0; i < size; i++) + if (!SWIG_IsOK(SWIG_ConvertPtr(PyList_GetItem($input, i), (void **) &$1[i], $descriptor(Routino_Waypoint*), 0))) + SWIG_exception_fail(SWIG_TypeError, "in method '$symname', expecting type Routino_Waypoint"); + } else { + PyErr_SetString(PyExc_TypeError, "not a list"); + SWIG_fail; + } +} + +%typemap(freearg) Routino_Waypoint ** { + free((Routino_Waypoint *) $1); +} + + +/* Rename variables and functions by stripping 'Routino_' or 'ROUTINO_' prefixes */ + +%rename("%(regex:/R[Oo][Uu][Tt][Ii][Nn][Oo]_(.*)/\\1/)s") ""; + +/* Rename the Routino_CalculateRoute() function so we can replace with a Python wrapper */ + +%rename("_CalculateRoute") "Routino_CalculateRoute"; + +/* Rename the Routino_LoadDatabase() function so we can replace with a Python wrapper */ + +%rename("_LoadDatabase") "Routino_LoadDatabase"; + + +/* Add some custom Python code to the module */ + +%pythoncode %{ + +# Set up a replacement function for a macro in the original + +def CheckAPIVersion(): + return _router.Check_API_Version(_router.API_VERSION) + +# Set up a replacement function so that we do not need to pass the size of the list + +def CalculateRoute(database, profile, translation, waypoints, options, progress=None): + return _router._CalculateRoute(database, profile, translation, waypoints, len(waypoints), options, progress) + +# Set up a replacement function to make the second argument optional + +def LoadDatabase(dirname, prefix=None): + return _router._LoadDatabase(dirname, prefix) + +# Create a function for concatenating directory names, prefixes and filenames + +def FileName(dirname, prefix, name): + + filename="" + + if dirname is not None: + filename=dirname + "/" + + if prefix is not None: + filename += prefix + "-" + + filename += name + + return filename +%} + + +/* Use the 'routino.h' header file from the library to generate the wrapper (everything is read-only) */ + +%immutable; + +%include "../src/routino.h" ===================================== python/test/Makefile ===================================== @@ -0,0 +1,63 @@ +# Test cases Makefile +# +# Part of the Routino routing software. +# +# This file Copyright 2011-2015, 2018 Andrew M. Bishop +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU Affero General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Affero General Public License for more details. +# +# You should have received a copy of the GNU Affero General Public License +# along with this program. If not, see . +# + +# All configuration is in the top-level Makefile.conf + +include ../../Makefile.conf + +TESTDIR=../../src/test + +# Executables + +EXE=$(TESTDIR)/is-fast-math$(.EXE) + +######## + +all: + +######## + +test: $(EXE) + @./run-router-tests.sh + @./run-database-tests.sh + +######## + +$(EXE): + cd $(TESTDIR) && $(MAKE) test + +######## + +install: + +######## + +clean: + rm -rf results + rm -f *.log + rm -f *~ + +######## + +distclean: clean + +######## + +.PHONY:: all test install clean distclean ===================================== python/test/run-database-tests.sh ===================================== @@ -0,0 +1,14 @@ +#!/bin/sh + +# Python build location + +PYTHONPATH=`echo ../build/lib.*` +export PYTHONPATH + +# Run the test + +python3 ../database.py + +# Finish + +exit 0 ===================================== python/test/run-one-test.sh ===================================== @@ -0,0 +1,100 @@ +#!/bin/sh + +# Main tests directory + +testdir=../../src/test + +# Exit on error + +set -e + +# Test name + +name=`basename $1 .sh` + +# Libroutino location + +LD_LIBRARY_PATH=$testdir/..:$LD_LIBRARY_PATH +export LD_LIBRARY_PATH + +# Python build location + +PYTHONPATH=`echo ../build/lib.*` +export PYTHONPATH + +# Create the output directory + +dir=results + +[ -d $dir ] || mkdir $dir + +# Name related options + +osm=$testdir/$name.osm +log=$name.log + +option_prefix="--prefix=$name" +option_dir="--dir=$testdir/fat" + +# Generic program options + +option_router="--profile=motorcar --profiles=../../xml/routino-profiles.xml --translations=$testdir/copyright.xml" + + +# Run waypoints program + +run_waypoints() +{ + perl $testdir/waypoints.pl $@ +} + + +# Run planetsplitter + +run_planetsplitter() +{ + echo "Skipping planetsplitter" +} + + +# Run filedumper + +run_filedumper() +{ + echo "Skipping filedumper" +} + + +# Run the router + +run_router() +{ + waypoint=$1 + + shift + + [ -d $dir/$name-$waypoint ] || mkdir $dir/$name-$waypoint + + echo ../router.py $option_dir $option_prefix $option_osm $option_router $@ >> $log + ../router.py $option_dir $option_prefix $option_osm $option_router $@ >> $log + + mv shortest* $dir/$name-$waypoint + + echo diff -u $testdir/expected/$name-$waypoint.txt $dir/$name-$waypoint/shortest-all.txt >> $log + + if $testdir/is-fast-math; then + diff -U 0 $testdir/expected/$name-$waypoint.txt $dir/$name-$waypoint/shortest-all.txt | 2>&1 egrep '^[-+] ' || true + else + diff -u $testdir/expected/$name-$waypoint.txt $dir/$name-$waypoint/shortest-all.txt >> $log + fi +} + + +# Run the specific test script + +. $testdir/$name.sh + + +# Finish + +exit 0 ===================================== python/test/run-router-tests.sh ===================================== @@ -0,0 +1,68 @@ +#!/bin/sh + +# Main tests directory + +testdir=../../src/test + +# Overall status + +status=true + +# Functions for running tests + +run_a_test () +{ + script=$1 + shift + + if ./run-one-test.sh $script $@ ; then + echo "... passed" + else + echo "... FAILED" + status=false + fi +} + +compare_results () +{ + if diff -q -r $1 $2; then + echo "... matched" + else + echo "... match FAILED" + status=false + fi +} + + +# Initial informational message + +echo "" +$testdir/is-fast-math message + + +# Get the list of tests + +scripts=`echo $testdir/*.osm | sed -e s/.osm/.sh/g` + +# Run the scripts + +for script in $scripts; do + echo "" + echo "Testing: $script ... " + run_a_test $script +done + + +# Check results + +if $status; then + echo "Success: all tests passed" +else + echo "Warning: Some tests FAILED" + exit 1 +fi + + +# Finish + +exit 0 View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/c335b24e0489627e291822986b47b52706166fcd...fd0b60b5338c42c13bcca1436e3d4ee2c5589474 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/c335b24e0489627e291822986b47b52706166fcd...fd0b60b5338c42c13bcca1436e3d4ee2c5589474 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 9 05:57:23 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 09 Sep 2019 04:57:23 +0000 Subject: Processing of routino_3.3.1-1_source.changes Message-ID: routino_3.3.1-1_source.changes uploaded successfully to localhost along with the files: routino_3.3.1-1.dsc routino_3.3.1.orig.tar.gz routino_3.3.1-1.debian.tar.xz routino_3.3.1-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Mon Sep 9 06:03:24 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 09 Sep 2019 05:03:24 +0000 Subject: [Git][debian-gis-team/pyepr] Pushed new tag debian/1.0.0-1 Message-ID: <5d75dd1cc35fa_73483fbbb23f22dc1440a7@godard.mail> Bas Couwenberg pushed new tag debian/1.0.0-1 at Debian GIS Project / pyepr -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/tree/debian/1.0.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 9 06:12:25 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 09 Sep 2019 05:12:25 +0000 Subject: Processing of pyepr_1.0.0-1_source.changes Message-ID: pyepr_1.0.0-1_source.changes uploaded successfully to localhost along with the files: pyepr_1.0.0-1.dsc pyepr_1.0.0.orig.tar.gz pyepr_1.0.0-1.debian.tar.xz pyepr_1.0.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 9 06:20:55 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 09 Sep 2019 05:20:55 +0000 Subject: pyepr_1.0.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 08 Sep 2019 21:36:41 +0000 Source: pyepr Architecture: source Version: 1.0.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: pyepr (1.0.0-1) unstable; urgency=medium . * New upstream release. * Update debian/copyright file. * debian/patches: - refresh all patches Checksums-Sha1: 5cadebdefe019b5dcd6110bd3f0e908c5b1bbc84 2245 pyepr_1.0.0-1.dsc 8c7f482695a3807b4bda8054255721ad8e4bd61a 514167 pyepr_1.0.0.orig.tar.gz e74e67565863c3e961508bf5ca0bc6bba526bd8a 7644 pyepr_1.0.0-1.debian.tar.xz 87c7dc3dc865583a0eb1ac8c58168690b221cc7a 11773 pyepr_1.0.0-1_amd64.buildinfo Checksums-Sha256: c2933a5675f414c143ddb276cab578d89312b6d0f7acf1700f17170084b4824e 2245 pyepr_1.0.0-1.dsc 367118697545e6e1758ffcfc029490313515b1cc17b9442a98ef66144df811e0 514167 pyepr_1.0.0.orig.tar.gz 27d6edd3438a3fb40b2840274ef3c1ac8b86739cdb1137e472330ff184bb7a9a 7644 pyepr_1.0.0-1.debian.tar.xz 4163ef86737cd3f287264b22ee83f5ddcd60f80b66497422426356b1bb1c22ae 11773 pyepr_1.0.0-1_amd64.buildinfo Files: 4903fccbd73fac10519d8575688814aa 2245 python optional pyepr_1.0.0-1.dsc 9462b3ce23668e67eb190efbc23067c3 514167 python optional pyepr_1.0.0.orig.tar.gz 91a6c514df37d7f63e477178c5f26dd6 7644 python optional pyepr_1.0.0-1.debian.tar.xz 3c8c70c24b039d445bb27dcbe217a725 11773 python optional pyepr_1.0.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl113Q0ACgkQZ1DxCuiN SvHuqw//SER3MLQGIps9uHrMXfkp79Ti7GYtfHH6+WY3sTz7bu24T96SoH7wesYR wM4oitqkLYAUYNkuna9xja60n1Hp4V2Qc+XcQ0AaYGThZd1i20JKl5dlvX1hFs+S WypXdwziPa7jJOVOaeYZsvXZbHCfm4kx1SXzKwGMSYqC0KaFnIta80lc9C5gB8Z8 w1M5VLIgFy15/XcHLLV0LmgPzKwWsyXOZZUxqj7F4zZzz3wmkMYoyBznAyul8gCC e39AfI7fAWMZvCIIVoD2xZ1K0N3OJlgQroCEOa0awUpa8XRkFeP1dETjHtat/ML+ r/g37NQwCj01/Z6hjxvttWucYS2VlvNt38CAwSahOwSAgxah9lnfLtf66B9NCQ+n l9mt7WB2HOR4+0HcVsOVTW/S6KBLoOxN6Q5iglrF0GOqfU2/RrrFTtKD3St/VRmI 8NydsY4h7fQHNALh1D2naIo2/M/1BhYicI98CUBgL2JCfqjoF6snqXXJVBpVIrdm sLuarbfoqfCDNPy4CkV6pOC+WyTz4XJXBFXCRRCoLCv1cFNXkBsa+gmxOly/BocC 2iMDXf62Y2Kh7QF7lrxs8tySRdvEoDFQfCSMYqlrp/HH88AC7ROtJTFU9Ot6ptsQ haYD4VWECRCPl+OtdWaanVXE/0fxtQwPtDgVC3itp8k6R2SHBzs= =WoZp -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 9 06:21:04 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 09 Sep 2019 05:21:04 +0000 Subject: routino_3.3.1-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 09 Sep 2019 06:40:48 +0200 Source: routino Architecture: source Version: 3.3.1-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: routino (3.3.1-1) unstable; urgency=medium . * New upstream release. * Drop python.patch, fixed upstream. Checksums-Sha1: 734803f66228af04de9f2c05af8857748c8459f1 2349 routino_3.3.1-1.dsc ec3696a9b21bf1b7bb66434db57c9aad290cbad8 2541830 routino_3.3.1.orig.tar.gz 3888882fc56ca4c96fc3fd43ac2246a83b982563 29768 routino_3.3.1-1.debian.tar.xz f32fefd9213055ab9b9e89d6c999a8f1c58601b5 10663 routino_3.3.1-1_amd64.buildinfo Checksums-Sha256: 0d72fdc2d10fb7f02ec8a5ef7029cf904d6184c768489b29aa145b3451d7dc52 2349 routino_3.3.1-1.dsc a954565ab60a5abebc47e8c6e8b496f972e8dd781810fa5548b6d7a9e3e5e135 2541830 routino_3.3.1.orig.tar.gz 9de71258c822efaf3bed64632cadccda0ba1b3abbdceb89d3156b41a33d11f8b 29768 routino_3.3.1-1.debian.tar.xz 7364a731b2bdce9c17043e2a5e7d5715a6a75ca8568db87c6bcb6bbfe286089d 10663 routino_3.3.1-1_amd64.buildinfo Files: f8608114fb18ed06c11f6e61e20199be 2349 misc optional routino_3.3.1-1.dsc 6f49844a66a6f3f682b2216edc14d12f 2541830 misc optional routino_3.3.1.orig.tar.gz a1d9ba9216319dd75080ae46e40f2725 29768 misc optional routino_3.3.1-1.debian.tar.xz bfeddd1cea97ba1719ac1b50a89a7deb 10663 misc optional routino_3.3.1-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl112b0ACgkQZ1DxCuiN SvH00BAAsp3vK1hJDy3ePmh+FNhcUDFsjr+VoMQrXRn6O0iOFPimPg/uHYRVoOQa 3o2CC3YSKiocgCnd3KQ6iWbkBEz8K/Q7rXoqYarpETDtrw96HyRi7Ef7uAqYgEva rxagAS+cxGuESXr6abduQ+B7Lfs7/ixuvdlS0avDZC9OP6SorWP3raGqE9aZMFmH dfVOkLjvuVslqYBflq+i5VHBKS66599yeg5lq8nZrjMTDaagEo+yL9Dfwjtmeouu zsXaW5gbqSV8411hxqcvveSB77/DMCFx0GgItibfyP6HPCkOw4RCojL9IBTKCJRO CRvC7t8cbf2BrFTS/bLscwtmrh2X4wduKqILOieJgeK+yoQlRgL7mZup+VThBx08 uQIiWROU9Ih8TnA9s5Oe9DZ09qKUj5I9txqdXbbWrbqBbtm6P3XUdZH3+z9CPc2H goItXWvRrBR78af7qgVDWv49b24Ya9QUBW81/dMSU8SVJwTpNFIJmP5NN4CVrUeC 7f4CrWH+4dtCqlcIq1y0nvNeWB8hKeQe6Za5Usg3uj8S+c6Vngyk6ZB1QNFAf7n7 wGzPdtE9c1ENRSmv6llDYnfMOx20NBxIJPuHj7qC+f4FCvssoCWxpri6f3CCJInH b/sP+2mA6x5cMzA/iSBMbMDW2LiJWk7CSG7TDftdULYI2Sg8BHk= =55qW -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From sebastic at xs4all.nl Mon Sep 9 17:28:39 2019 From: sebastic at xs4all.nl (Bas Couwenberg) Date: Mon, 09 Sep 2019 18:28:39 +0200 Subject: Bug#939872: fiona: FTBFS with GDAL 3 Message-ID: <156804651982.23140.198675418827472311.reportbug@osiris.linuxminded.xs4all.nl> Source: fiona Version: 1.8.6-2 Severity: important Tags: upstream User: debian-gis at lists.debian.org Usertags: gdal-3 Dear Maintainer, Your package FTBFS with GDAL 3 from experimental: ImportError: [...]/fiona/ogrext.so: undefined symbol: OSRFixup Note that the upstream issue was closed but no fix is available yet: https://github.com/Toblerity/Fiona/issues/745 Kind Regards, Bas From info at salecolmi.ru Mon Sep 9 18:57:37 2019 From: info at salecolmi.ru (Duy Ta) Date: Mon, 9 Sep 2019 20:57:37 +0300 Subject: Service Procurement Message-ID: <27njza6675276k6441ra66074pr108u0@steveefe.com> Hello We are interested in purchasing your products and services and we sincerely hope to establish a long-term business relationship with your esteemed company.Please kindly send me your latest catalogue, contact us via: alialaraadi2031 at gmail.com. Your early reply is highly appreciated. Kind Regards. Mr. Ali Abdullah Al-Araadi -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 9 20:07:22 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 09 Sep 2019 19:07:22 +0000 Subject: [Git][debian-gis-team/mapproxy][master] Update override for embedded-javascript-library. Message-ID: <5d76a2ea9c881_73482ad95dd616c023038c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapproxy Commits: 7a6f5002 by Bas Couwenberg at 2019-09-09T19:06:26Z Update override for embedded-javascript-library. - - - - - 2 changed files: - debian/changelog - debian/mapproxy-doc.lintian-overrides Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mapproxy (1.12.0-2) UNRELEASED; urgency=medium + + * Update override for embedded-javascript-library. + + -- Bas Couwenberg Mon, 09 Sep 2019 20:54:10 +0200 + mapproxy (1.12.0-1) unstable; urgency=medium * Move from experimental to unstable. ===================================== debian/mapproxy-doc.lintian-overrides ===================================== @@ -1,5 +1,5 @@ # libjs-twitter-bootstrap is not compatible -embedded-javascript-library usr/share/doc/mapproxy/html/_static/bootstrap-*/js/bootstrap.js please use libjs-twitter-bootstrap +embedded-javascript-library usr/share/doc/mapproxy/html/_static/bootstrap-*/js/bootstrap.js please use libjs-bootstrap font-in-non-font-package usr/share/doc/mapproxy/html/_static/boot*/fonts/* font-outside-font-dir usr/share/doc/mapproxy/html/_static/boots*/fonts/* View it on GitLab: https://salsa.debian.org/debian-gis-team/mapproxy/commit/7a6f5002b72e5d2687142b974bfcf63154f4fb0b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapproxy/commit/7a6f5002b72e5d2687142b974bfcf63154f4fb0b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brenda at acae.co.za Mon Sep 9 20:56:34 2019 From: brenda at acae.co.za (Brenda Mweshi (Workshop Invitation)) Date: Mon, 9 Sep 2019 21:56:34 +0200 Subject: Reminder!! September to December Confirmed Workshops and Seminars Message-ID: <47723819670562311530417@PROD08> Thank you for Receiving this workshop invitation. You may please unsubscribe here if you no longer wish to receive our emails --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 1.png Type: image/png Size: 46038 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 2.gif Type: image/gif Size: 646 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 3.gif Type: image/gif Size: 641 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 4.gif Type: image/gif Size: 650 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 5.gif Type: image/gif Size: 650 bytes Desc: not available URL: From owner at bugs.debian.org Mon Sep 9 22:39:32 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Mon, 09 Sep 2019 21:39:32 +0000 Subject: Processed: severity of 933407 is important, severity of 933408 is important, severity of 933411 is important ... References: <1568064414-3530-bts-olly@survex.com> Message-ID: Processing commands for control at bugs.debian.org: > severity 933407 important Bug #933407 [codeblocks] codeblocks: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933408 important Bug #933408 [usbprog] usbprog: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933411 important Bug #933411 [fwknop-gui] fwknop-gui: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933412 important Bug #933412 {Done: Christoph Berg } [limesuite] limesuite: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933415 important Bug #933415 [openbabel] openbabel: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933416 important Bug #933416 [filezilla] filezilla: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933417 important Bug #933417 [xchm] xchm: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933419 important Bug #933419 [darkradiant] darkradiant: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933424 important Bug #933424 [sitplus] sitplus: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933425 important Bug #933425 [sooperlooper] sooperlooper: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933426 important Bug #933426 [mediainfo] mediainfo: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933427 important Bug #933427 [delaboratory] delaboratory: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933429 important Bug #933429 [wxmaxima] wxmaxima: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933431 important Bug #933431 [gnuplot] gnuplot: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933433 important Bug #933433 [chipw] chipw: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933434 important Bug #933434 [cba] cba: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933435 important Bug #933435 [freedink-dfarc] freedink-dfarc: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933438 important Bug #933438 [spek] spek: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933440 important Bug #933440 [munipack] munipack: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933441 important Bug #933441 [pcsx2] pcsx2: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933442 important Bug #933442 [ebook2cwgui] ebook2cwgui: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933445 important Bug #933445 [sandboxgamemaker] sandboxgamemaker: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933450 important Bug #933450 {Done: Stuart Prescott } [fityk] fityk: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933451 important Bug #933451 [pgn2web] pgn2web: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933455 important Bug #933455 [aegisub] aegisub: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933457 important Bug #933457 [gspiceui] gspiceui: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933459 important Bug #933459 [pgadmin3] pgadmin3: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933462 important Bug #933462 [wxsvg] wxsvg: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933463 important Bug #933463 [treesheets] treesheets: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933465 important Bug #933465 [freespace2-launcher-wxlauncher] freespace2-launcher-wxlauncher: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933466 important Bug #933466 [rapidsvn] rapidsvn: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933469 important Bug #933469 [mrpt] mrpt: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933473 important Bug #933473 [bossa] bossa: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933474 important Bug #933474 [cubicsdr] cubicsdr: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933475 important Bug #933475 [wxastrocapture] wxastrocapture: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933476 important Bug #933476 [stx-btree] stx-btree: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933477 important Bug #933477 [openmsx-catapult] openmsx-catapult: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933478 important Bug #933478 [ucblogo] ucblogo: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933479 important Bug #933479 [objcryst-fox] objcryst-fox: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 934096 important Bug #934096 [codelite] codelite: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 934097 important Bug #934097 [maitreya] maitreya: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 934098 important Bug #934098 [3depict] 3depict: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933439 important Bug #933439 [amule] amule: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933409 important Bug #933409 [spatialite-gui] spatialite-gui: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933458 important Bug #933458 [bochs] bochs: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 933464 important Bug #933464 [saga] saga: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > severity 934099 important Bug #934099 [gnudatalanguage] gnudatalanguage: Please rebuild against wxWidgets GTK 3 package Severity set to 'important' from 'normal' > thanks Stopping processing here. Please contact me if you need assistance. -- 933407: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933407 933408: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933408 933409: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933409 933411: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933411 933412: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933412 933415: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933415 933416: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933416 933417: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933417 933419: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933419 933424: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933424 933425: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933425 933426: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933426 933427: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933427 933429: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933429 933431: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933431 933433: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933433 933434: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933434 933435: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933435 933438: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933438 933439: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933439 933440: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933440 933441: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933441 933442: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933442 933445: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933445 933450: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933450 933451: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933451 933455: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933455 933457: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933457 933458: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933458 933459: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933459 933462: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933462 933463: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933463 933464: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933464 933465: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933465 933466: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933466 933469: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933469 933473: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933473 933474: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933474 933475: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933475 933476: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933476 933477: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933477 933478: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933478 933479: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933479 934096: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=934096 934097: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=934097 934098: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=934098 934099: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=934099 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From gitlab at salsa.debian.org Tue Sep 10 05:23:11 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 04:23:11 +0000 Subject: [Git][debian-gis-team/rasterio][master] 4 commits: New upstream version 1.0.28 Message-ID: <5d77252fdd2f5_73482ad9615dca902880be@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / rasterio Commits: d24b0b33 by Bas Couwenberg at 2019-09-10T04:07:41Z New upstream version 1.0.28 - - - - - deb7aa84 by Bas Couwenberg at 2019-09-10T04:08:32Z Update upstream source from tag 'upstream/1.0.28' Update to upstream version '1.0.28' with Debian dir 02628faf835aa75be85a50b1446584104936a249 - - - - - 9cc7fce0 by Bas Couwenberg at 2019-09-10T04:10:22Z New upstream release. - - - - - f1913cc0 by Bas Couwenberg at 2019-09-10T04:11:09Z Set distribution to unstable. - - - - - 4 changed files: - CHANGES.txt - debian/changelog - rasterio/__init__.py - rasterio/_io.pyx Changes: ===================================== CHANGES.txt ===================================== @@ -1,10 +1,18 @@ Changes ======= +1.0.28 (2019-09-09) +------------------- + +- Coercion to ``int`` was forgotten in the block size guard introduced in + 1.0.27 and code that passes string valued ``blockxsize`` and ``blockysize`` + keyword arguments to ``rasterio.open()`` was broken (#1769). This has been + fixed in 1.0.28. + 1.0.27 (2019-09-05) ------------------- -- Resolve #1744 by adding a `dtype` keyword argument to the WarpedVRT +- Resolve #1744 by adding a ``dtype`` keyword argument to the WarpedVRT constructor. It allows a user to specify the working data type for the warp operation and output. - All cases of deprecated affine right multiplication have been changed to be ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +rasterio (1.0.28-1) unstable; urgency=medium + + * Team upload. + * New upstream release. + + -- Bas Couwenberg Tue, 10 Sep 2019 06:10:59 +0200 + rasterio (1.0.27-1) unstable; urgency=medium * Team upload. ===================================== rasterio/__init__.py ===================================== @@ -42,7 +42,7 @@ import rasterio.path __all__ = ['band', 'open', 'pad', 'Env'] -__version__ = "1.0.27" +__version__ = "1.0.28" __gdal_version__ = gdal_version() # Rasterio attaches NullHandler to the 'rasterio' logger and its ===================================== rasterio/_io.pyx ===================================== @@ -1082,7 +1082,7 @@ cdef class DatasetWriterBase(DatasetReaderBase): if tiled: blockxsize = kwargs.get("blockxsize", None) blockysize = kwargs.get("blockysize", None) - if (blockxsize and blockxsize % 16) or (blockysize and blockysize % 16): + if (blockxsize and int(blockxsize) % 16) or (blockysize and int(blockysize) % 16): raise RasterBlockError("The height and width of dataset blocks must be multiples of 16") kwargs["tiled"] = "TRUE" View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/compare/bf42bcb4d332d58d7ae92c3d9336ac0e26d37ced...f1913cc06bedaa98775fc13dfb3715dde56284f3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/compare/bf42bcb4d332d58d7ae92c3d9336ac0e26d37ced...f1913cc06bedaa98775fc13dfb3715dde56284f3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 05:23:13 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 04:23:13 +0000 Subject: [Git][debian-gis-team/rasterio][pristine-tar] pristine-tar data for rasterio_1.0.28.orig.tar.gz Message-ID: <5d7725311a3c1_73482ad95dfbe5e428821e@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / rasterio Commits: b3edea25 by Bas Couwenberg at 2019-09-10T04:08:31Z pristine-tar data for rasterio_1.0.28.orig.tar.gz - - - - - 2 changed files: - + rasterio_1.0.28.orig.tar.gz.delta - + rasterio_1.0.28.orig.tar.gz.id Changes: ===================================== rasterio_1.0.28.orig.tar.gz.delta ===================================== Binary files /dev/null and b/rasterio_1.0.28.orig.tar.gz.delta differ ===================================== rasterio_1.0.28.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +16ba34d0c9a99cc8e46ee58fa9b2d4dcee75821d View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/b3edea25f2f022b2cf9aa5f163a35b2c852cd14c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/b3edea25f2f022b2cf9aa5f163a35b2c852cd14c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 05:23:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 04:23:14 +0000 Subject: [Git][debian-gis-team/rasterio][upstream] New upstream version 1.0.28 Message-ID: <5d7725324dc70_73482ad9615dca902884df@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / rasterio Commits: d24b0b33 by Bas Couwenberg at 2019-09-10T04:07:41Z New upstream version 1.0.28 - - - - - 3 changed files: - CHANGES.txt - rasterio/__init__.py - rasterio/_io.pyx Changes: ===================================== CHANGES.txt ===================================== @@ -1,10 +1,18 @@ Changes ======= +1.0.28 (2019-09-09) +------------------- + +- Coercion to ``int`` was forgotten in the block size guard introduced in + 1.0.27 and code that passes string valued ``blockxsize`` and ``blockysize`` + keyword arguments to ``rasterio.open()`` was broken (#1769). This has been + fixed in 1.0.28. + 1.0.27 (2019-09-05) ------------------- -- Resolve #1744 by adding a `dtype` keyword argument to the WarpedVRT +- Resolve #1744 by adding a ``dtype`` keyword argument to the WarpedVRT constructor. It allows a user to specify the working data type for the warp operation and output. - All cases of deprecated affine right multiplication have been changed to be ===================================== rasterio/__init__.py ===================================== @@ -42,7 +42,7 @@ import rasterio.path __all__ = ['band', 'open', 'pad', 'Env'] -__version__ = "1.0.27" +__version__ = "1.0.28" __gdal_version__ = gdal_version() # Rasterio attaches NullHandler to the 'rasterio' logger and its ===================================== rasterio/_io.pyx ===================================== @@ -1082,7 +1082,7 @@ cdef class DatasetWriterBase(DatasetReaderBase): if tiled: blockxsize = kwargs.get("blockxsize", None) blockysize = kwargs.get("blockysize", None) - if (blockxsize and blockxsize % 16) or (blockysize and blockysize % 16): + if (blockxsize and int(blockxsize) % 16) or (blockysize and int(blockysize) % 16): raise RasterBlockError("The height and width of dataset blocks must be multiples of 16") kwargs["tiled"] = "TRUE" View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/d24b0b33db14b288de600ff680ff19b6a0b6b339 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/d24b0b33db14b288de600ff680ff19b6a0b6b339 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 05:23:21 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 04:23:21 +0000 Subject: [Git][debian-gis-team/rasterio] Pushed new tag debian/1.0.28-1 Message-ID: <5d7725399e1fa_73483fbbb2ce5ca4288632@godard.mail> Bas Couwenberg pushed new tag debian/1.0.28-1 at Debian GIS Project / rasterio -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/tree/debian/1.0.28-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 05:23:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 04:23:23 +0000 Subject: [Git][debian-gis-team/rasterio] Pushed new tag upstream/1.0.28 Message-ID: <5d77253b725e5_73483fbbbf126de828881f@godard.mail> Bas Couwenberg pushed new tag upstream/1.0.28 at Debian GIS Project / rasterio -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/tree/upstream/1.0.28 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 10 05:33:27 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 10 Sep 2019 04:33:27 +0000 Subject: Processing of rasterio_1.0.28-1_source.changes Message-ID: rasterio_1.0.28-1_source.changes uploaded successfully to localhost along with the files: rasterio_1.0.28-1.dsc rasterio_1.0.28.orig.tar.gz rasterio_1.0.28-1.debian.tar.xz rasterio_1.0.28-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Tue Sep 10 05:36:24 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 10 Sep 2019 04:36:24 +0000 Subject: rasterio_1.0.28-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Tue, 10 Sep 2019 06:10:59 +0200 Source: rasterio Architecture: source Version: 1.0.28-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: rasterio (1.0.28-1) unstable; urgency=medium . * Team upload. * New upstream release. Checksums-Sha1: aed687dd292cde0b409440c7c724157f1cdbab20 2309 rasterio_1.0.28-1.dsc 3b7f00926158c15a26033120384ac90937ef74bf 15903850 rasterio_1.0.28.orig.tar.gz 2858022b06f98bc36ead361f7c8cbf9c8afa0a29 8300 rasterio_1.0.28-1.debian.tar.xz f4307a441d14ba9f5ef20abde60844384211f46c 13799 rasterio_1.0.28-1_amd64.buildinfo Checksums-Sha256: 09a53132e7181a518d9e6a9273bb7d5364df198c694ecc3c945e9691ca62b864 2309 rasterio_1.0.28-1.dsc fac0ad17590520e5808338798c0cc425b635ae068571513f76b679015f2ee665 15903850 rasterio_1.0.28.orig.tar.gz cd66eb660071431c11c223c2a4af8451fb0cdebad98e3e7645a206983169613f 8300 rasterio_1.0.28-1.debian.tar.xz c595213b9ff2e039ab37dab6c5d2a2eba5cab5381afda8211ad7f6ffde69e9aa 13799 rasterio_1.0.28-1_amd64.buildinfo Files: 6c3b46ff9b41bba01a621115e656ebd3 2309 python optional rasterio_1.0.28-1.dsc d1edffd303e09b9a05f7a9a6325fb778 15903850 python optional rasterio_1.0.28.orig.tar.gz 6c29865dfb59ef7051f4bdc8bcde68a4 8300 python optional rasterio_1.0.28-1.debian.tar.xz e45669a312d7473a2ec47e988797150e 13799 python optional rasterio_1.0.28-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl13JPsACgkQZ1DxCuiN SvGbLQ/+PZCCW8DszaGkQuujVsxza4fCIx4hNsTKvyyINBLH9PcAf4h4bL1uBtYa Y1Fd4paILiDStJOvxhxd6oXg7bNHt1UBcUDimKj25XN3JwKvQsgDWrpifeOUKPz0 dcesSg9qsPovbGFh1EALSsp2BNuZ+76aRlU3ywbeJ1+2vJ74n56dEjmAZHB2DXk+ r+bpE3t7MOvqs04ARwrShsCmaEDoSaZjC6nNIx925Ug8/7wN4GQDRwKxfPknldLe 3bk27Ls7DxJzZauSyShK9GsBNRbz1woajkRS2N3BDBzlSy4pvKzescSSc0eBknNS LVAdCgkjgV8ivr1L8Y6xEENKkb8aANsXZc8+7elTLNorZ7+RhuesCD9r1Va4RLWr i3a+TlMvcmA76NS5T3CGUpXxdg9Dbl6jGEqCDH4FWkJpvvmEQ/DapYk6lQQtwe58 +bTrOAQNCczynFy8Vgn/Rw5f/q7VgB9nDN4nFaFR5pWr4o30pHz5g1TOYcGPHwqH /exc9A+MPh7v30GoUzEzuwG+dj7WoE4sbEIiS8hVvBj/2bS0FZAW9x+op3cyS8v2 nm8ACNTDY+2eTJ6JiwEdtR7dMsE4MG6kyNB9QCuSM95EbaOizSR4mhVP2gsyKnB1 u3rJahnO/o8TeWC4MF3Lek52EMbFBrsP6Wqy6cMeAGvw/67ZCQ4= =RxI+ -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From noreply at release.debian.org Tue Sep 10 05:39:10 2019 From: noreply at release.debian.org (Debian testing watch) Date: Tue, 10 Sep 2019 04:39:10 +0000 Subject: libgeotiff 1.5.1-2 MIGRATED to testing Message-ID: FYI: The status of the libgeotiff source package in Debian's testing distribution has changed. Previous version: 1.5.1-1 Current version: 1.5.1-2 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Tue Sep 10 05:39:11 2019 From: noreply at release.debian.org (Debian testing watch) Date: Tue, 10 Sep 2019 04:39:11 +0000 Subject: python-affine 2.3.0-1 MIGRATED to testing Message-ID: FYI: The status of the python-affine source package in Debian's testing distribution has changed. Previous version: 2.2.2-2 Current version: 2.3.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Tue Sep 10 14:13:17 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Tue, 10 Sep 2019 13:13:17 +0000 Subject: [Git][debian-gis-team/gdal-grass] Deleted tag ubuntu/2.4.2-1.bionic1 Message-ID: <5d77a16da1d6e_73482ad963a7a41035931b@godard.mail> Martin Landa deleted tag ubuntu/2.4.2-1.bionic1 at Debian GIS Project / gdal-grass -- You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 14:18:15 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Tue, 10 Sep 2019 13:18:15 +0000 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Revert "Rebuild 2.4.2 for bionic" Message-ID: <5d77a297c0adb_73483fbbb2544a543598a6@godard.mail> Martin Landa pushed to branch ubuntu/bionic at Debian GIS Project / gdal-grass Commits: b24f5cdf by Martin Landa at 2019-09-10T13:11:31Z Revert "Rebuild 2.4.2 for bionic" This reverts commit 61453b3e5bfe8f1d40f6d85f835733ea93146c06. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,4 +1,4 @@ -libgdal-grass (2.4.2-1~bionic1) bionic; urgency=medium +libgdal-grass (2.4.2-2~bionic1) bionic; urgency=medium * Rebuild for bionic. View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/commit/b24f5cdfc56f56d6df86d4b2eb30941ca7eb666f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/commit/b24f5cdfc56f56d6df86d4b2eb30941ca7eb666f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 14:18:21 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Tue, 10 Sep 2019 13:18:21 +0000 Subject: [Git][debian-gis-team/gdal-grass] Pushed new tag ubuntu/2.4.2-2.bionic1 Message-ID: <5d77a29d86d7b_73483fbbb2544a5436005e@godard.mail> Martin Landa pushed new tag ubuntu/2.4.2-2.bionic1 at Debian GIS Project / gdal-grass -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/tree/ubuntu/2.4.2-2.bionic1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From landa.martin at gmail.com Tue Sep 10 14:21:03 2019 From: landa.martin at gmail.com (Martin Landa) Date: Tue, 10 Sep 2019 15:21:03 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: Hi, ne 8. 9. 2019 v 7:46 odesílatel Sebastiaan Couwenberg napsal: > 2.4.1 would be the previous patch version. > > I still don't see any inconsistency. I meant libgdal-grass_2.4.2-*2*~bionic1 vs gdal - 2.4.2+dfsg-*1*~bionic0 . But it's a problem at end. Sorry for the noise. Martin -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From landa.martin at gmail.com Tue Sep 10 14:27:09 2019 From: landa.martin at gmail.com (Martin Landa) Date: Tue, 10 Sep 2019 15:27:09 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: Hi, ne 8. 9. 2019 v 7:53 odesílatel Sebastiaan Couwenberg napsal: > Did you resolve the merge conflict incorrectly perhaps? you are right, my fault. Sorry for the noise, Ma -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From sebastic at xs4all.nl Tue Sep 10 14:29:23 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Tue, 10 Sep 2019 15:29:23 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: On 9/10/19 3:21 PM, Martin Landa wrote: > ne 8. 9. 2019 v 7:46 odesílatel Sebastiaan Couwenberg napsal: >> 2.4.1 would be the previous patch version. >> >> I still don't see any inconsistency. > > I meant libgdal-grass_2.4.2-*2*~bionic1 vs gdal - > 2.4.2+dfsg-*1*~bionic0 . But it's a problem at end. Sorry for the > noise. Martin Just for the record, that's the package revision. That does is specific to the package and does not need to be in sync. The libgdal-grass package requires that it's the same GDAL upstream version, 2.4.2 in this case. The package revision is incremented every time packaging changes are published, this revision is reset when the upstream version changes. Backports have an additional revision to indicate which package revision they backport to another distribution. If there is only a packaging change for the backport, e.g. to fix merge conflict, the backport revision is incremented (~bionic1 -> ~bionic2 in this case, or ~bpo10+1 -> bpo10+2 for buster-backports). Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From landa.martin at gmail.com Tue Sep 10 14:46:45 2019 From: landa.martin at gmail.com (Martin Landa) Date: Tue, 10 Sep 2019 15:46:45 +0200 Subject: [Git][debian-gis-team/gdal-grass][ubuntu/bionic] Rebuild 2.4.2 for bionic In-Reply-To: References: <5d74066b93988_577b3f91d43b7c0c13304c0@godard.mail> <7f6937d6-f2b5-ecc9-eda5-5529faa735a1@xs4all.nl> <7a8449cc-a474-6bc6-2d1d-5ee0eca6ec40@xs4all.nl> <37e57ad3-5605-654f-6733-b4b19a0071bc@xs4all.nl> Message-ID: Hi, út 10. 9. 2019 v 15:29 odesílatel Sebastiaan Couwenberg napsal: > Just for the record, that's the package revision. That does is specific > to the package and does not need to be in sync. thanks for clarification. Ma -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From gitlab at salsa.debian.org Tue Sep 10 16:09:21 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 15:09:21 +0000 Subject: [Git][debian-gis-team/saga][master] Update lintian override for spelling-error-in-binary. Message-ID: <5d77bca133754_73482ad9639a29d43714e9@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / saga Commits: c4b15c0d by Bas Couwenberg at 2019-09-10T15:09:12Z Update lintian override for spelling-error-in-binary. - - - - - 2 changed files: - debian/changelog - debian/saga.lintian-overrides Changes: ===================================== debian/changelog ===================================== @@ -3,6 +3,7 @@ saga (7.3.0+dfsg-2) UNRELEASED; urgency=medium * Team upload. * Switch to wxWidgets GTK 3 implementation. (closes: #933464) + * Update lintian override for spelling-error-in-binary. -- Bas Couwenberg Tue, 30 Jul 2019 17:02:03 +0200 ===================================== debian/saga.lintian-overrides ===================================== @@ -12,5 +12,5 @@ hardening-no-fortify-functions * # [This is the first released version of the Lesser GPL. It also counts # as the successor of the GNU Library Public License, version 2, hence # the version number 2.1.] -spelling-error-in-binary usr/bin/saga_gui GNU Library Public License GNU Library General Public License +spelling-error-in-binary usr/bin/saga_gui "GNU Library Public License" "GNU Library General Public License" View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/c4b15c0df0916fbf5f5faa10c6105f4c2f689609 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/c4b15c0df0916fbf5f5faa10c6105f4c2f689609 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastic at xs4all.nl Tue Sep 10 20:52:49 2019 From: sebastic at xs4all.nl (Bas Couwenberg) Date: Tue, 10 Sep 2019 21:52:49 +0200 Subject: Bug#939989: transition: gdal Message-ID: <156814516935.1319.13264348075841746906.reportbug@osiris.linuxminded.xs4all.nl> Package: release.debian.org Severity: normal User: release.debian.org at packages.debian.org Usertags: transition Control: block -1 by 939872 939891 931944 Control: forwarded -1 https://release.debian.org/transitions/html/auto-gdal.html For the Debian GIS team I'd like to transition to GDAL 3.x. This is the next step in the major update of the GIS stack after PROJ 6. All reverse dependencies rebuilt successfully with GDAL 3.0.1 from experimental as summarized below, except fiona, mysql-workbench & vtk7. The fiona issue is actually related to GDAL 3, mysql-workbench FTBFS due to gcc-9 & -Werror, and vtk7 hasn't been updated for PROJ 6 yet. libgdal-grass doesn't need a binNMU as the 3.0.1 version will be uploaded to unstable instead. Transition: gdal libgdal20 (2.4.2+dfsg-1+b2) -> libgdal26 (3.0.1+dfsg-1~exp3) The status of the most recent rebuilds is as follows. dans-gdal-scripts (0.24-3) OK fiona (1.8.6-2) FTBFS (#939872) gazebo (9.6.0-2) OK gmt (5.4.5+dfsg-2) OK libcitygml (2.0.9-2) OK libosmium (2.15.2-1) OK mapcache (1.8.0-1) OK mapnik (3.0.22+ds1-1) OK mapproxy (1.12.0-1) OK mapserver (7.4.1-1) OK mysql-workbench (8.0.17+dfsg-1) FTBFS (#939891) ncl (6.6.2-1) OK node-srs (0.4.8+dfsg-4) OK octave-mapping (1.2.1-4) OK openorienteering-mapper (0.8.4-2) OK openscenegraph (3.2.3+dfsg1-3) OK pdal (2.0.1+ds-1) OK pgsql-ogr-fdw (1.0.8-1) OK pktools (2.6.7.6+ds-2) OK postgis (2.5.3+dfsg-1) OK pprepair (0.0~20170614-dd91a21-3) OK prepair (0.7.1-3) OK python-django (2:2.2.5-1) OK qmapshack (1.13.1-1) OK r-cran-mi (1.0-7) OK r-cran-rgdal (1.4-4-1) OK r-cran-sf (0.7-7+dfsg-1) OK r-cran-tmvtnorm (1.4-10-3) OK rasterio (1.0.28-1) OK sumo (1.1.0+dfsg1-1) OK vtk6 (6.3.0+dfsg2-3) OK vtk7 (7.1.1+dfsg1-12) FTBFS (#931944) cloudcompare (2.10.3-3) OK grass (7.8.0-1) OK opencv (3.2.0+dfsg-6) OK openscenegraph-3.4 (3.4.1+dfsg1-5) OK osmcoastline (2.2.4-1) OK pyosmium (2.15.3-1) OK libgdal-grass (2.4.2-3 / 3.0.1-1~exp3) FTBFS / OK osgearth (2.10.2+dfsg-1) OK otb (6.6.1+dfsg-3) OK qgis (3.4.11+dfsg-2) OK saga (7.3.0+dfsg-1) OK Kind Regards, Bas From owner at bugs.debian.org Tue Sep 10 20:54:09 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Tue, 10 Sep 2019 19:54:09 +0000 Subject: Processed: transition: gdal References: <156814516935.1319.13264348075841746906.reportbug@osiris.linuxminded.xs4all.nl> <156814516935.1319.13264348075841746906.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: Processing control commands: > block -1 by 939872 939891 931944 Bug #939989 [release.debian.org] transition: gdal 939989 was not blocked by any bugs. 939989 was not blocking any bugs. Added blocking bug(s) of 939989: 939872, 939891, and 931944 > forwarded -1 https://release.debian.org/transitions/html/auto-gdal.html Bug #939989 [release.debian.org] transition: gdal Set Bug forwarded-to-address to 'https://release.debian.org/transitions/html/auto-gdal.html'. -- 939989: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939989 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From gitlab at salsa.debian.org Tue Sep 10 21:03:52 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Tue, 10 Sep 2019 20:03:52 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag ubuntu/3.4.11+dfsg-2.bionic1 Message-ID: <5d7801a872c8_73482ad95d7dabac4013ea@godard.mail> Martin Landa pushed new tag ubuntu/3.4.11+dfsg-2.bionic1 at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/ubuntu/3.4.11+dfsg-2.bionic1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 21:06:08 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Tue, 10 Sep 2019 20:06:08 +0000 Subject: [Git][debian-gis-team/qgis] Deleted tag ubuntu/3.4.11+dfsg-2.bionic1 Message-ID: <5d780230c0709_73483fbbbe60bd50401517@godard.mail> Martin Landa deleted tag ubuntu/3.4.11+dfsg-2.bionic1 at Debian GIS Project / qgis -- You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 21:42:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 20:42:05 +0000 Subject: [Git][debian-gis-team/otb][master] 4 commits: Add patch to fix FTBFS with OSSIM 2.9.1. Message-ID: <5d780a9d399ab_73483fbbbe7c03304069dd@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / otb Commits: 0e75d6a1 by Bas Couwenberg at 2019-09-10T18:55:07Z Add patch to fix FTBFS with OSSIM 2.9.1. - - - - - ef9fb978 by Bas Couwenberg at 2019-09-10T18:55:07Z Add patch to support GDAL 3. - - - - - ef932316 by Bas Couwenberg at 2019-09-10T19:37:58Z Set distribution to unstable. - - - - - 6393d308 by Bas Couwenberg at 2019-09-10T20:41:37Z Use ${python3:Depends} substvar for python3-otb. - - - - - 6 changed files: - debian/changelog - debian/control - + debian/patches/gdal.patch - + debian/patches/ossim.patch - debian/patches/series - debian/rules Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,16 @@ +otb (6.6.1+dfsg-4) UNRELEASED; urgency=medium + + * Use ${python3:Depends} substvar for python3-otb. + + -- Bas Couwenberg Tue, 10 Sep 2019 22:40:51 +0200 + +otb (6.6.1+dfsg-3) unstable; urgency=medium + + * Add patch to fix FTBFS with OSSIM 2.9.1. + * Add patch to support GDAL 3. + + -- Bas Couwenberg Tue, 10 Sep 2019 21:37:47 +0200 + otb (6.6.1+dfsg-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -960,10 +960,10 @@ Package: python3-otb Architecture: any Section: python Depends: libotb-apps (= ${binary:Version}), + dh-python, + ${python3:Depends}, ${shlibs:Depends}, ${misc:Depends}, - python3, - dh-python Description: ORFEO Toolbox Python API for applications (Python 3) ORFEO Toolbox (OTB) is distributed as an open source library of image processing algorithms. OTB is based on the medical image processing library ===================================== debian/patches/gdal.patch ===================================== @@ -0,0 +1,14 @@ +Description: Add support for GDAL 3. +Author: Bas Couwenberg + +--- a/Modules/ThirdParty/GDAL/otb-module-init.cmake ++++ b/Modules/ThirdParty/GDAL/otb-module-init.cmake +@@ -95,7 +95,7 @@ if(EXISTS "${TEMP}/gdalVersion.txt") + file(READ "${TEMP}/gdalVersion.txt" _GDAL_VERSION_STRING) + #can't we use GDAL_VERSION_NUM ? + string(SUBSTRING ${_GDAL_VERSION_STRING} 0 2 VER2) +- if("${VER2}" STREQUAL "2.") ++ if("${VER2}" STREQUAL "2." OR "${VER2}" STREQUAL "3.") + set(OTB_USE_GDAL_20 true CACHE INTERNAL "True if GDAL >= 2.0.0 has been detected" FORCE ) + else() + set(OTB_USE_GDAL_20 false CACHE INTERNAL "True if GDAL >= 2.0.0 has been detected" FORCE ) ===================================== debian/patches/ossim.patch ===================================== @@ -0,0 +1,1182 @@ +Description: Fix FTBFS with OSSIM 2.9.1. +Author: Bas Couwenberg + +--- a/Modules/ThirdParty/OssimPlugins/src/gdal/ossimOgcWktTranslator.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/gdal/ossimOgcWktTranslator.cpp +@@ -128,7 +128,7 @@ ossimString ossimOgcWktTranslator::fromO + << ": " + << ( ossimUnitTypeLut::instance()-> + getEntryString(units).c_str() ) +- << endl; ++ << std::endl; + break; + } + } // End of switch (units) +@@ -491,11 +491,11 @@ ossimString ossimOgcWktTranslator::fromO + } + else + { +- cerr << "ossimOgcWktTranslator::fromOssimKwl:\n" +- << "Projection translation for " +- << projType +- << " not supported " +- << endl; ++ std::cerr << "ossimOgcWktTranslator::fromOssimKwl:\n" ++ << "Projection translation for " ++ << projType ++ << " not supported " ++ << std::endl; + } + + if(pcsCodeVal >= EPSG_CODE_MAX) +@@ -532,10 +532,10 @@ ossimString ossimOgcWktTranslator::fromO + } + else + { +- cerr << "ossimOgcWktTranslator::fromOssimKwl: Datum translation for " +- << datumType +- <<" not supported" +- << endl; ++ std::cerr << "ossimOgcWktTranslator::fromOssimKwl: Datum translation for " ++ << datumType ++ <<" not supported" ++ << std::endl; + } + } + +@@ -700,7 +700,7 @@ bool ossimOgcWktTranslator::toOssimKwl( + { + ossimNotify(ossimNotifyLevel_DEBUG) + << MODULE << "DEBUG:" +- << "\nossimProj = " << ossimProj << endl; ++ << "\nossimProj = " << ossimProj << std::endl; + } + + kwl.add(prefix, ossimKeywordNames::TYPE_KW, ossimProj.c_str(), true); +@@ -916,7 +916,7 @@ bool ossimOgcWktTranslator::toOssimKwl( + << "Projection conversion to OSSIM not supported !!!!!!!!!\n" + << "Please send the following string to the development staff\n" + << "to be added to the transaltion to OSSIM\n" +- << wkt << endl; ++ << wkt << std::endl; + } + return false; + } +@@ -1055,7 +1055,7 @@ ossimString ossimOgcWktTranslator::wktTo + ossimString ossimOgcWktTranslator::ossimToWktDatum(const ossimString& datum)const + { + ossimString result; +- map::const_iterator i = theOssimToWktDatumTranslation.find(datum); ++ std::map::const_iterator i = theOssimToWktDatumTranslation.find(datum); + if(i != theOssimToWktDatumTranslation.end()) + { + result = (*i).second; +@@ -1066,7 +1066,7 @@ ossimString ossimOgcWktTranslator::ossim + ossimString ossimOgcWktTranslator::wktToOssimProjection(const ossimString& datum)const + { + std::string result; +- map::const_iterator i = ++ std::map::const_iterator i = + theWktToOssimProjectionTranslation.find(datum); + if(i != theWktToOssimProjectionTranslation.end()) + { +@@ -1078,7 +1078,7 @@ ossimString ossimOgcWktTranslator::wktTo + ossimString ossimOgcWktTranslator::ossimToWktProjection(const ossimString& datum)const + { + ossimString result; +- map::const_iterator i = ++ std::map::const_iterator i = + theOssimToWktProjectionTranslation.find(datum); + if(i != theOssimToWktProjectionTranslation.end()) + { +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimAlosPalsarModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimAlosPalsarModel.cpp +@@ -217,7 +217,7 @@ namespace ossimplugins + /* + * Leader file data reading + */ +- std::ifstream leaderFile(leaFilename.c_str(), ios::in | ios::binary); ++ std::ifstream leaderFile(leaFilename.c_str(), std::ios::in | std::ios::binary); + leaderFile >> *theAlosPalsarLeader; + leaderFile.close(); + +@@ -241,7 +241,7 @@ namespace ossimplugins + /* + * Read header of data file for image size info + */ +- std::ifstream dataFile(datFilename.c_str(), ios::in | ios::binary); ++ std::ifstream dataFile(datFilename.c_str(), std::ios::in | std::ios::binary); + dataFile >> *theAlosPalsarData; + dataFile.close(); + +@@ -682,7 +682,7 @@ namespace ossimplugins + + bool ossimAlosPalsarModel::isAlosPalsarLeader(const ossimFilename& file) const + { +- std::ifstream candidate(file.c_str(), ios::in | ios::binary); ++ std::ifstream candidate(file.c_str(), std::ios::in | std::ios::binary); + char alosFileName[16]; + + candidate.seekg(48); +@@ -745,7 +745,7 @@ namespace ossimplugins + + bool ossimAlosPalsarModel::isAlosPalsarData(const ossimFilename& file) const + { +- std::ifstream candidate(file.c_str(), ios::in | ios::binary); ++ std::ifstream candidate(file.c_str(), std::ios::in | std::ios::binary); + char alosFileName[16]; + + candidate.seekg(48); +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimEnvisatAsarModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimEnvisatAsarModel.cpp +@@ -125,7 +125,7 @@ namespace ossimplugins + * Opening and test of the file + */ + ossimFilename Filename = file; +- ifstream dataFile(Filename.c_str(), ios::in | ios::binary); ++ std::ifstream dataFile(Filename.c_str(), std::ios::in | std::ios::binary); + if (dataFile.eof()) + { + dataFile.close(); +@@ -368,7 +368,7 @@ namespace ossimplugins + // Capture the original flags. + std::ios_base::fmtflags f = out.flags(); + +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\nossimEnvisatAsarModel data members:\n" + << "_pixel_spacing: " << _pixel_spacing << "\n" + << "_n_srgr: " << _n_srgr << "\n"; +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimGeometricSarSensorModel.h ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimGeometricSarSensorModel.h +@@ -118,7 +118,7 @@ public: + * @param position Position of the sensor at line line + * @param speed Speed of the sensor at line line + */ +- virtual bool getPlatformPositionAtLine(double line, vector& position, vector& speed); ++ virtual bool getPlatformPositionAtLine(double line, std::vector& position, std::vector& speed); + + /** + * @brief This function is able to convert image coordinates into world +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimRadarSatModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimRadarSatModel.cpp +@@ -158,24 +158,24 @@ bool ossimRadarSatModel::open(const ossi + ossimFilename dataFilePath; + ossimFilename volumeDirectoryFilePath; + std::string input_file = file; +- string::size_type loc_DAT = input_file.find( "DAT_01", 0 ); +- string::size_type loc_dat = input_file.find( "dat_01", 0 ); +- if ( (loc_DAT != string::npos ) || ( loc_dat != string::npos ) ) ++ std::string::size_type loc_DAT = input_file.find( "DAT_01", 0 ); ++ std::string::size_type loc_dat = input_file.find( "dat_01", 0 ); ++ if ( (loc_DAT != std::string::npos ) || ( loc_dat != std::string::npos ) ) + { + dataFilePath = input_file.c_str(); +- if (loc_DAT != string::npos ) input_file.replace(loc_DAT, 6, "VDF_DAT"); +- if (loc_dat != string::npos ) input_file.replace(loc_dat, 6, "vdf_dat"); ++ if (loc_DAT != std::string::npos ) input_file.replace(loc_DAT, 6, "VDF_DAT"); ++ if (loc_dat != std::string::npos ) input_file.replace(loc_dat, 6, "vdf_dat"); + volumeDirectoryFilePath = input_file.c_str(); + } + else + { +- string::size_type loc_VDF = input_file.find( "VDF_DAT", 0 ); +- string::size_type loc_vdf = input_file.find( "vdf_dat", 0 ); +- if ( (loc_VDF != string::npos ) || ( loc_vdf != string::npos ) ) ++ std::string::size_type loc_VDF = input_file.find( "VDF_DAT", 0 ); ++ std::string::size_type loc_vdf = input_file.find( "vdf_dat", 0 ); ++ if ( (loc_VDF != std::string::npos ) || ( loc_vdf != std::string::npos ) ) + { + volumeDirectoryFilePath = input_file.c_str(); +- if (loc_VDF != string::npos ) input_file.replace(loc_VDF, 7, "DAT_01"); +- if (loc_vdf != string::npos ) input_file.replace(loc_vdf, 7, "dat_01"); ++ if (loc_VDF != std::string::npos ) input_file.replace(loc_VDF, 7, "DAT_01"); ++ if (loc_vdf != std::string::npos ) input_file.replace(loc_vdf, 7, "dat_01"); + dataFilePath = input_file.c_str(); + } + else +@@ -224,7 +224,7 @@ bool ossimRadarSatModel::open(const ossi + + RadarSatRecordHeader headerVDF; + VolumeDirFactory factoryVDF; +- ifstream volumeDirFile (volumeDirectoryFilePath.c_str(), ios::in|ios::binary); ++ std::ifstream volumeDirFile (volumeDirectoryFilePath.c_str(), std::ios::in|std::ios::binary); + volumeDirFile>>headerVDF; + if(volumeDirFile.eof()) + { +@@ -269,7 +269,7 @@ bool ossimRadarSatModel::open(const ossi + //Reading of the remaining of the volume directory file + + volumeDirFile.close(); +- volumeDirFile.open(volumeDirectoryFilePath.c_str(), ios::in | ios::binary); ++ volumeDirFile.open(volumeDirectoryFilePath.c_str(), std::ios::in | std::ios::binary); + volumeDirFile >> *_volumeDir; + volumeDirFile.close(); + +@@ -280,7 +280,7 @@ bool ossimRadarSatModel::open(const ossi + + RadarSatRecordHeader headerDAT; + DataFactory factoryDAT; +- ifstream dataFile (dataFilePath.c_str(), ios::in|ios::binary); ++ std::ifstream dataFile (dataFilePath.c_str(), std::ios::in|std::ios::binary); + dataFile>>headerDAT; + if(dataFile.eof()) + { +@@ -302,7 +302,7 @@ bool ossimRadarSatModel::open(const ossi + /* + * Reading the remaining of the data file + */ +- dataFile.open(dataFilePath.c_str(), ios::in|ios::binary); ++ dataFile.open(dataFilePath.c_str(), std::ios::in|std::ios::binary); + dataFile>>*_data; + dataFile.close(); + +@@ -329,12 +329,12 @@ bool ossimRadarSatModel::open(const ossi + * Warning : the filename case has to be homogenous + */ + std::string leader_file = dataFilePath; +- string::size_type loc = leader_file.find( "DAT_01", 0 ); +- if( loc != string::npos ) leader_file.replace(loc, 6, "LEA_01" ); // upper case test ++ std::string::size_type loc = leader_file.find( "DAT_01", 0 ); ++ if( loc != std::string::npos ) leader_file.replace(loc, 6, "LEA_01" ); // upper case test + else + { + loc = leader_file.find( "dat_01", 0 ); +- if( loc != string::npos ) leader_file.replace(loc, 6, "lea_01" ); // lower case test ++ if( loc != std::string::npos ) leader_file.replace(loc, 6, "lea_01" ); // lower case test + else + { + ossimNotify(ossimNotifyLevel_DEBUG) << "File Name not coherent (searching for *DAT_01* or *dat_01*) : " << file << std::endl; +@@ -355,7 +355,7 @@ bool ossimRadarSatModel::open(const ossi + /* + * Leader file data reading + */ +- ifstream leaderFile (leaderFilePath.c_str(), ios::in|ios::binary); ++ std::ifstream leaderFile (leaderFilePath.c_str(), std::ios::in|std::ios::binary); + leaderFile>>*_leader; + leaderFile.close(); + if(traceDebug()) +@@ -370,11 +370,11 @@ bool ossimRadarSatModel::open(const ossi + */ + std::string trailer_file = dataFilePath; + loc = trailer_file.find( "DAT_01", 0 ); +- if( loc != string::npos ) trailer_file.replace(loc, 6, "TRA_01" ); // upper case test ++ if( loc != std::string::npos ) trailer_file.replace(loc, 6, "TRA_01" ); // upper case test + else + { + loc = trailer_file.find( "dat_01", 0 ); +- if( loc != string::npos ) trailer_file.replace(loc, 6, "tra_01" ); // lower case test ++ if( loc != std::string::npos ) trailer_file.replace(loc, 6, "tra_01" ); // lower case test + else + { + ossimNotify(ossimNotifyLevel_DEBUG) << "File Name not coherent (searching for *DAT_01* or *dat_01*) : " << file << std::endl; +@@ -395,7 +395,7 @@ bool ossimRadarSatModel::open(const ossi + /* + * Trailer file data reading + */ +- ifstream trailerFile (trailerFilePath.c_str(), ios::in|ios::binary); ++ std::ifstream trailerFile (trailerFilePath.c_str(), std::ios::in|std::ios::binary); + trailerFile>>*_trailer; + trailerFile.close(); + if(traceDebug()) +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatDimapSupportData.h ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatDimapSupportData.h +@@ -163,7 +163,7 @@ public: + //--- + // Convenient method to print important image info: + //--- +- void printInfo (ostream& os) const; ++ void printInfo (std::ostream& os) const; + + virtual bool saveState(ossimKeywordlist& kwl, + const char* prefix = 0)const; +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimErsSarModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimErsSarModel.cpp +@@ -195,7 +195,7 @@ namespace ossimplugins + /* + * Leader file data reading + */ +- std::ifstream leaderFile(leaFilename.c_str(), ios::in | ios::binary); ++ std::ifstream leaderFile(leaFilename.c_str(), std::ios::in | std::ios::binary); + leaderFile >> *theErsSarleader; + leaderFile.close(); + +@@ -613,8 +613,8 @@ namespace ossimplugins + ossimString filename(kwl.find("filename")); + filename.upcase(); + //std::transform(filename.begin(), filename.end(), filename.begin(), toupper); +- string::size_type loc = filename.find("PRI"); +- if (loc != string::npos) ++ std::string::size_type loc = filename.find("PRI"); ++ if (loc != std::string::npos) + { + _isProductGeoreferenced = true; + } +@@ -646,7 +646,7 @@ namespace ossimplugins + + bool ossimErsSarModel::isErsLeader(const ossimFilename& file) const + { +- std::ifstream candidate(file.c_str(), ios::in | ios::binary); ++ std::ifstream candidate(file.c_str(), std::ios::in | std::ios::binary); + char ersFileName[16]; + + candidate.seekg(48); +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatDimapSupportData.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatDimapSupportData.cpp +@@ -325,7 +325,7 @@ bool ossimFormosatDimapSupportData::load + //--- + // Check that it is a FORMOSAT DIMAP file format + //--- +- vector > xml_nodes; ++ std::vector > xml_nodes; + xml_nodes.clear(); + ossimString xpath = "/Dimap_Document/Dataset_Sources/Source_Information/Scene_Source/MISSION"; + xmlDocument->findNodes(xpath, xml_nodes); +@@ -802,7 +802,7 @@ void ossimFormosatDimapSupportData::getG + } + } + +-void ossimFormosatDimapSupportData::printInfo(ostream& os) const ++void ossimFormosatDimapSupportData::printInfo(std::ostream& os) const + { + ossimString corr_att = "NO"; + if (theStarTrackerUsed) +@@ -1521,7 +1521,7 @@ bool ossimFormosatDimapSupportData::pars + static const char MODULE[] = "ossimFormosatDimapSupportData::parsePart1"; + + ossimString xpath; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the ImageSize: +@@ -2350,7 +2350,7 @@ bool ossimFormosatDimapSupportData::init + { + ossimNotify(ossimNotifyLevel_DEBUG) + << "DEBUG:\nCould not find: " << xpath +- << endl; ++ << std::endl; + } + return false; + } +@@ -2385,7 +2385,7 @@ bool ossimFormosatDimapSupportData::init + ossimRefPtr xmlDocument) + { + ossimString xpath; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the Image ID: +@@ -2399,7 +2399,7 @@ bool ossimFormosatDimapSupportData::init + { + ossimNotify(ossimNotifyLevel_DEBUG) + << "DEBUG:\nCould not find: " << xpath +- << endl; ++ << std::endl; + } + return false; + } +@@ -2411,7 +2411,7 @@ bool ossimFormosatDimapSupportData::init + ossimRefPtr xmlDocument) + { + ossimString xpath; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the mission index (Formosat 1 or 2): +@@ -2590,7 +2590,7 @@ bool ossimFormosatDimapSupportData::init + ossimRefPtr xmlDocument) + { + ossimString xpath; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Corner points: +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimGeometricSarSensorModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimGeometricSarSensorModel.cpp +@@ -133,7 +133,7 @@ namespace ossimplugins + return time; + } + +- bool ossimGeometricSarSensorModel::getPlatformPositionAtLine(double line, vector& position, vector& speed) ++ bool ossimGeometricSarSensorModel::getPlatformPositionAtLine(double line, std::vector& position, std::vector& speed) + { + JSDDateTime time = getTime(line); + return _platformPosition->getPlatformPositionAtTime(time,position,speed); +@@ -532,13 +532,13 @@ namespace ossimplugins + // if (result) + // { + // ossimNotify(ossimNotifyLevel_DEBUG) +-// << "calling saveState to verify loadState..." << endl; ++// << "calling saveState to verify loadState..." << std::endl; + + // ossimKeywordlist kwl2; + // saveState(kwl2, 0); + + // ossimNotify(ossimNotifyLevel_DEBUG) +-// << "saveState result after loadState:" << kwl2 << endl; ++// << "saveState result after loadState:" << kwl2 << std::endl; + // } + + if (traceDebug()) +@@ -574,7 +574,7 @@ namespace ossimplugins + + std::ostream& ossimGeometricSarSensorModel::print(std::ostream& out) const + { +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\nossimGeometricSarSensorModel class data members:\n"; + + const char* prefix = 0; +@@ -631,7 +631,7 @@ bool ossimGeometricSarSensorModel::creat + + if (traceDebug()) + { +- ossimNotify(ossimNotifyLevel_NOTICE)<<"\nComputing coarse grid..."<buildGrid(theImageClipRect, this, 500.00, true, false); + +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimPleiadesDimapSupportData.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimPleiadesDimapSupportData.cpp +@@ -99,7 +99,7 @@ namespace ossimplugins + ossimString xpath, + ossimString& nodeValue) + { +- vector > xml_nodes; ++ std::vector > xml_nodes; + + xmlDocument->findNodes(xpath, xml_nodes); + if (xml_nodes.size() == 0) +@@ -286,7 +286,7 @@ namespace ossimplugins + theSwathLastCol = 0; + } + +- void ossimPleiadesDimapSupportData::printInfo(ostream& os) const ++ void ossimPleiadesDimapSupportData::printInfo(std::ostream& os) const + { + + os << "\n----------------- Info on Pleiades Image -------------------" +@@ -1249,7 +1249,7 @@ namespace ossimplugins + ossimRefPtr xmlDocument) + { + ossimString xpath; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the Image ID: +@@ -1292,7 +1292,7 @@ namespace ossimplugins + ossimRefPtr xmlDocument) + { + ossimString xpath, nodeValue; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Corner points: +@@ -1574,7 +1574,7 @@ namespace ossimplugins + { + static const char MODULE[] = "ossimPleiadesDimapSupportData::parseRPCMetadata"; + ossimString xpath, nodeValue; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the Global RFM - Direct Model - Bias: +@@ -1919,7 +1919,7 @@ namespace ossimplugins + { + // static const char MODULE[] = "ossimPleiadesDimapSupportData::parseMetadataIdentification"; + +- vector > xml_nodes; ++ std::vector > xml_nodes; + ossimString xpath, nodeValue; + theXmlDocumentRoot = "/PHR_Dimap_Document"; + +@@ -1971,7 +1971,7 @@ namespace ossimplugins + { + // static const char MODULE[] = "ossimPleiadesDimapSupportData::parseMetadataIdentification"; + +- vector > xml_nodes; ++ std::vector > xml_nodes; + ossimString xpath, nodeValue; + theXmlDocumentRoot = "/DIMAP_Document"; + +@@ -1990,7 +1990,7 @@ namespace ossimplugins + theXmlDocumentRoot = "/PHR_DIMAP_Document"; + if (traceDebug()) + { +- ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the old root: " << theXmlDocumentRoot << endl; ++ ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the old root: " << theXmlDocumentRoot << std::endl; + } + + xml_nodes.clear(); +@@ -2005,7 +2005,7 @@ namespace ossimplugins + theXmlDocumentRoot = "/Dimap_Document"; + if (traceDebug()) + { +- ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the new root: " << theXmlDocumentRoot << endl; ++ ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the new root: " << theXmlDocumentRoot << std::endl; + } + + xml_nodes.clear(); +@@ -2017,7 +2017,7 @@ namespace ossimplugins + setErrorStatus(); + if (traceDebug()) + { +- ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nCould not find: " << xpath << endl; ++ ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nCould not find: " << xpath << std::endl; + } + return false; + } +@@ -2113,7 +2113,7 @@ namespace ossimplugins + { + // static const char MODULE[] = "ossimPleiadesDimapSupportData::parseProcessingInformation"; + +- vector > xml_nodes; ++ std::vector > xml_nodes; + ossimString xpath, nodeValue; + + //--- +@@ -2156,7 +2156,7 @@ namespace ossimplugins + bool ossimPleiadesDimapSupportData::parseRasterData(ossimRefPtr xmlDocument) + { + static const char MODULE[] = "ossimPleiadesDimapSupportData::parseRasterData"; +- vector > xml_nodes; ++ std::vector > xml_nodes; + ossimString xpath, nodeValue; + //--- + // Fetch if the product file is linked to one or many JP2 files: +@@ -2471,7 +2471,7 @@ namespace ossimplugins + bool ossimPleiadesDimapSupportData::parseGeometricData(ossimRefPtr xmlDocument) + { + ossimString xpath, nodeValue; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + xml_nodes.clear(); + if (theDIMAPVersion == OSSIM_PLEIADES_DIMAPv1) +@@ -2760,7 +2760,7 @@ namespace ossimplugins + { + // static const char MODULE[] = "ossimPleiadesDimapSupportData::parseDatasetSources"; + ossimString xpath, nodeValue; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the mission index (1A or 1B) ? +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimPleiadesDimapSupportData.h ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimPleiadesDimapSupportData.h +@@ -83,7 +83,7 @@ namespace ossimplugins + //--- + // Convenient method to print important image info: + //--- +- void printInfo (ostream& os) const; ++ void printInfo (std::ostream& os) const; + + /** + * Method to save the state of the object to a keyword list. +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimPleiadesModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimPleiadesModel.cpp +@@ -107,8 +107,8 @@ namespace ossimplugins + // Capture stream flags since we are going to mess with them. + std::ios_base::fmtflags f = out.flags(); + +- out << "\nDump of ossimPleiadesModel at address " << (hex) << this +- << (dec) ++ out << "\nDump of ossimPleiadesModel at address " << (std::hex) << this ++ << (std::dec) + << "\n------------------------------------------------" + << "\n theImageID = " << theImageID + << "\n theImageSize = " << theImageSize +@@ -116,7 +116,7 @@ namespace ossimplugins + << "\n theRefImgPt = " << theRefImgPt + << "\n theProcessingLevel = " << theSupportData->getProcessingLevel() + << "\n------------------------------------------------" +- << "\n " << endl; ++ << "\n " << std::endl; + + // Set the flags back. + out.flags(f); +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatModel.h ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatModel.h +@@ -95,7 +95,7 @@ public: + * Writes a template of geom keywords processed by loadState and saveState + * to output stream. + */ +- static void writeGeomTemplate(ostream& os); ++ static void writeGeomTemplate(std::ostream& os); + + /*! + * Given an image point and height, initializes worldPoint. +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSpot6DimapSupportData.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSpot6DimapSupportData.cpp +@@ -99,7 +99,7 @@ namespace ossimplugins + ossimString xpath, + ossimString& nodeValue) + { +- vector > xml_nodes; ++ std::vector > xml_nodes; + + xmlDocument->findNodes(xpath, xml_nodes); + if (xml_nodes.size() == 0) +@@ -269,7 +269,7 @@ namespace ossimplugins + theSpecId = ""; + } + +- void ossimSpot6DimapSupportData::printInfo(ostream& os) const ++ void ossimSpot6DimapSupportData::printInfo(std::ostream& os) const + { + + os << "\n----------------- Info on Spot6 Image -------------------" +@@ -1110,7 +1110,7 @@ namespace ossimplugins + ossimRefPtr xmlDocument) + { + ossimString xpath; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the Image ID: +@@ -1139,7 +1139,7 @@ namespace ossimplugins + ossimRefPtr xmlDocument) + { + ossimString xpath, nodeValue; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Corner points: +@@ -1401,7 +1401,7 @@ namespace ossimplugins + { + static const char MODULE[] = "ossimSpot6DimapSupportData::parseRPCMetadata"; + ossimString xpath, nodeValue; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the Global RFM - Direct Model - Bias: +@@ -1612,7 +1612,7 @@ namespace ossimplugins + { + static const char MODULE[] = "ossimSpot6DimapSupportData::parseMetadataIdentification"; + +- vector > xml_nodes; ++ std::vector > xml_nodes; + ossimString xpath, nodeValue; + theXmlDocumentRoot = "/DIMAP_Document"; + +@@ -1632,7 +1632,7 @@ namespace ossimplugins + theXmlDocumentRoot = "/SPOT_DIMAP_Document"; + if (traceDebug()) + { +- ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the old root: " << theXmlDocumentRoot << endl; ++ ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the old root: " << theXmlDocumentRoot << std::endl; + } + + xml_nodes.clear(); +@@ -1647,7 +1647,7 @@ namespace ossimplugins + theXmlDocumentRoot = "/Dimap_Document"; + if (traceDebug()) + { +- ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the new root: " << theXmlDocumentRoot << endl; ++ ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nTry to use the new root: " << theXmlDocumentRoot << std::endl; + } + + xml_nodes.clear(); +@@ -1659,7 +1659,7 @@ namespace ossimplugins + setErrorStatus(); + if (traceDebug()) + { +- ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nCould not find: " << xpath << endl; ++ ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG:\nCould not find: " << xpath << std::endl; + } + return false; + } +@@ -1758,7 +1758,7 @@ namespace ossimplugins + { + static const char MODULE[] = "ossimSpot6DimapSupportData::parseProcessingInformation"; + +- vector > xml_nodes; ++ std::vector > xml_nodes; + ossimString xpath, nodeValue; + + //--- +@@ -1787,7 +1787,7 @@ namespace ossimplugins + bool ossimSpot6DimapSupportData::parseRasterData(ossimRefPtr xmlDocument) + { + static const char MODULE[] = "ossimSpot6DimapSupportData::parseRasterData"; +- vector > xml_nodes; ++ std::vector > xml_nodes; + ossimString xpath, nodeValue; + //--- + // Fetch if the product file is linked to one or many JP2 files: +@@ -2005,7 +2005,7 @@ namespace ossimplugins + bool ossimSpot6DimapSupportData::parseGeometricData(ossimRefPtr xmlDocument) + { + ossimString xpath; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + xml_nodes.clear(); + xpath = "/Geometric_Data/Use_Area/Located_Geometric_Values"; //DIMAPv2 +@@ -2125,7 +2125,7 @@ namespace ossimplugins + { + static const char MODULE[] = "ossimSpot6DimapSupportData::parseDatasetSources"; + ossimString xpath, nodeValue; +- vector > xml_nodes; ++ std::vector > xml_nodes; + + //--- + // Fetch the mission index (1A or 1B) ? +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSpot6DimapSupportData.h ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSpot6DimapSupportData.h +@@ -82,7 +82,7 @@ namespace ossimplugins + //--- + // Convenient method to print important image info: + //--- +- void printInfo (ostream& os) const; ++ void printInfo (std::ostream& os) const; + + /** + * Method to save the state of the object to a keyword list. +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTerraSarModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTerraSarModel.cpp +@@ -856,7 +856,7 @@ std::ostream& ossimplugins::ossimTerraSa + // Capture the original flags. + std::ios_base::fmtflags f = out.flags(); + +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\nossimTerraSarModelclass data members:\n" + << SR_GR_R0_KW << _SrToGr_R0 << "\n"; + +@@ -1546,7 +1546,7 @@ bool ossimplugins::ossimTerraSarModel::i + if (traceDebug()) + { + ossimNotify(ossimNotifyLevel_DEBUG) +- << "result for tsDoc.initSensorParams " << result << endl; ++ << "result for tsDoc.initSensorParams " << result << std::endl; + } + + if (!result) +@@ -2436,7 +2436,7 @@ bool ossimplugins::ossimTerraSarModel::f + } + + +-void ossimplugins::ossimTerraSarModel::printInfo(ostream& os) const ++void ossimplugins::ossimTerraSarModel::printInfo(std::ostream& os) const + { + os << "\n----------------- General Info on TSX-1 Image -------------------" + << "\n " +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTerraSarModel.h ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTerraSarModel.h +@@ -119,7 +119,7 @@ namespace ossimplugins + //--- + // Convenient method to print important image info: + //--- +- void printInfo (ostream& os) const; ++ void printInfo (std::ostream& os) const; + + private: + +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTileMapModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTileMapModel.cpp +@@ -184,7 +184,7 @@ namespace ossimplugins + std::ostream& ossimTileMapModel::print(std::ostream& os) const + { + os << "\nDump of ossimTileMapModel object at " +- << hex << this << ":\n" ++ << std::hex << this << ":\n" + << "\nTileMapModel -- Dump of all data members: " + << "\n theImageID: " << theImageID.chars() + << "\n theImageSize: " << theImageSize +@@ -193,7 +193,7 @@ namespace ossimplugins + << "\n theGSD.line: " << theGSD.line + << "\n theGSD.samp: " << theGSD.samp + << "\n qDepth: " << qDepth +- << endl; ++ << std::endl; + + return ossimSensorModel::print(os); + } +@@ -282,7 +282,7 @@ namespace ossimplugins + // Writes a sample kwl to output stream. + // + //***************************************************************************** +- void ossimTileMapModel::writeGeomTemplate(ostream& os) ++ void ossimTileMapModel::writeGeomTemplate(std::ostream& os) + { + if (traceExec()) ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG ossimTileMapModel::writeGeomTemplate: entering..." << std::endl; + +@@ -290,7 +290,7 @@ namespace ossimplugins + "//**************************************************************\n" + "// Template for TileMap model keywordlist\n" + "//**************************************************************\n" +- << ossimKeywordNames::TYPE_KW << ": " << "ossimTileMapModel" << endl; ++ << ossimKeywordNames::TYPE_KW << ": " << "ossimTileMapModel" << std::endl; + + + if (traceExec()) ossimNotify(ossimNotifyLevel_DEBUG) << "DEBUG ossimTileMapModel::writeGeomTemplate: returning..." << std::endl; +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTileMapModel.h ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimTileMapModel.h +@@ -96,7 +96,7 @@ public: + * Writes a template of geom keywords processed by loadState and saveState + * to output stream. + */ +- static void writeGeomTemplate(ostream& os); ++ static void writeGeomTemplate(std::ostream& os); + + //*** + // Overrides base class pure virtual. +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatModel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimFormosatModel.cpp +@@ -472,8 +472,8 @@ std::ostream& ossimplugins::ossimFormosa + // Capture stream flags since we are going to mess with them. + std::ios_base::fmtflags f = out.flags(); + +- out << "\nDump of ossimFormosatModel at address " << (hex) << this +- << (dec) ++ out << "\nDump of ossimFormosatModel at address " << (std::hex) << this ++ << (std::dec) + << "\n------------------------------------------------" + << "\n theImageID = " << theImageID + << "\n theMetadataFile = " << theMetaDataFile +@@ -495,7 +495,7 @@ std::ostream& ossimplugins::ossimFormosa + << "\n theYawRate = " << theYawRate + << "\n theFocalLenOffset = " << theFocalLenOffset + << "\n------------------------------------------------" +- << "\n " << endl; ++ << "\n " << std::endl; + + // Set the flags back. + out.flags(f); +@@ -596,14 +596,14 @@ void ossimplugins::ossimFormosatModel::i + { + ossimNotify(ossimNotifyLevel_DEBUG) + << "DEBUG:\n\t Psi_x = " << Psi_x +- << "\n\t Psi_y = " << Psi_y << endl; ++ << "\n\t Psi_y = " << Psi_y << std::endl; + } + + ossimColumnVector3d u_sat (-tan(Psi_y), tan(Psi_x), -(1.0 + theFocalLenOffset)); + if (traceDebug() || runtime_dbflag) + { + ossimNotify(ossimNotifyLevel_DEBUG) +- << "DEBUG \n\t u_sat = " << u_sat << endl; ++ << "DEBUG \n\t u_sat = " << u_sat << std::endl; + } + + // +@@ -617,7 +617,7 @@ void ossimplugins::ossimFormosatModel::i + { + ossimNotify(ossimNotifyLevel_DEBUG) + << "DEBUG:\n\t theSatToOrbRotation = " << satToOrbit +- << "\n\t u_orb = " << u_orb << endl; ++ << "\n\t u_orb = " << u_orb << std::endl; + } + + // +@@ -648,7 +648,7 @@ void ossimplugins::ossimFormosatModel::i + { + ossimNotify(ossimNotifyLevel_DEBUG) + << "DEBUG:\n\t orbToEcfRotation = " << orbToEcfRotation +- << "\n\t u_ecf = " << u_ecf << endl; ++ << "\n\t u_ecf = " << u_ecf << std::endl; + } + + // +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimRadarSat2Model.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimRadarSat2Model.cpp +@@ -115,7 +115,7 @@ double ossimRadarSat2Model::getSlantRang + << "\n(col-_refPoint->get_pix_col()) " + << (col-_refPoint->get_pix_col()) + << "\n_refPoint->get_pix_col() : " << _refPoint->get_pix_col() +- << "\n relativeGroundRange : " << relativeGroundRange << endl; ++ << "\n relativeGroundRange : " << relativeGroundRange << std::endl; + } + + int numSet = FindSRGRSetNumber((_refPoint->get_ephemeris())->get_date()) ; +@@ -303,7 +303,7 @@ bool ossimRadarSat2Model::open(const oss + ossimNotify(ossimNotifyLevel_DEBUG) + << "theImageClipRect : " << theImageClipRect + << "ul, ur, lr, ll " << ul << ", " << ur +- << ", " << lr << " , " << ll << endl; ++ << ", " << lr << " , " << ll << std::endl; + } + + setGroundRect(ul, ur, lr, ll); // ossimSensorModel method. +@@ -328,7 +328,7 @@ std::ostream& ossimRadarSat2Model::print + // Capture the original flags. + std::ios_base::fmtflags f = out.flags(); + +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\nossimRadarSat2Model class data members:\n" + << "_n_srgr: " << _n_srgr << "\n"; + +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimRadarSat2ProductDoc.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimRadarSat2ProductDoc.cpp +@@ -711,10 +711,10 @@ RPCModel ossimRadarSat2ProductDoc::getRp + double longitudeScale = 0; + double heightScale = 0; + +- vector lineNumeratorCoefficients = vector(20,0); +- vector lineDenominatorCoefficients = vector(20,0); +- vector pixelNumeratorCoefficients = vector(20,0); +- vector pixelDenominatorCoefficients = vector(20,0); ++ std::vector lineNumeratorCoefficients = std::vector(20,0); ++ std::vector lineDenominatorCoefficients = std::vector(20,0); ++ std::vector pixelNumeratorCoefficients = std::vector(20,0); ++ std::vector pixelDenominatorCoefficients = std::vector(20,0); + + //the final string outputs to the text file + +@@ -724,66 +724,66 @@ RPCModel ossimRadarSat2ProductDoc::getRp + if (rs2Check) + { + if (!ossim::getPath(searchbiasError, xdoc, biasErrorStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + biasError = biasErrorStr.toDouble(); + + if (!ossim::getPath(searchrandomError, xdoc, randomErrorStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + randomError = randomErrorStr.toDouble(); + + if (!ossim::getPath(searchlineFitQuality, xdoc, lineFitQualityStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + lineFitQuality = lineFitQualityStr.toDouble(); + + if (!ossim::getPath(searchpixelFitQuality, xdoc, pixelFitQualityStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + pixelFitQuality = pixelFitQualityStr.toDouble(); + + if (!ossim::getPath(searchlineOffset, xdoc, lineOffsetStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + lineOffset = lineOffsetStr.toDouble(); + + if (!ossim::getPath(searchpixelOffset, xdoc, pixelOffsetStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + pixelOffset = pixelOffsetStr.toDouble(); + + if (!ossim::getPath(searchlatitudeOffset, xdoc, latitudeOffsetStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + latitudeOffset = latitudeOffsetStr.toDouble(); + + if (!ossim::getPath(searchlongitudeOffset, xdoc, longitudeOffsetStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + longitudeOffset = longitudeOffsetStr.toDouble(); + + if (!ossim::getPath(searchheightOffset, xdoc, heightOffsetStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + heightOffset = heightOffsetStr.toDouble(); + + // -------------- + + if (!ossim::getPath(searchlineScale, xdoc, lineScaleStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + lineScale = lineScaleStr.toDouble(); + + + if (!ossim::getPath(searchpixelScale, xdoc, pixelScaleStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + pixelScale = pixelScaleStr.toDouble(); + + + if (!ossim::getPath(searchlatitudeScale, xdoc, latitudeScaleStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + latitudeScale = latitudeScaleStr.toDouble(); + + // ----------------------- + + if (!ossim::getPath(searchlongitudeScale, xdoc, longitudeScaleStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + longitudeScale = longitudeScaleStr.toDouble(); + + + if (!ossim::getPath(searchheightScale, xdoc, heightScaleStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + heightScale = heightScaleStr.toDouble(); + + // ---- parameters for reading in coeefs ------------ +@@ -794,7 +794,7 @@ RPCModel ossimRadarSat2ProductDoc::getRp + + + if (!ossim::getPath(searchlineNumeratorCoefficients, xdoc, lineNumeratorCoefficientsStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + + + string lineNumeratorCoefficientsStr_N = lineNumeratorCoefficientsStr[0]; +@@ -810,7 +810,7 @@ RPCModel ossimRadarSat2ProductDoc::getRp + // ------------------ + + if (!ossim::getPath(searchlineDenominatorCoefficients, xdoc, lineDenominatorCoefficientsStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + + + string lineDenominatorCoefficientsStr_N = lineDenominatorCoefficientsStr[0]; +@@ -827,7 +827,7 @@ RPCModel ossimRadarSat2ProductDoc::getRp + // ------------------ + + if (!ossim::getPath(searchpixelNumeratorCoefficients, xdoc, pixelNumeratorCoefficientsStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + + string pixelNumeratorCoefficientsStr_N = pixelNumeratorCoefficientsStr[0]; + +@@ -843,7 +843,7 @@ RPCModel ossimRadarSat2ProductDoc::getRp + // ------------------ + + if (!ossim::getPath(searchpixelDenominatorCoefficients, xdoc, pixelDenominatorCoefficientsStr)) +- ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << endl; ++ ossimNotify(ossimNotifyLevel_WARN) << "ERROR: UNABLE TO FIND RS2 RPC COEFFICIENT INFORMATION" << std::endl; + + string pixelDenominatorCoefficientsStr_N = pixelDenominatorCoefficientsStr[0]; + +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSentinel1Model.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSentinel1Model.cpp +@@ -115,14 +115,14 @@ namespace ossimplugins + // Capture stream flags since we are going to mess with them. + std::ios_base::fmtflags f = out.flags(); + +- out << "\nDump of ossimSentinel1Model at address " << hex << this +- << dec ++ out << "\nDump of ossimSentinel1Model at address " << std::hex << this ++ << std::dec + << "\n------------------------------------------------" + << "\n theImageID = " << theImageID + << "\n theImageSize = " << theImageSize + + << "\n------------------------------------------------" +- << "\n " << endl; ++ << "\n " << std::endl; + + // Set the flags back. + out.flags(f); +@@ -371,7 +371,7 @@ namespace ossimplugins + const ossimString prefix = "support_data."; + const ossimString xpath = "/xfdu:XFDU/dataObjectSection/dataObject"; + +- vector > xml_nodes; ++ std::vector > xml_nodes; + + theManifestDoc->findNodes(xpath, xml_nodes); + +@@ -1169,9 +1169,9 @@ namespace ossimplugins + double ossimSentinel1Model::getBandTerrainHeight(ossimXmlDocument const& productXmlDocument) + { + double heightSum = 0.0; +- vector< ossimXmlNodePtr > heightList; ++ std::vector< ossimXmlNodePtr > heightList; + productXmlDocument.findNodes("/product/generalAnnotation/terrainHeightList/terrainHeight", heightList); +- vector::const_iterator it = heightList.begin(); ++ std::vector::const_iterator it = heightList.begin(); + for ( ; it != heightList.end() ; ++it) + { + heightSum += getOptionalTextFromFirstNode(**it, "value").toFloat64(); +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSpot6Model.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/ossimSpot6Model.cpp +@@ -107,8 +107,8 @@ namespace ossimplugins + // Capture stream flags since we are going to mess with them. + std::ios_base::fmtflags f = out.flags(); + +- out << "\nDump of ossimSpot6Model at address " << (hex) << this +- << (dec) ++ out << "\nDump of ossimSpot6Model at address " << (std::hex) << this ++ << (std::dec) + << "\n------------------------------------------------" + << "\n theImageID = " << theImageID + << "\n theImageSize = " << theImageSize +@@ -116,7 +116,7 @@ namespace ossimplugins + << "\n theRefImgPt = " << theRefImgPt + << "\n theProcessingLevel = " << theSupportData->getProcessingLevel() + << "\n------------------------------------------------" +- << "\n " << endl; ++ << "\n " << std::endl; + + // Set the flags back. + out.flags(f); +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/otb/IncidenceAngles.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/otb/IncidenceAngles.cpp +@@ -148,7 +148,7 @@ bool IncidenceAngles::loadState(const os + + std::ostream& IncidenceAngles::print(std::ostream& out) const + { +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\n IncidentAngles class data members:\n"; + + const char* prefix = 0; +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/otb/Noise.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/otb/Noise.cpp +@@ -149,7 +149,7 @@ bool Noise::loadState(const ossimKeyword + + std::ostream& Noise::print(std::ostream& out) const + { +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\n Noise class data members:\n"; + + const char* prefix = 0; +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/otb/RadarSat2NoiseLevel.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/otb/RadarSat2NoiseLevel.cpp +@@ -313,7 +313,7 @@ bool RadarSat2NoiseLevel::loadState(cons + + std::ostream& RadarSat2NoiseLevel::print(std::ostream& out) const + { +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\n RadarSat2NoiseLevel class data members:\n"; + + //const char* prefix = 0; +--- a/Modules/ThirdParty/OssimPlugins/src/ossim/otb/SceneCoord.cpp ++++ b/Modules/ThirdParty/OssimPlugins/src/ossim/otb/SceneCoord.cpp +@@ -152,7 +152,7 @@ bool SceneCoord::loadState(const ossimKe + + std::ostream& SceneCoord::print(std::ostream& out) const + { +- out << setprecision(15) << setiosflags(ios::fixed) ++ out << std::setprecision(15) << std::setiosflags(std::ios::fixed) + << "\n SceneCoord class data members:\n"; + + const char* prefix = 0; +--- a/Modules/Adapters/OSSIMAdapters/src/otbMapProjectionAdapter.cxx ++++ b/Modules/Adapters/OSSIMAdapters/src/otbMapProjectionAdapter.cxx +@@ -240,7 +240,7 @@ bool MapProjectionAdapter::InstantiatePr + //see discussion in May 2009 on ossim list; + //a better solution might be available... + std::string projectionString(kwl.find("type")); +- if (projectionString.find("ossimEquDistCylProjection") != string::npos) ++ if (projectionString.find("ossimEquDistCylProjection") != std::string::npos) + { + otbMsgDevMacro(<< "WARNING: Not instantiating a ossimEquDistCylProjection: " << projectionString); + otbMsgDevMacro(<< "Wkt was: " << kwl); +--- a/Modules/Adapters/OSSIMAdapters/test/otbOssimElevManagerTest4.cxx ++++ b/Modules/Adapters/OSSIMAdapters/test/otbOssimElevManagerTest4.cxx +@@ -94,7 +94,7 @@ int otbOssimElevManagerTest4(int argc, c + + std::ofstream file; + std::cout << outfname << std::endl; +- file.open(outfname, ios::binary | ios::out); ++ file.open(outfname, std::ios::binary | std::ios::out); + + file.write(reinterpret_cast(image), sizeof(double) * size[0] * size[1]); + file.close(); ===================================== debian/patches/series ===================================== @@ -1,2 +1,4 @@ version-install.patch 0001-BUG-remove-.Fixup-for-GDAL-2.5-compatibility.patch +ossim.patch +gdal.patch ===================================== debian/rules ===================================== @@ -106,7 +106,7 @@ override_dh_installdocs: dh_installdocs -A NOTICE override_dh_python3: - dh_python3 - dh_numpy3 + dh_python3 -ppython3-otb + dh_numpy3 -ppython3-otb .PHONY: templates View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/compare/39a39fd22c7587d6219653f6f75e071a3513a225...6393d30889e10df37be062795c0933b742daf4b5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/compare/39a39fd22c7587d6219653f6f75e071a3513a225...6393d30889e10df37be062795c0933b742daf4b5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 10 21:42:07 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 10 Sep 2019 20:42:07 +0000 Subject: [Git][debian-gis-team/otb] Pushed new tag debian/6.6.1+dfsg-3 Message-ID: <5d780a9f56c58_73483fbbbe7c0330407174@godard.mail> Bas Couwenberg pushed new tag debian/6.6.1+dfsg-3 at Debian GIS Project / otb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/tree/debian/6.6.1+dfsg-3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 10 21:54:24 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 10 Sep 2019 20:54:24 +0000 Subject: Processing of otb_6.6.1+dfsg-3_source.changes Message-ID: otb_6.6.1+dfsg-3_source.changes uploaded successfully to localhost along with the files: otb_6.6.1+dfsg-3.dsc otb_6.6.1+dfsg-3.debian.tar.xz otb_6.6.1+dfsg-3_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Tue Sep 10 22:24:46 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 10 Sep 2019 21:24:46 +0000 Subject: otb_6.6.1+dfsg-3_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Tue, 10 Sep 2019 21:37:47 +0200 Source: otb Architecture: source Version: 6.6.1+dfsg-3 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: otb (6.6.1+dfsg-3) unstable; urgency=medium . * Add patch to fix FTBFS with OSSIM 2.9.1. * Add patch to support GDAL 3. Checksums-Sha1: 54f2586b4e468d0b0313d69b0ecb84575004eb7d 6657 otb_6.6.1+dfsg-3.dsc ff96f74257b00ecb7126b6721b6ddb68d6a03cb6 259388 otb_6.6.1+dfsg-3.debian.tar.xz 369f6de7712097e2f523c57e5812cf35b5c6e71f 60434 otb_6.6.1+dfsg-3_amd64.buildinfo Checksums-Sha256: 8282221bcecdd87714303b78196a80efb196d8aa74835619c5080638a4448cb6 6657 otb_6.6.1+dfsg-3.dsc 7981de6d6ade80164586ef390d25026da941e9a4708ed6d3fd86d12259539919 259388 otb_6.6.1+dfsg-3.debian.tar.xz a0d0b67970c7cd319ae54a1b87873c1b81446e9f700613512d722f626b7c95d3 60434 otb_6.6.1+dfsg-3_amd64.buildinfo Files: bc9a43014d6c0d37faa320d9a0bc794b 6657 science optional otb_6.6.1+dfsg-3.dsc c2f892567948a63e136d130dac92acd0 259388 science optional otb_6.6.1+dfsg-3.debian.tar.xz d7a31379fb01f015f4c389e9f636935b 60434 science optional otb_6.6.1+dfsg-3_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl14CtUACgkQZ1DxCuiN SvEE3Q//TOuVcBuAHs0L0gR1Ic1nAWXiAPaqKI9xJ0/vZOt3hWH8SRJ1SUX0Zzmo cM8IlOLIAkbD5BRBI54NMvG2AFkIHCAkySiLjRWgS+BZFedihnzilqdX55N/Km/q 3lnvjykvFIpxwqQ3UqkNL9GjFOK+/94OWrkfzjIVv8SaRMUe3oKhJ967fyNUlsfh t6Ggg+Qyfu4AafqlPJZ/fFrRcIxRd58DndXRxtf21XQi3p8wtMRf/AxUTcMp8cU2 iHITMX/RM0yRKNHaFMEM4x4jQDDo5za5BLYpsEBBHozKfkchvovfYC8tJTZz9sk2 7kiNE3L8V4hOVirHrWj9HJJ7qYXb+Z51zYWOMWjGcsP5hKdxjpkMXYg9WFbzMoCq BpmxITaSacym3WpBlfOHfimUrvAZfez8sSIdwxBAD88BSIC9v2ksd23DvfN7cGcV dOql7/S6/pmFRA8CqRBLYyLQ6VUV0KvBFiZRYLkAX/AuVza+Vq08Apy+u2REfo8v DD1BB547yDJZfCbgx1O2x0KoBaMZ6mU5Oi3MyBQsBreU/edWOY+75o1plKK+621s DTRZgpu0x1FhAosOjNhTmC3U+RVJKWUiaFRpMgFdLvDTy9p1hh1e3rWeiX7TVohs 9b+r4cbMR+ES9Yik4PN4VEaqfZI/nTSjVx3mViMihfoLvxjdh24= =g1kh -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From info at josephmuscatpr.com Tue Sep 10 23:33:09 2019 From: info at josephmuscatpr.com (Alok Kumar) Date: Tue, 10 Sep 2019 18:33:09 -0400 Subject: Investments Partnership. Message-ID: Your email client cannot read this email. To view it online, please go here: http://josephmuscatpr.com/iem/link.php?M=10459&N=31&L=1&F=T To stop receiving our this email, click below to unsubscribe To stop receiving these emails:http://josephmuscatpr.com/iem/link.php?M=10459&N=31&L=2&F=T -------------- next part -------------- An HTML attachment was scrubbed... URL: From info at trendtoday.co.uk Wed Sep 11 00:43:15 2019 From: info at trendtoday.co.uk (=?UTF-8?B?U2VyYSBSZWxpZWYgQ0JEIE9pbA==?=) Date: Wed, 11 Sep 2019 01:43:15 +0200 Subject: pkg-grass-devel === > '' =?UTF-8?B?R0VUIFlPVVIgU0VSQSBSRUxJRUYgQ0JEIEhFTVAgT0lMIFRPREFZISA=?= '' Message-ID: An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 06:48:42 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 05:48:42 +0000 Subject: [Git][debian-gis-team/otb][master] Update symbols for amd64 & i386. Message-ID: <5d788abaaabd5_73482ad95d7dabac4301f8@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / otb Commits: 3d5857bb by Bas Couwenberg at 2019-09-11T05:48:30Z Update symbols for amd64 & i386. - - - - - 20 changed files: - debian/changelog - debian/libotbapplicationengine-6.6-1.symbols - debian/libotbcarto-6.6-1.symbols - debian/libotbcommandlineparser-6.6-1.symbols - debian/libotbcommon-6.6-1.symbols - debian/libotbedge-6.6-1.symbols - debian/libotbextendedfilename-6.6-1.symbols - debian/libotbgdaladapters-6.6-1.symbols - debian/libotbice-6.6-1.symbols - debian/libotbimagebase-6.6-1.symbols - debian/libotbimageio-6.6-1.symbols - debian/libotbiogdal-6.6-1.symbols - debian/libotbiokml-6.6-1.symbols - debian/libotbmapla-6.6-1.symbols - debian/libotbmathparserx-6.6-1.symbols - debian/libotbmetadata-6.6-1.symbols - debian/libotbmonteverdicore-6.6-1.symbols - debian/libotbmonteverdigui-6.6-1.symbols - debian/libotbossimadapters-6.6-1.symbols - debian/libotbossimplugins-6.6-1.symbols The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/3d5857bb2ce95d2b55af0d0199935b3e0837820d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/3d5857bb2ce95d2b55af0d0199935b3e0837820d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 08:58:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 07:58:44 +0000 Subject: [Git][debian-gis-team/fiona] Pushed new tag debian/1.8.6-3 Message-ID: <5d78a93448263_73483fbbbe60bd50446360@godard.mail> Bas Couwenberg pushed new tag debian/1.8.6-3 at Debian GIS Project / fiona -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/tree/debian/1.8.6-3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 08:58:48 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 07:58:48 +0000 Subject: [Git][debian-gis-team/fiona][master] 2 commits: Add patch to fix FTBFS with GDAL 3. (closes: #939872) Message-ID: <5d78a93874158_73482ad9639a29d44464d8@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / fiona Commits: e9fc223e by Bas Couwenberg at 2019-09-11T07:48:09Z Add patch to fix FTBFS with GDAL 3. (closes: #939872) - - - - - e285e491 by Bas Couwenberg at 2019-09-11T07:48:09Z Set distribution to unstable. - - - - - 3 changed files: - debian/changelog - + debian/patches/gdal3.patch - debian/patches/series Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,11 @@ +fiona (1.8.6-3) unstable; urgency=medium + + * Team upload. + * Add patch to fix FTBFS with GDAL 3. + (closes: #939872) + + -- Bas Couwenberg Wed, 11 Sep 2019 09:39:15 +0200 + fiona (1.8.6-2) unstable; urgency=medium * Team upload. ===================================== debian/patches/gdal3.patch ===================================== @@ -0,0 +1,2558 @@ +Description: Fix FTBFS with GDAL 3. +Author: Bas Couwenberg + +--- /dev/null ++++ b/fiona/_shim3.pxd +@@ -0,0 +1,14 @@ ++include "ogrext3.pxd" ++ ++cdef bint is_field_null(void *feature, int n) ++cdef void set_field_null(void *feature, int n) ++cdef void gdal_flush_cache(void *cogr_ds) ++cdef void* gdal_open_vector(const char *path_c, int mode, drivers, options) except NULL ++cdef void* gdal_create(void* cogr_driver, const char *path_c, options) except NULL ++cdef OGRErr gdal_start_transaction(void *cogr_ds, int force) ++cdef OGRErr gdal_commit_transaction(void *cogr_ds) ++cdef OGRErr gdal_rollback_transaction(void *cogr_ds) ++cdef OGRFieldSubType get_field_subtype(void *fielddefn) ++cdef void set_field_subtype(void *fielddefn, OGRFieldSubType subtype) ++cdef bint check_capability_create_layer(void *cogr_ds) ++cdef void *get_linear_geometry(void *geom) +--- /dev/null ++++ b/fiona/_shim3.pyx +@@ -0,0 +1,134 @@ ++"""Shims on top of ogrext for GDAL versions >= 3.0""" ++ ++cdef extern from "ogr_api.h": ++ ++ int OGR_F_IsFieldNull(void *feature, int n) ++ ++ ++from fiona.ogrext2 cimport * ++from fiona._err cimport exc_wrap_pointer ++from fiona._err import cpl_errs, CPLE_BaseError, FionaNullPointerError ++from fiona.errors import DriverError ++ ++import logging ++ ++ ++log = logging.getLogger(__name__) ++ ++ ++cdef bint is_field_null(void *feature, int n): ++ if OGR_F_IsFieldNull(feature, n): ++ return True ++ elif not OGR_F_IsFieldSet(feature, n): ++ return True ++ else: ++ return False ++ ++ ++cdef void set_field_null(void *feature, int n): ++ OGR_F_SetFieldNull(feature, n) ++ ++ ++cdef void gdal_flush_cache(void *cogr_ds): ++ with cpl_errs: ++ GDALFlushCache(cogr_ds) ++ ++ ++cdef void* gdal_open_vector(char* path_c, int mode, drivers, options) except NULL: ++ cdef void* cogr_ds = NULL ++ cdef char **drvs = NULL ++ cdef void* drv = NULL ++ cdef char **open_opts = NULL ++ ++ flags = GDAL_OF_VECTOR | GDAL_OF_VERBOSE_ERROR ++ if mode == 1: ++ flags |= GDAL_OF_UPDATE ++ else: ++ flags |= GDAL_OF_READONLY ++ ++ if drivers: ++ for name in drivers: ++ name_b = name.encode() ++ name_c = name_b ++ drv = GDALGetDriverByName(name_c) ++ if drv != NULL: ++ drvs = CSLAddString(drvs, name_c) ++ ++ for k, v in options.items(): ++ ++ if v is None: ++ continue ++ ++ k = k.upper().encode('utf-8') ++ if isinstance(v, bool): ++ v = ('ON' if v else 'OFF').encode('utf-8') ++ else: ++ v = str(v).encode('utf-8') ++ log.debug("Set option %r: %r", k, v) ++ open_opts = CSLAddNameValue(open_opts, k, v) ++ ++ open_opts = CSLAddNameValue(open_opts, "VALIDATE_OPEN_OPTIONS", "NO") ++ ++ try: ++ cogr_ds = exc_wrap_pointer( ++ GDALOpenEx(path_c, flags, drvs, open_opts, NULL) ++ ) ++ return cogr_ds ++ except FionaNullPointerError: ++ raise DriverError("Failed to open dataset (mode={}): {}".format(mode, path_c.decode("utf-8"))) ++ except CPLE_BaseError as exc: ++ raise DriverError(str(exc)) ++ finally: ++ CSLDestroy(drvs) ++ CSLDestroy(open_opts) ++ ++ ++cdef void* gdal_create(void* cogr_driver, const char *path_c, options) except NULL: ++ cdef char **creation_opts = NULL ++ cdef void *cogr_ds = NULL ++ ++ for k, v in options.items(): ++ k = k.upper().encode('utf-8') ++ if isinstance(v, bool): ++ v = ('ON' if v else 'OFF').encode('utf-8') ++ else: ++ v = str(v).encode('utf-8') ++ log.debug("Set option %r: %r", k, v) ++ creation_opts = CSLAddNameValue(creation_opts, k, v) ++ ++ try: ++ return exc_wrap_pointer(GDALCreate(cogr_driver, path_c, 0, 0, 0, GDT_Unknown, creation_opts)) ++ except FionaNullPointerError: ++ raise DriverError("Failed to create dataset: {}".format(path_c.decode("utf-8"))) ++ except CPLE_BaseError as exc: ++ raise DriverError(str(exc)) ++ finally: ++ CSLDestroy(creation_opts) ++ ++ ++cdef OGRErr gdal_start_transaction(void* cogr_ds, int force): ++ return GDALDatasetStartTransaction(cogr_ds, force) ++ ++ ++cdef OGRErr gdal_commit_transaction(void* cogr_ds): ++ return GDALDatasetCommitTransaction(cogr_ds) ++ ++ ++cdef OGRErr gdal_rollback_transaction(void* cogr_ds): ++ return GDALDatasetRollbackTransaction(cogr_ds) ++ ++ ++cdef OGRFieldSubType get_field_subtype(void *fielddefn): ++ return OGR_Fld_GetSubType(fielddefn) ++ ++ ++cdef void set_field_subtype(void *fielddefn, OGRFieldSubType subtype): ++ OGR_Fld_SetSubType(fielddefn, subtype) ++ ++ ++cdef bint check_capability_create_layer(void *cogr_ds): ++ return GDALDatasetTestCapability(cogr_ds, ODsCCreateLayer) ++ ++ ++cdef void *get_linear_geometry(void *geom): ++ return OGR_G_GetLinearGeometry(geom, 0.0, NULL) +--- a/setup.py ++++ b/setup.py +@@ -77,7 +77,8 @@ class sdist_multi_gdal(sdist): + sources = { + "_shim1": "_shim", + "_shim2": "_shim", +- "_shim22": "_shim" ++ "_shim22": "_shim", ++ "_shim3": "_shim" + } + for src_a, src_b in sources.items(): + shutil.copy('fiona/{}.pyx'.format(src_a), 'fiona/{}.pyx'.format(src_b)) +@@ -211,6 +212,9 @@ if source_is_repo and "clean" not in sys + "Cython is required to build from a repo.") + sys.exit(1) + ++ ogrext_file = 'fiona/ogrext.pyx' ++ crs_file = 'fiona/_crs.pyx' ++ + if gdalversion.startswith("1"): + shutil.copy('fiona/_shim1.pyx', 'fiona/_shim.pyx') + shutil.copy('fiona/_shim1.pxd', 'fiona/_shim.pxd') +@@ -223,17 +227,25 @@ if source_is_repo and "clean" not in sys + log.info("Building Fiona for gdal 2.0.x-2.1.x: {0}".format(gdalversion)) + shutil.copy('fiona/_shim2.pyx', 'fiona/_shim.pyx') + shutil.copy('fiona/_shim2.pxd', 'fiona/_shim.pxd') ++ elif gdal_major_version == 3: ++ log.info("Building Fiona for gdal 3.0+: {0}".format(gdalversion)) ++ shutil.copy('fiona/_shim3.pyx', 'fiona/_shim.pyx') ++ shutil.copy('fiona/_shim3.pxd', 'fiona/_shim.pxd') ++ ++ ogrext_file = 'fiona/ogrext3.pyx' ++ crs_file = 'fiona/_crs3.pyx' ++ + + ext_modules = cythonize([ + Extension('fiona._geometry', ['fiona/_geometry.pyx'], **ext_options), + Extension('fiona.schema', ['fiona/schema.pyx'], **ext_options), + Extension('fiona._transform', ['fiona/_transform.pyx'], **ext_options_cpp), +- Extension('fiona._crs', ['fiona/_crs.pyx'], **ext_options), ++ Extension('fiona._crs', [crs_file], **ext_options), + Extension('fiona._env', ['fiona/_env.pyx'], **ext_options), + Extension('fiona._drivers', ['fiona/_drivers.pyx'], **ext_options), + Extension('fiona._err', ['fiona/_err.pyx'], **ext_options), + Extension('fiona._shim', ['fiona/_shim.pyx'], **ext_options), +- Extension('fiona.ogrext', ['fiona/ogrext.pyx'], **ext_options) ++ Extension('fiona.ogrext', [ogrext_file], **ext_options) + ], + compiler_directives={"language_level": "3"} + ) +--- /dev/null ++++ b/fiona/ogrext3.pxd +@@ -0,0 +1,327 @@ ++# Copyright (c) 2007, Sean C. Gillies ++# All rights reserved. ++# See ../LICENSE.txt ++ ++from libc.stdio cimport FILE ++ ++ ++cdef extern from "ogr_core.h": ++ ++ ctypedef int OGRErr ++ ++ ctypedef enum OGRwkbGeometryType: ++ wkbUnknown ++ wkbPoint ++ wkbLineString ++ wkbPolygon ++ wkbMultiPoint ++ wkbMultiLineString ++ wkbMultiPolygon ++ wkbGeometryCollection ++ wkbCircularString ++ wkbCompoundCurve ++ wkbCurvePolygon ++ wkbMultiCurve ++ wkbMultiSurface ++ wkbCurve ++ wkbSurface ++ wkbPolyhedralSurface ++ wkbTIN ++ wkbTriangle ++ wkbNone ++ wkbLinearRing ++ wkbCircularStringZ ++ wkbCompoundCurveZ ++ wkbCurvePolygonZ ++ wkbMultiCurveZ ++ wkbMultiSurfaceZ ++ wkbCurveZ ++ wkbSurfaceZ ++ wkbPolyhedralSurfaceZ ++ wkbTINZ ++ wkbTriangleZ ++ wkbPointM ++ wkbLineStringM ++ wkbPolygonM ++ wkbMultiPointM ++ wkbMultiLineStringM ++ wkbMultiPolygonM ++ wkbGeometryCollectionM ++ wkbCircularStringM ++ wkbCompoundCurveM ++ wkbCurvePolygonM ++ wkbMultiCurveM ++ wkbMultiSurfaceM ++ wkbCurveM ++ wkbSurfaceM ++ wkbPolyhedralSurfaceM ++ wkbTINM ++ wkbTriangleM ++ wkbPointZM ++ wkbLineStringZM ++ wkbPolygonZM ++ wkbMultiPointZM ++ wkbMultiLineStringZM ++ wkbMultiPolygonZM ++ wkbGeometryCollectionZM ++ wkbCircularStringZM ++ wkbCompoundCurveZM ++ wkbCurvePolygonZM ++ wkbMultiCurveZM ++ wkbMultiSurfaceZM ++ wkbCurveZM ++ wkbSurfaceZM ++ wkbPolyhedralSurfaceZM ++ wkbTINZM ++ wkbTriangleZM ++ wkbPoint25D ++ wkbLineString25D ++ wkbPolygon25D ++ wkbMultiPoint25D ++ wkbMultiLineString25D ++ wkbMultiPolygon25D ++ wkbGeometryCollection25D ++ ++ ctypedef enum OGRFieldType: ++ OFTInteger ++ OFTIntegerList ++ OFTReal ++ OFTRealList ++ OFTString ++ OFTStringList ++ OFTWideString ++ OFTWideStringList ++ OFTBinary ++ OFTDate ++ OFTTime ++ OFTDateTime ++ OFTInteger64 ++ OFTInteger64List ++ OFTMaxType ++ ++ ctypedef int OGRFieldSubType ++ cdef int OFSTNone = 0 ++ cdef int OFSTBoolean = 1 ++ cdef int OFSTInt16 = 2 ++ cdef int OFSTFloat32 = 3 ++ cdef int OFSTMaxSubType = 3 ++ ++ ctypedef struct OGREnvelope: ++ double MinX ++ double MaxX ++ double MinY ++ double MaxY ++ ++ char * OGRGeometryTypeToName(int) ++ ++ ++ char * ODsCCreateLayer = "CreateLayer" ++ char * ODsCDeleteLayer = "DeleteLayer" ++ ++ ++cdef extern from "gdal.h": ++ char * GDALVersionInfo (char *pszRequest) ++ void * GDALGetDriverByName(const char * pszName) ++ void * GDALOpenEx(const char * pszFilename, ++ unsigned int nOpenFlags, ++ const char *const *papszAllowedDrivers, ++ const char *const *papszOpenOptions, ++ const char *const *papszSiblingFiles ++ ) ++ int GDAL_OF_UPDATE ++ int GDAL_OF_READONLY ++ int GDAL_OF_VECTOR ++ int GDAL_OF_VERBOSE_ERROR ++ int GDALDatasetGetLayerCount(void * hds) ++ void * GDALDatasetGetLayer(void * hDS, int iLayer) ++ void * GDALDatasetGetLayerByName(void * hDS, char * pszName) ++ void GDALClose(void * hDS) ++ void * GDALCreate(void * hDriver, ++ const char * pszFilename, ++ int nXSize, ++ int nYSize, ++ int nBands, ++ GDALDataType eBandType, ++ char ** papszOptions) ++ void * GDALDatasetCreateLayer(void * hDS, ++ const char * pszName, ++ void * hSpatialRef, ++ int eType, ++ char ** papszOptions) ++ int GDALDatasetDeleteLayer(void * hDS, int iLayer) ++ void GDALFlushCache(void * hDS) ++ char * GDALGetDriverShortName(void * hDriver) ++ char * GDALGetDatasetDriver (void * hDataset) ++ int GDALDeleteDataset(void * hDriver, const char * pszFilename) ++ OGRErr GDALDatasetStartTransaction (void * hDataset, int bForce) ++ OGRErr GDALDatasetCommitTransaction (void * hDataset) ++ OGRErr GDALDatasetRollbackTransaction (void * hDataset) ++ int GDALDatasetTestCapability (void * hDataset, char *) ++ ++ ++ ctypedef enum GDALDataType: ++ GDT_Unknown ++ GDT_Byte ++ GDT_UInt16 ++ GDT_Int16 ++ GDT_UInt32 ++ GDT_Int32 ++ GDT_Float32 ++ GDT_Float64 ++ GDT_CInt16 ++ GDT_CInt32 ++ GDT_CFloat32 ++ GDT_CFloat64 ++ GDT_TypeCount ++ ++cdef extern from "gdal_version.h": ++ int GDAL_COMPUTE_VERSION(int maj, int min, int rev) ++ ++cdef extern from "cpl_conv.h": ++ void * CPLMalloc (size_t) ++ void CPLFree (void *ptr) ++ void CPLSetThreadLocalConfigOption (char *key, char *val) ++ const char *CPLGetConfigOption (char *, char *) ++ ++ ++cdef extern from "cpl_string.h": ++ char ** CSLAddNameValue (char **list, const char *name, const char *value) ++ char ** CSLSetNameValue (char **list, const char *name, const char *value) ++ void CSLDestroy (char **list) ++ char ** CSLAddString(char **list, const char *string) ++ ++ ++cdef extern from "cpl_vsi.h" nogil: ++ ctypedef int vsi_l_offset ++ ctypedef FILE VSILFILE ++ ++ unsigned char *VSIGetMemFileBuffer(const char *path, ++ vsi_l_offset *data_len, ++ int take_ownership) ++ VSILFILE *VSIFileFromMemBuffer(const char *path, void *data, ++ vsi_l_offset data_len, int take_ownership) ++ VSILFILE* VSIFOpenL(const char *path, const char *mode) ++ int VSIFCloseL(VSILFILE *fp) ++ int VSIUnlink(const char *path) ++ ++ int VSIFFlushL(VSILFILE *fp) ++ size_t VSIFReadL(void *buffer, size_t nSize, size_t nCount, VSILFILE *fp) ++ int VSIFSeekL(VSILFILE *fp, vsi_l_offset nOffset, int nWhence) ++ vsi_l_offset VSIFTellL(VSILFILE *fp) ++ int VSIFTruncateL(VSILFILE *fp, vsi_l_offset nNewSize) ++ size_t VSIFWriteL(void *buffer, size_t nSize, size_t nCount, VSILFILE *fp) ++ int VSIUnlink (const char * pathname) ++ ++ ++cdef extern from "ogr_srs_api.h": ++ ++ ctypedef void * OGRSpatialReferenceH ++ ++ void OSRCleanup () ++ OGRSpatialReferenceH OSRClone (OGRSpatialReferenceH srs) ++ int OSRExportToProj4 (OGRSpatialReferenceH srs, char **params) ++ int OSRExportToWkt (OGRSpatialReferenceH srs, char **params) ++ int OSRImportFromEPSG (OGRSpatialReferenceH, int code) ++ int OSRImportFromProj4 (OGRSpatialReferenceH srs, const char *proj) ++ int OSRSetFromUserInput (OGRSpatialReferenceH srs, const char *input) ++ int OSRAutoIdentifyEPSG (OGRSpatialReferenceH srs) ++ const char * OSRGetAuthorityName (OGRSpatialReferenceH srs, const char *key) ++ const char * OSRGetAuthorityCode (OGRSpatialReferenceH srs, const char *key) ++ OGRSpatialReferenceH OSRNewSpatialReference (char *wkt) ++ void OSRRelease (OGRSpatialReferenceH srs) ++ void * OCTNewCoordinateTransformation (OGRSpatialReferenceH source, OGRSpatialReferenceH dest) ++ void OCTDestroyCoordinateTransformation (void *source) ++ int OCTTransform (void *ct, int nCount, double *x, double *y, double *z) ++ ++cdef extern from "ogr_api.h": ++ ++ const char * OGR_Dr_GetName (void *driver) ++ void * OGR_Dr_CreateDataSource (void *driver, const char *path, char **options) ++ int OGR_Dr_DeleteDataSource (void *driver, char *) ++ void * OGR_Dr_Open (void *driver, const char *path, int bupdate) ++ int OGR_Dr_TestCapability (void *driver, const char *) ++ int OGR_DS_DeleteLayer (void *datasource, int n) ++ void * OGR_F_Create (void *featuredefn) ++ void OGR_F_Destroy (void *feature) ++ long OGR_F_GetFID (void *feature) ++ int OGR_F_IsFieldSet (void *feature, int n) ++ int OGR_F_GetFieldAsDateTime (void *feature, int n, int *y, int *m, int *d, int *h, int *m, int *s, int *z) ++ double OGR_F_GetFieldAsDouble (void *feature, int n) ++ int OGR_F_GetFieldAsInteger (void *feature, int n) ++ char * OGR_F_GetFieldAsString (void *feature, int n) ++ unsigned char * OGR_F_GetFieldAsBinary(void *feature, int n, int *s) ++ int OGR_F_GetFieldCount (void *feature) ++ void * OGR_F_GetFieldDefnRef (void *feature, int n) ++ int OGR_F_GetFieldIndex (void *feature, char *name) ++ void * OGR_F_GetGeometryRef (void *feature) ++ void * OGR_F_StealGeometry (void *feature) ++ void OGR_F_SetFieldDateTime (void *feature, int n, int y, int m, int d, int hh, int mm, int ss, int tz) ++ void OGR_F_SetFieldDouble (void *feature, int n, double value) ++ void OGR_F_SetFieldInteger (void *feature, int n, int value) ++ void OGR_F_SetFieldString (void *feature, int n, char *value) ++ void OGR_F_SetFieldBinary (void *feature, int n, int l, unsigned char *value) ++ void OGR_F_SetFieldNull (void *feature, int n) # new in GDAL 2.2 ++ int OGR_F_SetGeometryDirectly (void *feature, void *geometry) ++ void * OGR_FD_Create (char *name) ++ int OGR_FD_GetFieldCount (void *featuredefn) ++ void * OGR_FD_GetFieldDefn (void *featuredefn, int n) ++ int OGR_FD_GetGeomType (void *featuredefn) ++ char * OGR_FD_GetName (void *featuredefn) ++ void * OGR_Fld_Create (char *name, OGRFieldType fieldtype) ++ void OGR_Fld_Destroy (void *fielddefn) ++ char * OGR_Fld_GetNameRef (void *fielddefn) ++ int OGR_Fld_GetPrecision (void *fielddefn) ++ int OGR_Fld_GetType (void *fielddefn) ++ int OGR_Fld_GetWidth (void *fielddefn) ++ void OGR_Fld_Set (void *fielddefn, char *name, int fieldtype, int width, int precision, int justification) ++ void OGR_Fld_SetPrecision (void *fielddefn, int n) ++ void OGR_Fld_SetWidth (void *fielddefn, int n) ++ OGRFieldSubType OGR_Fld_GetSubType(void *fielddefn) ++ void OGR_Fld_SetSubType(void *fielddefn, OGRFieldSubType subtype) ++ OGRErr OGR_G_AddGeometryDirectly (void *geometry, void *part) ++ void OGR_G_AddPoint (void *geometry, double x, double y, double z) ++ void OGR_G_AddPoint_2D (void *geometry, double x, double y) ++ void OGR_G_CloseRings (void *geometry) ++ void * OGR_G_CreateGeometry (int wkbtypecode) ++ void OGR_G_DestroyGeometry (void *geometry) ++ unsigned char * OGR_G_ExportToJson (void *geometry) ++ void OGR_G_ExportToWkb (void *geometry, int endianness, char *buffer) ++ int OGR_G_GetCoordinateDimension (void *geometry) ++ int OGR_G_GetGeometryCount (void *geometry) ++ unsigned char * OGR_G_GetGeometryName (void *geometry) ++ int OGR_G_GetGeometryType (void *geometry) ++ void * OGR_G_GetGeometryRef (void *geometry, int n) ++ int OGR_G_GetPointCount (void *geometry) ++ double OGR_G_GetX (void *geometry, int n) ++ double OGR_G_GetY (void *geometry, int n) ++ double OGR_G_GetZ (void *geometry, int n) ++ void OGR_G_ImportFromWkb (void *geometry, unsigned char *bytes, int nbytes) ++ int OGR_G_WkbSize (void *geometry) ++ void * OGR_G_ForceToMultiPolygon (void *geometry) ++ void * OGR_G_ForceToPolygon (void *geometry) ++ void * OGR_G_Clone(void *geometry) ++ OGRErr OGR_L_CreateFeature (void *layer, void *feature) ++ OGRErr OGR_L_CreateField (void *layer, void *fielddefn, int flexible) ++ OGRErr OGR_L_GetExtent (void *layer, void *extent, int force) ++ void * OGR_L_GetFeature (void *layer, int n) ++ int OGR_L_GetFeatureCount (void *layer, int m) ++ void * OGR_G_GetLinearGeometry (void *hGeom, double dfMaxAngleStepSizeDegrees, char **papszOptions) ++ void * OGR_L_GetLayerDefn (void *layer) ++ char * OGR_L_GetName (void *layer) ++ void * OGR_L_GetNextFeature (void *layer) ++ void * OGR_L_GetSpatialFilter (void *layer) ++ void * OGR_L_GetSpatialRef (void *layer) ++ void OGR_L_ResetReading (void *layer) ++ void OGR_L_SetSpatialFilter (void *layer, void *geometry) ++ void OGR_L_SetSpatialFilterRect ( ++ void *layer, double minx, double miny, double maxx, double maxy ++ ) ++ int OGR_L_TestCapability (void *layer, char *name) ++ void * OGRGetDriverByName (char *) ++ void * OGROpen (char *path, int mode, void *x) ++ void * OGROpenShared (char *path, int mode, void *x) ++ int OGRReleaseDataSource (void *datasource) ++ OGRErr OGR_L_SetIgnoredFields (void *layer, const char **papszFields) ++ OGRErr OGR_L_SetNextByIndex (void *layer, long nIndex) ++ long long OGR_F_GetFieldAsInteger64 (void *feature, int n) ++ void OGR_F_SetFieldInteger64 (void *feature, int n, long long value) +--- /dev/null ++++ b/fiona/ogrext3.pyx +@@ -0,0 +1,1765 @@ ++# These are extension functions and classes using the OGR C API. ++ ++from __future__ import absolute_import ++ ++import datetime ++import json ++import locale ++import logging ++import os ++import warnings ++import math ++import uuid ++from collections import namedtuple ++ ++from six import integer_types, string_types, text_type ++ ++from fiona._shim cimport * ++ ++from fiona._geometry cimport ( ++ GeomBuilder, OGRGeomBuilder, geometry_type_code, ++ normalize_geometry_type_code, base_geometry_type_code) ++from fiona._err cimport exc_wrap_int, exc_wrap_pointer, exc_wrap_vsilfile ++ ++import fiona ++from fiona._env import GDALVersion, get_gdal_version_num ++from fiona._err import cpl_errs, FionaNullPointerError, CPLE_BaseError, CPLE_OpenFailedError ++from fiona._geometry import GEOMETRY_TYPES ++from fiona import compat ++from fiona.errors import ( ++ DriverError, DriverIOError, SchemaError, CRSError, FionaValueError, ++ TransactionError, GeometryTypeValidationError, DatasetDeleteError, ++ FionaDeprecationWarning) ++from fiona.compat import OrderedDict ++from fiona.rfc3339 import parse_date, parse_datetime, parse_time ++from fiona.rfc3339 import FionaDateType, FionaDateTimeType, FionaTimeType ++from fiona.schema import FIELD_TYPES, FIELD_TYPES_MAP, normalize_field_type ++from fiona.path import vsi_path ++ ++from fiona._shim cimport is_field_null ++ ++from libc.stdlib cimport malloc, free ++from libc.string cimport strcmp ++from cpython cimport PyBytes_FromStringAndSize, PyBytes_AsString ++ ++ ++cdef extern from "ogr_api.h" nogil: ++ ++ ctypedef void * OGRLayerH ++ ctypedef void * OGRDataSourceH ++ ctypedef void * OGRSFDriverH ++ ctypedef void * OGRFieldDefnH ++ ctypedef void * OGRFeatureDefnH ++ ctypedef void * OGRFeatureH ++ ctypedef void * OGRGeometryH ++ ++ ++log = logging.getLogger(__name__) ++ ++DEFAULT_TRANSACTION_SIZE = 20000 ++ ++# OGR Driver capability ++cdef const char * ODrCCreateDataSource = "CreateDataSource" ++cdef const char * ODrCDeleteDataSource = "DeleteDataSource" ++ ++# OGR Layer capability ++cdef const char * OLC_RANDOMREAD = "RandomRead" ++cdef const char * OLC_SEQUENTIALWRITE = "SequentialWrite" ++cdef const char * OLC_RANDOMWRITE = "RandomWrite" ++cdef const char * OLC_FASTSPATIALFILTER = "FastSpatialFilter" ++cdef const char * OLC_FASTFEATURECOUNT = "FastFeatureCount" ++cdef const char * OLC_FASTGETEXTENT = "FastGetExtent" ++cdef const char * OLC_FASTSETNEXTBYINDEX = "FastSetNextByIndex" ++cdef const char * OLC_CREATEFIELD = "CreateField" ++cdef const char * OLC_CREATEGEOMFIELD = "CreateGeomField" ++cdef const char * OLC_DELETEFIELD = "DeleteField" ++cdef const char * OLC_REORDERFIELDS = "ReorderFields" ++cdef const char * OLC_ALTERFIELDDEFN = "AlterFieldDefn" ++cdef const char * OLC_DELETEFEATURE = "DeleteFeature" ++cdef const char * OLC_STRINGSASUTF8 = "StringsAsUTF8" ++cdef const char * OLC_TRANSACTIONS = "Transactions" ++ ++# OGR integer error types. ++ ++OGRERR_NONE = 0 ++OGRERR_NOT_ENOUGH_DATA = 1 # not enough data to deserialize */ ++OGRERR_NOT_ENOUGH_MEMORY = 2 ++OGRERR_UNSUPPORTED_GEOMETRY_TYPE = 3 ++OGRERR_UNSUPPORTED_OPERATION = 4 ++OGRERR_CORRUPT_DATA = 5 ++OGRERR_FAILURE = 6 ++OGRERR_UNSUPPORTED_SRS = 7 ++OGRERR_INVALID_HANDLE = 8 ++ ++ ++def _explode(coords): ++ """Explode a GeoJSON geometry's coordinates object and yield ++ coordinate tuples. As long as the input is conforming, the type of ++ the geometry doesn't matter.""" ++ for e in coords: ++ if isinstance(e, (float, int)): ++ yield coords ++ break ++ else: ++ for f in _explode(e): ++ yield f ++ ++ ++def _bounds(geometry): ++ """Bounding box of a GeoJSON geometry""" ++ try: ++ xyz = tuple(zip(*list(_explode(geometry['coordinates'])))) ++ return min(xyz[0]), min(xyz[1]), max(xyz[0]), max(xyz[1]) ++ except (KeyError, TypeError): ++ return None ++ ++ ++cdef int GDAL_VERSION_NUM = get_gdal_version_num() ++ ++ ++# Feature extension classes and functions follow. ++ ++cdef class FeatureBuilder: ++ """Build Fiona features from OGR feature pointers. ++ ++ No OGR objects are allocated by this function and the feature ++ argument is not destroyed. ++ """ ++ ++ cdef build(self, void *feature, encoding='utf-8', bbox=False, driver=None, ignore_fields=None, ignore_geometry=False): ++ """Build a Fiona feature object from an OGR feature ++ ++ Parameters ++ ---------- ++ feature : void * ++ The OGR feature # TODO: use a real typedef ++ encoding : str ++ The encoding of OGR feature attributes ++ bbox : bool ++ Not used ++ driver : str ++ OGR format driver name like 'GeoJSON' ++ ignore_fields : sequence ++ A sequence of field names that will be ignored and omitted ++ in the Fiona feature properties ++ ignore_geometry : bool ++ Flag for whether the OGR geometry field is to be ignored ++ ++ Returns ++ ------- ++ dict ++ """ ++ cdef void *fdefn = NULL ++ cdef int i ++ cdef int y = 0 ++ cdef int m = 0 ++ cdef int d = 0 ++ cdef int hh = 0 ++ cdef int mm = 0 ++ cdef int ss = 0 ++ cdef int tz = 0 ++ cdef unsigned char *data = NULL ++ cdef int l ++ cdef int retval ++ cdef int fieldsubtype ++ cdef const char *key_c = NULL ++ ++ # Skeleton of the feature to be returned. ++ fid = OGR_F_GetFID(feature) ++ props = OrderedDict() ++ fiona_feature = { ++ "type": "Feature", ++ "id": str(fid), ++ "properties": props, ++ } ++ ++ ignore_fields = set(ignore_fields or []) ++ ++ # Iterate over the fields of the OGR feature. ++ for i in range(OGR_F_GetFieldCount(feature)): ++ fdefn = OGR_F_GetFieldDefnRef(feature, i) ++ if fdefn == NULL: ++ raise ValueError("Null feature definition") ++ key_c = OGR_Fld_GetNameRef(fdefn) ++ if key_c == NULL: ++ raise ValueError("Null field name reference") ++ key_b = key_c ++ key = key_b.decode(encoding) ++ ++ if key in ignore_fields: ++ continue ++ ++ fieldtypename = FIELD_TYPES[OGR_Fld_GetType(fdefn)] ++ fieldsubtype = get_field_subtype(fdefn) ++ if not fieldtypename: ++ log.warning( ++ "Skipping field %s: invalid type %s", ++ key, ++ OGR_Fld_GetType(fdefn)) ++ continue ++ ++ # TODO: other types ++ fieldtype = FIELD_TYPES_MAP[fieldtypename] ++ ++ if is_field_null(feature, i): ++ props[key] = None ++ ++ elif fieldtypename is 'int32': ++ if fieldsubtype == OFSTBoolean: ++ props[key] = bool(OGR_F_GetFieldAsInteger(feature, i)) ++ else: ++ props[key] = OGR_F_GetFieldAsInteger(feature, i) ++ ++ elif fieldtype is int: ++ if fieldsubtype == OFSTBoolean: ++ props[key] = bool(OGR_F_GetFieldAsInteger64(feature, i)) ++ else: ++ props[key] = OGR_F_GetFieldAsInteger64(feature, i) ++ ++ elif fieldtype is float: ++ props[key] = OGR_F_GetFieldAsDouble(feature, i) ++ ++ elif fieldtype is text_type: ++ try: ++ val = OGR_F_GetFieldAsString(feature, i) ++ val = val.decode(encoding) ++ except UnicodeDecodeError: ++ log.warning( ++ "Failed to decode %s using %s codec", val, encoding) ++ ++ # Does the text contain a JSON object? Let's check. ++ # Let's check as cheaply as we can. ++ if driver == 'GeoJSON' and val.startswith('{'): ++ try: ++ val = json.loads(val) ++ except ValueError as err: ++ log.warning(str(err)) ++ ++ # Now add to the properties object. ++ props[key] = val ++ ++ elif fieldtype in (FionaDateType, FionaTimeType, FionaDateTimeType): ++ retval = OGR_F_GetFieldAsDateTime( ++ feature, i, &y, &m, &d, &hh, &mm, &ss, &tz) ++ try: ++ if fieldtype is FionaDateType: ++ props[key] = datetime.date(y, m, d).isoformat() ++ elif fieldtype is FionaTimeType: ++ props[key] = datetime.time(hh, mm, ss).isoformat() ++ else: ++ props[key] = datetime.datetime( ++ y, m, d, hh, mm, ss).isoformat() ++ except ValueError as err: ++ log.exception(err) ++ props[key] = None ++ ++ elif fieldtype is bytes: ++ data = OGR_F_GetFieldAsBinary(feature, i, &l) ++ props[key] = data[:l] ++ ++ else: ++ log.debug("%s: None, fieldtype: %r, %r" % (key, fieldtype, fieldtype in string_types)) ++ props[key] = None ++ ++ cdef void *cogr_geometry = NULL ++ cdef void *org_geometry = NULL ++ ++ if not ignore_geometry: ++ cogr_geometry = OGR_F_GetGeometryRef(feature) ++ ++ if cogr_geometry is not NULL: ++ ++ code = base_geometry_type_code(OGR_G_GetGeometryType(cogr_geometry)) ++ ++ if 8 <= code <= 14: # Curves. ++ cogr_geometry = get_linear_geometry(cogr_geometry) ++ geom = GeomBuilder().build(cogr_geometry) ++ OGR_G_DestroyGeometry(cogr_geometry) ++ ++ elif 15 <= code <= 17: ++ # We steal the geometry: the geometry of the in-memory feature is now null ++ # and we are responsible for cogr_geometry. ++ org_geometry = OGR_F_StealGeometry(feature) ++ ++ if code in (15, 16): ++ cogr_geometry = OGR_G_ForceToMultiPolygon(org_geometry) ++ elif code == 17: ++ cogr_geometry = OGR_G_ForceToPolygon(org_geometry) ++ ++ geom = GeomBuilder().build(cogr_geometry) ++ OGR_G_DestroyGeometry(cogr_geometry) ++ ++ else: ++ geom = GeomBuilder().build(cogr_geometry) ++ ++ fiona_feature["geometry"] = geom ++ ++ else: ++ ++ fiona_feature["geometry"] = None ++ ++ return fiona_feature ++ ++ ++cdef class OGRFeatureBuilder: ++ ++ """Builds an OGR Feature from a Fiona feature mapping. ++ ++ Allocates one OGR Feature which should be destroyed by the caller. ++ Borrows a layer definition from the collection. ++ """ ++ ++ cdef void * build(self, feature, collection) except NULL: ++ cdef void *cogr_geometry = NULL ++ cdef const char *string_c = NULL ++ cdef WritingSession session ++ session = collection.session ++ cdef void *cogr_layer = session.cogr_layer ++ if cogr_layer == NULL: ++ raise ValueError("Null layer") ++ cdef void *cogr_featuredefn = OGR_L_GetLayerDefn(cogr_layer) ++ if cogr_featuredefn == NULL: ++ raise ValueError("Null feature definition") ++ cdef void *cogr_feature = OGR_F_Create(cogr_featuredefn) ++ if cogr_feature == NULL: ++ raise ValueError("Null feature") ++ ++ if feature['geometry'] is not None: ++ cogr_geometry = OGRGeomBuilder().build( ++ feature['geometry']) ++ OGR_F_SetGeometryDirectly(cogr_feature, cogr_geometry) ++ ++ # OGR_F_SetFieldString takes encoded strings ('bytes' in Python 3). ++ encoding = session._get_internal_encoding() ++ ++ for key, value in feature['properties'].items(): ++ log.debug( ++ "Looking up %s in %s", key, repr(session._schema_mapping)) ++ ogr_key = session._schema_mapping[key] ++ ++ schema_type = normalize_field_type(collection.schema['properties'][key]) ++ ++ log.debug("Normalizing schema type for key %r in schema %r to %r", key, collection.schema['properties'], schema_type) ++ ++ try: ++ key_bytes = ogr_key.encode(encoding) ++ except UnicodeDecodeError: ++ log.warning("Failed to encode %s using %s codec", key, encoding) ++ key_bytes = ogr_key ++ key_c = key_bytes ++ i = OGR_F_GetFieldIndex(cogr_feature, key_c) ++ if i < 0: ++ continue ++ ++ # Special case: serialize dicts to assist OGR. ++ if isinstance(value, dict): ++ value = json.dumps(value) ++ ++ # Continue over the standard OGR types. ++ if isinstance(value, integer_types): ++ ++ log.debug("Setting field %r, type %r, to value %r", i, schema_type, value) ++ ++ if schema_type == 'int32': ++ OGR_F_SetFieldInteger(cogr_feature, i, value) ++ else: ++ OGR_F_SetFieldInteger64(cogr_feature, i, value) ++ ++ elif isinstance(value, float): ++ OGR_F_SetFieldDouble(cogr_feature, i, value) ++ elif (isinstance(value, string_types) ++ and schema_type in ['date', 'time', 'datetime']): ++ if schema_type == 'date': ++ y, m, d, hh, mm, ss, ff = parse_date(value) ++ elif schema_type == 'time': ++ y, m, d, hh, mm, ss, ff = parse_time(value) ++ else: ++ y, m, d, hh, mm, ss, ff = parse_datetime(value) ++ OGR_F_SetFieldDateTime( ++ cogr_feature, i, y, m, d, hh, mm, ss, 0) ++ elif (isinstance(value, datetime.date) ++ and schema_type == 'date'): ++ y, m, d = value.year, value.month, value.day ++ OGR_F_SetFieldDateTime( ++ cogr_feature, i, y, m, d, 0, 0, 0, 0) ++ elif (isinstance(value, datetime.datetime) ++ and schema_type == 'datetime'): ++ y, m, d = value.year, value.month, value.day ++ hh, mm, ss = value.hour, value.minute, value.second ++ OGR_F_SetFieldDateTime( ++ cogr_feature, i, y, m, d, hh, mm, ss, 0) ++ elif (isinstance(value, datetime.time) ++ and schema_type == 'time'): ++ hh, mm, ss = value.hour, value.minute, value.second ++ OGR_F_SetFieldDateTime( ++ cogr_feature, i, 0, 0, 0, hh, mm, ss, 0) ++ elif isinstance(value, bytes) and schema_type == "bytes": ++ string_c = value ++ OGR_F_SetFieldBinary(cogr_feature, i, len(value), ++ string_c) ++ elif isinstance(value, string_types): ++ try: ++ value_bytes = value.encode(encoding) ++ except UnicodeDecodeError: ++ log.warning( ++ "Failed to encode %s using %s codec", value, encoding) ++ value_bytes = value ++ string_c = value_bytes ++ OGR_F_SetFieldString(cogr_feature, i, string_c) ++ elif value is None: ++ set_field_null(cogr_feature, i) ++ else: ++ raise ValueError("Invalid field type %s" % type(value)) ++ log.debug("Set field %s: %r" % (key, value)) ++ return cogr_feature ++ ++ ++cdef _deleteOgrFeature(void *cogr_feature): ++ """Delete an OGR feature""" ++ if cogr_feature is not NULL: ++ OGR_F_Destroy(cogr_feature) ++ cogr_feature = NULL ++ ++ ++def featureRT(feature, collection): ++ # For testing purposes only, leaks the JSON data ++ cdef void *cogr_feature = OGRFeatureBuilder().build(feature, collection) ++ cdef void *cogr_geometry = OGR_F_GetGeometryRef(cogr_feature) ++ if cogr_geometry == NULL: ++ raise ValueError("Null geometry") ++ result = FeatureBuilder().build( ++ cogr_feature, ++ encoding='utf-8', ++ bbox=False, ++ driver=collection.driver ++ ) ++ _deleteOgrFeature(cogr_feature) ++ return result ++ ++ ++# Collection-related extension classes and functions ++ ++cdef class Session: ++ ++ cdef void *cogr_ds ++ cdef void *cogr_layer ++ cdef object _fileencoding ++ cdef object _encoding ++ cdef object collection ++ ++ def __init__(self): ++ self.cogr_ds = NULL ++ self.cogr_layer = NULL ++ self._fileencoding = None ++ self._encoding = None ++ ++ def __dealloc__(self): ++ self.stop() ++ ++ def start(self, collection, **kwargs): ++ cdef const char *path_c = NULL ++ cdef const char *name_c = NULL ++ cdef void *drv = NULL ++ cdef void *ds = NULL ++ cdef char **ignore_fields = NULL ++ ++ path_b = collection.path.encode('utf-8') ++ path_c = path_b ++ ++ self._fileencoding = kwargs.get('encoding') or collection.encoding ++ ++ # We have two ways of specifying drivers to try. Resolve the ++ # values into a single set of driver short names. ++ if collection._driver: ++ drivers = set([collection._driver]) ++ elif collection.enabled_drivers: ++ drivers = set(collection.enabled_drivers) ++ else: ++ drivers = None ++ ++ encoding = kwargs.pop('encoding', None) ++ if encoding: ++ kwargs['encoding'] = encoding.upper() ++ ++ self.cogr_ds = gdal_open_vector(path_c, 0, drivers, kwargs) ++ ++ if isinstance(collection.name, string_types): ++ name_b = collection.name.encode('utf-8') ++ name_c = name_b ++ self.cogr_layer = GDALDatasetGetLayerByName(self.cogr_ds, name_c) ++ elif isinstance(collection.name, int): ++ self.cogr_layer = GDALDatasetGetLayer(self.cogr_ds, collection.name) ++ name_c = OGR_L_GetName(self.cogr_layer) ++ name_b = name_c ++ collection.name = name_b.decode('utf-8') ++ ++ if self.cogr_layer == NULL: ++ raise ValueError("Null layer: " + repr(collection.name)) ++ ++ encoding = self._get_internal_encoding() ++ ++ if collection.ignore_fields: ++ try: ++ for name in collection.ignore_fields: ++ try: ++ name_b = name.encode(encoding) ++ except AttributeError: ++ raise TypeError("Ignored field \"{}\" has type \"{}\", expected string".format(name, name.__class__.__name__)) ++ ignore_fields = CSLAddString(ignore_fields, name_b) ++ OGR_L_SetIgnoredFields(self.cogr_layer, ignore_fields) ++ finally: ++ CSLDestroy(ignore_fields) ++ ++ self.collection = collection ++ ++ cpdef stop(self): ++ self.cogr_layer = NULL ++ if self.cogr_ds != NULL: ++ GDALClose(self.cogr_ds) ++ self.cogr_ds = NULL ++ ++ def get_fileencoding(self): ++ """DEPRECATED""" ++ warnings.warn("get_fileencoding is deprecated and will be removed in a future version.", FionaDeprecationWarning) ++ return self._fileencoding ++ ++ def _get_fallback_encoding(self): ++ """Determine a format-specific fallback encoding to use when using OGR_F functions ++ ++ Parameters ++ ---------- ++ None ++ ++ Returns ++ ------- ++ str ++ ++ """ ++ if "Shapefile" in self.get_driver(): ++ return 'iso-8859-1' ++ else: ++ return locale.getpreferredencoding() ++ ++ ++ def _get_internal_encoding(self): ++ """Determine the encoding to use when use OGR_F functions ++ ++ Parameters ++ ---------- ++ None ++ ++ Returns ++ ------- ++ str ++ ++ Notes ++ ----- ++ If the layer implements RFC 23 support for UTF-8, the return ++ value will be 'utf-8' and callers can be certain that this is ++ correct. If the layer does not have the OLC_STRINGSASUTF8 ++ capability marker, it is not possible to know exactly what the ++ internal encoding is and this method returns best guesses. That ++ means ISO-8859-1 for shapefiles and the locale's preferred ++ encoding for other formats such as CSV files. ++ ++ """ ++ if OGR_L_TestCapability(self.cogr_layer, OLC_STRINGSASUTF8): ++ return 'utf-8' ++ else: ++ return self._fileencoding or self._get_fallback_encoding() ++ ++ def get_length(self): ++ if self.cogr_layer == NULL: ++ raise ValueError("Null layer") ++ return OGR_L_GetFeatureCount(self.cogr_layer, 0) ++ ++ def get_driver(self): ++ cdef void *cogr_driver = GDALGetDatasetDriver(self.cogr_ds) ++ if cogr_driver == NULL: ++ raise ValueError("Null driver") ++ cdef const char *name = OGR_Dr_GetName(cogr_driver) ++ driver_name = name ++ return driver_name.decode() ++ ++ def get_schema(self): ++ cdef int i ++ cdef int n ++ cdef void *cogr_featuredefn = NULL ++ cdef void *cogr_fielddefn = NULL ++ cdef const char *key_c ++ props = [] ++ ++ if self.cogr_layer == NULL: ++ raise ValueError("Null layer") ++ ++ if self.collection.ignore_fields: ++ ignore_fields = self.collection.ignore_fields ++ else: ++ ignore_fields = set() ++ ++ cogr_featuredefn = OGR_L_GetLayerDefn(self.cogr_layer) ++ if cogr_featuredefn == NULL: ++ raise ValueError("Null feature definition") ++ ++ encoding = self._get_internal_encoding() ++ ++ n = OGR_FD_GetFieldCount(cogr_featuredefn) ++ ++ for i from 0 <= i < n: ++ cogr_fielddefn = OGR_FD_GetFieldDefn(cogr_featuredefn, i) ++ if cogr_fielddefn == NULL: ++ raise ValueError("Null field definition") ++ ++ key_c = OGR_Fld_GetNameRef(cogr_fielddefn) ++ key_b = key_c ++ ++ if not bool(key_b): ++ raise ValueError("Invalid field name ref: %s" % key) ++ ++ key = key_b.decode(encoding) ++ ++ if key in ignore_fields: ++ log.debug("By request, ignoring field %r", key) ++ continue ++ ++ fieldtypename = FIELD_TYPES[OGR_Fld_GetType(cogr_fielddefn)] ++ if not fieldtypename: ++ log.warning( ++ "Skipping field %s: invalid type %s", ++ key, ++ OGR_Fld_GetType(cogr_fielddefn)) ++ continue ++ ++ val = fieldtypename ++ if fieldtypename == 'float': ++ fmt = "" ++ width = OGR_Fld_GetWidth(cogr_fielddefn) ++ if width: # and width != 24: ++ fmt = ":%d" % width ++ precision = OGR_Fld_GetPrecision(cogr_fielddefn) ++ if precision: # and precision != 15: ++ fmt += ".%d" % precision ++ val = "float" + fmt ++ elif fieldtypename in ('int32', 'int64'): ++ fmt = "" ++ width = OGR_Fld_GetWidth(cogr_fielddefn) ++ if width: ++ fmt = ":%d" % width ++ val = 'int' + fmt ++ elif fieldtypename == 'str': ++ fmt = "" ++ width = OGR_Fld_GetWidth(cogr_fielddefn) ++ if width: ++ fmt = ":%d" % width ++ val = fieldtypename + fmt ++ ++ props.append((key, val)) ++ ++ ret = {"properties": OrderedDict(props)} ++ ++ if not self.collection.ignore_geometry: ++ code = normalize_geometry_type_code( ++ OGR_FD_GetGeomType(cogr_featuredefn)) ++ ret["geometry"] = GEOMETRY_TYPES[code] ++ ++ return ret ++ ++ def get_crs(self): ++ """Get the layer's CRS ++ ++ Returns ++ ------- ++ CRS ++ ++ """ ++ cdef char *proj_c = NULL ++ cdef const char *auth_key = NULL ++ cdef const char *auth_val = NULL ++ cdef void *cogr_crs = NULL ++ ++ if self.cogr_layer == NULL: ++ raise ValueError("Null layer") ++ ++ try: ++ cogr_crs = exc_wrap_pointer(OGR_L_GetSpatialRef(self.cogr_layer)) ++ # TODO: we don't intend to use try/except for flow control ++ # this is a work around for a GDAL issue. ++ except FionaNullPointerError: ++ log.debug("Layer has no coordinate system") ++ ++ if cogr_crs is not NULL: ++ ++ log.debug("Got coordinate system") ++ crs = {} ++ ++ try: ++ ++ retval = OSRAutoIdentifyEPSG(cogr_crs) ++ if retval > 0: ++ log.info("Failed to auto identify EPSG: %d", retval) ++ ++ try: ++ auth_key = exc_wrap_pointer(OSRGetAuthorityName(cogr_crs, NULL)) ++ auth_val = exc_wrap_pointer(OSRGetAuthorityCode(cogr_crs, NULL)) ++ ++ except CPLE_BaseError as exc: ++ log.debug("{}".format(exc)) ++ ++ if auth_key != NULL and auth_val != NULL: ++ key_b = auth_key ++ key = key_b.decode('utf-8') ++ if key == 'EPSG': ++ val_b = auth_val ++ val = val_b.decode('utf-8') ++ crs['init'] = "epsg:" + val ++ ++ else: ++ OSRExportToProj4(cogr_crs, &proj_c) ++ if proj_c == NULL: ++ raise ValueError("Null projection") ++ proj_b = proj_c ++ log.debug("Params: %s", proj_b) ++ value = proj_b.decode() ++ value = value.strip() ++ for param in value.split(): ++ kv = param.split("=") ++ if len(kv) == 2: ++ k, v = kv ++ try: ++ v = float(v) ++ if v % 1 == 0: ++ v = int(v) ++ except ValueError: ++ # Leave v as a string ++ pass ++ elif len(kv) == 1: ++ k, v = kv[0], True ++ else: ++ raise ValueError("Unexpected proj parameter %s" % param) ++ k = k.lstrip("+") ++ crs[k] = v ++ ++ finally: ++ CPLFree(proj_c) ++ return crs ++ ++ else: ++ log.debug("Projection not found (cogr_crs was NULL)") ++ ++ return {} ++ ++ def get_crs_wkt(self): ++ cdef char *proj_c = NULL ++ cdef void *cogr_crs = NULL ++ ++ if self.cogr_layer == NULL: ++ raise ValueError("Null layer") ++ ++ try: ++ cogr_crs = exc_wrap_pointer(OGR_L_GetSpatialRef(self.cogr_layer)) ++ ++ # TODO: we don't intend to use try/except for flow control ++ # this is a work around for a GDAL issue. ++ except FionaNullPointerError: ++ log.debug("Layer has no coordinate system") ++ except fiona._err.CPLE_OpenFailedError as exc: ++ log.debug("A support file wasn't opened. See the preceding ERROR level message.") ++ cogr_crs = OGR_L_GetSpatialRef(self.cogr_layer) ++ log.debug("Called OGR_L_GetSpatialRef() again without error checking.") ++ if cogr_crs == NULL: ++ raise exc ++ ++ if cogr_crs is not NULL: ++ log.debug("Got coordinate system") ++ ++ try: ++ OSRExportToWkt(cogr_crs, &proj_c) ++ if proj_c == NULL: ++ raise ValueError("Null projection") ++ proj_b = proj_c ++ crs_wkt = proj_b.decode('utf-8') ++ ++ finally: ++ CPLFree(proj_c) ++ return crs_wkt ++ ++ else: ++ log.debug("Projection not found (cogr_crs was NULL)") ++ return "" ++ ++ def get_extent(self): ++ cdef OGREnvelope extent ++ ++ if self.cogr_layer == NULL: ++ raise ValueError("Null layer") ++ ++ result = OGR_L_GetExtent(self.cogr_layer, &extent, 1) ++ return (extent.MinX, extent.MinY, extent.MaxX, extent.MaxY) ++ ++ def has_feature(self, fid): ++ """Provides access to feature data by FID. ++ ++ Supports Collection.__contains__(). ++ """ ++ cdef void * cogr_feature ++ fid = int(fid) ++ cogr_feature = OGR_L_GetFeature(self.cogr_layer, fid) ++ if cogr_feature != NULL: ++ _deleteOgrFeature(cogr_feature) ++ return True ++ else: ++ return False ++ ++ def get_feature(self, fid): ++ """Provides access to feature data by FID. ++ ++ Supports Collection.__contains__(). ++ """ ++ cdef void * cogr_feature ++ fid = int(fid) ++ cogr_feature = OGR_L_GetFeature(self.cogr_layer, fid) ++ if cogr_feature != NULL: ++ feature = FeatureBuilder().build( ++ cogr_feature, ++ encoding=self._get_internal_encoding(), ++ bbox=False, ++ driver=self.collection.driver, ++ ignore_fields=self.collection.ignore_fields, ++ ignore_geometry=self.collection.ignore_geometry, ++ ) ++ _deleteOgrFeature(cogr_feature) ++ return feature ++ else: ++ raise KeyError("There is no feature with fid {!r}".format(fid)) ++ ++ get = get_feature ++ ++ # TODO: Make this an alias for get_feature in a future version. ++ def __getitem__(self, item): ++ cdef void * cogr_feature ++ if isinstance(item, slice): ++ warnings.warn("Collection slicing is deprecated and will be disabled in a future version.", FionaDeprecationWarning) ++ itr = Iterator(self.collection, item.start, item.stop, item.step) ++ log.debug("Slice: %r", item) ++ return list(itr) ++ elif isinstance(item, int): ++ index = item ++ # from the back ++ if index < 0: ++ ftcount = OGR_L_GetFeatureCount(self.cogr_layer, 0) ++ if ftcount == -1: ++ raise IndexError( ++ "collection's dataset does not support negative indexes") ++ index += ftcount ++ cogr_feature = OGR_L_GetFeature(self.cogr_layer, index) ++ if cogr_feature == NULL: ++ return None ++ feature = FeatureBuilder().build( ++ cogr_feature, ++ encoding=self._get_internal_encoding(), ++ bbox=False, ++ driver=self.collection.driver, ++ ignore_fields=self.collection.ignore_fields, ++ ignore_geometry=self.collection.ignore_geometry, ++ ) ++ _deleteOgrFeature(cogr_feature) ++ return feature ++ ++ def isactive(self): ++ if self.cogr_layer != NULL and self.cogr_ds != NULL: ++ return 1 ++ else: ++ return 0 ++ ++ ++cdef class WritingSession(Session): ++ ++ cdef object _schema_mapping ++ ++ def start(self, collection, **kwargs): ++ cdef void *cogr_srs = NULL ++ cdef char **options = NULL ++ cdef const char *path_c = NULL ++ cdef const char *driver_c = NULL ++ cdef const char *name_c = NULL ++ cdef const char *proj_c = NULL ++ cdef const char *fileencoding_c = NULL ++ cdef OGRFieldSubType field_subtype ++ cdef int ret ++ path = collection.path ++ self.collection = collection ++ ++ userencoding = kwargs.get('encoding') ++ ++ if collection.mode == 'a': ++ ++ if not os.path.exists(path): ++ raise OSError("No such file or directory %s" % path) ++ ++ try: ++ path_b = path.encode('utf-8') ++ except UnicodeDecodeError: ++ path_b = path ++ path_c = path_b ++ ++ try: ++ self.cogr_ds = gdal_open_vector(path_c, 1, None, kwargs) ++ ++ if isinstance(collection.name, string_types): ++ name_b = collection.name.encode('utf-8') ++ name_c = name_b ++ self.cogr_layer = exc_wrap_pointer(GDALDatasetGetLayerByName(self.cogr_ds, name_c)) ++ ++ elif isinstance(collection.name, int): ++ self.cogr_layer = exc_wrap_pointer(GDALDatasetGetLayer(self.cogr_ds, collection.name)) ++ ++ except CPLE_BaseError as exc: ++ OGRReleaseDataSource(self.cogr_ds) ++ self.cogr_ds = NULL ++ self.cogr_layer = NULL ++ raise DriverError(u"{}".format(exc)) ++ ++ else: ++ self._fileencoding = userencoding or self._get_fallback_encoding() ++ ++ elif collection.mode == 'w': ++ ++ try: ++ path_b = path.encode('utf-8') ++ except UnicodeDecodeError: ++ path_b = path ++ path_c = path_b ++ ++ driver_b = collection.driver.encode() ++ driver_c = driver_b ++ cogr_driver = exc_wrap_pointer(GDALGetDriverByName(driver_c)) ++ ++ # Our most common use case is the creation of a new data ++ # file and historically we've assumed that it's a file on ++ # the local filesystem and queryable via os.path. ++ # ++ # TODO: remove the assumption. ++ if not os.path.exists(path): ++ log.debug("File doesn't exist. Creating a new one...") ++ cogr_ds = gdal_create(cogr_driver, path_c, {}) ++ ++ # TODO: revisit the logic in the following blocks when we ++ # change the assumption above. ++ else: ++ if collection.driver == "GeoJSON" and os.path.exists(path): ++ # manually remove geojson file as GDAL doesn't do this for us ++ os.unlink(path) ++ try: ++ # attempt to open existing dataset in write mode ++ cogr_ds = gdal_open_vector(path_c, 1, None, kwargs) ++ except DriverError: ++ # failed, attempt to create it ++ cogr_ds = gdal_create(cogr_driver, path_c, kwargs) ++ else: ++ # check capability of creating a new layer in the existing dataset ++ capability = check_capability_create_layer(cogr_ds) ++ if GDAL_VERSION_NUM < 2000000 and collection.driver == "GeoJSON": ++ # GeoJSON driver tells lies about it's capability ++ capability = False ++ if not capability or collection.name is None: ++ # unable to use existing dataset, recreate it ++ GDALClose(cogr_ds) ++ cogr_ds = NULL ++ cogr_ds = gdal_create(cogr_driver, path_c, kwargs) ++ ++ self.cogr_ds = cogr_ds ++ ++ # Set the spatial reference system from the crs given to the ++ # collection constructor. We by-pass the crs_wkt and crs ++ # properties because they aren't accessible until the layer ++ # is constructed (later). ++ try: ++ ++ col_crs = collection._crs_wkt or collection._crs ++ ++ if col_crs: ++ cogr_srs = exc_wrap_pointer(OSRNewSpatialReference(NULL)) ++ ++ # First, check for CRS strings like "EPSG:3857". ++ if isinstance(col_crs, string_types): ++ proj_b = col_crs.encode('utf-8') ++ proj_c = proj_b ++ OSRSetFromUserInput(cogr_srs, proj_c) ++ ++ elif isinstance(col_crs, compat.DICT_TYPES): ++ # EPSG is a special case. ++ init = col_crs.get('init') ++ if init: ++ log.debug("Init: %s", init) ++ auth, val = init.split(':') ++ if auth.upper() == 'EPSG': ++ log.debug("Setting EPSG: %s", val) ++ OSRImportFromEPSG(cogr_srs, int(val)) ++ else: ++ params = [] ++ col_crs['wktext'] = True ++ for k, v in col_crs.items(): ++ if v is True or (k in ('no_defs', 'wktext') and v): ++ params.append("+%s" % k) ++ else: ++ params.append("+%s=%s" % (k, v)) ++ proj = " ".join(params) ++ log.debug("PROJ.4 to be imported: %r", proj) ++ proj_b = proj.encode('utf-8') ++ proj_c = proj_b ++ OSRImportFromProj4(cogr_srs, proj_c) ++ ++ else: ++ raise ValueError("Invalid CRS") ++ ++ ++ except (ValueError, CPLE_BaseError) as exc: ++ OGRReleaseDataSource(self.cogr_ds) ++ self.cogr_ds = NULL ++ self.cogr_layer = NULL ++ raise CRSError(u"{}".format(exc)) ++ ++ # Determine which encoding to use. The encoding parameter given to ++ # the collection constructor takes highest precedence, then ++ # 'iso-8859-1' (for shapefiles), then the system's default encoding ++ # as last resort. ++ sysencoding = locale.getpreferredencoding() ++ self._fileencoding = userencoding or ("Shapefile" in collection.driver and 'iso-8859-1') or sysencoding ++ ++ if "Shapefile" in collection.driver: ++ if self._fileencoding: ++ fileencoding_b = self._fileencoding.upper().encode('utf-8') ++ fileencoding_c = fileencoding_b ++ options = CSLSetNameValue(options, "ENCODING", fileencoding_c) ++ ++ # Does the layer exist already? If so, we delete it. ++ layer_count = GDALDatasetGetLayerCount(self.cogr_ds) ++ layer_names = [] ++ for i in range(layer_count): ++ cogr_layer = GDALDatasetGetLayer(cogr_ds, i) ++ name_c = OGR_L_GetName(cogr_layer) ++ name_b = name_c ++ layer_names.append(name_b.decode('utf-8')) ++ ++ idx = -1 ++ if isinstance(collection.name, string_types): ++ if collection.name in layer_names: ++ idx = layer_names.index(collection.name) ++ elif isinstance(collection.name, int): ++ if collection.name >= 0 and collection.name < layer_count: ++ idx = collection.name ++ if idx >= 0: ++ log.debug("Deleted pre-existing layer at %s", collection.name) ++ GDALDatasetDeleteLayer(self.cogr_ds, idx) ++ ++ # Create the named layer in the datasource. ++ name_b = collection.name.encode('utf-8') ++ name_c = name_b ++ ++ for k, v in kwargs.items(): ++ ++ if v is None: ++ continue ++ ++ # We need to remove encoding from the layer creation ++ # options if we're not creating a shapefile. ++ if k == 'encoding' and "Shapefile" not in collection.driver: ++ continue ++ ++ k = k.upper().encode('utf-8') ++ ++ if isinstance(v, bool): ++ v = ('ON' if v else 'OFF').encode('utf-8') ++ else: ++ v = str(v).encode('utf-8') ++ log.debug("Set option %r: %r", k, v) ++ options = CSLAddNameValue(options, k, v) ++ ++ geometry_type = collection.schema.get("geometry", "Unknown") ++ if not isinstance(geometry_type, string_types) and geometry_type is not None: ++ geometry_types = set(geometry_type) ++ if len(geometry_types) > 1: ++ geometry_type = "Unknown" ++ else: ++ geometry_type = geometry_types.pop() ++ if geometry_type == "Any" or geometry_type is None: ++ geometry_type = "Unknown" ++ geometry_code = geometry_type_code(geometry_type) ++ ++ try: ++ self.cogr_layer = exc_wrap_pointer( ++ GDALDatasetCreateLayer( ++ self.cogr_ds, name_c, cogr_srs, ++ geometry_code, options)) ++ ++ except Exception as exc: ++ OGRReleaseDataSource(self.cogr_ds) ++ self.cogr_ds = NULL ++ raise DriverIOError(u"{}".format(exc)) ++ ++ finally: ++ if options != NULL: ++ CSLDestroy(options) ++ ++ # Shapefile layers make a copy of the passed srs. GPKG ++ # layers, on the other hand, increment its reference ++ # count. OSRRelease() is the safe way to release ++ # OGRSpatialReferenceH. ++ if cogr_srs != NULL: ++ OSRRelease(cogr_srs) ++ ++ log.debug("Created layer %s", collection.name) ++ ++ # Next, make a layer definition from the given schema properties, ++ # which are an ordered dict since Fiona 1.0.1. ++ ++ encoding = self._get_internal_encoding() ++ ++ for key, value in collection.schema['properties'].items(): ++ ++ log.debug("Begin creating field: %r value: %r", key, value) ++ ++ field_subtype = OFSTNone ++ ++ # Convert 'long' to 'int'. See ++ # https://github.com/Toblerity/Fiona/issues/101. ++ if fiona.gdal_version.major >= 2 and value in ('int', 'long'): ++ value = 'int64' ++ elif value == 'int': ++ value = 'int32' ++ ++ if value == 'bool': ++ value = 'int32' ++ field_subtype = OFSTBoolean ++ ++ # Is there a field width/precision? ++ width = precision = None ++ if ':' in value: ++ value, fmt = value.split(':') ++ ++ log.debug("Field format parsing, value: %r, fmt: %r", value, fmt) ++ ++ if '.' in fmt: ++ width, precision = map(int, fmt.split('.')) ++ else: ++ width = int(fmt) ++ ++ if value == 'int': ++ if GDAL_VERSION_NUM >= 2000000 and (width == 0 or width >= 10): ++ value = 'int64' ++ else: ++ value = 'int32' ++ ++ field_type = FIELD_TYPES.index(value) ++ ++ try: ++ key_bytes = key.encode(encoding) ++ cogr_fielddefn = exc_wrap_pointer(OGR_Fld_Create(key_bytes, field_type)) ++ if width: ++ OGR_Fld_SetWidth(cogr_fielddefn, width) ++ if precision: ++ OGR_Fld_SetPrecision(cogr_fielddefn, precision) ++ if field_subtype != OFSTNone: ++ # subtypes are new in GDAL 2.x, ignored in 1.x ++ set_field_subtype(cogr_fielddefn, field_subtype) ++ exc_wrap_int(OGR_L_CreateField(self.cogr_layer, cogr_fielddefn, 1)) ++ ++ except (UnicodeEncodeError, CPLE_BaseError) as exc: ++ OGRReleaseDataSource(self.cogr_ds) ++ self.cogr_ds = NULL ++ self.cogr_layer = NULL ++ raise SchemaError(u"{}".format(exc)) ++ ++ else: ++ OGR_Fld_Destroy(cogr_fielddefn) ++ log.debug("End creating field %r", key) ++ ++ # Mapping of the Python collection schema to the munged ++ # OGR schema. ++ ogr_schema = self.get_schema() ++ self._schema_mapping = dict(zip( ++ collection.schema['properties'].keys(), ++ ogr_schema['properties'].keys() )) ++ ++ log.debug("Writing started") ++ ++ def writerecs(self, records, collection): ++ """Writes buffered records to OGR.""" ++ cdef void *cogr_driver ++ cdef void *cogr_feature ++ cdef int features_in_transaction = 0 ++ ++ cdef void *cogr_layer = self.cogr_layer ++ if cogr_layer == NULL: ++ raise ValueError("Null layer") ++ ++ schema_geom_type = collection.schema['geometry'] ++ cogr_driver = GDALGetDatasetDriver(self.cogr_ds) ++ driver_name = OGR_Dr_GetName(cogr_driver).decode("utf-8") ++ ++ valid_geom_types = collection._valid_geom_types ++ def validate_geometry_type(record): ++ if record["geometry"] is None: ++ return True ++ return record["geometry"]["type"].lstrip("3D ") in valid_geom_types ++ ++ log.debug("Starting transaction (initial)") ++ result = gdal_start_transaction(self.cogr_ds, 0) ++ if result == OGRERR_FAILURE: ++ raise TransactionError("Failed to start transaction") ++ ++ schema_props_keys = set(collection.schema['properties'].keys()) ++ for record in records: ++ log.debug("Creating feature in layer: %s" % record) ++ # Validate against collection's schema. ++ if set(record['properties'].keys()) != schema_props_keys: ++ raise ValueError( ++ "Record does not match collection schema: %r != %r" % ( ++ record['properties'].keys(), ++ list(schema_props_keys) )) ++ if not validate_geometry_type(record): ++ raise GeometryTypeValidationError( ++ "Record's geometry type does not match " ++ "collection schema's geometry type: %r != %r" % ( ++ record['geometry']['type'], ++ collection.schema['geometry'] )) ++ ++ cogr_feature = OGRFeatureBuilder().build(record, collection) ++ result = OGR_L_CreateFeature(cogr_layer, cogr_feature) ++ if result != OGRERR_NONE: ++ raise RuntimeError("Failed to write record: %s" % record) ++ _deleteOgrFeature(cogr_feature) ++ ++ features_in_transaction += 1 ++ if features_in_transaction == DEFAULT_TRANSACTION_SIZE: ++ log.debug("Comitting transaction (intermediate)") ++ result = gdal_commit_transaction(self.cogr_ds) ++ if result == OGRERR_FAILURE: ++ raise TransactionError("Failed to commit transaction") ++ log.debug("Starting transaction (intermediate)") ++ result = gdal_start_transaction(self.cogr_ds, 0) ++ if result == OGRERR_FAILURE: ++ raise TransactionError("Failed to start transaction") ++ features_in_transaction = 0 ++ ++ log.debug("Comitting transaction (final)") ++ result = gdal_commit_transaction(self.cogr_ds) ++ if result == OGRERR_FAILURE: ++ raise TransactionError("Failed to commit transaction") ++ ++ def sync(self, collection): ++ """Syncs OGR to disk.""" ++ cdef void *cogr_ds = self.cogr_ds ++ cdef void *cogr_layer = self.cogr_layer ++ if cogr_ds == NULL: ++ raise ValueError("Null data source") ++ ++ ++ gdal_flush_cache(cogr_ds) ++ log.debug("Flushed data source cache") ++ ++cdef class Iterator: ++ ++ """Provides iterated access to feature data. ++ """ ++ ++ # Reference to its Collection ++ cdef collection ++ cdef encoding ++ cdef int next_index ++ cdef stop ++ cdef start ++ cdef step ++ cdef fastindex ++ cdef stepsign ++ ++ def __cinit__(self, collection, start=None, stop=None, step=None, ++ bbox=None, mask=None): ++ if collection.session is None: ++ raise ValueError("I/O operation on closed collection") ++ self.collection = collection ++ cdef Session session ++ cdef void *cogr_geometry ++ session = self.collection.session ++ cdef void *cogr_layer = session.cogr_layer ++ if cogr_layer == NULL: ++ raise ValueError("Null layer") ++ OGR_L_ResetReading(cogr_layer) ++ ++ if bbox and mask: ++ raise ValueError("mask and bbox can not be set together") ++ ++ if bbox: ++ OGR_L_SetSpatialFilterRect( ++ cogr_layer, bbox[0], bbox[1], bbox[2], bbox[3]) ++ elif mask: ++ cogr_geometry = OGRGeomBuilder().build(mask) ++ OGR_L_SetSpatialFilter(cogr_layer, cogr_geometry) ++ OGR_G_DestroyGeometry(cogr_geometry) ++ ++ else: ++ OGR_L_SetSpatialFilter(cogr_layer, NULL) ++ ++ self.encoding = session._get_internal_encoding() ++ ++ self.fastindex = OGR_L_TestCapability( ++ session.cogr_layer, OLC_FASTSETNEXTBYINDEX) ++ ++ ftcount = OGR_L_GetFeatureCount(session.cogr_layer, 0) ++ if ftcount == -1 and ((start is not None and start < 0) or ++ (stop is not None and stop < 0)): ++ raise IndexError( ++ "collection's dataset does not support negative slice indexes") ++ ++ if stop is not None and stop < 0: ++ stop += ftcount ++ ++ if start is None: ++ start = 0 ++ if start is not None and start < 0: ++ start += ftcount ++ ++ # step size ++ if step is None: ++ step = 1 ++ if step == 0: ++ raise ValueError("slice step cannot be zero") ++ if step < 0 and not self.fastindex: ++ warnings.warn("Layer does not support" \ ++ "OLCFastSetNextByIndex, negative step size may" \ ++ " be slow", RuntimeWarning) ++ self.stepsign = int(math.copysign(1, step)) ++ self.stop = stop ++ self.start = start ++ self.step = step ++ ++ self.next_index = start ++ log.debug("Index: %d", self.next_index) ++ OGR_L_SetNextByIndex(session.cogr_layer, self.next_index) ++ ++ def __iter__(self): ++ return self ++ ++ def _next(self): ++ """Internal method to set read cursor to next item""" ++ ++ cdef Session session ++ session = self.collection.session ++ ++ # Check if next_index is valid ++ if self.next_index < 0: ++ raise StopIteration ++ ++ if self.stepsign == 1: ++ if self.next_index < self.start or (self.stop is not None and self.next_index >= self.stop): ++ raise StopIteration ++ else: ++ if self.next_index > self.start or (self.stop is not None and self.next_index <= self.stop): ++ raise StopIteration ++ ++ # Set read cursor to next_item position ++ if self.step > 1 and self.fastindex: ++ OGR_L_SetNextByIndex(session.cogr_layer, self.next_index) ++ ++ elif self.step > 1 and not self.fastindex and not self.next_index == self.start: ++ for _ in range(self.step - 1): ++ # TODO rbuffat add test -> OGR_L_GetNextFeature increments cursor by 1, therefore self.step - 1 as one increment was performed when feature is read ++ cogr_feature = OGR_L_GetNextFeature(session.cogr_layer) ++ if cogr_feature == NULL: ++ raise StopIteration ++ elif self.step > 1 and not self.fastindex and self.next_index == self.start: ++ OGR_L_SetNextByIndex(session.cogr_layer, self.next_index) ++ ++ elif self.step == 0: ++ # OGR_L_GetNextFeature increments read cursor by one ++ pass ++ elif self.step < 0: ++ OGR_L_SetNextByIndex(session.cogr_layer, self.next_index) ++ ++ # set the next index ++ self.next_index += self.step ++ ++ def __next__(self): ++ cdef OGRFeatureH cogr_feature = NULL ++ cdef OGRLayerH cogr_layer = NULL ++ cdef Session session ++ ++ session = self.collection.session ++ ++ if not session or not session.isactive: ++ raise FionaValueError("Session is inactive, dataset is closed or layer is unavailable.") ++ ++ # Update read cursor ++ self._next() ++ ++ # Get the next feature. ++ cogr_feature = OGR_L_GetNextFeature(session.cogr_layer) ++ if cogr_feature == NULL: ++ raise StopIteration ++ ++ try: ++ return FeatureBuilder().build( ++ cogr_feature, ++ encoding=self.collection.session._get_internal_encoding(), ++ bbox=False, ++ driver=self.collection.driver, ++ ignore_fields=self.collection.ignore_fields, ++ ignore_geometry=self.collection.ignore_geometry, ++ ) ++ finally: ++ _deleteOgrFeature(cogr_feature) ++ ++ ++cdef class ItemsIterator(Iterator): ++ ++ def __next__(self): ++ ++ cdef long fid ++ cdef void * cogr_feature ++ cdef Session session ++ session = self.collection.session ++ ++ #Update read cursor ++ self._next() ++ ++ # Get the next feature. ++ cogr_feature = OGR_L_GetNextFeature(session.cogr_layer) ++ if cogr_feature == NULL: ++ raise StopIteration ++ ++ fid = OGR_F_GetFID(cogr_feature) ++ feature = FeatureBuilder().build( ++ cogr_feature, ++ encoding=self.collection.session._get_internal_encoding(), ++ bbox=False, ++ driver=self.collection.driver, ++ ignore_fields=self.collection.ignore_fields, ++ ignore_geometry=self.collection.ignore_geometry, ++ ) ++ _deleteOgrFeature(cogr_feature) ++ ++ return fid, feature ++ ++ ++cdef class KeysIterator(Iterator): ++ ++ def __next__(self): ++ cdef long fid ++ cdef void * cogr_feature ++ cdef Session session ++ session = self.collection.session ++ ++ #Update read cursor ++ self._next() ++ ++ # Get the next feature. ++ cogr_feature = OGR_L_GetNextFeature(session.cogr_layer) ++ if cogr_feature == NULL: ++ raise StopIteration ++ ++ fid = OGR_F_GetFID(cogr_feature) ++ _deleteOgrFeature(cogr_feature) ++ ++ return fid ++ ++ ++def _remove(path, driver=None): ++ """Deletes an OGR data source ++ """ ++ cdef void *cogr_driver ++ cdef void *cogr_ds ++ cdef int result ++ cdef char *driver_c ++ ++ if driver is None: ++ # attempt to identify the driver by opening the dataset ++ try: ++ cogr_ds = gdal_open_vector(path.encode("utf-8"), 0, None, {}) ++ except (DriverError, FionaNullPointerError): ++ raise DatasetDeleteError("Failed to remove data source {}".format(path)) ++ cogr_driver = GDALGetDatasetDriver(cogr_ds) ++ GDALClose(cogr_ds) ++ else: ++ cogr_driver = OGRGetDriverByName(driver.encode("utf-8")) ++ ++ if cogr_driver == NULL: ++ raise DatasetDeleteError("Null driver when attempting to delete {}".format(path)) ++ ++ if not OGR_Dr_TestCapability(cogr_driver, ODrCDeleteDataSource): ++ raise DatasetDeleteError("Driver does not support dataset removal operation") ++ ++ result = GDALDeleteDataset(cogr_driver, path.encode('utf-8')) ++ if result != OGRERR_NONE: ++ raise DatasetDeleteError("Failed to remove data source {}".format(path)) ++ ++ ++def _remove_layer(path, layer, driver=None): ++ cdef void *cogr_ds ++ cdef int layer_index ++ ++ if isinstance(layer, integer_types): ++ layer_index = layer ++ layer_str = str(layer_index) ++ else: ++ layer_names = _listlayers(path) ++ try: ++ layer_index = layer_names.index(layer) ++ except ValueError: ++ raise ValueError("Layer \"{}\" does not exist in datasource: {}".format(layer, path)) ++ layer_str = '"{}"'.format(layer) ++ ++ if layer_index < 0: ++ layer_names = _listlayers(path) ++ layer_index = len(layer_names) + layer_index ++ ++ try: ++ cogr_ds = gdal_open_vector(path.encode("utf-8"), 1, None, {}) ++ except (DriverError, FionaNullPointerError): ++ raise DatasetDeleteError("Failed to remove data source {}".format(path)) ++ ++ result = OGR_DS_DeleteLayer(cogr_ds, layer_index) ++ GDALClose(cogr_ds) ++ if result == OGRERR_UNSUPPORTED_OPERATION: ++ raise DatasetDeleteError("Removal of layer {} not supported by driver".format(layer_str)) ++ elif result != OGRERR_NONE: ++ raise DatasetDeleteError("Failed to remove layer {} from datasource: {}".format(layer_str, path)) ++ ++ ++def _listlayers(path, **kwargs): ++ ++ """Provides a list of the layers in an OGR data source. ++ """ ++ ++ cdef void *cogr_ds = NULL ++ cdef void *cogr_layer = NULL ++ cdef const char *path_c ++ cdef const char *name_c ++ ++ # Open OGR data source. ++ try: ++ path_b = path.encode('utf-8') ++ except UnicodeDecodeError: ++ path_b = path ++ path_c = path_b ++ cogr_ds = gdal_open_vector(path_c, 0, None, kwargs) ++ ++ # Loop over the layers to get their names. ++ layer_count = GDALDatasetGetLayerCount(cogr_ds) ++ layer_names = [] ++ for i in range(layer_count): ++ cogr_layer = GDALDatasetGetLayer(cogr_ds, i) ++ name_c = OGR_L_GetName(cogr_layer) ++ name_b = name_c ++ layer_names.append(name_b.decode('utf-8')) ++ ++ # Close up data source. ++ if cogr_ds != NULL: ++ GDALClose(cogr_ds) ++ cogr_ds = NULL ++ ++ return layer_names ++ ++ ++def buffer_to_virtual_file(bytesbuf, ext=''): ++ """Maps a bytes buffer to a virtual file. ++ ++ `ext` is empty or begins with a period and contains at most one period. ++ """ ++ ++ vsi_filename = '/vsimem/{}'.format(uuid.uuid4().hex + ext) ++ vsi_cfilename = vsi_filename if not isinstance(vsi_filename, string_types) else vsi_filename.encode('utf-8') ++ ++ vsi_handle = VSIFileFromMemBuffer(vsi_cfilename, bytesbuf, len(bytesbuf), 0) ++ ++ if vsi_handle == NULL: ++ raise OSError('failed to map buffer to file') ++ if VSIFCloseL(vsi_handle) != 0: ++ raise OSError('failed to close mapped file handle') ++ ++ return vsi_filename ++ ++ ++def remove_virtual_file(vsi_filename): ++ vsi_cfilename = vsi_filename if not isinstance(vsi_filename, string_types) else vsi_filename.encode('utf-8') ++ return VSIUnlink(vsi_cfilename) ++ ++ ++cdef class MemoryFileBase(object): ++ """Base for a BytesIO-like class backed by an in-memory file.""" ++ ++ def __init__(self, file_or_bytes=None, filename=None, ext=''): ++ """A file in an in-memory filesystem. ++ ++ Parameters ++ ---------- ++ file_or_bytes : file or bytes ++ A file opened in binary mode or bytes or a bytearray ++ filename : str ++ A filename for the in-memory file under /vsimem ++ ext : str ++ A file extension for the in-memory file under /vsimem. Ignored if ++ filename was provided. ++ """ ++ cdef VSILFILE *vsi_handle = NULL ++ ++ if file_or_bytes: ++ if hasattr(file_or_bytes, 'read'): ++ initial_bytes = file_or_bytes.read() ++ else: ++ initial_bytes = file_or_bytes ++ if not isinstance(initial_bytes, (bytearray, bytes)): ++ raise TypeError( ++ "Constructor argument must be a file opened in binary " ++ "mode or bytes/bytearray.") ++ else: ++ initial_bytes = b'' ++ ++ if filename: ++ # GDAL's SRTMHGT driver requires the filename to be "correct" (match ++ # the bounds being written) ++ self.name = '/vsimem/{0}'.format(filename) ++ else: ++ # GDAL 2.1 requires a .zip extension for zipped files. ++ self.name = '/vsimem/{0}.{1}'.format(uuid.uuid4(), ext.lstrip('.')) ++ ++ self.path = self.name.encode('utf-8') ++ self._len = 0 ++ self._pos = 0 ++ self.closed = False ++ ++ self._initial_bytes = initial_bytes ++ cdef unsigned char *buffer = self._initial_bytes ++ ++ if self._initial_bytes: ++ ++ vsi_handle = VSIFileFromMemBuffer( ++ self.path, buffer, len(self._initial_bytes), 0) ++ self._len = len(self._initial_bytes) ++ ++ if vsi_handle == NULL: ++ raise IOError( ++ "Failed to create in-memory file using initial bytes.") ++ ++ if VSIFCloseL(vsi_handle) != 0: ++ raise IOError( ++ "Failed to properly close in-memory file.") ++ ++ def exists(self): ++ """Test if the in-memory file exists. ++ ++ Returns ++ ------- ++ bool ++ True if the in-memory file exists. ++ """ ++ cdef VSILFILE *fp = NULL ++ cdef const char *cypath = self.path ++ ++ with nogil: ++ fp = VSIFOpenL(cypath, 'r') ++ ++ if fp != NULL: ++ VSIFCloseL(fp) ++ return True ++ else: ++ return False ++ ++ def __len__(self): ++ """Length of the file's buffer in number of bytes. ++ ++ Returns ++ ------- ++ int ++ """ ++ cdef unsigned char *buff = NULL ++ cdef const char *cfilename = self.path ++ cdef vsi_l_offset buff_len = 0 ++ buff = VSIGetMemFileBuffer(self.path, &buff_len, 0) ++ return int(buff_len) ++ ++ def close(self): ++ """Close MemoryFile and release allocated memory.""" ++ VSIUnlink(self.path) ++ self._pos = 0 ++ self._initial_bytes = None ++ self.closed = True ++ ++ def read(self, size=-1): ++ """Read size bytes from MemoryFile.""" ++ cdef VSILFILE *fp = NULL ++ # Return no bytes immediately if the position is at or past the ++ # end of the file. ++ length = len(self) ++ ++ if self._pos >= length: ++ self._pos = length ++ return b'' ++ ++ if size == -1: ++ size = length - self._pos ++ else: ++ size = min(size, length - self._pos) ++ ++ cdef unsigned char *buffer = CPLMalloc(size) ++ cdef bytes result ++ ++ fp = VSIFOpenL(self.path, 'r') ++ ++ try: ++ fp = exc_wrap_vsilfile(fp) ++ if VSIFSeekL(fp, self._pos, 0) < 0: ++ raise IOError( ++ "Failed to seek to offset %s in %s.", ++ self._pos, self.name) ++ ++ objects_read = VSIFReadL(buffer, 1, size, fp) ++ result = buffer[:objects_read] ++ ++ finally: ++ VSIFCloseL(fp) ++ CPLFree(buffer) ++ ++ self._pos += len(result) ++ return result ++ ++ def seek(self, offset, whence=0): ++ """Seek to position in MemoryFile.""" ++ if whence == 0: ++ pos = offset ++ elif whence == 1: ++ pos = self._pos + offset ++ elif whence == 2: ++ pos = len(self) - offset ++ if pos < 0: ++ raise ValueError("negative seek position: {}".format(pos)) ++ if pos > len(self): ++ raise ValueError("seek position past end of file: {}".format(pos)) ++ self._pos = pos ++ return self._pos ++ ++ def tell(self): ++ """Tell current position in MemoryFile.""" ++ return self._pos ++ ++ def write(self, data): ++ """Write data bytes to MemoryFile""" ++ cdef VSILFILE *fp = NULL ++ cdef const unsigned char *view = data ++ n = len(data) ++ ++ if not self.exists(): ++ fp = exc_wrap_vsilfile(VSIFOpenL(self.path, 'w')) ++ else: ++ fp = exc_wrap_vsilfile(VSIFOpenL(self.path, 'r+')) ++ if VSIFSeekL(fp, self._pos, 0) < 0: ++ raise IOError( ++ "Failed to seek to offset %s in %s.", self._pos, self.name) ++ ++ result = VSIFWriteL(view, 1, n, fp) ++ VSIFFlushL(fp) ++ VSIFCloseL(fp) ++ ++ self._pos += result ++ self._len = max(self._len, self._pos) ++ ++ return result +--- /dev/null ++++ b/fiona/_crs3.pxd +@@ -0,0 +1,21 @@ ++# Coordinate system and transform API functions. ++ ++cdef extern from "ogr_srs_api.h": ++ ++ ctypedef void * OGRSpatialReferenceH ++ ++ void OSRCleanup () ++ OGRSpatialReferenceH OSRClone (OGRSpatialReferenceH srs) ++ int OSRExportToProj4 (OGRSpatialReferenceH srs, char **params) ++ int OSRExportToWkt (OGRSpatialReferenceH srs, char **params) ++ int OSRImportFromEPSG (OGRSpatialReferenceH srs, int code) ++ int OSRImportFromProj4 (OGRSpatialReferenceH srs, char *proj) ++ int OSRSetFromUserInput (OGRSpatialReferenceH srs, char *input) ++ int OSRAutoIdentifyEPSG (OGRSpatialReferenceH srs) ++ const char * OSRGetAuthorityName (OGRSpatialReferenceH srs, const char *key) ++ const char * OSRGetAuthorityCode (OGRSpatialReferenceH srs, const char *key) ++ OGRSpatialReferenceH OSRNewSpatialReference (char *wkt) ++ void OSRRelease (OGRSpatialReferenceH srs) ++ void * OCTNewCoordinateTransformation (OGRSpatialReferenceH source, OGRSpatialReferenceH dest) ++ void OCTDestroyCoordinateTransformation (void *source) ++ int OCTTransform (void *ct, int nCount, double *x, double *y, double *z) +--- /dev/null ++++ b/fiona/_crs3.pyx +@@ -0,0 +1,67 @@ ++"""Extension module supporting crs.py. ++ ++Calls methods from GDAL's OSR module. ++""" ++ ++from __future__ import absolute_import ++ ++import logging ++ ++from six import string_types ++ ++from fiona cimport _cpl ++from fiona.errors import CRSError ++ ++ ++logger = logging.getLogger(__name__) ++ ++ ++# Export a WKT string from input crs. ++def crs_to_wkt(crs): ++ """Convert a Fiona CRS object to WKT format""" ++ cdef void *cogr_srs = NULL ++ cdef char *proj_c = NULL ++ ++ cogr_srs = OSRNewSpatialReference(NULL) ++ if cogr_srs == NULL: ++ raise CRSError("NULL spatial reference") ++ ++ # First, check for CRS strings like "EPSG:3857". ++ if isinstance(crs, string_types): ++ proj_b = crs.encode('utf-8') ++ proj_c = proj_b ++ OSRSetFromUserInput(cogr_srs, proj_c) ++ elif isinstance(crs, dict): ++ # EPSG is a special case. ++ init = crs.get('init') ++ if init: ++ logger.debug("Init: %s", init) ++ auth, val = init.split(':') ++ if auth.upper() == 'EPSG': ++ logger.debug("Setting EPSG: %s", val) ++ OSRImportFromEPSG(cogr_srs, int(val)) ++ else: ++ params = [] ++ crs['wktext'] = True ++ for k, v in crs.items(): ++ if v is True or (k in ('no_defs', 'wktext') and v): ++ params.append("+%s" % k) ++ else: ++ params.append("+%s=%s" % (k, v)) ++ proj = " ".join(params) ++ logger.debug("PROJ.4 to be imported: %r", proj) ++ proj_b = proj.encode('utf-8') ++ proj_c = proj_b ++ OSRImportFromProj4(cogr_srs, proj_c) ++ else: ++ raise ValueError("Invalid CRS") ++ ++ OSRExportToWkt(cogr_srs, &proj_c) ++ ++ if proj_c == NULL: ++ raise CRSError("Null projection") ++ ++ proj_b = proj_c ++ _cpl.CPLFree(proj_c) ++ ++ return proj_b.decode('utf-8') +--- a/fiona/_env.pyx ++++ b/fiona/_env.pyx +@@ -295,18 +295,18 @@ class GDALDataFinder(object): + if prefix is None: + prefix = __file__ + datadir = os.path.abspath(os.path.join(os.path.dirname(prefix), "gdal_data")) +- return datadir if os.path.exists(os.path.join(datadir, 'pcs.csv')) else None ++ return datadir if os.path.exists(os.path.join(datadir, 'header.dxf')) else None + + def search_prefix(self, prefix=sys.prefix): + """Check sys.prefix location""" + datadir = os.path.join(prefix, 'share', 'gdal') +- return datadir if os.path.exists(os.path.join(datadir, 'pcs.csv')) else None ++ return datadir if os.path.exists(os.path.join(datadir, 'header.dxf')) else None + + def search_debian(self, prefix=sys.prefix): + """Check Debian locations""" + gdal_release_name = GDALVersionInfo("RELEASE_NAME") + datadir = os.path.join(prefix, 'share', 'gdal', '{}.{}'.format(*gdal_release_name.split('.')[:2])) +- return datadir if os.path.exists(os.path.join(datadir, 'pcs.csv')) else None ++ return datadir if os.path.exists(os.path.join(datadir, 'header.dxf')) else None + + + class PROJDataFinder(object): +--- a/fiona/_drivers.pyx ++++ b/fiona/_drivers.pyx +@@ -128,15 +128,15 @@ cdef class GDALEnv(object): + + # If we find GDAL data at the well-known paths, we will + # add a GDAL_DATA key to the config options dict. +- if os.path.exists(os.path.join(whl_datadir, 'pcs.csv')): ++ if os.path.exists(os.path.join(whl_datadir, 'header.dxf')): + log.debug("Set GDAL_DATA = %r", whl_datadir) + self.options['GDAL_DATA'] = whl_datadir + +- elif os.path.exists(os.path.join(deb_share_datadir, 'pcs.csv')): ++ elif os.path.exists(os.path.join(deb_share_datadir, 'header.dxf')): + log.debug("Set GDAL_DATA = %r", deb_share_datadir) + self.options['GDAL_DATA'] = deb_share_datadir + +- elif os.path.exists(os.path.join(fhs_share_datadir, 'pcs.csv')): ++ elif os.path.exists(os.path.join(fhs_share_datadir, 'header.dxf')): + log.debug("Set GDAL_DATA = %r", fhs_share_datadir) + self.options['GDAL_DATA'] = fhs_share_datadir + +--- a/tests/test__env.py ++++ b/tests/test__env.py +@@ -1,5 +1,7 @@ + """Tests of _env util module""" + ++import os ++ + import pytest + try: + from unittest import mock +@@ -17,29 +19,29 @@ def mock_wheel(tmpdir): + moduledir = tmpdir.mkdir("rasterio") + moduledir.ensure("__init__.py") + moduledir.ensure("_env.py") +- moduledir.ensure("gdal_data/pcs.csv") +- moduledir.ensure("proj_data/epsg") ++ moduledir.ensure("gdal_data/header.dxf") ++ moduledir.ensure("proj_data/null") + return moduledir + + + @pytest.fixture + def mock_fhs(tmpdir): + """A fake FHS system""" +- tmpdir.ensure("share/gdal/pcs.csv") +- tmpdir.ensure("share/proj/epsg") ++ tmpdir.ensure("share/gdal/header.dxf") ++ tmpdir.ensure("share/proj/null") + return tmpdir + + + @pytest.fixture + def mock_debian(tmpdir): + """A fake Debian multi-install system""" +- tmpdir.ensure("share/gdal/1.11/pcs.csv") +- tmpdir.ensure("share/gdal/2.0/pcs.csv") +- tmpdir.ensure("share/gdal/2.1/pcs.csv") +- tmpdir.ensure("share/gdal/2.2/pcs.csv") +- tmpdir.ensure("share/gdal/2.3/pcs.csv") +- tmpdir.ensure("share/gdal/2.4/pcs.csv") +- tmpdir.ensure("share/proj/epsg") ++ tmpdir.ensure("share/gdal/1.11/header.dxf") ++ tmpdir.ensure("share/gdal/2.0/header.dxf") ++ tmpdir.ensure("share/gdal/2.1/header.dxf") ++ tmpdir.ensure("share/gdal/2.2/header.dxf") ++ tmpdir.ensure("share/gdal/2.3/header.dxf") ++ tmpdir.ensure("share/gdal/2.4/header.dxf") ++ tmpdir.ensure("share/proj/null") + return tmpdir + + +@@ -73,6 +75,7 @@ def test_search_debian_gdal_data_failure + assert not finder.search_debian(str(tmpdir)) + + ++ at pytest.mark.skipif(not os.path.exists('/usr/share/gdal/pcs.csv'), reason='GDAL 3') + def test_search_debian_gdal_data(mock_debian): + """Find GDAL data under Debian locations""" + finder = GDALDataFinder() +@@ -89,6 +92,7 @@ def test_search_gdal_data_fhs(mock_fhs): + assert finder.search(str(mock_fhs)) == str(mock_fhs.join("share").join("gdal")) + + ++ at pytest.mark.skipif(not os.path.exists('/usr/share/gdal/pcs.csv'), reason='GDAL 3') + def test_search_gdal_data_debian(mock_debian): + """Find GDAL data under Debian locations""" + finder = GDALDataFinder() +--- a/tests/test_env.py ++++ b/tests/test_env.py +@@ -58,7 +58,7 @@ def test_ensure_env_decorator_sets_gdal_ + return getenv()['GDAL_DATA'] + + find_file.return_value = None +- tmpdir.ensure("share/gdal/pcs.csv") ++ tmpdir.ensure("share/gdal/header.dxf") + monkeypatch.delenv('GDAL_DATA', raising=False) + monkeypatch.setattr(_env, '__file__', str(tmpdir.join("fake.py"))) + monkeypatch.setattr(sys, 'prefix', str(tmpdir)) +@@ -74,7 +74,7 @@ def test_ensure_env_decorator_sets_gdal_ + return getenv()['GDAL_DATA'] + + find_file.return_value = None +- tmpdir.ensure("gdal_data/pcs.csv") ++ tmpdir.ensure("gdal_data/header.dxf") + monkeypatch.delenv('GDAL_DATA', raising=False) + monkeypatch.setattr(_env, '__file__', str(tmpdir.join(os.path.basename(_env.__file__)))) + +@@ -89,7 +89,7 @@ def test_ensure_env_with_decorator_sets_ + return getenv()['GDAL_DATA'] + + find_file.return_value = None +- tmpdir.ensure("gdal_data/pcs.csv") ++ tmpdir.ensure("gdal_data/header.dxf") + monkeypatch.delenv('GDAL_DATA', raising=False) + monkeypatch.setattr(_env, '__file__', str(tmpdir.join(os.path.basename(_env.__file__)))) + +--- a/tests/test_crs.py ++++ b/tests/test_crs.py +@@ -1,5 +1,7 @@ + from fiona import crs, _crs + ++import os, pytest ++ + + def test_proj_keys(): + assert len(crs.all_proj_keys) == 87 +@@ -106,6 +108,7 @@ def test_towgs84(): + assert 'towgs84' in crs.from_string(proj4) + + ++ at pytest.mark.skipif(os.path.exists('/usr/share/proj/proj.db'), reason='PROJ 6') + def test_towgs84_wkt(): + """+towgs84 +wktext are preserved in WKT""" + proj4 = ('+proj=lcc +lat_1=49 +lat_2=46 +lat_0=47.5 ' ===================================== debian/patches/series ===================================== @@ -1,3 +1,4 @@ 0001-Rename-fio-command-to-fiona-to-avoid-name-clash.patch 0002-Remove-outside-reference-possible-privacy-breach.patch 0006-Remove-unknown-distribution-options.patch +gdal3.patch View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/5dcc496400d674b5a47cb5a0dd463e485b3d23f5...e285e491122b975197eaf60bc682cfa67c0f72c4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/5dcc496400d674b5a47cb5a0dd463e485b3d23f5...e285e491122b975197eaf60bc682cfa67c0f72c4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 11 09:08:39 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 11 Sep 2019 08:08:39 +0000 Subject: Processing of fiona_1.8.6-3_source.changes Message-ID: fiona_1.8.6-3_source.changes uploaded successfully to localhost along with the files: fiona_1.8.6-3.dsc fiona_1.8.6-3.debian.tar.xz fiona_1.8.6-3_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 11 09:41:38 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 11 Sep 2019 08:41:38 +0000 Subject: fiona_1.8.6-3_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 11 Sep 2019 09:39:15 +0200 Source: fiona Architecture: source Version: 1.8.6-3 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 939872 Changes: fiona (1.8.6-3) unstable; urgency=medium . * Team upload. * Add patch to fix FTBFS with GDAL 3. (closes: #939872) Checksums-Sha1: b8aca2dc9bf81d5bca37b5fbea3f53f9637d8667 2237 fiona_1.8.6-3.dsc 874dbc878e5a1958ca0d5f40a269b6419e9c333c 29664 fiona_1.8.6-3.debian.tar.xz a3a4b7d9442a211d34b81ce116c0f0aa1a980232 14317 fiona_1.8.6-3_amd64.buildinfo Checksums-Sha256: d67c381583afba389042f79873cbe425b53ea76e33a1751ae97698b0cae191e4 2237 fiona_1.8.6-3.dsc deb777846be6425fae597fdfafe5474803d3262689f87f161334fdbc3519cefa 29664 fiona_1.8.6-3.debian.tar.xz 18d4a5c396bc20c456b1e1f35afdaba313997ca2c259405b930981cb487f2228 14317 fiona_1.8.6-3_amd64.buildinfo Files: d61bf9a2394781335e1f6225969b04db 2237 python optional fiona_1.8.6-3.dsc 5ab42ce7b5310234a92304ad9b772317 29664 python optional fiona_1.8.6-3.debian.tar.xz 53df7f13b9138ac741aeaab3ffda2306 14317 python optional fiona_1.8.6-3_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl14qRAACgkQZ1DxCuiN SvHtOhAAiT8iMvDelF+2xRA+FfwtAVbndbWw7RSNgFvDaI2orakM+E8yaIGJQjN2 DgfN0E+bqoLhU1fTL/j9sfDZoFmuPaf25bASCiI52iTlgGvq/NpfEfVYiM5xzdpq csrqwHGTWEGllWeks7yeXlqhy8eUnxTrMPPGFGTXX2/jwiq3En+vr3fyoponWysv 9xqtPzkuqHwE/kMXyZNH2oB30r+D9RJCs2Kldltp+eeVcL9aWkU9N0VW8+9ilT2d ZOsLDRd6sOhXwGgSWkrZ3ICnNrEOchgkxpWTNy2aOrS3xigosfecx8C4Qhbu2EZn njH5w/M+qqta31tJxaAylaRHtgArWBmxSs/cRNoZ7dINcmlKjfbuTdaR4ULzHK4Z 2ZzhNaZR4TvgEvElDo9GsWY5+KXvr8FhUDH+mRX6ioSC8/7UVRS32QEAf269sZn1 +rLbmp3Of6dd1OhGvnxrwqnpKMLLZaHbbiMmcrMBoh7YDOEhSdp6/rAF8SjpEBGt p3ZTjjk6mdvF+hTgV6I3lFR7RzvL829JT3hMnD16MSSKMC/X/cLdbJocRmFs+Pf9 dYdQBEGDBQT9UXxyIYQBAvCr505PhxhBgH6+8oJ2+4A1EdWpy3Xzr8qiNGoVDWey B7LMY/DF+afsuBD85TyDb5fnw4PQqL62/5erbahVZQaVg37Ju5E= =LDW+ -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Wed Sep 11 09:45:07 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Wed, 11 Sep 2019 08:45:07 +0000 Subject: Bug#939872: marked as done (fiona: FTBFS with GDAL 3) References: <156804651982.23140.198675418827472311.reportbug@osiris.linuxminded.xs4all.nl> Message-ID: Your message dated Wed, 11 Sep 2019 08:41:38 +0000 with message-id and subject line Bug#939872: fixed in fiona 1.8.6-3 has caused the Debian Bug report #939872, regarding fiona: FTBFS with GDAL 3 to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 939872: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939872 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: fiona: FTBFS with GDAL 3 Date: Mon, 09 Sep 2019 18:28:39 +0200 Size: 2424 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#939872: fixed in fiona 1.8.6-3 Date: Wed, 11 Sep 2019 08:41:38 +0000 Size: 5055 URL: From gitlab at salsa.debian.org Wed Sep 11 10:45:13 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Wed, 11 Sep 2019 09:45:13 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag ubuntu/3.4.11+dfsg-2.bionic1 Message-ID: <5d78c229eadcf_73483fbbb4349a18463537@godard.mail> Martin Landa pushed new tag ubuntu/3.4.11+dfsg-2.bionic1 at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/ubuntu/3.4.11+dfsg-2.bionic1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 10:45:21 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Wed, 11 Sep 2019 09:45:21 +0000 Subject: [Git][debian-gis-team/qgis][ubuntu/bionic] 56 commits: Update symbols for other architectures. Message-ID: <5d78c2315757e_73482ad95d7dabac4637d2@godard.mail> Martin Landa pushed to branch ubuntu/bionic at Debian GIS Project / qgis Commits: addd2fa2 by Bas Couwenberg at 2019-03-29T20:33:27Z Update symbols for other architectures. - - - - - fd51fd2f by Bas Couwenberg at 2019-03-29T20:33:27Z Set distribution to unstable. - - - - - afe3263b by Bas Couwenberg at 2019-04-14T07:29:17Z Update symbols for arm* & {mips,ppc}64el. - - - - - 14d31813 by Bas Couwenberg at 2019-04-19T12:44:57Z New upstream version 3.4.7+dfsg - - - - - bfccaf64 by Bas Couwenberg at 2019-04-19T12:53:37Z Merge tag 'upstream/3.4.7+dfsg' Upstream version 3.4.7+dfsg - - - - - 592f36ee by Bas Couwenberg at 2019-04-19T17:26:17Z New upstream release. - - - - - 02849a59 by Bas Couwenberg at 2019-04-19T17:26:35Z Don't build apidoc target separately. - - - - - da8eaafd by Bas Couwenberg at 2019-04-19T17:26:39Z Update copyright file. Changes: - Update copyright years for Matthias Kuhn - Add Daniele Viganò to copyright holders - - - - - c2648ec2 by Bas Couwenberg at 2019-04-19T17:26:41Z Update symbols for amd64. - - - - - 40f012d7 by Bas Couwenberg at 2019-04-19T17:26:43Z Set distribution to experimental. - - - - - 952050a0 by Bas Couwenberg at 2019-04-24T04:28:28Z Update symbols for other architectures. - - - - - 38cda454 by Bas Couwenberg at 2019-04-24T07:49:11Z Update symbols for amd64. - - - - - e900d0a6 by Bas Couwenberg at 2019-04-24T07:49:11Z Set distribution to unstable. - - - - - 7cf95676 by Bas Couwenberg at 2019-05-17T13:59:52Z New upstream version 3.4.8+dfsg - - - - - 24ef05f3 by Bas Couwenberg at 2019-05-17T14:03:32Z Merge tag 'upstream/3.4.8+dfsg' Upstream version 3.4.8+dfsg - - - - - e285c179 by Bas Couwenberg at 2019-05-17T14:05:57Z New upstream release. - - - - - 478d1f77 by Bas Couwenberg at 2019-05-17T17:15:14Z Update symbols for amd64. - - - - - 6c935b45 by Bas Couwenberg at 2019-05-17T20:07:53Z Set distribution to experimental. - - - - - 9c11a610 by Bas Couwenberg at 2019-05-23T07:33:30Z Update symbols for other architectures. - - - - - 9b443fe3 by Bas Couwenberg at 2019-05-23T07:33:52Z Set distribution to unstable. - - - - - 3ca590f5 by Bas Couwenberg at 2019-06-21T13:38:04Z New upstream version 3.4.9+dfsg - - - - - 04616117 by Bas Couwenberg at 2019-06-21T13:42:57Z Merge tag 'upstream/3.4.9+dfsg' Upstream version 3.4.9+dfsg - - - - - 33f021d5 by Bas Couwenberg at 2019-06-21T13:45:30Z New upstream release. - - - - - c3f08f1a by Bas Couwenberg at 2019-06-21T13:59:57Z Update copyright years for Marco Hugentobler & Alessandro Pasotti. - - - - - bca68a74 by Bas Couwenberg at 2019-06-21T16:03:32Z Update symbols for amd64. - - - - - ae73aad7 by Bas Couwenberg at 2019-06-21T16:14:26Z Include CSV resource files in qgis-common package. - - - - - c06c7f26 by Bas Couwenberg at 2019-06-21T16:14:27Z Set distribution to experimental. - - - - - 259e5776 by Bas Couwenberg at 2019-07-07T08:04:47Z Update gbp.conf to use --source-only-changes by default. - - - - - 29fe4895 by Bas Couwenberg at 2019-07-07T11:43:08Z Bump minimum GRASS build dependency version to 7.6.1. - - - - - 89e33413 by Bas Couwenberg at 2019-07-09T16:44:06Z Update symbols for other architectures. - - - - - e72ae810 by Bas Couwenberg at 2019-07-09T16:44:38Z Set distribution to unstable. - - - - - 96fde907 by Bas Couwenberg at 2019-07-10T17:14:02Z Bump Standards-Version to 4.4.0, no changes. - - - - - 80de3634 by Bas Couwenberg at 2019-07-19T14:05:00Z New upstream version 3.4.10+dfsg - - - - - 91783ea0 by Bas Couwenberg at 2019-07-19T14:08:00Z Update upstream source from tag 'upstream/3.4.10+dfsg' Update to upstream version '3.4.10+dfsg' with Debian dir 52633d1921e937f29ea26e03377bc90caa0b8c65 - - - - - 7eea93d9 by Bas Couwenberg at 2019-07-19T14:10:35Z New upstream release. - - - - - 01ae5091 by Bas Couwenberg at 2019-07-19T14:14:21Z Move CSV resources files to qgis-providers-common package. - - - - - c53e33ab by Bas Couwenberg at 2019-07-19T14:16:45Z Add qt5-image-formats-plugins to qgis dependencies. - - - - - 7e58aeb1 by Bas Couwenberg at 2019-07-19T14:20:22Z Update copyright years for René-Luc D'Hont. - - - - - e0e6d793 by Bas Couwenberg at 2019-07-19T18:40:21Z Update symbols for amd64. - - - - - 312168e7 by Bas Couwenberg at 2019-07-19T18:40:21Z Set distribution to experimental. - - - - - 4f3aad6e by Bas Couwenberg at 2019-08-02T13:20:48Z Update symbols for other architectures. - - - - - 26050545 by Bas Couwenberg at 2019-08-02T13:22:22Z Set distribution to unstable. - - - - - 3126e21f by Bas Couwenberg at 2019-08-17T06:57:45Z New upstream version 3.4.11+dfsg - - - - - 337c4de7 by Bas Couwenberg at 2019-08-17T07:01:16Z Update upstream source from tag 'upstream/3.4.11+dfsg' Update to upstream version '3.4.11+dfsg' with Debian dir e3569b03a04939623d767040381c704af1c1e136 - - - - - 90b584a2 by Bas Couwenberg at 2019-08-17T07:12:10Z New upstream release. - - - - - 5d0a43bf by Bas Couwenberg at 2019-08-17T07:24:32Z Merge upstream packaging changes. - - - - - a2ce174f by Bas Couwenberg at 2019-08-17T09:44:35Z Update symbols for amd64. - - - - - ab6173c0 by Bas Couwenberg at 2019-08-17T09:59:11Z Add lintian override for spelling-error-in-binary false positive. - - - - - 934144ad by Bas Couwenberg at 2019-08-17T09:59:11Z Set distribution to experimental. - - - - - 2566b063 by Bas Couwenberg at 2019-08-24T14:10:17Z Add Breaks/Replaces to fix upgrade from 2.18.18. (closes: #935613) - - - - - 4377fa51 by Bas Couwenberg at 2019-08-30T12:13:31Z Update symbols for other architectures. - - - - - cd8c6be3 by Bas Couwenberg at 2019-08-30T12:13:58Z Set distribution to unstable. - - - - - ad75e62c by Bas Couwenberg at 2019-09-07T07:28:18Z Update packaging for GRASS 7.8.0. - - - - - f00f865d by Bas Couwenberg at 2019-09-07T08:08:12Z Update symbols for other architectures. - - - - - 7da12bde by Bas Couwenberg at 2019-09-07T08:08:27Z Set distribution to unstable. - - - - - b1bd1110 by Martin Landa at 2019-09-11T09:44:25Z Rebuild 3.4.11+dfsg for bionic - - - - - 30 changed files: - .ci/travis/scripts/ctest2travis.py - .docker/qgis_resources/test_runner/qgis_setup.sh - .travis.yml - CMakeLists.txt - CTestConfig.cmake - ChangeLog - INSTALL - cmake/FindGRASS.cmake - debian/changelog - debian/control - debian/copyright - debian/libqgis-3d3.4.6.install → debian/libqgis-3d3.4.11.install - debian/libqgis-3d3.4.6.symbols → debian/libqgis-3d3.4.11.symbols - debian/libqgis-analysis3.4.6.install → debian/libqgis-analysis3.4.11.install - debian/libqgis-analysis3.4.6.symbols → debian/libqgis-analysis3.4.11.symbols - debian/libqgis-app3.4.6.install → debian/libqgis-app3.4.11.install - debian/libqgis-core3.4.6.lintian-overrides → debian/libqgis-app3.4.11.lintian-overrides - debian/libqgis-app3.4.6.symbols → debian/libqgis-app3.4.11.symbols - debian/libqgis-core3.4.6.install → debian/libqgis-core3.4.11.install - debian/libqgis-app3.4.6.lintian-overrides → debian/libqgis-core3.4.11.lintian-overrides - debian/libqgis-core3.4.6.symbols → debian/libqgis-core3.4.11.symbols - debian/libqgis-gui3.4.6.install → debian/libqgis-gui3.4.11.install - debian/libqgis-gui3.4.6.symbols → debian/libqgis-gui3.4.11.symbols - debian/libqgis-native3.4.6.install → debian/libqgis-native3.4.11.install - debian/libqgis-native3.4.6.symbols → debian/libqgis-native3.4.11.symbols - debian/libqgis-server3.4.6.install → debian/libqgis-server3.4.11.install - debian/libqgis-server3.4.6.symbols → debian/libqgis-server3.4.11.symbols - debian/libqgisgrass7-3.4.6.install → debian/libqgisgrass7-3.4.11.install - debian/libqgisgrass7-3.4.6.lintian-overrides → debian/libqgisgrass7-3.4.11.lintian-overrides - debian/libqgisgrass7-3.4.6.symbols → debian/libqgisgrass7-3.4.11.symbols The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/6bb8320976943911700279f75e044c1350860a7f...b1bd11106b177a9152a10f286a8492e37710ee4b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/6bb8320976943911700279f75e044c1350860a7f...b1bd11106b177a9152a10f286a8492e37710ee4b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastic at xs4all.nl Wed Sep 11 10:54:19 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Wed, 11 Sep 2019 11:54:19 +0200 Subject: [Git][debian-gis-team/qgis][ubuntu/bionic] 56 commits: Update symbols for other architectures. In-Reply-To: <5d78c2315757e_73482ad95d7dabac4637d2@godard.mail> References: <5d78c2315757e_73482ad95d7dabac4637d2@godard.mail> Message-ID: <2af91d79-e518-5b71-051c-337426da642c@xs4all.nl> On 9/11/19 11:45 AM, Martin Landa wrote: > 259e5776 by Bas Couwenberg at 2019-07-07T08:04:47Z > Update gbp.conf to use --source-only-changes by default. You did not merge this change correctly, see: git diff master ubuntu/bionic Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From gitlab at salsa.debian.org Wed Sep 11 11:00:26 2019 From: gitlab at salsa.debian.org (Martin Landa) Date: Wed, 11 Sep 2019 10:00:26 +0000 Subject: [Git][debian-gis-team/qgis][ubuntu/bionic] Update gbp.conf to use --source-only-changes by default Message-ID: <5d78c5badf3e0_73482ad963a2c9b8463958@godard.mail> Martin Landa pushed to branch ubuntu/bionic at Debian GIS Project / qgis Commits: ad10ee93 by Martin Landa at 2019-09-11T10:00:07Z Update gbp.conf to use --source-only-changes by default - - - - - 1 changed file: - debian/gbp.conf Changes: ===================================== debian/gbp.conf ===================================== @@ -17,3 +17,6 @@ pristine-tar = True [import-orig] rollback = False + +[buildpackage] +pbuilder-options = --source-only-changes View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/commit/ad10ee93ca74ab82f9dbd61bb07f419b0fb5dc9f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/commit/ad10ee93ca74ab82f9dbd61bb07f419b0fb5dc9f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From landa.martin at gmail.com Wed Sep 11 11:01:03 2019 From: landa.martin at gmail.com (Martin Landa) Date: Wed, 11 Sep 2019 12:01:03 +0200 Subject: [Git][debian-gis-team/qgis][ubuntu/bionic] 56 commits: Update symbols for other architectures. In-Reply-To: <2af91d79-e518-5b71-051c-337426da642c@xs4all.nl> References: <5d78c2315757e_73482ad95d7dabac4637d2@godard.mail> <2af91d79-e518-5b71-051c-337426da642c@xs4all.nl> Message-ID: Hi, st 11. 9. 2019 v 11:54 odesílatel Sebastiaan Couwenberg napsal: > > Update gbp.conf to use --source-only-changes by default. > You did not merge this change correctly, see: > > git diff master ubuntu/bionic ops, fixed in ad10ee93c. Thanks for info, Ma -- Martin Landa http://geo.fsv.cvut.cz/gwiki/Landa http://gismentors.cz/mentors/landa From info at trendtoday.co.uk Wed Sep 11 13:15:43 2019 From: info at trendtoday.co.uk (Daily Brand Winner) Date: Wed, 11 Sep 2019 14:15:43 +0200 Subject: pkg-grass-devel === > ''Consumer News: Get an $100 Gift Card! '' Message-ID: An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 18:26:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 17:26:43 +0000 Subject: [Git][debian-gis-team/qmapshack][master] 5 commits: New upstream version 1.13.2 Message-ID: <5d792e53a3a38_73482ad95ff1ed08522342@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / qmapshack Commits: d737bcde by Bas Couwenberg at 2019-09-11T16:30:12Z New upstream version 1.13.2 - - - - - ee63ad48 by Bas Couwenberg at 2019-09-11T16:31:05Z Update upstream source from tag 'upstream/1.13.2' Update to upstream version '1.13.2' with Debian dir 3cf573f1763a390486c1343bfd5e27429d5d75c3 - - - - - 9d3449e7 by Bas Couwenberg at 2019-09-11T16:31:22Z New upstream release. - - - - - b4a993b3 by Bas Couwenberg at 2019-09-11T17:05:58Z Add patch to fix spelling errors. - - - - - 6411d982 by Bas Couwenberg at 2019-09-11T17:05:58Z Set distribution to unstable. - - - - - 30 changed files: - CMakeLists.txt - CMakeLists.txt.user - MacOSX/HowtoBuildOSX.txt - MacOSX/bundle-qmaptool.sh - changelog.txt - debian/changelog - debian/patches/series - + debian/patches/spelling-errors.patch - msvc_64/QMapShack_Installer.nsi - msvc_64/cmake/FindPROJ4.cmake - msvc_64/copyfiles.bat - + src/icons/32x32/Attention.png - + src/icons/32x32/EnergyCycling.png - + src/icons/32x32/Hint.png - + src/icons/48x48/Attention.png - + src/icons/48x48/EnergyCycling.png - + src/icons/48x48/Hint.png - + src/icons/Attention.svg - + src/icons/EnergyCycling.svg - + src/icons/Hint.svg - src/qmapshack/CMainWindow.cpp - src/qmapshack/CMakeLists.txt - src/qmapshack/canvas/CCanvas.cpp - src/qmapshack/canvas/CCanvas.h - src/qmapshack/gis/CGisWorkspace.cpp - src/qmapshack/gis/CGisWorkspace.h - src/qmapshack/gis/IGisItem.h - src/qmapshack/gis/IGisWorkspace.ui - src/qmapshack/gis/ovl/CGisItemOvlArea.cpp - src/qmapshack/gis/ovl/CGisItemOvlArea.h The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/compare/a2ce6f73ed6e573c8a270879b9c5ddf1302e9518...6411d9821a76315ade77128aba9d60a13051c5bb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/compare/a2ce6f73ed6e573c8a270879b9c5ddf1302e9518...6411d9821a76315ade77128aba9d60a13051c5bb You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 18:26:45 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 17:26:45 +0000 Subject: [Git][debian-gis-team/qmapshack][pristine-tar] pristine-tar data for qmapshack_1.13.2.orig.tar.gz Message-ID: <5d792e5552b0c_73482ad95ff1ed08522590@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / qmapshack Commits: 4a06f356 by Bas Couwenberg at 2019-09-11T16:31:05Z pristine-tar data for qmapshack_1.13.2.orig.tar.gz - - - - - 2 changed files: - + qmapshack_1.13.2.orig.tar.gz.delta - + qmapshack_1.13.2.orig.tar.gz.id Changes: ===================================== qmapshack_1.13.2.orig.tar.gz.delta ===================================== Binary files /dev/null and b/qmapshack_1.13.2.orig.tar.gz.delta differ ===================================== qmapshack_1.13.2.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +7caed8f3fca161fd9017cc0f6d53a67a0ea94eb5 View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/4a06f356d17bd2c532df90dce48fa7a42c75ffd1 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/4a06f356d17bd2c532df90dce48fa7a42c75ffd1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 18:26:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 17:26:46 +0000 Subject: [Git][debian-gis-team/qmapshack][upstream] New upstream version 1.13.2 Message-ID: <5d792e56b9aec_73482ad96391917052274c@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / qmapshack Commits: d737bcde by Bas Couwenberg at 2019-09-11T16:30:12Z New upstream version 1.13.2 - - - - - 30 changed files: - CMakeLists.txt - CMakeLists.txt.user - MacOSX/HowtoBuildOSX.txt - MacOSX/bundle-qmaptool.sh - changelog.txt - msvc_64/QMapShack_Installer.nsi - msvc_64/cmake/FindPROJ4.cmake - msvc_64/copyfiles.bat - + src/icons/32x32/Attention.png - + src/icons/32x32/EnergyCycling.png - + src/icons/32x32/Hint.png - + src/icons/48x48/Attention.png - + src/icons/48x48/EnergyCycling.png - + src/icons/48x48/Hint.png - + src/icons/Attention.svg - + src/icons/EnergyCycling.svg - + src/icons/Hint.svg - src/qmapshack/CMainWindow.cpp - src/qmapshack/CMakeLists.txt - src/qmapshack/canvas/CCanvas.cpp - src/qmapshack/canvas/CCanvas.h - src/qmapshack/gis/CGisWorkspace.cpp - src/qmapshack/gis/CGisWorkspace.h - src/qmapshack/gis/IGisItem.h - src/qmapshack/gis/IGisWorkspace.ui - src/qmapshack/gis/ovl/CGisItemOvlArea.cpp - src/qmapshack/gis/ovl/CGisItemOvlArea.h - src/qmapshack/gis/prj/IGisProject.cpp - src/qmapshack/gis/prj/IGisProject.h - src/qmapshack/gis/qms/serialization.cpp The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/d737bcde0e0379719d330167b7f7c09018da6966 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/d737bcde0e0379719d330167b7f7c09018da6966 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 18:27:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 17:27:09 +0000 Subject: [Git][debian-gis-team/qmapshack] Pushed new tag debian/1.13.2-1 Message-ID: <5d792e6d35f27_73482ad95ff1ed0852295f@godard.mail> Bas Couwenberg pushed new tag debian/1.13.2-1 at Debian GIS Project / qmapshack -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/tree/debian/1.13.2-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Wed Sep 11 18:27:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Wed, 11 Sep 2019 17:27:10 +0000 Subject: [Git][debian-gis-team/qmapshack] Pushed new tag upstream/1.13.2 Message-ID: <5d792e6e167a5_73482ad95fbd453052318e@godard.mail> Bas Couwenberg pushed new tag upstream/1.13.2 at Debian GIS Project / qmapshack -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/tree/upstream/1.13.2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Wed Sep 11 18:35:34 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 11 Sep 2019 17:35:34 +0000 Subject: Processing of qmapshack_1.13.2-1_source.changes Message-ID: qmapshack_1.13.2-1_source.changes uploaded successfully to localhost along with the files: qmapshack_1.13.2-1.dsc qmapshack_1.13.2.orig.tar.gz qmapshack_1.13.2-1.debian.tar.xz qmapshack_1.13.2-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Wed Sep 11 18:49:25 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Wed, 11 Sep 2019 17:49:25 +0000 Subject: qmapshack_1.13.2-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Wed, 11 Sep 2019 18:31:46 +0200 Source: qmapshack Architecture: source Version: 1.13.2-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: qmapshack (1.13.2-1) unstable; urgency=medium . * New upstream release. * Add patch to fix spelling errors. Checksums-Sha1: 5edad892f0feeed214cc1826795f1127015262a8 2210 qmapshack_1.13.2-1.dsc abc9ff47d1c13d7e4fea8f5539e1d59bf6b8f16a 12725064 qmapshack_1.13.2.orig.tar.gz 97b62616b250a3a7a2f6bedd7328015c56f69204 12220 qmapshack_1.13.2-1.debian.tar.xz a68aff6cf93b286455cc2fcb98c0d689b366c7ed 19036 qmapshack_1.13.2-1_amd64.buildinfo Checksums-Sha256: 4b55b97bdf1f6702f2a4e0a5bc04a18e7d3ad5b5f0c0658b134b738cc81050d9 2210 qmapshack_1.13.2-1.dsc 03663d4bd4ab35892a95bcebae9994aac34e92a4293ece9e9ec417fd184dbdd5 12725064 qmapshack_1.13.2.orig.tar.gz d7ed6d9d89114655d26f4d9f7e7a82ede9ef646a61e8e2aaa657e3830a456a32 12220 qmapshack_1.13.2-1.debian.tar.xz f99d870896790f8d4f3c1757b135ed8e1670454ce9963b8b2cd4571bf0665f7f 19036 qmapshack_1.13.2-1_amd64.buildinfo Files: da8e2dc9b89833ba9cb1a2d629f9f1cf 2210 science optional qmapshack_1.13.2-1.dsc e1ab2103852a45282d767090df69f14a 12725064 science optional qmapshack_1.13.2.orig.tar.gz af13c4981fd03cfe7e57404e7a1b9005 12220 science optional qmapshack_1.13.2-1.debian.tar.xz 39c25800899744d057f80ec9d7ff03fe 19036 science optional qmapshack_1.13.2-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl15Lh8ACgkQZ1DxCuiN SvH4Mg//XKWz2qXL0jGh5flRst+3utCHDxebsfjrVBproH30czL6/1qLvty0vNhE JoMuvfM4yaDKX4Evu6GXPkz1moY8QhNQhmIWgOCXRAr721rNfaERVB8DMpU/wcHV QUhTtB1iGTEj/7aCojewLh3MjSP/o6xn4Fyr9ik076QLZxamzowV3rhYyMOkVsCu 9qyFOXQtSlA6eBR9rQNSSK891J9t3H1qfXE4+MoUqrgutiizyehb54dEqf2wgvVS iC5ryFdb5PLTGmAd0I7oI8Pe1zBCGV7mWB7McilIezUOlEVgrdMPUIWYgupkZDfW hfk2eX7O8eRjkBheYZtueAXtXQTB+JIN7Lqc33rQFyNwL0xktJ6OwiGNg9X9t9tI EAs8nptgxNhEKuII3UJKj27C/FCiguP53M7oRMvN4CvqWMjNIeKkzpukkGHrZz9S N3GgItDvU2PXfnIPznJeO/iLXBDNnF36eUDEAwgapOiM9zpFM6GxpMYnE377PRSR w7Jo5dhMcf9MyS/oR/ZjeY+Q/Alt/ZjXNtMD2MKGyQYo4WHONGbcEsENwF8KWXqA VAZ4OPRPkxI4PvjLbXEGSz+FdO6TRL5HDSIhI7xSEnNdhWg+3zOoDcODieLCvgFm 2QbGdYViHnUAtGoXBQbB2LlDWmlVMzy+s3Q09tWNbCrN+0RcIxY= =8EfP -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From noreply at release.debian.org Thu Sep 12 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 12 Sep 2019 04:39:21 +0000 Subject: ossim 2.9.1-1 MIGRATED to testing Message-ID: FYI: The status of the ossim source package in Debian's testing distribution has changed. Previous version: 2.9.0-1 Current version: 2.9.1-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Thu Sep 12 05:39:20 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 12 Sep 2019 04:39:20 +0000 Subject: netcdf4-python 1.5.2-1 MIGRATED to testing Message-ID: FYI: The status of the netcdf4-python source package in Debian's testing distribution has changed. Previous version: 1.5.1.2-4 Current version: 1.5.2-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From plugwash at p10link.net Thu Sep 12 17:44:28 2019 From: plugwash at p10link.net (peter green) Date: Thu, 12 Sep 2019 17:44:28 +0100 Subject: Bug#937226: State of pywps upload. References: Message-ID: <9603f2ef-0531-b1b6-e7ee-1e0ea250b35e@p10link.net> Hi Over 5 months ago an upload of pywps was made to switch it from python 2 to python 3 (and hence unblock removal of at least two other python 2 packages). This upload went to new as it changed the list of binary packages. It seems it has been there ever since. Is there a problem with the upload? or did it just slip through the cracks? From gitlab at salsa.debian.org Fri Sep 13 05:28:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 04:28:27 +0000 Subject: [Git][debian-gis-team/netcdf-cxx][pristine-tar] pristine-tar data for netcdf-cxx_4.3.1.orig.tar.gz Message-ID: <5d7b1aeba4b4b_73483fbbba458e4470547c@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / netcdf-cxx Commits: 9bb31d45 by Bas Couwenberg at 2019-09-13T03:59:08Z pristine-tar data for netcdf-cxx_4.3.1.orig.tar.gz - - - - - 2 changed files: - + netcdf-cxx_4.3.1.orig.tar.gz.delta - + netcdf-cxx_4.3.1.orig.tar.gz.id Changes: ===================================== netcdf-cxx_4.3.1.orig.tar.gz.delta ===================================== Binary files /dev/null and b/netcdf-cxx_4.3.1.orig.tar.gz.delta differ ===================================== netcdf-cxx_4.3.1.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +7cc60a7dc9fbc87c061c768b9274c958751350bd View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/commit/9bb31d4575ad06f12ab7a0a93da8b7674e61172a -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/commit/9bb31d4575ad06f12ab7a0a93da8b7674e61172a You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 05:28:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 04:28:29 +0000 Subject: [Git][debian-gis-team/netcdf-cxx][upstream] New upstream version 4.3.1 Message-ID: <5d7b1aed92f96_73482ad96140591070562c@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / netcdf-cxx Commits: 66016b3c by Bas Couwenberg at 2019-09-13T03:59:04Z New upstream version 4.3.1 - - - - - 26 changed files: - .gitignore - .travis.yml - CMakeLists.txt - Makefile.am - Makefile.in - README.md - RELEASE_NOTES.md - aclocal.m4 - compile - config.guess - config.h.in - config.sub - configure - configure.ac - cxx4/CMakeLists.txt - cxx4/Makefile.am - cxx4/Makefile.in - + cxx4/findplugin.in - cxx4/ncCheck.h - cxx4/ncDim.cpp - cxx4/ncDim.h - cxx4/ncFile.cpp - cxx4/ncFile.h - + cxx4/ncFill.cpp - + cxx4/ncFill.h - + cxx4/ncFilter.cpp The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/commit/66016b3ccf42a45a1071097c8139c878c4cd283b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/commit/66016b3ccf42a45a1071097c8139c878c4cd283b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 05:28:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 04:28:30 +0000 Subject: [Git][debian-gis-team/netcdf-cxx] Pushed new branch experimental Message-ID: <5d7b1aeee16b7_73482ad96140591070589@godard.mail> Bas Couwenberg pushed new branch experimental at Debian GIS Project / netcdf-cxx -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/tree/experimental You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 05:28:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 04:28:34 +0000 Subject: [Git][debian-gis-team/netcdf-cxx] Pushed new tag upstream/4.3.1 Message-ID: <5d7b1af2bdc16_73483fbbba9687e0706044@godard.mail> Bas Couwenberg pushed new tag upstream/4.3.1 at Debian GIS Project / netcdf-cxx -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/tree/upstream/4.3.1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From noreply at release.debian.org Fri Sep 13 05:39:10 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Fri, 13 Sep 2019 04:39:10 +0000 Subject: satpy is marked for autoremoval from testing Message-ID: satpy 0.16.1-2 is marked for autoremoval from testing on 2019-10-13 It (build-)depends on packages with these RC bugs: 938494: skimage: Python2 removal in sid/bullseye From noreply at release.debian.org Fri Sep 13 05:39:11 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Fri, 13 Sep 2019 04:39:11 +0000 Subject: glymur is marked for autoremoval from testing Message-ID: glymur 0.8.18+ds-1 is marked for autoremoval from testing on 2019-10-13 It (build-)depends on packages with these RC bugs: 938494: skimage: Python2 removal in sid/bullseye From noreply at release.debian.org Fri Sep 13 05:39:12 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Fri, 13 Sep 2019 04:39:12 +0000 Subject: python-geopandas is marked for autoremoval from testing Message-ID: python-geopandas 0.5.1-2 is marked for autoremoval from testing on 2019-10-13 It (build-)depends on packages with these RC bugs: 938494: skimage: Python2 removal in sid/bullseye From noreply at release.debian.org Fri Sep 13 05:39:13 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Fri, 13 Sep 2019 04:39:13 +0000 Subject: pysal is marked for autoremoval from testing Message-ID: pysal 2.0.0-1 is marked for autoremoval from testing on 2019-10-13 It (build-)depends on packages with these RC bugs: 938494: skimage: Python2 removal in sid/bullseye From noreply at release.debian.org Fri Sep 13 05:39:16 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 13 Sep 2019 04:39:16 +0000 Subject: debian-gis 0.0.18 MIGRATED to testing Message-ID: FYI: The status of the debian-gis source package in Debian's testing distribution has changed. Previous version: 0.0.17 Current version: 0.0.18 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Fri Sep 13 05:39:22 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 13 Sep 2019 04:39:22 +0000 Subject: python-pdal 2.2.1+ds-1 MIGRATED to testing Message-ID: FYI: The status of the python-pdal source package in Debian's testing distribution has changed. Previous version: 2.1.8+ds-2 Current version: 2.2.1+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Fri Sep 13 05:39:23 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 13 Sep 2019 04:39:23 +0000 Subject: snaphu 2.0.1-1 MIGRATED to testing Message-ID: FYI: The status of the snaphu source package in Debian's testing distribution has changed. Previous version: 2.0.0-1 Current version: 2.0.1-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From debian at kitterman.com Fri Sep 13 16:23:01 2019 From: debian at kitterman.com (Scott Kitterman) Date: Fri, 13 Sep 2019 11:23:01 -0400 Subject: Bug#940185: src:pywps: Debian/copyright needs update Message-ID: <156838818187.29795.15334712978001753322.reportbug@l5580.kitterman.com> Package: src:pywps Version: 4.2.1-1 Severity: serious Justification: Policy 2.3 One of our ftp-trainees reviewed your package and made the following observations. They exist in the current version of the package, so I'm not rejecting as a result, but they should be fixed in the next upload: Compiled works are present in tests/ and it does not appear this data can be rebuilt from the source package provided. Please remove and repack the tarball to remove any such artifacts. Data in pywps/schemas/geojson/ has no licensing information. Who holds an actual copyright is confusing: - LICENSE.txt claims copyright by "PyWPS Development Team" - All source claims copyright by "Open Source Geospatial Foundation" - Source also (incorrectly) uses "and others" as a copyright holder - d/copyright claims "PyWPS Project Steering Committee" is a copyright holder, but is not represented in source Files provided in d/patches have a copyright holder that is not present in d/copyright This appears to be the new maintainer, they should be included in the debian/* paragraph. Scott K From debian at kitterman.com Fri Sep 13 16:29:58 2019 From: debian at kitterman.com (Scott Kitterman) Date: Fri, 13 Sep 2019 11:29:58 -0400 Subject: Bug#937226: State of pywps upload. In-Reply-To: <9603f2ef-0531-b1b6-e7ee-1e0ea250b35e@p10link.net> References: <9603f2ef-0531-b1b6-e7ee-1e0ea250b35e@p10link.net> Message-ID: <2006695.RWXqIFo6N8@l5580> On Thursday, September 12, 2019 12:44:28 PM EDT peter green wrote: > Hi > > Over 5 months ago an upload of pywps was made to switch it from python 2 to > python 3 (and hence unblock removal of at least two other python 2 > packages). This upload went to new as it changed the list of binary > packages. It seems it has been there ever since. > > Is there a problem with the upload? or did it just slip through the cracks? It was caught in an locked state in our internal database and required manual intervention to resolve. There are some significant issues in the package, but none of them new, so I've accepted it (See #940185). Scott K From ftpmaster at ftp-master.debian.org Fri Sep 13 17:00:11 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 13 Sep 2019 16:00:11 +0000 Subject: pywps_4.2.1-2~exp1_amd64.changes ACCEPTED into experimental, experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 30 Mar 2019 20:59:32 +0100 Source: pywps Binary: python3-pywps pywps pywps-doc pywps-wsgi Architecture: source all Version: 4.2.1-2~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: python3-pywps - Implementation of OGC's Web Processing Service - Python module pywps - Implementation of OGC's Web Processing Service pywps-doc - Implementation of OGC's Web Processing Service - Documentation pywps-wsgi - Implementation of OGC's Web Processing Service - WSGI example Changes: pywps (4.2.1-2~exp1) experimental; urgency=medium . * Bump Standards-Version to 4.3.0, no changes. * Remove package name from lintian overrides. * Switch to Python 3. * Add patch for Python 3.7 compatibility. Checksums-Sha1: dc711e445d9d037b0bb9a82b67d1587ddbfaf712 2376 pywps_4.2.1-2~exp1.dsc c0656126ddc99d7cd8bd279c5545d2d212dfef9e 10008 pywps_4.2.1-2~exp1.debian.tar.xz d3f16ab7c92b624a59149040579266f6e2725bcb 56028 python3-pywps_4.2.1-2~exp1_all.deb 6a4129a1b38874fd8b19d23fa9a78fd691af9461 322652 pywps-doc_4.2.1-2~exp1_all.deb e881a91eb04e511d640e0478d763724095a642aa 7276 pywps-wsgi_4.2.1-2~exp1_all.deb d8c9375a1261f62c909fe4349e3fe516f0aea67d 5420 pywps_4.2.1-2~exp1_all.deb 39e38eaa752cebb4ea0d4dfc4713b594052ed471 11497 pywps_4.2.1-2~exp1_amd64.buildinfo Checksums-Sha256: 5709847611a8e9f714150e12fa037dc9a360a3c1ee22c77aab3db9c2a51f9bc4 2376 pywps_4.2.1-2~exp1.dsc 46d30121549ee6706f4c22f0358ccfe529646a25abc7328c6019979238e84ad7 10008 pywps_4.2.1-2~exp1.debian.tar.xz 94e511e0a80f9b7d694dd5daa4b4dea52c0f279dd9d8ffaecc75a65242840451 56028 python3-pywps_4.2.1-2~exp1_all.deb d70b09c2adc1bbc3a5568f6d6459aa9f5ced2ba2537a00d7c1554cab5a38c30b 322652 pywps-doc_4.2.1-2~exp1_all.deb b92c19e25eda51539ac6dc56ecdbd0695c250f15ffea42ed18c90e7e9bb253bf 7276 pywps-wsgi_4.2.1-2~exp1_all.deb 9eeacaa5b948d2f51224599596cef247bdd2680877d0d69fca1d0ed0d7649558 5420 pywps_4.2.1-2~exp1_all.deb 5e931e89f426050bf637a7ecc5a70e704baa73898cda11f31b1cd930f1c33fdc 11497 pywps_4.2.1-2~exp1_amd64.buildinfo Files: 1c42e2fb76863ce7eebabfa59ce0927a 2376 python optional pywps_4.2.1-2~exp1.dsc 51ab0dcd742030d5532ce2502871ab2e 10008 python optional pywps_4.2.1-2~exp1.debian.tar.xz ff4db078516f46a0de6849990968b263 56028 python optional python3-pywps_4.2.1-2~exp1_all.deb e021d54f4e782684d9b31f4b8180f65b 322652 doc optional pywps-doc_4.2.1-2~exp1_all.deb 90a60d740122956f2cf484be87150e3a 7276 web optional pywps-wsgi_4.2.1-2~exp1_all.deb 4364abb70d18966c13e7d3448ee6984a 5420 metapackages optional pywps_4.2.1-2~exp1_all.deb 5d1d164c17c871e80197f9668487b027 11497 python optional pywps_4.2.1-2~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAlyf05kACgkQZ1DxCuiN SvEP6g/9H5aMG9fzUmDsj/Ie8hruWWwTls1iPQFaNaeNWf91fcJG+4NG/RtC/Mwn BMpCSDgSGbSgIW55sQNOIOHX7R1l+txqgxpeWjmSyw9KFqfwk7D85MTN5+OoyDub 2ykZrRwIA7ZAmIuFZJBllPmZVmV4MPR5T6MfQYsvzuE643CDyla+YW36gMy2RoOF bu7FB5kX8HuQgajMLa3I56H+TKN3JUj90vJyKmUb0pr9xdB/xBgYE4DVIBO970LN YRimenht3kX+gMu7hzwuYWSLjtFfcgtkAa9aBh7UtebhrkuaR/d1bM6gycF9SZmp dNgJlxQOyBRdpF27B6vYa0yHNogRPw3SXvZyXBABo0Gk0darbKHkq2eksM0zKLPJ aY9Y02kf0ikB0dbVXBUtuvsAfCRWGylpBa99lOrnCtbOUA/Cft6CksQUfwd08uXX 6hmq9yIdYmaWRx/3yf2hSrV36A0tbplfr3ZXVe0/mOzuezCFj4wTRibUgc5Mr/Rr dUO+pX19AvoF8AMTBHV7/j2ctCj+4CI7wHFeE4xgCfYYubZaABrGHhuCCL+amaTx vewknWlYhHA1IoGOi40bEqg5wWA0KDMs/fUAaFlKYPEkLLqYdSxXQ7ed7brfvSCK Kq4nzUbFQYbIoTB3Q+7LKvlS6iHVCloLH8jx7zqR6P4xd7QuoPE= =SOaN -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Fri Sep 13 17:03:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Fri, 13 Sep 2019 16:03:03 +0000 Subject: Processed: tagging 940185 References: <1568390488-1740-bts-sebastic@debian.org> Message-ID: Processing commands for control at bugs.debian.org: > tags 940185 + moreinfo Bug #940185 [src:pywps] src:pywps: Debian/copyright needs update Added tag(s) moreinfo. > thanks Stopping processing here. Please contact me if you need assistance. -- 940185: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=940185 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From sebastic at xs4all.nl Fri Sep 13 16:56:30 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Fri, 13 Sep 2019 17:56:30 +0200 Subject: Bug#940185: src:pywps: Debian/copyright needs update In-Reply-To: <156838818187.29795.15334712978001753322.reportbug@l5580.kitterman.com> References: <156838818187.29795.15334712978001753322.reportbug@l5580.kitterman.com> <156838818187.29795.15334712978001753322.reportbug@l5580.kitterman.com> Message-ID: <5a93dc41-c6e7-e678-0748-8c2cb5ac3777@xs4all.nl> Hi Scott, Thanks for finally reviewing pywps. On 9/13/19 5:23 PM, Scott Kitterman wrote: > One of our ftp-trainees reviewed your package and made the following > observations. It seems that the process is broken. This is far from the first time where an anonymous ftp-trainee commented on a package in NEW, but no ftp-master acted on this. What process does ftp-master use to review ftp-trainee comments? > Compiled works are present in tests/ and it does not appear this data can be rebuilt > from the source package provided. Please remove and repack the tarball > to remove any such artifacts. Can you or or the anonymous ftp-trainee clarify which files they consider to be "Compiled works"? There are data files under tests/data & tests/requests used in various tests. Why should these need to be rebuilt if those are the files in question? > Data in pywps/schemas/geojson/ has no licensing information. Its upstream states: "[...] either of the AFL or BSD license", but not which version. I've contacted the author to request clarification. > Who holds an actual copyright is confusing: > - LICENSE.txt claims copyright by "PyWPS Development Team" > - All source claims copyright by "Open Source Geospatial Foundation" Why does this matter? debian/copyright includes the holders as listed in LICENSE.txt and the sources. Are you saying that's wrong? > - Source also (incorrectly) uses "and others" as a copyright holder Why is this incorrect? The sources have copyright statements like this: Copyright 2018 Open Source Geospatial Foundation and others licensed under MIT, Please consult LICENSE.txt for details That's what's reflected in debian/copyright. > - d/copyright claims "PyWPS Project Steering Committee" is a copyright holder, but is > not represented in source It used to be a copyright holder, see: debian/share/pywps/processes/sayhello.py > Files provided in d/patches have a copyright holder that is not present in d/copyright > This appears to be the new maintainer, they should be included in the debian/* paragraph. The patches are trivial and cannot be copyrighted in my understanding. I wave any copyright claims on them if they can. Depending on how strict ftp-master is on the data files issue, it may be better to just remove this package from Debian as I don't use it myself and just co-maintain it because it's also included in OSGeoLive. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From gitlab at salsa.debian.org Fri Sep 13 17:26:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 16:26:00 +0000 Subject: [Git][debian-gis-team/pywps][master] 2 commits: Update copyright file. Message-ID: <5d7bc318ab0ad_73482ad95e237eec8212f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pywps Commits: 7dbeae6b by Bas Couwenberg at 2019-09-13T15:58:53Z Update copyright file. Changes: - Fix license & copyright for sayhello.py. - - - - - 7c300c0f by Bas Couwenberg at 2019-09-13T16:19:35Z Set distribution to unstable. - - - - - 2 changed files: - debian/changelog - debian/copyright Changes: ===================================== debian/changelog ===================================== @@ -1,10 +1,13 @@ -pywps (4.2.1-2~exp2) UNRELEASED; urgency=medium +pywps (4.2.1-2) unstable; urgency=medium * Update gbp.conf to use --source-only-changes by default. * Bump Standards-Version to 4.4.0, no changes. * Update PIE hardening conditional, trusty is EOL. + * Update copyright file, changes: + - Fix license & copyright for sayhello.py. + * Move from experimental to unstable. - -- Bas Couwenberg Sun, 07 Jul 2019 10:04:01 +0200 + -- Bas Couwenberg Fri, 13 Sep 2019 18:19:15 +0200 pywps (4.2.1-2~exp1) experimental; urgency=medium ===================================== debian/copyright ===================================== @@ -6,13 +6,16 @@ Source: https://github.com/geopython/pywps/releases Files: * Copyright: 2018, Open Source Geospatial Foundation and others 2014-2016, PyWPS Development Team, represented by PyWPS Project Steering Committee - 2016, PyWPS Project Steering Committee License: Expat Files: debian/* Copyright: 2006, Jáchym Čepický License: GPL-2+ +Files: debian/share/pywps/processes/sayhello.py +Copyright: 2016, PyWPS Project Steering Committee +License: Expat + License: Expat Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/0ff2b4f43531fe12ab93605e941f87d7152b6345...7c300c0f30cbc1bc0086a2e8f03e051409660a5f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/0ff2b4f43531fe12ab93605e941f87d7152b6345...7c300c0f30cbc1bc0086a2e8f03e051409660a5f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 17:26:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 16:26:05 +0000 Subject: [Git][debian-gis-team/pywps] Pushed new tag debian/4.2.1-2 Message-ID: <5d7bc31dc8bda_73482ad95f330820821431@godard.mail> Bas Couwenberg pushed new tag debian/4.2.1-2 at Debian GIS Project / pywps -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/tree/debian/4.2.1-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From debian at kitterman.com Fri Sep 13 17:24:54 2019 From: debian at kitterman.com (Scott Kitterman) Date: Fri, 13 Sep 2019 12:24:54 -0400 Subject: Bug#940185: src:pywps: Debian/copyright needs update In-Reply-To: <5a93dc41-c6e7-e678-0748-8c2cb5ac3777@xs4all.nl> References: <156838818187.29795.15334712978001753322.reportbug@l5580.kitterman.com> <5a93dc41-c6e7-e678-0748-8c2cb5ac3777@xs4all.nl> <156838818187.29795.15334712978001753322.reportbug@l5580.kitterman.com> Message-ID: <2082747.z90QcWvzqv@l5580> On Friday, September 13, 2019 11:56:30 AM EDT Sebastiaan Couwenberg wrote: > Hi Scott, > > Thanks for finally reviewing pywps. > > On 9/13/19 5:23 PM, Scott Kitterman wrote: > > One of our ftp-trainees reviewed your package and made the following > > observations. > > It seems that the process is broken. > > This is far from the first time where an anonymous ftp-trainee commented > on a package in NEW, but no ftp-master acted on this. > > What process does ftp-master use to review ftp-trainee comments? The delay this time was caused by it being in an incorrectly locked state, not because there was a pending trainee comment. The stale lock problem happens rarely enough that it took me some time to remember how to resolve it. > > Compiled works are present in tests/ and it does not appear this data can > > be rebuilt from the source package provided. Please remove and repack > > the tarball to remove any such artifacts. > > Can you or or the anonymous ftp-trainee clarify which files they > consider to be "Compiled works"? I'll ask, but unfortunately the note didn't include it. If you don't see anything there, then I wouldn't worry about it. > There are data files under tests/data & tests/requests used in various > tests. Why should these need to be rebuilt if those are the files in > question? There's no need to actually rebuild them. We do generally require that it be possible to rebuild them from tools in Debian. Of course the best way to know that you actually can rebuild them is to do so during the package build, but it's not required. > > Data in pywps/schemas/geojson/ has no licensing information. > > Its upstream states: "[...] either of the AFL or BSD license", but not > which version. I've contacted the author to request clarification. Thanks. I took a look and those files seem to be based on https://geojson.org/ geojson-spec.html which is CC BY 3.0 US, so I have doubts. > > Who holds an actual copyright is confusing: > > - LICENSE.txt claims copyright by "PyWPS Development Team" > > - All source claims copyright by "Open Source Geospatial Foundation" > > Why does this matter? > > debian/copyright includes the holders as listed in LICENSE.txt and the > sources. Are you saying that's wrong? > > > - Source also (incorrectly) uses "and others" as a copyright holder > > Why is this incorrect? > > The sources have copyright statements like this: > Copyright 2018 Open Source Geospatial Foundation and others > licensed under MIT, Please consult LICENSE.txt for details > > That's what's reflected in debian/copyright. I've reviewed it more carefully now and I agree with you that it's fine as is. > > - d/copyright claims "PyWPS Project Steering Committee" is a copyright > > holder, but is> > > not represented in source > > It used to be a copyright holder, see: > > debian/share/pywps/processes/sayhello.py > > > Files provided in d/patches have a copyright holder that is not present in > > d/copyright This appears to be the new maintainer, they should be > > included in the debian/* paragraph. > The patches are trivial and cannot be copyrighted in my understanding. I > wave any copyright claims on them if they can. I think that's fine. > Depending on how strict ftp-master is on the data files issue, it may be > better to just remove this package from Debian as I don't use it myself > and just co-maintain it because it's also included in OSGeoLive. I should have checked more carefully before passing on all the note as it's not all correct. The missing license for the schema is correctly serious and should be resolved. I can't tell you if it's worth keeping in Debian or not, but I don't think (now that I've looked harder) there's a lot of work to do to resolve this. Scott K From gitlab at salsa.debian.org Fri Sep 13 17:36:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 16:36:04 +0000 Subject: [Git][debian-gis-team/owslib][master] 2 commits: Drop Python 2 support. (closes: #937226) Message-ID: <5d7bc5742a8ce_73482ad95e2d84a082303c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / owslib Commits: 6f7269b7 by Bas Couwenberg at 2019-09-12T17:05:13Z Drop Python 2 support. (closes: #937226) - - - - - ae2aed64 by Bas Couwenberg at 2019-09-13T16:31:25Z Set distribution to unstable. - - - - - 3 changed files: - debian/changelog - debian/control - debian/rules Changes: ===================================== debian/changelog ===================================== @@ -1,8 +1,11 @@ -owslib (0.18.0-2) UNRELEASED; urgency=medium +owslib (0.18.0-2) unstable; urgency=medium + * Team upload. * Bump Standards-Version to 4.4.0, no changes. + * Drop Python 2 support. + (closes: #937226) - -- Bas Couwenberg Wed, 10 Jul 2019 18:47:53 +0200 + -- Bas Couwenberg Fri, 13 Sep 2019 18:26:23 +0200 owslib (0.18.0-1) unstable; urgency=medium ===================================== debian/control ===================================== @@ -6,47 +6,23 @@ Section: python Priority: optional Build-Depends: debhelper (>= 9), dh-python, - python-setuptools, - python-dateutil, - python-pytest, - python-tz, - python-all, - python-requests, - python3-setuptools, + python3-all, python3-dateutil, python3-pytest, - python3-tz, - python3-all, python3-requests, - python3-sphinx + python3-setuptools, + python3-sphinx, + python3-tz Standards-Version: 4.4.0 Vcs-Browser: https://salsa.debian.org/debian-gis-team/owslib Vcs-Git: https://salsa.debian.org/debian-gis-team/owslib.git Homepage: https://geopython.github.com/OWSLib/ -Package: python-owslib -Architecture: all -Depends: ${python:Depends}, - ${misc:Depends}, - python-lxml -Suggests: owslib-doc -Description: Client library for Open Geospatial (OGC) web services (Python 2) - OWSLib is a Python package for client programming with Open Geospatial - Consortium (OGC) web service (hence OWS) interface standards, and their - related content models. - . - Full documentation is available at https://geopython.github.io/OWSLib - . - OWSLib provides a common API for accessing service metadata and wrappers - for numerous OGC Web Service interfaces. - . - This package provides the Python 2 version of the library. - Package: python3-owslib Architecture: all -Depends: ${python3:Depends}, - ${misc:Depends}, - python3-lxml +Depends: python3-lxml, + ${python3:Depends}, + ${misc:Depends} Suggests: owslib-doc Description: Client library for Open Geospatial (OGC) web services (Python 3) OWSLib is a Python package for client programming with Open Geospatial ===================================== debian/rules ===================================== @@ -2,7 +2,6 @@ # -*- makefile -*- export PYBUILD_NAME=owslib -export PYBUILD_AFTER_BUILD_python3 = cd docs;make html include /usr/share/dpkg/pkg-info.mk @@ -10,10 +9,15 @@ BUILD_DATE = $(shell LC_ALL=C date -u "+%B %d, %Y" -d "@$(SOURCE_DATE_EPOCH)") SPHINXOPTS = -D today="$(BUILD_DATE)" %: - dh $@ --with python2,python3,sphinxdoc --buildsystem pybuild --parallel - -override_dh_auto_test: -#skipping tests as they require internet access + dh $@ --with python3,sphinxdoc --buildsystem pybuild --parallel override_dh_auto_clean: rm -rf docs/build + +override_dh_auto_build: + dh_auto_build + + cd docs && make html + +override_dh_auto_test: +#skipping tests as they require internet access View it on GitLab: https://salsa.debian.org/debian-gis-team/owslib/compare/3757e6ae8825f9467b3bd0de8b5d4ba63773c32d...ae2aed64ace2d0725e15edc5c271ae209eaa42ff -- View it on GitLab: https://salsa.debian.org/debian-gis-team/owslib/compare/3757e6ae8825f9467b3bd0de8b5d4ba63773c32d...ae2aed64ace2d0725e15edc5c271ae209eaa42ff You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 17:36:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 16:36:08 +0000 Subject: [Git][debian-gis-team/owslib] Pushed new tag debian/0.18.0-2 Message-ID: <5d7bc578d0da8_73482ad95e2d84a082323d@godard.mail> Bas Couwenberg pushed new tag debian/0.18.0-2 at Debian GIS Project / owslib -- View it on GitLab: https://salsa.debian.org/debian-gis-team/owslib/tree/debian/0.18.0-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 13 17:38:21 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 13 Sep 2019 16:38:21 +0000 Subject: Processing of pywps_4.2.1-2_source.changes Message-ID: pywps_4.2.1-2_source.changes uploaded successfully to localhost along with the files: pywps_4.2.1-2.dsc pywps_4.2.1-2.debian.tar.xz pywps_4.2.1-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 13 17:48:40 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 13 Sep 2019 16:48:40 +0000 Subject: Processing of owslib_0.18.0-2_source.changes Message-ID: owslib_0.18.0-2_source.changes uploaded successfully to localhost along with the files: owslib_0.18.0-2.dsc owslib_0.18.0-2.debian.tar.xz owslib_0.18.0-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 13 17:51:34 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 13 Sep 2019 16:51:34 +0000 Subject: owslib_0.18.0-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 13 Sep 2019 18:26:23 +0200 Source: owslib Architecture: source Version: 0.18.0-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 937226 Changes: owslib (0.18.0-2) unstable; urgency=medium . * Team upload. * Bump Standards-Version to 4.4.0, no changes. * Drop Python 2 support. (closes: #937226) Checksums-Sha1: c6172f97b85d1743f692386b4fc6382e29904988 2125 owslib_0.18.0-2.dsc bf57f2300c120dd088e60b431fcf361ef47932e0 5160 owslib_0.18.0-2.debian.tar.xz 6bb7048fe5372f7e7280262c9ea1cb7fa9121d43 8333 owslib_0.18.0-2_amd64.buildinfo Checksums-Sha256: 5f7bfb258026a814787eefc59f7f762170f05d2de1fec03c71e0494d1bf8c63e 2125 owslib_0.18.0-2.dsc 5b3d15f6fbda5a6396c6a560eb032369eea293a7d1b86f14b0e2f4bf11e9b667 5160 owslib_0.18.0-2.debian.tar.xz 5cbc662061242e86356c8ea85ca78ffc0ec75c0cd76d4473f3730097c9411101 8333 owslib_0.18.0-2_amd64.buildinfo Files: 18a023d240bedba1845ee91b0314c0ef 2125 python optional owslib_0.18.0-2.dsc b4135ec4f8124099e3864989cd57e52b 5160 python optional owslib_0.18.0-2.debian.tar.xz 81ccee17900f1424e58319d01c475b59 8333 python optional owslib_0.18.0-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIyBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl17xVcACgkQZ1DxCuiN SvHhXw/2L4SB7Vuoc7DpySXwGbDnS08kcDidua6EO+u1hEfv61uFsG2soHA2fpHC 8nQmA+huZj8uuns9htoengQN30JWYYPk4xHw7ILLo5Ltey9FbAB8Vo0Dxjn/DvEl IrZp9r4vvw2LioTCdnZNnp2pRP6D5o8XmbUeS2HuL/VYMiRDJ8Ag2RFMAbkJxsKs udq5SfgTUD9o4dkVfX8K0ucjUrKyxYMNwxB+JwSopGU8WvhXVp2V+WJK07HPkYY4 fyukSoRgLBTmr/rv02PQqBA0Rllq312D6aibg3S6mwAykfgYHJAvA3hpjdZWp2PJ eKu9MU399w1kr0QI9B9L9REOLGY90YAKrM+OjEb+Iq1nzqTzdoxGaDq7EMS4554B gwM5OQlFpFxs5YYleJmYh8crQXPWR5+miGp/F9D728LID9sDC/rLconNbdkLpVNI WWS33MdMyRJpLwJu1gFJT14H8bf0xsoTpDJjDIWNEIa/9jYCAJE7eJG1NrxMGb34 t1vfp2i6HgB4dAAzFsLVmXLmDDENvtXuHPwE7J6b9igGhcWMK1S53Qk1vfoWwOeW SEKZfX9I+K1esXFmBQo9qjNCgUCz4tEr2fYpOdUsfFmJLVqu3/4KW03t3iDGvz3W 6E3UYx8lAC2TLnzm/K14OeZkEpembRdXYoIBvD/D5w4PLaYeyQ== =rwmu -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Fri Sep 13 17:51:43 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 13 Sep 2019 16:51:43 +0000 Subject: pywps_4.2.1-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 13 Sep 2019 18:19:15 +0200 Source: pywps Architecture: source Version: 4.2.1-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: pywps (4.2.1-2) unstable; urgency=medium . * Update gbp.conf to use --source-only-changes by default. * Bump Standards-Version to 4.4.0, no changes. * Update PIE hardening conditional, trusty is EOL. * Update copyright file, changes: - Fix license & copyright for sayhello.py. * Move from experimental to unstable. Checksums-Sha1: 2f360c378b36362d7457ecaf5153aa7bb3d52fe0 2356 pywps_4.2.1-2.dsc 72dc9ffe91d60ed87f1d3c31248e98a0234f7fb0 10148 pywps_4.2.1-2.debian.tar.xz d19e5e73adb561d5a40bd71af970fe016eebe937 11638 pywps_4.2.1-2_amd64.buildinfo Checksums-Sha256: 540a18e7d84580b4fb3e73187e49d122ce8e8459bf6f04e47c48f2a023801958 2356 pywps_4.2.1-2.dsc 76e9c87c91f3debdb783643031a79963a4f5d68f0d6fa5d880b5bd06fe0dc837 10148 pywps_4.2.1-2.debian.tar.xz b4e815a9f50a89da7654f376cdf95d72188d85aed254a96caa55f1e1bb5622bb 11638 pywps_4.2.1-2_amd64.buildinfo Files: e0a130a66020b39b6cebb362796f488a 2356 python optional pywps_4.2.1-2.dsc 68026dc009524287de930e0920c658f3 10148 python optional pywps_4.2.1-2.debian.tar.xz e58bc56184825c4667bbb23c2e773fa6 11638 python optional pywps_4.2.1-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl17wwUACgkQZ1DxCuiN SvFQdQ/8CoZFiOmB/Qak+4pPDNsN3CSO0MNpi5spnuxcdE3eGestzosqgr6dITpy SAtk+b7OlgBOMYHsPAv7/bTrbf/onlWzsH5qcEQ9U/iW1aqhUK+1hUAXhiPqjUrx AoOoKnSxDAS3gE53m3SaIVoYXBFEYF+Qe87B97IuiGt5p1ERtz2szT2lLhX5MZwL RGM9r8SJQPH1JfNrKEsynpPhRucANYqJeODo9DQd4vNNIDdqpMYzGaO+24yn8ABj jhyd/mQNQc6TUMw9wJ/XP2dkaeXjTKEDtI01QndstB5oIyZ8MdzlqUinZOzYVNvw BgstD9NOUgtKVi7PYjx7DZhd5Q2w6S5ZWjLe4LxpOuBBjP7PVXv9TxYVI1ACbamV lRbTttc0ctFonoJb7bvoFwQRAmjbnEbtIpo/u6OXNMleu8mntY+TEKBRJU/EJ6T2 nvk/srAMT1qebtVbTTRkS/hit1bRoqYCMJWqP+dvAs+Z6aJPJ3qA2HCJ5DIEex4a pGc+zpvGPj29OVd4RzWDQ+wGmzcL2bjEs26ymsIziGElghAsogUP9iLwP6NUfIS2 fYr3vhiYjGojXCJeq5Rh0aqYeWlpsvcztzATQaQil4P7mcyMc/S2odB41fRxDLGZ 9TxcwUXrDtiy4V+XJsr+NUPT/gYqlnUykQCZ/6yXOMaSVyBLoNg= =6/d7 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Fri Sep 13 17:54:06 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Fri, 13 Sep 2019 16:54:06 +0000 Subject: Bug#937226: marked as done (owslib: Python2 removal in sid/bullseye) References: Message-ID: Your message dated Fri, 13 Sep 2019 16:51:34 +0000 with message-id and subject line Bug#937226: fixed in owslib 0.18.0-2 has caused the Debian Bug report #937226, regarding owslib: Python2 removal in sid/bullseye to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 937226: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=937226 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Matthias Klose Subject: owslib: Python2 removal in sid/bullseye Date: Fri, 30 Aug 2019 07:29:52 +0000 Size: 4465 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#937226: fixed in owslib 0.18.0-2 Date: Fri, 13 Sep 2019 16:51:34 +0000 Size: 5088 URL: From gitlab at salsa.debian.org Fri Sep 13 18:39:03 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 17:39:03 +0000 Subject: [Git][debian-gis-team/qgis][pristine-tar] 2 commits: pristine-tar data for qgis_3.4.12+dfsg.orig.tar.xz Message-ID: <5d7bd4375657b_73482ad95f330820826216@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / qgis Commits: 4be46b79 by Bas Couwenberg at 2019-09-13T12:40:20Z pristine-tar data for qgis_3.4.12+dfsg.orig.tar.xz - - - - - a956f1f5 by Bas Couwenberg at 2019-09-13T12:50:17Z pristine-tar data for qgis_3.8.3+dfsg.orig.tar.xz - - - - - 4 changed files: - + qgis_3.4.12+dfsg.orig.tar.xz.delta - + qgis_3.4.12+dfsg.orig.tar.xz.id - + qgis_3.8.3+dfsg.orig.tar.xz.delta - + qgis_3.8.3+dfsg.orig.tar.xz.id Changes: ===================================== qgis_3.4.12+dfsg.orig.tar.xz.delta ===================================== Binary files /dev/null and b/qgis_3.4.12+dfsg.orig.tar.xz.delta differ ===================================== qgis_3.4.12+dfsg.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +d375460cb78c69dd0f6d2fd64591d54e850882b5 ===================================== qgis_3.8.3+dfsg.orig.tar.xz.delta ===================================== Binary files /dev/null and b/qgis_3.8.3+dfsg.orig.tar.xz.delta differ ===================================== qgis_3.8.3+dfsg.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +016b03bd812edbbe5458bce5a4fe18c8c75294ca View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/a674f14a08bc9a16126b8cf7b45c637b0732c764...a956f1f5eaf867f88bfee9d28a5db2730d3d9be7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/a674f14a08bc9a16126b8cf7b45c637b0732c764...a956f1f5eaf867f88bfee9d28a5db2730d3d9be7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 18:39:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 17:39:10 +0000 Subject: [Git][debian-gis-team/qgis][master] 6 commits: Update watch file to use query string for cache avoidance. Message-ID: <5d7bd43e8539c_73482ad95e2d84a0826417@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / qgis Commits: 593fa849 by Bas Couwenberg at 2019-09-13T12:36:50Z Update watch file to use query string for cache avoidance. - - - - - 187dbc06 by Bas Couwenberg at 2019-09-13T12:37:12Z New upstream version 3.4.12+dfsg - - - - - 4dcd4765 by Bas Couwenberg at 2019-09-13T12:40:21Z Update upstream source from tag 'upstream/3.4.12+dfsg' Update to upstream version '3.4.12+dfsg' with Debian dir 042d9c15d9568095f568e449adbc93eb7ef0ece6 - - - - - c547b448 by Bas Couwenberg at 2019-09-13T12:46:31Z New upstream release. - - - - - 1058415e by Bas Couwenberg at 2019-09-13T15:07:52Z Update symbols for amd64. - - - - - 4ba0089a by Bas Couwenberg at 2019-09-13T15:07:52Z Set distribution to experimental. - - - - - 30 changed files: - .ci/travis/scripts/ctest2travis.py - CMakeLists.txt - ChangeLog - debian/changelog - debian/control - debian/libqgis-3d3.4.11.install → debian/libqgis-3d3.4.12.install - debian/libqgis-3d3.4.11.symbols → debian/libqgis-3d3.4.12.symbols - debian/libqgis-analysis3.4.11.install → debian/libqgis-analysis3.4.12.install - debian/libqgis-analysis3.4.11.symbols → debian/libqgis-analysis3.4.12.symbols - debian/libqgis-app3.4.11.install → debian/libqgis-app3.4.12.install - debian/libqgis-app3.4.11.lintian-overrides → debian/libqgis-app3.4.12.lintian-overrides - debian/libqgis-app3.4.11.symbols → debian/libqgis-app3.4.12.symbols - debian/libqgis-core3.4.11.install → debian/libqgis-core3.4.12.install - debian/libqgis-core3.4.11.lintian-overrides → debian/libqgis-core3.4.12.lintian-overrides - debian/libqgis-core3.4.11.symbols → debian/libqgis-core3.4.12.symbols - debian/libqgis-gui3.4.11.install → debian/libqgis-gui3.4.12.install - debian/libqgis-gui3.4.11.symbols → debian/libqgis-gui3.4.12.symbols - debian/libqgis-native3.4.11.install → debian/libqgis-native3.4.12.install - debian/libqgis-native3.4.11.symbols → debian/libqgis-native3.4.12.symbols - debian/libqgis-server3.4.11.install → debian/libqgis-server3.4.12.install - debian/libqgis-server3.4.11.symbols → debian/libqgis-server3.4.12.symbols - debian/libqgisgrass7-3.4.11.install → debian/libqgisgrass7-3.4.12.install - debian/libqgisgrass7-3.4.11.lintian-overrides → debian/libqgisgrass7-3.4.12.lintian-overrides - debian/libqgisgrass7-3.4.11.symbols → debian/libqgisgrass7-3.4.12.symbols - debian/libqgispython3.4.11.install → debian/libqgispython3.4.12.install - debian/libqgispython3.4.11.symbols → debian/libqgispython3.4.12.symbols - debian/watch - doc/TRANSLATORS - i18n/CMakeLists.txt - i18n/qgis_ar.ts The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/7da12bde8ec22bf8a8ffaba54b5d72ec5267313b...4ba0089acd3836deedc48cbaf188315a41e5cbb7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/7da12bde8ec22bf8a8ffaba54b5d72ec5267313b...4ba0089acd3836deedc48cbaf188315a41e5cbb7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 18:39:13 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 17:39:13 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag debian/3.4.12+dfsg-1_exp1 Message-ID: <5d7bd44137c3e_73482ad95e2d84a0826685@godard.mail> Bas Couwenberg pushed new tag debian/3.4.12+dfsg-1_exp1 at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/debian/3.4.12+dfsg-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 18:39:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 17:39:14 +0000 Subject: [Git][debian-gis-team/qgis][upstream-ltr] New upstream version 3.4.12+dfsg Message-ID: <5d7bd4425d98d_73482ad95e237eec82687c@godard.mail> Bas Couwenberg pushed to branch upstream-ltr at Debian GIS Project / qgis Commits: 187dbc06 by Bas Couwenberg at 2019-09-13T12:37:12Z New upstream version 3.4.12+dfsg - - - - - 30 changed files: - .ci/travis/scripts/ctest2travis.py - CMakeLists.txt - ChangeLog - debian/changelog - doc/TRANSLATORS - i18n/CMakeLists.txt - i18n/qgis_ar.ts - i18n/qgis_bg.ts - i18n/qgis_bs.ts - i18n/qgis_ca.ts - i18n/qgis_cs.ts - i18n/qgis_da.ts - i18n/qgis_de.ts - i18n/qgis_el.ts - i18n/qgis_en.ts - i18n/qgis_lv.ts → i18n/qgis_eo.ts - i18n/qgis_es.ts - i18n/qgis_eu.ts - i18n/qgis_fi.ts - i18n/qgis_fr.ts - i18n/qgis_gl.ts - i18n/qgis_nb.ts → i18n/qgis_hi.ts - i18n/qgis_hu.ts - i18n/qgis_id.ts - i18n/qgis_is.ts - i18n/qgis_it.ts - i18n/qgis_ja.ts - i18n/qgis_km.ts - i18n/qgis_ko.ts - i18n/qgis_ky.ts The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/commit/187dbc06be9ebb75519bbe86fb9ad06a605b479a -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/commit/187dbc06be9ebb75519bbe86fb9ad06a605b479a You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 18:39:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 17:39:14 +0000 Subject: [Git][debian-gis-team/qgis][upstream] New upstream version 3.8.3+dfsg Message-ID: <5d7bd442838f2_73482ad95f28d42c8270f5@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / qgis Commits: a8c1a861 by Bas Couwenberg at 2019-09-13T12:47:03Z New upstream version 3.8.3+dfsg - - - - - 30 changed files: - .ci/travis/scripts/ctest2travis.py - CMakeLists.txt - ChangeLog - debian/changelog - doc/TRANSLATORS - external/o2/src/o2.cpp - i18n/CMakeLists.txt - i18n/qgis_ar.ts - i18n/qgis_bg.ts - i18n/qgis_bs.ts - i18n/qgis_ca.ts - i18n/qgis_cs.ts - i18n/qgis_da.ts - i18n/qgis_de.ts - i18n/qgis_el.ts - i18n/qgis_en.ts - i18n/qgis_es.ts - i18n/qgis_et.ts - i18n/qgis_eu.ts - i18n/qgis_fi.ts - i18n/qgis_fr.ts - + i18n/qgis_gl.ts - i18n/qgis_hi.ts - i18n/qgis_hu.ts - i18n/qgis_id.ts - i18n/qgis_is.ts - i18n/qgis_it.ts - i18n/qgis_ja.ts - i18n/qgis_ko.ts - i18n/qgis_ky.ts The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/commit/a8c1a861e1c882853d3f978adcb28a2ab7fc3ad8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/commit/a8c1a861e1c882853d3f978adcb28a2ab7fc3ad8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 18:39:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 17:39:16 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag upstream/3.4.12+dfsg Message-ID: <5d7bd444b51b_73482ad95f28d42c82725e@godard.mail> Bas Couwenberg pushed new tag upstream/3.4.12+dfsg at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/upstream/3.4.12+dfsg You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 13 18:39:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 13 Sep 2019 17:39:16 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag upstream/3.8.3+dfsg Message-ID: <5d7bd4441a3cc_73482ad95f3308208274de@godard.mail> Bas Couwenberg pushed new tag upstream/3.8.3+dfsg at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/upstream/3.8.3+dfsg You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 13 19:06:45 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 13 Sep 2019 18:06:45 +0000 Subject: Processing of qgis_3.4.12+dfsg-1~exp1_amd64.changes Message-ID: qgis_3.4.12+dfsg-1~exp1_amd64.changes uploaded successfully to localhost along with the files: qgis_3.4.12+dfsg-1~exp1.dsc qgis_3.4.12+dfsg.orig.tar.xz qgis_3.4.12+dfsg-1~exp1.debian.tar.xz libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-3d3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-analysis3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-app3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-core3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-customwidgets_3.4.12+dfsg-1~exp1_amd64.deb libqgis-dev_3.4.12+dfsg-1~exp1_amd64.deb libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-gui3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-native3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgis-server3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgisgrass7-3.4.12_3.4.12+dfsg-1~exp1_amd64.deb libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb libqgispython3.4.12_3.4.12+dfsg-1~exp1_amd64.deb python3-qgis-common_3.4.12+dfsg-1~exp1_all.deb python3-qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb python3-qgis_3.4.12+dfsg-1~exp1_amd64.deb qgis-api-doc_3.4.12+dfsg-1~exp1_all.deb qgis-common_3.4.12+dfsg-1~exp1_all.deb qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb qgis-plugin-grass-common_3.4.12+dfsg-1~exp1_all.deb qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb qgis-plugin-grass_3.4.12+dfsg-1~exp1_amd64.deb qgis-provider-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb qgis-provider-grass_3.4.12+dfsg-1~exp1_amd64.deb qgis-providers-common_3.4.12+dfsg-1~exp1_all.deb qgis-providers-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb qgis-providers_3.4.12+dfsg-1~exp1_amd64.deb qgis-server-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb qgis-server_3.4.12+dfsg-1~exp1_amd64.deb qgis_3.4.12+dfsg-1~exp1_amd64.buildinfo qgis_3.4.12+dfsg-1~exp1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 13 19:12:45 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 13 Sep 2019 18:12:45 +0000 Subject: qgis_3.4.12+dfsg-1~exp1_amd64.changes is NEW Message-ID: binary:libqgis-3d3.4.12 is NEW. binary:libqgis-analysis3.4.12 is NEW. binary:libqgis-app3.4.12 is NEW. binary:libqgis-core3.4.12 is NEW. binary:libqgis-gui3.4.12 is NEW. binary:libqgis-native3.4.12 is NEW. binary:libqgis-server3.4.12 is NEW. binary:libqgisgrass7-3.4.12 is NEW. binary:libqgispython3.4.12 is NEW. binary:libqgis-app3.4.12 is NEW. binary:libqgis-3d3.4.12 is NEW. binary:libqgispython3.4.12 is NEW. binary:libqgis-core3.4.12 is NEW. binary:libqgis-native3.4.12 is NEW. binary:libqgis-analysis3.4.12 is NEW. binary:libqgis-server3.4.12 is NEW. binary:libqgisgrass7-3.4.12 is NEW. binary:libqgis-gui3.4.12 is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From noreply at release.debian.org Sat Sep 14 05:39:17 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 14 Sep 2019 04:39:17 +0000 Subject: grass 7.8.0-1 MIGRATED to testing Message-ID: FYI: The status of the grass source package in Debian's testing distribution has changed. Previous version: 7.6.1-3 Current version: 7.8.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 14 05:39:17 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 14 Sep 2019 04:39:17 +0000 Subject: libgdal-grass 2.4.2-3 MIGRATED to testing Message-ID: FYI: The status of the libgdal-grass source package in Debian's testing distribution has changed. Previous version: 2.4.2-1 Current version: 2.4.2-3 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 14 05:39:19 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 14 Sep 2019 04:39:19 +0000 Subject: qgis 3.4.11+dfsg-2 MIGRATED to testing Message-ID: FYI: The status of the qgis source package in Debian's testing distribution has changed. Previous version: 3.4.11+dfsg-1 Current version: 3.4.11+dfsg-2 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 14 05:39:20 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 14 Sep 2019 04:39:20 +0000 Subject: routino 3.3.1-1 MIGRATED to testing Message-ID: FYI: The status of the routino source package in Debian's testing distribution has changed. Previous version: 3.2-5 Current version: 3.3.1-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 14 05:39:19 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 14 Sep 2019 04:39:19 +0000 Subject: pyepr 1.0.0-1 MIGRATED to testing Message-ID: FYI: The status of the pyepr source package in Debian's testing distribution has changed. Previous version: 0.9.5-3 Current version: 1.0.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Sat Sep 14 07:04:57 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 06:04:57 +0000 Subject: [Git][debian-gis-team/pywps][master] 2 commits: Add Breaks/Replaces to fix upgrade to python3-pywps. Message-ID: <5d7c83094a422_73483fbbb287965c88549d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pywps Commits: 33709111 by Bas Couwenberg at 2019-09-14T05:50:41Z Add Breaks/Replaces to fix upgrade to python3-pywps. - - - - - de8f0a46 by Bas Couwenberg at 2019-09-14T05:50:41Z Set distribution to unstable. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pywps (4.2.1-3) unstable; urgency=medium + + * Add Breaks/Replaces to fix upgrade to python3-pywps. + + -- Bas Couwenberg Sat, 14 Sep 2019 07:47:13 +0200 + pywps (4.2.1-2) unstable; urgency=medium * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -47,6 +47,8 @@ Recommends: python3-mapscript, python3-pyproj Suggests: grass-core, r-base +Breaks: python-pywps (<< 4.2.1-2~) +Replaces: python-pywps (<< 4.2.1-2~) Description: Implementation of OGC's Web Processing Service - Python module PyWPS is implementation of Web Processing Service from Open Geospatial Consortium Inc.(R) with help of Python Programming View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/7c300c0f30cbc1bc0086a2e8f03e051409660a5f...de8f0a46a53f1d66ff18e53f542d04a22fb4cceb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/7c300c0f30cbc1bc0086a2e8f03e051409660a5f...de8f0a46a53f1d66ff18e53f542d04a22fb4cceb You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 07:05:02 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 06:05:02 +0000 Subject: [Git][debian-gis-team/pywps] Pushed new tag debian/4.2.1-3 Message-ID: <5d7c830ea558d_73482ad95deb8ce4885634@godard.mail> Bas Couwenberg pushed new tag debian/4.2.1-3 at Debian GIS Project / pywps -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/tree/debian/4.2.1-3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 14 07:13:55 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 14 Sep 2019 06:13:55 +0000 Subject: Processing of pywps_4.2.1-3_source.changes Message-ID: pywps_4.2.1-3_source.changes uploaded successfully to localhost along with the files: pywps_4.2.1-3.dsc pywps_4.2.1-3.debian.tar.xz pywps_4.2.1-3_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 14 07:19:59 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 14 Sep 2019 06:19:59 +0000 Subject: pywps_4.2.1-3_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 14 Sep 2019 07:47:13 +0200 Source: pywps Architecture: source Version: 4.2.1-3 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: pywps (4.2.1-3) unstable; urgency=medium . * Add Breaks/Replaces to fix upgrade to python3-pywps. Checksums-Sha1: d90f03955350724dc9e7cac25f014d0015bd80f0 2356 pywps_4.2.1-3.dsc da9ec5cc64c0dd5d08f82b14bc7d8f730e30121e 10184 pywps_4.2.1-3.debian.tar.xz a45e3d0755ee765fb8410d3e4782d74a872d2445 11638 pywps_4.2.1-3_amd64.buildinfo Checksums-Sha256: b4a845e0ad78b180864acd8158ce5eb073ccc013d145c9cd1c726d594aac6438 2356 pywps_4.2.1-3.dsc a5016eaef7a9d94ad0ec4846577896cefa346c41e6b2e33ecb17aee24ea9af65 10184 pywps_4.2.1-3.debian.tar.xz 77e0e5afd3bbf8fe9de9020bd2a6664a3b2394353069ef85d0702e29b7bf0ef4 11638 pywps_4.2.1-3_amd64.buildinfo Files: 57a562ba209423705e32854cd38ce9eb 2356 python optional pywps_4.2.1-3.dsc 696bf696bcb44e13d0a9058af472f10c 10184 python optional pywps_4.2.1-3.debian.tar.xz 27602a92b1bafc2a4518c9e4e274bf02 11638 python optional pywps_4.2.1-3_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl18gvQACgkQZ1DxCuiN SvHrwRAAuc05iKXjtlmuWenRe2nBhtbFmeBMPBvQahMbWlarKEZ3rPQBfUQcDqsW 5s8VMLtjwXzK+1QccB9jigrYKUpwaurInJnniRQkUfFaXInrFUsSWTFwgs+4P7m1 YcczLwfc08zf/QKF896EVYImba/g5hUDIXROJ/YmYKvVJsfTq37a601XDbSDSg2r vmXZDMJWOi0cr7s2UdbE4Ck+hzoD8hZggARkK+JirRBx9L/ZuJQ+/t6fPxykhuy9 6v948BoeLs6lZ//OlLX737E9Gubh2dXg6Pfco5iqN30Xp+Mg3D0SRMXQDLP67MGz gaBwA8w5VJRnDfpHGXwEWqUm/I5119xp4Ko0/tpU2tW0hGAdRnmwDoUdXp5pwtBt TikQN+R0JLh+76HNw5wYprg5LkOwJQONbZn3ScD83aQxmjwoMhmsjXx/US/9xfU1 7GtmfnciuBX7z2gl/VUMY0noCRnURHkjebRvdrAFli/zr/PgjnMFZh0FobkFzNrd wcB+3Cw05MBPNlQ0+rRRb1SagFmcfmRQM2Pz+YR8NCVXP8JFxJVJlWUbWrfJwj8R nzj2GzO4isPiI0ebsq36G7O2sgWLGXFD/FWXzeli37E8w0A7zZSUsXxm5vaclwJf vLkTAhvwd913AUa48sax8mJnq/A2kMTv3gIpWa/ypD3fyP94LkU= =9SmV -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 14 07:36:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 06:36:26 +0000 Subject: [Git][debian-gis-team/mapserver][master] 6 commits: New upstream version 7.4.2 Message-ID: <5d7c8a6addd35_73482ad95deb8ce4886332@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapserver Commits: 96263bc5 by Bas Couwenberg at 2019-09-14T05:49:01Z New upstream version 7.4.2 - - - - - a87066ed by Bas Couwenberg at 2019-09-14T05:52:32Z Merge tag 'upstream/7.4.2' Upstream version 7.4.2 - - - - - bb4491ca by Bas Couwenberg at 2019-09-14T05:52:47Z New upstream release. - - - - - 2a559851 by Bas Couwenberg at 2019-09-14T06:10:29Z Update 7.4.1 symbols for other architectures. - - - - - b0b74dbe by Bas Couwenberg at 2019-09-14T06:23:34Z Update 7.4.2 symbols for amd64. - - - - - 7b8bd6d8 by Bas Couwenberg at 2019-09-14T06:23:34Z Set distribution to unstable. - - - - - 30 changed files: - CMakeLists.txt - HISTORY.TXT - Makefile - + ci/travis/after_success.sh - + ci/travis/before_install.sh - + ci/travis/script.sh - debian/changelog - debian/libmapserver2.symbols - mapcopy.c - mapfile.c - mapgdal.c - maplabel.c - maplayer.c - maplexer.c - maplexer.l - mapmetadata.c - mapmssql2008.c - mapogcsld.c - mapogcsld.h - mapogcsos.c - mapogr.cpp - mapogroutput.c - mapows.c - mapparser.c - mapparser.h - mapparser.y - mapquery.c - mapresample.c - mapscript/phpng/CMakeLists.txt - mapsymbol.c The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/compare/53253bf6c627fd6b1a57ef304a9a346b1ed91f01...7b8bd6d84e4d713e35c25c4d95652469ab928b13 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/compare/53253bf6c627fd6b1a57ef304a9a346b1ed91f01...7b8bd6d84e4d713e35c25c4d95652469ab928b13 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 07:36:28 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 06:36:28 +0000 Subject: [Git][debian-gis-team/mapserver][pristine-tar] pristine-tar data for mapserver_7.4.2.orig.tar.gz Message-ID: <5d7c8a6c1ec13_73482ad95f8486c8886596@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / mapserver Commits: 1543d5e1 by Bas Couwenberg at 2019-09-14T05:49:20Z pristine-tar data for mapserver_7.4.2.orig.tar.gz - - - - - 2 changed files: - + mapserver_7.4.2.orig.tar.gz.delta - + mapserver_7.4.2.orig.tar.gz.id Changes: ===================================== mapserver_7.4.2.orig.tar.gz.delta ===================================== Binary files /dev/null and b/mapserver_7.4.2.orig.tar.gz.delta differ ===================================== mapserver_7.4.2.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +b1dea045d8b9c44934a79944ba21ca97422b2332 View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/commit/1543d5e1b4261d0b4c78c543937d7f614bec0851 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/commit/1543d5e1b4261d0b4c78c543937d7f614bec0851 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 07:36:28 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 06:36:28 +0000 Subject: [Git][debian-gis-team/mapserver][upstream] New upstream version 7.4.2 Message-ID: <5d7c8a6c40de0_73482ad95deb8ce48867ea@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / mapserver Commits: 96263bc5 by Bas Couwenberg at 2019-09-14T05:49:01Z New upstream version 7.4.2 - - - - - 30 changed files: - CMakeLists.txt - HISTORY.TXT - Makefile - + ci/travis/after_success.sh - + ci/travis/before_install.sh - + ci/travis/script.sh - mapcopy.c - mapfile.c - mapgdal.c - maplabel.c - maplayer.c - maplexer.c - maplexer.l - mapmetadata.c - mapmssql2008.c - mapogcsld.c - mapogcsld.h - mapogcsos.c - mapogr.cpp - mapogroutput.c - mapows.c - mapparser.c - mapparser.h - mapparser.y - mapquery.c - mapresample.c - mapscript/phpng/CMakeLists.txt - mapsymbol.c - maputil.c - mapwcs.c The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/commit/96263bc52f3a085383f989834204c5b801894c3b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/commit/96263bc52f3a085383f989834204c5b801894c3b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 07:36:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 06:36:31 +0000 Subject: [Git][debian-gis-team/mapserver] Pushed new tag debian/7.4.2-1 Message-ID: <5d7c8a6fde583_73483fbbb287965c8869f0@godard.mail> Bas Couwenberg pushed new tag debian/7.4.2-1 at Debian GIS Project / mapserver -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/tree/debian/7.4.2-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 07:36:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 06:36:32 +0000 Subject: [Git][debian-gis-team/mapserver] Pushed new tag upstream/7.4.2 Message-ID: <5d7c8a70c97f5_73483fbbb287965c887144@godard.mail> Bas Couwenberg pushed new tag upstream/7.4.2 at Debian GIS Project / mapserver -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/tree/upstream/7.4.2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 14 07:45:30 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 14 Sep 2019 06:45:30 +0000 Subject: Processing of mapserver_7.4.2-1_source.changes Message-ID: mapserver_7.4.2-1_source.changes uploaded successfully to localhost along with the files: mapserver_7.4.2-1.dsc mapserver_7.4.2.orig.tar.gz mapserver_7.4.2-1.debian.tar.xz mapserver_7.4.2-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 14 08:06:19 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 14 Sep 2019 07:06:19 +0000 Subject: mapserver_7.4.2-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 14 Sep 2019 07:55:23 +0200 Source: mapserver Architecture: source Version: 7.4.2-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: mapserver (7.4.2-1) unstable; urgency=medium . * New upstream release. * Update 7.4.1 symbols for other architectures. * Update 7.4.2 symbols for amd64. Checksums-Sha1: d48c4da141a45bd77f337e2963e68740c49dd369 3231 mapserver_7.4.2-1.dsc e389fab4de9f09a1155c90c1e9848e07fec13967 2688497 mapserver_7.4.2.orig.tar.gz 1e04840f65ff725357f7acf8ecab8bcf572a61ba 47004 mapserver_7.4.2-1.debian.tar.xz 297c40321b345b1ed27b17125087b8aa3ffb197b 22808 mapserver_7.4.2-1_amd64.buildinfo Checksums-Sha256: c5a8cfe7598d679585333282a4218ddf88145ba313fe4e04316617706532e71a 3231 mapserver_7.4.2-1.dsc e15497eca57768932822d9f8524fef0b5e328df05b6a1c30bd73f5e5b3e4125d 2688497 mapserver_7.4.2.orig.tar.gz 9815baa7d892d6700a0757ae31f3ad8c5e6b88309ff783a2022103509f753e71 47004 mapserver_7.4.2-1.debian.tar.xz eb426f3610cc053c550ea98dd69ac3715de8ed07b3bd56873a015e93d4f57baf 22808 mapserver_7.4.2-1_amd64.buildinfo Files: 33a61f1cee1299c524ac19feaa9fad0f 3231 devel optional mapserver_7.4.2-1.dsc 500c44fa1954af0c6a201ff4cae5235b 2688497 devel optional mapserver_7.4.2.orig.tar.gz b97a641aef4d21942fabba51dfaf548b 47004 devel optional mapserver_7.4.2-1.debian.tar.xz 7d6ca59ab8615f076b372d24c4951f86 22808 devel optional mapserver_7.4.2-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl18ilIACgkQZ1DxCuiN SvF9Sw/+IyY3giJJFieE4XLGY+zMAIxkA56do4m07BDzraDFV3y5kIuSJAosd9AH vTaRs1x9JyppEKtE39f4gAYIdmpQXcJipl85fZfcf81RuWjiH+rmdGqMMyDadzEd 0uK26SfuN1G1Oc3FLuwmAnrBqlw6ayXmklqawkzxvN6nIrC7S49AFAmP6FqrNasd mVKKpgt8LmM9AFj+wAvr5S0R20Ab1oqfA+tRgP0QxyY4kZS7kWqzixYAE0I6EeEp 1/hvMERsdPjWAaJ8trzZhGGoAhGZ2Cxqb3Gpz/V4elb5PsP4PE7FZSiCzHGJbWb9 vEWCrVgezAnGaYZHhcbP6rMTt+YDs+rXSMTuup3LYh+6p+HpKRDYLOMmDRF81D++ FTdjJEiuopk+VVe6PW02S/2O6BE9PG3TWsJMsO53+3un6KvvyvN05Lt8XiPBmGFU r9lgmp+e6wju0DWqlcflL4qhiK+UCVYwaAzDzzmAoGCxPn4XjCP6L9egoubR5o2G SvNikL+HfGlcOwr4iuKfzjPA7WoUuNuZNAoZ84nm/Z5G+lLwg/v5slSrIwNMuhxQ teLJS2ilTVj8ldf736fUGZdNY/W3rOB9MsvcrKZVUAsHij+tbM0MSl3SPyvreflE UuuogsvyCRBumQEbCTG/UyUfpew1IUxdOTWznVxgI+mfPA0QUZA= =JSPE -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 14 18:43:29 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 14 Sep 2019 17:43:29 +0000 Subject: [Git][debian-gis-team/pyresample][pristine-tar] pristine-tar data for pyresample_1.13.0.orig.tar.gz Message-ID: <5d7d26c11cff2_73482ad9614b78909290c3@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / pyresample Commits: 0163f0db by Antonio Valentino at 2019-09-14T06:53:26Z pristine-tar data for pyresample_1.13.0.orig.tar.gz - - - - - 2 changed files: - + pyresample_1.13.0.orig.tar.gz.delta - + pyresample_1.13.0.orig.tar.gz.id Changes: ===================================== pyresample_1.13.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/pyresample_1.13.0.orig.tar.gz.delta differ ===================================== pyresample_1.13.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +ab5728fe151d9bdb49416333439cdb5246abde59 View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/0163f0db738536fe7f267c2cf922eb2136b03ec5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/0163f0db738536fe7f267c2cf922eb2136b03ec5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 18:43:27 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 14 Sep 2019 17:43:27 +0000 Subject: [Git][debian-gis-team/pyresample][master] 6 commits: New upstream version 1.13.0 Message-ID: <5d7d26bfeeabd_73482ad9614b789092888e@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyresample Commits: b62672cc by Antonio Valentino at 2019-09-14T06:53:12Z New upstream version 1.13.0 - - - - - 3dfd5f34 by Antonio Valentino at 2019-09-14T06:53:26Z Update upstream source from tag 'upstream/1.13.0' Update to upstream version '1.13.0' with Debian dir e2ef1b13da07dee3aa8ff9496c239c82cf2304b4 - - - - - 6f4ca2b5 by Antonio Valentino at 2019-09-14T07:00:48Z New upstream release - - - - - ea5ccbaf by Antonio Valentino at 2019-09-14T07:12:50Z Refresh all patches - - - - - e614a152 by Antonio Valentino at 2019-09-14T17:36:03Z Fix compatibility with dask and basemap - - - - - 44d53d73 by Antonio Valentino at 2019-09-14T17:36:50Z Set distribution to unstable - - - - - 30 changed files: - .travis.yml - CHANGELOG.md - appveyor.yml - debian/changelog - debian/patches/0001-fix-proj4-initialization.patch - debian/patches/0002-Skip-dask-related-tests-if-dask-is-not-available.patch - debian/patches/0003-Make-xarray-optional-for-testing.patch - + debian/patches/0005-Comapt-with-dask-1.0.patch - + debian/patches/0006-Skip-test-on-deprecatet-basemap.patch - debian/patches/series - docs/source/API.rst - docs/source/conf.py - docs/source/geo_def.rst - docs/source/geometry_utils.rst - docs/source/index.rst - docs/source/plot.rst - docs/source/swath.rst - pyresample/_spatial_mp.py - pyresample/area_config.py - pyresample/bilinear/xarr.py - + pyresample/bucket/__init__.py - pyresample/ewa/_fornav.cpp - pyresample/ewa/_fornav_templates.cpp - pyresample/ewa/_ll2cr.c - pyresample/geometry.py - pyresample/test/__init__.py - pyresample/test/test_bilinear.py - + pyresample/test/test_bucket.py - pyresample/test/test_geometry.py - pyresample/test/test_grid.py The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/compare/3a94bc4893e6beab6b3f3de4b36c9b7ebe33a167...44d53d73f4a56ca27fed046c8baff194f4b4b3d6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/compare/3a94bc4893e6beab6b3f3de4b36c9b7ebe33a167...44d53d73f4a56ca27fed046c8baff194f4b4b3d6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 18:44:15 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 14 Sep 2019 17:44:15 +0000 Subject: [Git][debian-gis-team/pyresample] Pushed new tag upstream/1.13.0 Message-ID: <5d7d26ef78705_73483fbbb4a7cdbc9293ca@godard.mail> Antonio Valentino pushed new tag upstream/1.13.0 at Debian GIS Project / pyresample -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/tree/upstream/1.13.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 14 18:44:27 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 14 Sep 2019 17:44:27 +0000 Subject: [Git][debian-gis-team/pyresample][upstream] New upstream version 1.13.0 Message-ID: <5d7d26fb2c14d_73482ad9614b789092941f@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / pyresample Commits: b62672cc by Antonio Valentino at 2019-09-14T06:53:12Z New upstream version 1.13.0 - - - - - 29 changed files: - .travis.yml - CHANGELOG.md - appveyor.yml - docs/source/API.rst - docs/source/conf.py - docs/source/geo_def.rst - docs/source/geometry_utils.rst - docs/source/index.rst - docs/source/plot.rst - docs/source/swath.rst - pyresample/_spatial_mp.py - pyresample/area_config.py - pyresample/bilinear/xarr.py - + pyresample/bucket/__init__.py - pyresample/ewa/_fornav.cpp - pyresample/ewa/_fornav_templates.cpp - pyresample/ewa/_ll2cr.c - pyresample/geometry.py - pyresample/test/__init__.py - pyresample/test/test_bilinear.py - + pyresample/test/test_bucket.py - pyresample/test/test_geometry.py - pyresample/test/test_grid.py - pyresample/test/test_spatial_mp.py - pyresample/test/test_utils.py - pyresample/test/utils.py - pyresample/utils/_proj4.py - pyresample/version.py - setup.py Changes: ===================================== .travis.yml ===================================== @@ -1,7 +1,7 @@ language: python env: global: - - PYTHON_VERSION=$PYTHON_VERSION + - PYTHON_VERSION=$TRAVIS_PYTHON_VERSION - NUMPY_VERSION=stable - MAIN_CMD='python setup.py' - CONDA_DEPENDENCIES='xarray dask toolz Cython pykdtree sphinx cartopy rasterio pillow matplotlib @@ -10,7 +10,7 @@ env: - EVENT_TYPE='push pull_request' - SETUP_CMD='test' - CONDA_CHANNELS='conda-forge' - - CONDA_CHANNEL_PRIORITY='True' + - CONDA_CHANNEL_PRIORITY='strict' matrix: include: - env: PYTHON_VERSION=2.7 @@ -18,19 +18,25 @@ matrix: - env: PYTHON_VERSION=2.7 os: osx language: generic - - env: PYTHON_VERSION=3.6 + - env: PYTHON_VERSION=3.7 os: linux - - env: PYTHON_VERSION=3.6 + - env: PYTHON_VERSION=3.7 os: osx language: generic + - env: PYTHON_VERSION=3.7 + os: windows + language: c + allow_failures: + - os: windows install: -- git clone --depth 1 git://github.com/astropy/ci-helpers.git +# - git clone --depth 1 git://github.com/astropy/ci-helpers.git +- git clone --depth 1 -b all-the-fixes git://github.com/djhoese/ci-helpers.git - source ci-helpers/travis/setup_conda.sh script: - coverage run --source=pyresample setup.py test - cd docs && mkdir doctest && sphinx-build -E -n -b doctest ./source ./doctest && cd .. after_success: -- if [[ $PYTHON_VERSION == 3.6 ]]; then coveralls; codecov; fi +- if [[ $PYTHON_VERSION == 3.7 ]]; then coveralls; codecov; fi deploy: - provider: pypi user: dhoese ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,40 @@ +## Version 1.13.0 (2019/09/13) + +### Issues Closed + +* [Issue 210](https://github.com/pytroll/pyresample/issues/210) - Incompatibility with new proj/pyproj versions + +In this release 1 issue was closed. + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 213](https://github.com/pytroll/pyresample/pull/213) - Remove extra conversion to dask array +* [PR 208](https://github.com/pytroll/pyresample/pull/208) - Bugfix bilinear resampler masking ([735](https://github.com/pytroll/satpy/issues/735)) +* [PR 207](https://github.com/pytroll/pyresample/pull/207) - Make output index tiling in bilinear interpolation work with dask +* [PR 205](https://github.com/pytroll/pyresample/pull/205) - Exclude NaNs from Bucket Average +* [PR 197](https://github.com/pytroll/pyresample/pull/197) - Fix to_cartopy_crs for latlong projections +* [PR 196](https://github.com/pytroll/pyresample/pull/196) - Improve handling of EPSG codes with pyproj 2.0+ + +#### Features added + +* [PR 212](https://github.com/pytroll/pyresample/pull/212) - Use slices in bilinear resampler +* [PR 203](https://github.com/pytroll/pyresample/pull/203) - Add Numpy version limitation for Python 2 +* [PR 198](https://github.com/pytroll/pyresample/pull/198) - Clarify warning if no overlap data and projection +* [PR 196](https://github.com/pytroll/pyresample/pull/196) - Improve handling of EPSG codes with pyproj 2.0+ +* [PR 192](https://github.com/pytroll/pyresample/pull/192) - Add bucket resampling + +#### Documentation changes + +* [PR 204](https://github.com/pytroll/pyresample/pull/204) - Add Example for Regular Lat-Lon Grid +* [PR 201](https://github.com/pytroll/pyresample/pull/201) - fix bug in plot example code +* [PR 198](https://github.com/pytroll/pyresample/pull/198) - Clarify warning if no overlap data and projection +* [PR 195](https://github.com/pytroll/pyresample/pull/195) - Update docs for create_area_def and improve AreaDefinition property consistency + +In this release 15 pull requests were closed. + + ## Version 1.12.3 (2019/05/17) ### Pull Requests Merged ===================================== appveyor.yml ===================================== @@ -14,10 +14,9 @@ environment: NUMPY_VERSION: "stable" install: - - "git clone --depth 1 git://github.com/astropy/ci-helpers.git" + - "git clone --depth 1 -b all-the-fixes git://github.com/djhoese/ci-helpers.git" - "powershell ci-helpers/appveyor/install-miniconda.ps1" - - "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" - - "activate test" + - "conda activate test" build: false # Not a C# project, build stuff at the test step instead. ===================================== docs/source/API.rst ===================================== @@ -31,6 +31,11 @@ pyresample.utils .. automodule:: pyresample.utils :members: +pyresample.area_config +---------------------- +.. automodule:: pyresample.area_config + :members: + pyresample.data_reduce ---------------------- .. automodule:: pyresample.data_reduce @@ -45,3 +50,8 @@ pyresample.ewa -------------- .. automodule:: pyresample.ewa :members: + +pyresample.bucket +----------------- +.. automodule:: pyresample.bucket + :members: ===================================== docs/source/conf.py ===================================== @@ -237,5 +237,5 @@ intersphinx_mapping = { 'pyresample': ('https://pyresample.readthedocs.io/en/stable', None), 'trollsift': ('https://trollsift.readthedocs.io/en/stable', None), 'trollimage': ('https://trollimage.readthedocs.io/en/stable', None), - 'proj4': ('https://proj4.org', None), + 'proj4': ('https://proj.org', None), } ===================================== docs/source/geo_def.rst ===================================== @@ -51,60 +51,48 @@ where * **upper_right_x**: projection x coordinate of upper right corner of upper right pixel * **upper_right_y**: projection y coordinate of upper right corner of upper right pixel -Below are three examples of creating an ``AreaDefinition``: +Example: .. doctest:: >>> from pyresample.geometry import AreaDefinition - - >>> # a) Using a projection dictionary >>> area_id = 'ease_sh' >>> description = 'Antarctic EASE grid' >>> proj_id = 'ease_sh' - >>> proj_dict = {'proj': 'laea', 'lat_0': -90, 'lon_0': 0, 'a': 6371228.0, 'units': 'm'} + >>> projection = {'proj': 'laea', 'lat_0': -90, 'lon_0': 0, 'a': 6371228.0, 'units': 'm'} >>> width = 425 >>> height = 425 >>> area_extent = (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) - >>> area_def = AreaDefinition(area_id, description, proj_id, proj_dict, + >>> area_def = AreaDefinition(area_id, description, proj_id, projection, ... width, height, area_extent) - >>> print(area_def) + >>> area_def Area ID: ease_sh Description: Antarctic EASE grid Projection ID: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) - >>> # b) Using an explicit proj4 string - >>> proj_string = '+proj=laea +lat_0=-90 +lon_0=0 +a=6371228.0 +units=m' - >>> area_def = AreaDefinition(area_id, description, proj_id, proj_string, +You can also specify the projection using a PROJ.4 string + +.. doctest:: + + >>> projection = '+proj=laea +lat_0=-90 +lon_0=0 +a=6371228.0 +units=m' + >>> area_def = AreaDefinition(area_id, description, proj_id, projection, ... width, height, area_extent) - >>> print(area_def) - Area ID: ease_sh - Description: Antarctic EASE grid - Projection ID: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} - Number of columns: 425 - Number of rows: 425 - Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) - >>> # c) Using an EPSG code in a proj4 string - >>> proj_string = '+init=EPSG:3409' # Use 'EPSG:3409' with pyproj 2.0+ - >>> area_def = AreaDefinition(area_id, description, proj_id, proj_string, +or an `EPSG code `_: + +.. doctest:: + + >>> projection = '+init=EPSG:3409' # Use 'EPSG:3409' with pyproj 2.0+ + >>> area_def = AreaDefinition(area_id, description, proj_id, projection, ... width, height, area_extent) - >>> print(area_def) - Area ID: ease_sh - Description: Antarctic EASE grid - Projection ID: ease_sh - Projection: {'init': 'EPSG:3409'} - Number of columns: 425 - Number of rows: 425 - Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) .. note:: - When using pyproj 2.0+, please use the new ``'EPSG:XXXX'`` syntax + With pyproj 2.0+ please use the new ``'EPSG:XXXX'`` syntax as the old ``'+init=EPSG:XXXX'`` is no longer supported. Creating an ``AreaDefinition`` can be complex if you don't know everything ===================================== docs/source/geometry_utils.rst ===================================== @@ -11,7 +11,7 @@ AreaDefinition Creation The main utility function for creating :class:`~pyresample.geometry.AreaDefinition` objects is the -:func:`~pyresample.utils.create_area_def` function. This function will take +:func:`~pyresample.area_config.create_area_def` function. This function will take whatever information can be provided to describe a geographic region and create a valid ``AreaDefinition`` object if possible. If it can't make a fully specified ``AreaDefinition`` then it will provide a @@ -46,17 +46,17 @@ and optional arguments: .. doctest:: - >>> from pyresample import utils + >>> from pyresample import create_area_def >>> area_id = 'ease_sh' >>> proj_dict = {'proj': 'laea', 'lat_0': -90, 'lon_0': 0, 'a': 6371228.0, 'units': 'm'} >>> center = (0, 0) >>> radius = (5326849.0625, 5326849.0625) >>> resolution = (25067.525, 25067.525) - >>> area_def = utils.create_area_def(area_id, proj_dict, center=center, radius=radius, resolution=resolution) + >>> area_def = create_area_def(area_id, proj_dict, center=center, radius=radius, resolution=resolution) >>> print(area_def) Area ID: ease_sh Description: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -68,12 +68,12 @@ keyword arguments can be specified with one value if ``dx == dy``: .. doctest:: >>> proj_string = '+proj=laea +lat_0=-90 +lon_0=0 +a=6371228.0 +units=m' - >>> area_def = utils.create_area_def(area_id, proj_string, center=center, + >>> area_def = create_area_def(area_id, proj_string, center=center, ... radius=5326849.0625, resolution=25067.525) >>> print(area_def) Area ID: ease_sh Description: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -85,25 +85,44 @@ the mercator projection with radius and resolution defined in degrees. .. doctest:: >>> proj_dict = {'proj': 'merc', 'lat_0': 0, 'lon_0': 0, 'a': 6371228.0, 'units': 'm'} - >>> area_def = utils.create_area_def(area_id, proj_dict, center=(0, 0), + >>> area_def = create_area_def(area_id, proj_dict, center=(0, 0), ... radius=(47.90379019311, 43.1355420077), ... resolution=(0.22542960090875294, 0.22542901929487608), ... units='degrees', description='Antarctic EASE grid') >>> print(area_def) Area ID: ease_sh Description: Antarctic EASE grid - Projection: {'a': '6371228.0', 'lat_0': '0.0', 'lon_0': '0.0', 'proj': 'merc', 'units': 'm'} + Projection: {'a': '6371228.0', 'lat_0': '0', 'lon_0': '0', 'proj': 'merc', 'type': 'crs', 'units': 'm'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) +The area definition corresponding to a given lat-lon grid (defined by area extent and resolution) +can be obtained as follows: + +.. doctest:: + + >>> area_def = create_area_def('my_area', + ... {'proj': 'latlong', 'lon_0': 0}, + ... area_extent=[-180, -90, 180, 90], + ... resolution=1, + ... units='degrees', + ... description='Global 1x1 degree lat-lon grid') + >>> print(area_def) + Area ID: my_area + Description: Global 1x1 degree lat-lon grid + Projection: {'lon_0': '0', 'proj': 'latlong', 'type': 'crs'} + Number of columns: 360 + Number of rows: 180 + Area extent: (-180.0, -90.0, 180.0, 90.0) + If only one of **area_extent** or **shape** can be computed from the information provided by the user, a :class:`~pyresample.geometry.DynamicAreaDefinition` object is returned: .. doctest:: - >>> area_def = utils.create_area_def(area_id, proj_string, radius=radius, resolution=resolution) + >>> area_def = create_area_def(area_id, proj_string, radius=radius, resolution=resolution) >>> print(type(area_def)) @@ -118,7 +137,7 @@ AreaDefinition Class Methods There are four class methods available on the :class:`~pyresample.geometry.AreaDefinition` class utilizing -:func:`~pyresample.utils.create_area_def` providing a simpler interface to the +:func:`~pyresample.area_config.create_area_def` providing a simpler interface to the functionality described in the previous section. Hence each argument used below is the same as the ``create_area_def`` arguments described above and can be used in the same way (i.e. units). The following @@ -132,7 +151,6 @@ from_extent .. doctest:: - >>> from pyresample import utils >>> from pyresample.geometry import AreaDefinition >>> area_id = 'ease_sh' >>> proj_string = '+proj=laea +lat_0=-90 +lon_0=0 +a=6371228.0 +units=m' @@ -142,7 +160,7 @@ from_extent >>> print(area_def) Area ID: ease_sh Description: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -161,7 +179,7 @@ from_circle >>> print(area_def) Area ID: ease_sh Description: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -173,7 +191,7 @@ from_circle >>> print(area_def) Area ID: ease_sh Description: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -189,7 +207,7 @@ from_area_of_interest >>> print(area_def) Area ID: ease_sh Description: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -204,7 +222,7 @@ from_ul_corner >>> print(area_def) Area ID: ease_sh Description: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -374,7 +392,7 @@ read a single ``AreaDefinition`` named ``corner`` by doing: >>> print(area_def) Area ID: corner Description: Example of making an area definition using shape, upper_left_extent, and resolution - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -388,7 +406,7 @@ series of arguments: >>> print(boundary) Area ID: ease_sh Description: Example of making an area definition using shape and area_extent - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -429,7 +447,7 @@ An area definition dict can be read using Area ID: ease_nh Description: Arctic EASE grid Projection ID: ease_nh - Projection: {'a': '6371228.0', 'lat_0': '90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) @@ -445,7 +463,7 @@ Several area definitions can be read at once using the region names in an argume Area ID: ease_sh Description: Antarctic EASE grid Projection ID: ease_sh - Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} + Projection: {'R': '6371228', 'lat_0': '-90', 'lon_0': '0', 'no_defs': 'None', 'proj': 'laea', 'type': 'crs', 'units': 'm', 'x_0': '0', 'y_0': '0'} Number of columns: 425 Number of rows: 425 Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625) ===================================== docs/source/index.rst ===================================== @@ -16,6 +16,7 @@ Pyresample offers multiple resampling algorithms including: - Nearest Neighbor - Elliptical Weighted Average (EWA) - Bilinear +- Bucket resampling (count hits per bin, averaging, ratios) For nearest neighbor and bilinear interpolation pyresample uses a kd-tree approach by using the fast KDTree implementation provided by the ===================================== docs/source/plot.rst ===================================== @@ -130,7 +130,7 @@ Cartopy CRS object. >>> ax = plt.axes(projection=crs) >>> ax.coastlines() >>> ax.set_global() - >>> plt.imshow(data, transform=crs, extent=crs.bounds, origin='upper') + >>> plt.imshow(result, transform=crs, extent=crs.bounds, origin='upper') >>> plt.colorbar() >>> plt.savefig('viirs_i04_cartopy.png') ===================================== docs/source/swath.rst ===================================== @@ -459,3 +459,12 @@ Example >>> rows_per_scan = 5 >>> # fornav resamples the swath data to the gridded area >>> num_valid_points, gridded_data = fornav(cols, rows, area_def, data, rows_per_scan=rows_per_scan) + +pyresample.bucket +----------------- + +.. autoclass:: pyresample.bucket.BucketResampler + :noindex: + +See :class:`~pyresample.bucket.BucketResampler` API documentation for +the details of method parameters. ===================================== pyresample/_spatial_mp.py ===================================== @@ -110,36 +110,15 @@ class cKDTree_MP(object): class BaseProj(pyproj.Proj): + """Helper class for easier backwards compatibility.""" def __init__(self, projparams=None, preserve_units=True, **kwargs): - # Copy dict-type arguments as they will be modified in-place - if isinstance(projparams, dict): - projparams = projparams.copy() - - # Pyproj<2 uses __new__ to initiate data and does not define its own __init__ method. if is_pyproj2(): - # If init is found in any of the data, override any other area parameters. - if 'init' in kwargs: - warnings.warn('init="EPSG:XXXX" is no longer supported. Use "EPSG:XXXX" as a proj string instead') - projparams = kwargs.pop('init') - # Proj takes params in projparams over the params in kwargs. - if isinstance(projparams, (dict, str)) and 'init' in projparams: - warn_msg = '{"init": "EPSG:XXXX"} is no longer supported. Use "EPSG:XXXX" as a proj string instead' - if isinstance(projparams, str): - warn_msg = '"+init=EPSG:XXXX" is no longer supported. Use "EPSG:XXXX" as a proj string instead' - # Proj-dicts are cleaner to parse than strings. - projparams = proj4_str_to_dict(projparams) - warnings.warn(warn_msg) - projparams = projparams.pop('init') - # New syntax 'EPSG:XXXX' - if 'EPSG' in kwargs or (isinstance(projparams, dict) and 'EPSG' in projparams): - if 'EPSG' in kwargs: - epsg_code = kwargs.pop('EPSG') - else: - epsg_code = projparams.pop('EPSG') - projparams = 'EPSG:{}'.format(epsg_code) - - super(BaseProj, self).__init__(projparams=projparams, preserve_units=preserve_units, **kwargs) + # have to have this because pyproj uses __new__ + # subclasses would fail when calling __init__ otherwise + super(BaseProj, self).__init__(projparams=projparams, + preserve_units=preserve_units, + **kwargs) def is_latlong(self): if is_pyproj2(): @@ -148,6 +127,7 @@ class BaseProj(pyproj.Proj): class Proj(BaseProj): + """Helper class to skip transforming lon/lat projection coordinates.""" def __call__(self, data1, data2, inverse=False, radians=False, errcheck=False, nprocs=1): ===================================== pyresample/area_config.py ===================================== @@ -335,10 +335,11 @@ def _get_proj4_args(proj4_args): """Create dict from proj4 args.""" from pyresample.utils._proj4 import convert_proj_floats if isinstance(proj4_args, (str, six.text_type)): - proj_config = proj4_str_to_dict(str(proj4_args)) - else: - from configobj import ConfigObj - proj_config = ConfigObj(proj4_args) + # float conversion is done in `proj4_str_to_dict` already + return proj4_str_to_dict(str(proj4_args)) + + from configobj import ConfigObj + proj_config = ConfigObj(proj4_args) return convert_proj_floats(proj_config.items()) @@ -417,12 +418,17 @@ def create_area_def(area_id, projection, width=None, height=None, area_extent=No description = kwargs.pop('description', area_id) proj_id = kwargs.pop('proj_id', None) + # convert EPSG dictionaries to projection string + # (hold on to EPSG code as much as possible) + if isinstance(projection, dict) and 'EPSG' in projection: + projection = "EPSG:{}".format(projection['EPSG']) + # Get a proj4_dict from either a proj4_dict or a proj4_string. proj_dict = _get_proj_data(projection) try: - p = Proj(proj_dict, preserve_units=True) + p = Proj(projection, preserve_units=True) except RuntimeError: - return _make_area(area_id, description, proj_id, proj_dict, shape, area_extent, **kwargs) + return _make_area(area_id, description, proj_id, projection, shape, area_extent, **kwargs) # If no units are provided, try to get units used in proj_dict. If still none are provided, use meters. if units is None: @@ -457,7 +463,7 @@ def create_area_def(area_id, projection, width=None, height=None, area_extent=No _extrapolate_information(area_extent, shape, center, radius, resolution, upper_left_extent, units, p, proj_dict) - return _make_area(area_id, description, proj_id, proj_dict, shape, + return _make_area(area_id, description, proj_id, projection, shape, area_extent, resolution=resolution, **kwargs) @@ -482,7 +488,23 @@ def _make_area(area_id, description, proj_id, proj_dict, shape, area_extent, **k def _get_proj_data(projection): - """Takes a proj4_dict or proj4_string and returns a proj4_dict and a Proj function.""" + """Takes a proj4_dict or proj4_string and returns a proj4_dict and a Proj function. + + There is special handling for the "EPSG:XXXX" case where "XXXX" is an + EPSG number code. It can be provided as a string `"EPSG:XXXX"` or as a + dictionary (when provided via YAML) as `{'EPSG': XXXX}`. + If it is passed as a string ("EPSG:XXXX") then the rules of + :func:`~pyresample.utils._proj.proj4_str_to_dict` are followed. + If a dictionary and pyproj 2.0+ is installed then the string + `"EPSG:XXXX"` is passed to ``proj4_str_to_dict``. If pyproj<2.0 + is installed then the string ``+init=EPSG:XXXX`` is passed to + ``proj4_str_to_dict`` which provides limited information to area + config operations. + + """ + if isinstance(projection, dict) and 'EPSG' in projection: + projection = "EPSG:{}".format(projection['EPSG']) + if isinstance(projection, str): proj_dict = proj4_str_to_dict(projection) elif isinstance(projection, dict): ===================================== pyresample/bilinear/xarr.py ===================================== @@ -1,3 +1,4 @@ +"""XArray version of bilinear interpolation.""" import warnings @@ -15,8 +16,17 @@ from pyresample._spatial_mp import Proj from pykdtree.kdtree import KDTree from pyresample import data_reduce, geometry, CHUNK_SIZE +CACHE_INDICES = ['bilinear_s', + 'bilinear_t', + 'slices_x', + 'slices_y', + 'mask_slices', + 'out_coords_x', + 'out_coords_y'] + class XArrayResamplerBilinear(object): + """Bilinear interpolation using XArray.""" def __init__(self, source_geo_def, @@ -24,10 +34,10 @@ class XArrayResamplerBilinear(object): radius_of_influence, neighbours=32, epsilon=0, - reduce_data=True, - nprocs=1, - segments=None): + reduce_data=True): """ + Initialize resampler. + Parameters ---------- source_geo_def : object @@ -44,11 +54,7 @@ class XArrayResamplerBilinear(object): reduce_data : bool, optional Perform initial coarse reduction of source dataset in order to reduce execution time - nprocs : int, optional - Number of processor cores to be used - segments : int or None - Number of segments to use when resampling. - If set to None an estimate will be calculated + """ if da is None: raise ImportError("Missing 'xarray' and 'dask' dependencies") @@ -59,11 +65,16 @@ class XArrayResamplerBilinear(object): self.distance_array = None self.bilinear_t = None self.bilinear_s = None + self.slices_x = None + self.slices_y = None + self.slices = {'x': self.slices_x, 'y': self.slices_y} + self.mask_slices = None + self.out_coords_x = None + self.out_coords_y = None + self.out_coords = {'x': self.out_coords_x, 'y': self.out_coords_y} self.neighbours = neighbours self.epsilon = epsilon self.reduce_data = reduce_data - self.nprocs = nprocs - self.segments = segments self.source_geo_def = source_geo_def self.target_geo_def = target_geo_def self.radius_of_influence = radius_of_influence @@ -77,9 +88,9 @@ class XArrayResamplerBilinear(object): Vertical fractional distances from corner to the new points s__ : numpy array Horizontal fractional distances from corner to the new points - input_idxs : numpy array + valid_input_index : numpy array Valid indices in the input data - idx_arr : numpy array + index_array : numpy array Mapping array from valid source points to target points """ @@ -88,9 +99,9 @@ class XArrayResamplerBilinear(object): (self.neighbours, self.source_geo_def.size)) # Create kd-tree - valid_input_idx, resample_kdtree = self._create_resample_kdtree() + valid_input_index, resample_kdtree = self._create_resample_kdtree() # This is a numpy array - self.valid_input_index = valid_input_idx + self.valid_input_index = valid_input_index if resample_kdtree.n == 0: # Handle if all input data is reduced away @@ -99,7 +110,7 @@ class XArrayResamplerBilinear(object): self.target_geo_def) self.bilinear_t = bilinear_t self.bilinear_s = bilinear_s - self.valid_input_index = valid_input_idx + self.valid_input_index = valid_input_index self.index_array = index_array return bilinear_t, bilinear_s, valid_input_index, index_array @@ -120,7 +131,9 @@ class XArrayResamplerBilinear(object): proj = Proj(self.target_geo_def.proj_str) # Get output x/y coordinates - out_x, out_y = _get_output_xy_dask(self.target_geo_def, proj) + out_x, out_y = self.target_geo_def.get_proj_coords(chunks=CHUNK_SIZE) + out_x = da.ravel(out_x) + out_y = da.ravel(out_y) # Get input x/y coordinates in_x, in_y = _get_input_xy_dask(self.source_geo_def, proj, @@ -139,81 +152,23 @@ class XArrayResamplerBilinear(object): self.index_array = index_array self.distance_array = distance_array - return (self.bilinear_t, self.bilinear_s, self.valid_input_index, - self.index_array) + self._get_slices() - def get_sample_from_bil_info(self, data, fill_value=np.nan, + return (self.bilinear_t, self.bilinear_s, + self.slices, self.mask_slices, + self.out_coords) + + def get_sample_from_bil_info(self, data, fill_value=None, output_shape=None): + """Resample using pre-computed resampling LUTs.""" + del output_shape if fill_value is None: - fill_value = np.nan - # FIXME: can be this made into a dask construct ? - cols, lines = np.meshgrid(np.arange(data['x'].size), - np.arange(data['y'].size)) - cols = da.ravel(cols) - lines = da.ravel(lines) - try: - self.valid_input_index = self.valid_input_index.compute() - except AttributeError: - pass - vii = self.valid_input_index.squeeze() - try: - self.index_array = self.index_array.compute() - except AttributeError: - pass - - # ia contains reduced (valid) indices of the source array, and has the - # shape of the destination array - ia = self.index_array - rlines = lines[vii][ia] - rcols = cols[vii][ia] - - slices = [] - mask_slices = [] - mask_2d_added = False - coords = {} - try: - # FIXME: Use same chunk size as input data - coord_x, coord_y = self.target_geo_def.get_proj_vectors_dask() - except AttributeError: - coord_x, coord_y = None, None - - for _, dim in enumerate(data.dims): - if dim == 'y': - slices.append(rlines) - if not mask_2d_added: - mask_slices.append(ia >= self.target_geo_def.size) - mask_2d_added = True - if coord_y is not None: - coords[dim] = coord_y - elif dim == 'x': - slices.append(rcols) - if not mask_2d_added: - mask_slices.append(ia >= self.target_geo_def.size) - mask_2d_added = True - if coord_x is not None: - coords[dim] = coord_x + if np.issubdtype(data.dtype, np.integer): + fill_value = 0 else: - slices.append(slice(None)) - mask_slices.append(slice(None)) - try: - coords[dim] = data.coords[dim] - except KeyError: - pass - - res = data.values[slices] - res[mask_slices] = fill_value - - try: - p_1 = res[:, :, 0] - p_2 = res[:, :, 1] - p_3 = res[:, :, 2] - p_4 = res[:, :, 3] - except IndexError: - p_1 = res[:, 0] - p_2 = res[:, 1] - p_3 = res[:, 2] - p_4 = res[:, 3] + fill_value = np.nan + p_1, p_2, p_3, p_4 = self._slice_data(data, fill_value) s__, t__ = self.bilinear_s, self.bilinear_t res = (p_1 * (1 - s__) * (1 - t__) + @@ -227,25 +182,107 @@ class XArrayResamplerBilinear(object): idxs = (res > data_max) | (res < data_min) res = da.where(idxs, fill_value, res) + res = da.where(np.isnan(res), fill_value, res) shp = self.target_geo_def.shape if data.ndim == 3: res = da.reshape(res, (res.shape[0], shp[0], shp[1])) else: res = da.reshape(res, (shp[0], shp[1])) - res = DataArray(da.from_array(res, chunks=CHUNK_SIZE), - dims=data.dims, coords=coords) + + # Add missing coordinates + self._add_missing_coordinates(data) + + res = DataArray(res, dims=data.dims, coords=self.out_coords) return res + def _compute_indices(self): + for idx in CACHE_INDICES: + var = getattr(self, idx) + try: + var = var.compute() + setattr(self, idx, var) + except AttributeError: + continue + + def _add_missing_coordinates(self, data): + if self.out_coords['x'] is None and self.out_coords_x is not None: + self.out_coords['x'] = self.out_coords_x + self.out_coords['y'] = self.out_coords_y + for _, dim in enumerate(data.dims): + if dim not in self.out_coords: + try: + self.out_coords[dim] = data.coords[dim] + except KeyError: + pass + + def _slice_data(self, data, fill_value): + + def _slicer(values, sl_x, sl_y, mask, fill_value): + if values.ndim == 2: + arr = values[(sl_y, sl_x)] + arr[(mask, )] = fill_value + p_1 = arr[:, 0] + p_2 = arr[:, 1] + p_3 = arr[:, 2] + p_4 = arr[:, 3] + elif values.ndim == 3: + arr = values[(slice(None), sl_y, sl_x)] + arr[(slice(None), mask)] = fill_value + p_1 = arr[:, :, 0] + p_2 = arr[:, :, 1] + p_3 = arr[:, :, 2] + p_4 = arr[:, :, 3] + else: + raise ValueError + + return p_1, p_2, p_3, p_4 + + values = data.values + sl_y = self.slices_y + sl_x = self.slices_x + mask = self.mask_slices + + return _slicer(values, sl_x, sl_y, mask, fill_value) + + def _get_slices(self): + shp = self.source_geo_def.shape + cols, lines = np.meshgrid(np.arange(shp[1]), + np.arange(shp[0])) + cols = np.ravel(cols) + lines = np.ravel(lines) + + vii = self.valid_input_index + ia_ = self.index_array + + # ia_ contains reduced (valid) indices of the source array, and has the + # shape of the destination array + rlines = lines[vii][ia_] + rcols = cols[vii][ia_] + + try: + coord_x, coord_y = self.target_geo_def.get_proj_vectors() + self.out_coords['y'] = coord_y + self.out_coords['x'] = coord_x + self.out_coords_y = self.out_coords['y'] + self.out_coords_x = self.out_coords['x'] + except AttributeError: + pass + + self.mask_slices = ia_ >= self.source_geo_def.size + self.slices['y'] = rlines + self.slices['x'] = rcols + self.slices_y = self.slices['y'] + self.slices_x = self.slices['x'] + def _create_resample_kdtree(self): - """Set up kd tree on input""" + """Set up kd tree on input.""" # Get input information valid_input_index, source_lons, source_lats = \ _get_valid_input_index_dask(self.source_geo_def, self.target_geo_def, self.reduce_data, - self.radius_of_influence, - nprocs=self.nprocs) + self.radius_of_influence) # FIXME: Is dask smart enough to only compute the pixels we end up # using even with this complicated indexing @@ -266,11 +303,6 @@ class XArrayResamplerBilinear(object): valid_oi, reduce_data=True): """Query kd-tree on slice of target coordinates.""" - -# res = da.map_blocks(query_no_distance, tlons, tlats, -# valid_oi, dtype=np.int, kdtree=resample_kdtree, -# neighbours=self.neighbours, epsilon=self.epsilon, -# radius=self.radius_of_influence) res = query_no_distance(tlons, tlats, valid_oi, resample_kdtree, self.neighbours, self.epsilon, @@ -278,41 +310,9 @@ class XArrayResamplerBilinear(object): return res, None -def _get_fill_mask_value(data_dtype): - """Returns the maximum value of dtype""" - if issubclass(data_dtype.type, np.floating): - fill_value = np.finfo(data_dtype.type).max - elif issubclass(data_dtype.type, np.integer): - fill_value = np.iinfo(data_dtype.type).max - else: - raise TypeError('Type %s is unsupported for masked fill values' % - data_dtype.type) - return fill_value - - -def _get_output_xy_dask(target_geo_def, proj): - """Get x/y coordinates of the target grid.""" - # Read output coordinates - out_lons, out_lats = target_geo_def.get_lonlats_dask() - - # Mask invalid coordinates - out_lons, out_lats = _mask_coordinates_dask(out_lons, out_lats) - - # Convert coordinates to output projection x/y space - res = da.dstack(proj(out_lons.compute(), out_lats.compute())) - # _run_proj(proj, out_lons, out_lats) - #, - # chunks=(out_lons.chunks[0], out_lons.chunks[1], 2), - # new_axis=[2]) - out_x = da.ravel(res[:, :, 0]) - out_y = da.ravel(res[:, :, 1]) - - return out_x, out_y - - -def _get_input_xy_dask(source_geo_def, proj, input_idxs, idx_ref): +def _get_input_xy_dask(source_geo_def, proj, valid_input_index, index_array): """Get x/y coordinates for the input area and reduce the data.""" - in_lons, in_lats = source_geo_def.get_lonlats_dask() + in_lons, in_lats = source_geo_def.get_lonlats(chunks=CHUNK_SIZE) # Mask invalid values in_lons, in_lats = _mask_coordinates_dask(in_lons, in_lats) @@ -323,16 +323,15 @@ def _get_input_xy_dask(source_geo_def, proj, input_idxs, idx_ref): in_lons = da.ravel(in_lons) in_lons = in_lons.compute() - in_lons = in_lons[input_idxs] + in_lons = in_lons[valid_input_index] in_lats = da.ravel(in_lats) in_lats = in_lats.compute() - in_lats = in_lats[input_idxs] + in_lats = in_lats[valid_input_index] + index_array = index_array.compute() # Expand input coordinates for each output location - # in_lons = in_lons.compute() - in_lons = in_lons[idx_ref] - # in_lats = in_lats.compute() - in_lats = in_lats[idx_ref] + in_lons = in_lons[index_array] + in_lats = in_lats[index_array] # Convert coordinates to output projection x/y space in_x, in_y = proj(in_lons, in_lats) @@ -340,14 +339,8 @@ def _get_input_xy_dask(source_geo_def, proj, input_idxs, idx_ref): return in_x, in_y -def _run_proj(proj, lons, lats): - return da.dstack(proj(lons, lats)) - - def _mask_coordinates_dask(lons, lats): - """Mask invalid coordinate values""" - # lons = da.ravel(lons) - # lats = da.ravel(lats) + """Mask invalid coordinate values.""" idxs = ((lons < -180.) | (lons > 180.) | (lats < -90.) | (lats > 90.)) lons = da.where(idxs, np.nan, lons) @@ -356,19 +349,23 @@ def _mask_coordinates_dask(lons, lats): return lons, lats -def _get_bounding_corners_dask(in_x, in_y, out_x, out_y, neighbours, idx_ref): - """Get four closest locations from (in_x, in_y) so that they form a +def _get_bounding_corners_dask(in_x, in_y, out_x, out_y, neighbours, index_array): + """Get bounding corners. + + Get four closest locations from (in_x, in_y) so that they form a bounding rectangle around the requested location given by (out_x, out_y). - """ + """ # Find four closest pixels around the target location # FIXME: how to daskify? # Tile output coordinates to same shape as neighbour info # Replacing with da.transpose and da.tile doesn't work - out_x_tile = np.transpose(np.tile(out_x, (neighbours, 1))) - out_y_tile = np.transpose(np.tile(out_y, (neighbours, 1))) + out_x_tile = np.reshape(np.tile(out_x, neighbours), + (neighbours, out_x.size)).T + out_y_tile = np.reshape(np.tile(out_y, neighbours), + (neighbours, out_y.size)).T # Get differences in both directions x_diff = out_x_tile - in_x @@ -378,57 +375,65 @@ def _get_bounding_corners_dask(in_x, in_y, out_x, out_y, neighbours, idx_ref): # Upper left source pixel valid = (x_diff > 0) & (y_diff < 0) - x_1, y_1, idx_1 = _get_corner_dask(stride, valid, in_x, in_y, idx_ref) + x_1, y_1, idx_1 = _get_corner_dask(stride, valid, in_x, in_y, index_array) # Upper right source pixel valid = (x_diff < 0) & (y_diff < 0) - x_2, y_2, idx_2 = _get_corner_dask(stride, valid, in_x, in_y, idx_ref) + x_2, y_2, idx_2 = _get_corner_dask(stride, valid, in_x, in_y, index_array) # Lower left source pixel valid = (x_diff > 0) & (y_diff > 0) - x_3, y_3, idx_3 = _get_corner_dask(stride, valid, in_x, in_y, idx_ref) + x_3, y_3, idx_3 = _get_corner_dask(stride, valid, in_x, in_y, index_array) # Lower right source pixel valid = (x_diff < 0) & (y_diff > 0) - x_4, y_4, idx_4 = _get_corner_dask(stride, valid, in_x, in_y, idx_ref) + x_4, y_4, idx_4 = _get_corner_dask(stride, valid, in_x, in_y, index_array) - # Combine sorted indices to idx_ref - idx_ref = np.transpose(np.vstack((idx_1, idx_2, idx_3, idx_4))) + # Combine sorted indices to index_array + index_array = np.transpose(np.vstack((idx_1, idx_2, idx_3, idx_4))) return (np.transpose(np.vstack((x_1, y_1))), np.transpose(np.vstack((x_2, y_2))), np.transpose(np.vstack((x_3, y_3))), np.transpose(np.vstack((x_4, y_4))), - idx_ref) + index_array) -def _get_corner_dask(stride, valid, in_x, in_y, idx_ref): - """Get closest set of coordinates from the *valid* locations""" +def _get_corner_dask(stride, valid, in_x, in_y, index_array): + """Get closest set of coordinates from the *valid* locations.""" # Find the closest valid pixels, if any idxs = np.argmax(valid, axis=1) # Check which of these were actually valid invalid = np.invert(np.max(valid, axis=1)) # idxs = idxs.compute() - idx_ref = idx_ref.compute() + index_array = index_array.compute() # Replace invalid points with np.nan x__ = in_x[stride, idxs] # TODO: daskify - x__ = np.where(invalid, np.nan, x__) + x__ = da.where(invalid, np.nan, x__) y__ = in_y[stride, idxs] # TODO: daskify - y__ = np.where(invalid, np.nan, y__) + y__ = da.where(invalid, np.nan, y__) - idx = idx_ref[stride, idxs] # TODO: daskify + idx = index_array[stride, idxs] # TODO: daskify return x__, y__, idx def _get_ts_dask(pt_1, pt_2, pt_3, pt_4, out_x, out_y): - """Calculate vertical and horizontal fractional distances t and s""" + """Calculate vertical and horizontal fractional distances t and s.""" + def invalid_to_nan(t__, s__): + idxs = (t__ < 0) | (t__ > 1) | (s__ < 0) | (s__ > 1) + t__ = da.where(idxs, np.nan, t__) + s__ = da.where(idxs, np.nan, s__) + return t__, s__ # General case, ie. where the the corners form an irregular rectangle t__, s__ = _get_ts_irregular_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x) + # Replace invalid values with NaNs + t__, s__ = invalid_to_nan(t__, s__) + # Cases where verticals are parallel idxs = da.isnan(t__) | da.isnan(s__) # Remove extra dimensions @@ -438,10 +443,12 @@ def _get_ts_dask(pt_1, pt_2, pt_3, pt_4, out_x, out_y): t_new, s_new = _get_ts_uprights_parallel_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x) - t__ = da.where(idxs, t_new, t__) s__ = da.where(idxs, s_new, s__) + # Replace invalid values with NaNs + t__, s__ = invalid_to_nan(t__, s__) + # Cases where both verticals and horizontals are parallel idxs = da.isnan(t__) | da.isnan(s__) # Remove extra dimensions @@ -452,16 +459,14 @@ def _get_ts_dask(pt_1, pt_2, pt_3, pt_4, out_x, out_y): t__ = da.where(idxs, t_new, t__) s__ = da.where(idxs, s_new, s__) - idxs = (t__ < 0) | (t__ > 1) | (s__ < 0) | (s__ > 1) - t__ = da.where(idxs, np.nan, t__) - s__ = da.where(idxs, np.nan, s__) + # Replace invalid values with NaNs + t__, s__ = invalid_to_nan(t__, s__) return t__, s__ def _get_ts_irregular_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x): """Get parameters for the case where none of the sides are parallel.""" - # Get parameters for the quadratic equation # TODO: check if needs daskifying a__, b__, c__ = _calc_abc_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x) @@ -478,9 +483,12 @@ def _get_ts_irregular_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x): # Might not need daskifying def _calc_abc_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x): - """Calculate coefficients for quadratic equation for - _get_ts_irregular() and _get_ts_uprights(). For _get_ts_uprights - switch order of pt_2 and pt_3. + """Calculate coefficients for quadratic equation. + + In this order of arguments used for _get_ts_irregular() and + _get_ts_uprights(). For _get_ts_uprights switch order of pt_2 and + pt_3. + """ # Pairwise longitudal separations between reference points x_21 = pt_2[:, 0] - pt_1[:, 0] @@ -503,11 +511,12 @@ def _calc_abc_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x): def _solve_quadratic_dask(a__, b__, c__, min_val=0.0, max_val=1.0): - """Solve quadratic equation and return the valid roots from interval - [*min_val*, *max_val*] + """Solve quadratic equation. - """ + Solve quadratic equation and return the valid roots from interval + [*min_val*, *max_val*]. + """ discriminant = b__ * b__ - 4 * a__ * c__ # Solve the quadratic polynomial @@ -525,8 +534,10 @@ def _solve_quadratic_dask(a__, b__, c__, min_val=0.0, max_val=1.0): def _solve_another_fractional_distance_dask(f__, y_1, y_2, y_3, y_4, out_y): - """Solve parameter t__ from s__, or vice versa. For solving s__, - switch order of y_2 and y_3.""" + """Solve parameter t__ from s__, or vice versa. + + For solving s__, switch order of y_2 and y_3. + """ y_21 = y_2 - y_1 y_43 = y_4 - y_3 @@ -541,8 +552,7 @@ def _solve_another_fractional_distance_dask(f__, y_1, y_2, y_3, y_4, out_y): def _get_ts_uprights_parallel_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x): - """Get parameters for the case where uprights are parallel""" - + """Get parameters for the case where uprights are parallel.""" # Get parameters for the quadratic equation a__, b__, c__ = _calc_abc_dask(pt_1, pt_3, pt_2, pt_4, out_y, out_x) @@ -557,8 +567,7 @@ def _get_ts_uprights_parallel_dask(pt_1, pt_2, pt_3, pt_4, out_y, out_x): def _get_ts_parallellogram_dask(pt_1, pt_2, pt_3, out_y, out_x): - """Get parameters for the case where uprights are parallel""" - + """Get parameters for the case where uprights are parallel.""" # Pairwise longitudal separations between reference points x_21 = pt_2[:, 0] - pt_1[:, 0] x_31 = pt_3[:, 0] - pt_1[:, 0] @@ -579,27 +588,10 @@ def _get_ts_parallellogram_dask(pt_1, pt_2, pt_3, out_y, out_x): return t__, s__ -def _check_data_shape_dask(data, input_idxs): - """Check data shape and adjust if necessary.""" - # Handle multiple datasets - if data.ndim > 2 and data.shape[0] * data.shape[1] == input_idxs.shape[0]: - data = da.reshape(data, data.shape[0] * data.shape[1], data.shape[2]) - # Also ravel single dataset - elif data.shape[0] != input_idxs.size: - data = da.ravel(data) - - # Ensure two dimensions - if data.ndim == 1: - data = da.reshape(data, (data.size, 1)) - - return data - - def query_no_distance(target_lons, target_lats, valid_output_index, kdtree, neighbours, epsilon, radius): """Query the kdtree. No distances are returned.""" voi = valid_output_index - shape = voi.shape voir = da.ravel(voi) target_lons_valid = da.ravel(target_lons)[voir] target_lats_valid = da.ravel(target_lats)[voir] @@ -617,11 +609,9 @@ def query_no_distance(target_lons, target_lats, def _get_valid_input_index_dask(source_geo_def, target_geo_def, reduce_data, - radius_of_influence, - nprocs=1): - """Find indices of reduced inputput data""" - - source_lons, source_lats = source_geo_def.get_lonlats_dask() + radius_of_influence): + """Find indices of reduced input data.""" + source_lons, source_lats = source_geo_def.get_lonlats(chunks=CHUNK_SIZE) source_lons = da.ravel(source_lons) source_lats = da.ravel(source_lats) @@ -663,7 +653,7 @@ def _get_valid_input_index_dask(source_geo_def, def lonlat2xyz(lons, lats): - + """Convert geographic coordinates to cartesian 3D coordinates.""" R = 6370997.0 x_coords = R * da.cos(da.deg2rad(lats)) * da.cos(da.deg2rad(lons)) y_coords = R * da.cos(da.deg2rad(lats)) * da.sin(da.deg2rad(lons)) @@ -674,8 +664,7 @@ def lonlat2xyz(lons, lats): def _create_empty_bil_info(source_geo_def, target_geo_def): - """Creates dummy info for empty result set""" - + """Create dummy info for empty result set.""" valid_input_index = np.ones(source_geo_def.size, dtype=np.bool) index_array = np.ones((target_geo_def.size, 4), dtype=np.int32) bilinear_s = np.nan * np.zeros(target_geo_def.size) ===================================== pyresample/bucket/__init__.py ===================================== @@ -0,0 +1,303 @@ +# pyresample, Resampling of remote sensing image data in python +# +# Copyright (C) 2019 Pyresample developers +# +# This program is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the Free +# Software Foundation, either version 3 of the License, or (at your option) any +# later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License along +# with this program. If not, see . + +"""Code for resampling using bucket resampling.""" + +import dask.array as da +import xarray as xr +import numpy as np +import logging +from pyresample._spatial_mp import Proj + +LOG = logging.getLogger(__name__) + + +class BucketResampler(object): + + """Class for bucket resampling. + + Bucket resampling is useful for calculating averages and hit-counts + when aggregating data to coarser scale grids. + + Below are examples how to use the resampler. + + Read data using Satpy. The resampling can also be done (apart from + fractions) directly from Satpy, but this demonstrates the direct + low-level usage. + + >>> from pyresample.bucket import BucketResampler + >>> from satpy import Scene + >>> from satpy.resample import get_area_def + >>> fname = "hrpt_noaa19_20170519_1214_42635.l1b" + >>> glbl = Scene(filenames=[fname]) + >>> glbl.load(['4']) + >>> data = glbl['4'] + >>> lons, lats = data.area.get_lonlats() + >>> target_area = get_area_def('euro4') + + Initialize the resampler + + >>> resampler = BucketResampler(adef, lons, lats) + + Calculate the sum of all the data in each grid location: + + >>> sums = resampler.get_sum(data) + + Calculate how many values were collected at each grid location: + + >>> counts = resampler.get_count() + + The average can be calculated from the above two results, or directly + using the helper method: + + >>> average = resampler.get_average(data) + + Calculate fractions of occurrences of different values in each grid + location. The data needs to be categorical (in integers), so + we'll create some categorical data from the brightness temperature + data that were read earlier. The data are returned in a + dictionary with the categories as keys. + + >>> data = da.where(data > 250, 1, 0) + >>> fractions = resampler.get_fractions(data, categories=[0, 1]) + >>> import matplotlib.pyplot as plt + >>> plt.imshow(fractions[0]); plt.show() + """ + + def __init__(self, target_area, source_lons, source_lats): + + self.target_area = target_area + self.source_lons = source_lons + self.source_lats = source_lats + self.prj = Proj(self.target_area.proj_dict) + self.x_idxs = None + self.y_idxs = None + self.idxs = None + self._get_indices() + self.counts = None + + def _get_proj_coordinates(self, lons, lats, x_res, y_res): + """Calculate projection coordinates and round to resolution unit. + + Parameters + ---------- + lons : Numpy or Dask array + Longitude coordinates + lats : Numpy or Dask array + Latitude coordinates + x_res : float + Resolution of the output in X direction + y_res : float + Resolution of the output in Y direction + """ + proj_x, proj_y = self.prj(lons, lats) + proj_x = round_to_resolution(proj_x, x_res) + proj_y = round_to_resolution(proj_y, y_res) + + return np.stack((proj_x, proj_y)) + + def _get_indices(self): + """Calculate projection indices. + + Returns + ------- + x_idxs : Dask array + X indices of the target grid where the data are put + y_idxs : Dask array + Y indices of the target grid where the data are put + """ + LOG.info("Determine bucket resampling indices") + adef = self.target_area + + lons = self.source_lons.ravel() + lats = self.source_lats.ravel() + + # Create output grid coordinates in projection units + x_res = (adef.area_extent[2] - adef.area_extent[0]) / adef.width + y_res = (adef.area_extent[3] - adef.area_extent[1]) / adef.height + x_vect = da.arange(adef.area_extent[0] + x_res / 2., + adef.area_extent[2] - x_res / 2., x_res) + # Orient so that 0-meridian is pointing down + y_vect = da.arange(adef.area_extent[3] - y_res / 2., + adef.area_extent[1] + y_res / 2., + -y_res) + + result = da.map_blocks(self._get_proj_coordinates, lons, + lats, x_res, y_res, + new_axis=0, chunks=(2,) + lons.chunks) + proj_x = result[0, :] + proj_y = result[1, :] + + # Calculate array indices + x_idxs = ((proj_x - np.min(x_vect)) / x_res).astype(np.int) + y_idxs = ((np.max(y_vect) - proj_y) / y_res).astype(np.int) + + # Get valid index locations + mask = ((x_idxs >= 0) & (x_idxs < adef.width) & + (y_idxs >= 0) & (y_idxs < adef.height)) + self.y_idxs = da.where(mask, y_idxs, -1) + self.x_idxs = da.where(mask, x_idxs, -1) + + # Convert X- and Y-indices to raveled indexing + target_shape = self.target_area.shape + self.idxs = self.y_idxs * target_shape[1] + self.x_idxs + + def get_sum(self, data, mask_all_nan=False): + """Calculate sums for each bin with drop-in-a-bucket resampling. + + Parameters + ---------- + data : Numpy or Dask array + mask_all_nan : boolean (optional) + Mask bins that have only NaN results, default: False + + Returns + ------- + data : Numpy or Dask array + Bin-wise sums in the target grid + """ + LOG.info("Get sum of values in each location") + if isinstance(data, xr.DataArray): + data = data.data + data = data.ravel() + # Remove NaN values from the data when used as weights + weights = da.where(np.isnan(data), 0, data) + + # Rechunk indices to match the data chunking + if weights.chunks != self.idxs.chunks: + self.idxs = da.rechunk(self.idxs, weights.chunks) + + # Calculate the sum of the data falling to each bin + out_size = self.target_area.size + sums, _ = da.histogram(self.idxs, bins=out_size, range=(0, out_size), + weights=weights, density=False) + + if mask_all_nan: + nans = np.isnan(data) + nan_sums, _ = da.histogram(self.idxs[nans], bins=out_size, + range=(0, out_size)) + counts = self.get_count().ravel() + sums = da.where(nan_sums == counts, np.nan, sums) + + return sums.reshape(self.target_area.shape) + + def get_count(self): + """Count the number of occurrences for each bin using drop-in-a-bucket + resampling. + + Returns + ------- + data : Dask array + Bin-wise count of hits for each target grid location + """ + LOG.info("Get number of values in each location") + + out_size = self.target_area.size + + # Calculate the sum of the data falling to each bin + if self.counts is None: + counts, _ = da.histogram(self.idxs, bins=out_size, + range=(0, out_size)) + self.counts = counts.reshape(self.target_area.shape) + + return self.counts + + def get_average(self, data, fill_value=np.nan, mask_all_nan=False): + """Calculate bin-averages using bucket resampling. + + Parameters + ---------- + data : Numpy or Dask array + Data to be binned and averaged + fill_value : float + Fill value to replace missing values. Default: np.nan + + Returns + ------- + average : Dask array + Binned and averaged data. + """ + LOG.info("Get average value for each location") + + sums = self.get_sum(data, mask_all_nan=mask_all_nan) + counts = self.get_sum(np.logical_not(np.isnan(data)).astype(int), + mask_all_nan=False) + + average = sums / da.where(counts == 0, np.nan, counts) + average = da.where(np.isnan(average), fill_value, average) + + return average + + def get_fractions(self, data, categories=None, fill_value=np.nan): + """Get fraction of occurrences for each given categorical value. + + Parameters + ---------- + data : Numpy or Dask array + Categorical data to be processed + categories : iterable or None + One dimensional list of categories in the data, or None. If None, + categories are determined from the data by fully processing the + data and finding the unique category values. + fill_value : float + Fill value to replace missing values. Default: np.nan + """ + if categories is None: + LOG.warning("No categories given, need to compute the data.") + # compute any dask arrays by converting to numpy + categories = np.asarray(np.unique(data)) + try: + num = categories.size + except AttributeError: + num = len(categories) + LOG.info("Get fractions for %d categories", num) + results = {} + counts = self.get_count() + counts = counts.astype(float) + # Disable logging for calls to get_sum() + LOG.disabled = True + for cat in categories: + cat_data = da.where(data == cat, 1.0, 0.0) + + sums = self.get_sum(cat_data) + result = sums.astype(float) / counts + result = da.where(counts == 0.0, fill_value, result) + results[cat] = result + # Re-enable logging + LOG.disabled = False + + return results + + +def round_to_resolution(arr, resolution): + """Round the values in *arr* to closest resolution element. + + Parameters + ---------- + arr : list, tuple, Numpy or Dask array + Array to be rounded + resolution : float + Resolution unit to which data are rounded + + Returns + ------- + data : Numpy or Dask array + Source data rounded to the closest resolution unit + """ + if isinstance(arr, (list, tuple)): + arr = np.array(arr) + return resolution * np.round(arr / resolution) ===================================== pyresample/ewa/_fornav.cpp ===================================== The diff for this file was not included because it is too large. ===================================== pyresample/ewa/_fornav_templates.cpp ===================================== @@ -234,7 +234,7 @@ int compute_ewa(size_t chan_count, int maximum_weight_mode, u0 = uimg[swath_offset]; v0 = vimg[swath_offset]; - if (u0 < 0.0 || v0 < 0.0 || __isnan(u0) || npy_isnan(v0)) { + if (u0 < 0.0 || v0 < 0.0 || __isnan(u0) || __isnan(v0)) { continue; } ===================================== pyresample/ewa/_ll2cr.c ===================================== The diff for this file was not included because it is too large. ===================================== pyresample/geometry.py ===================================== @@ -22,7 +22,7 @@ # You should have received a copy of the GNU Lesser General Public License # along with this program. If not, see . -"""Classes for geometry operations""" +"""Classes for geometry operations.""" import hashlib import warnings @@ -33,7 +33,7 @@ import numpy as np import yaml from pyproj import Geod, transform -from pyresample import CHUNK_SIZE, utils +from pyresample import CHUNK_SIZE from pyresample._spatial_mp import Cartesian, Cartesian_MP, Proj, Proj_MP from pyresample.boundary import AreaDefBoundary, Boundary, SimpleBoundary from pyresample.utils import proj4_str_to_dict, proj4_dict_to_str, convert_proj_floats @@ -44,16 +44,25 @@ try: except ImportError: DataArray = np.ndarray +try: + from pyproj import CRS +except ImportError: + CRS = None + logger = getLogger(__name__) class DimensionError(ValueError): + """Wrap ValueError.""" + pass class IncompatibleAreas(ValueError): """Error when the areas to combine are not compatible.""" + pass + class BaseDefinition(object): """Base class for geometry definitions. @@ -69,6 +78,7 @@ class BaseDefinition(object): """ def __init__(self, lons=None, lats=None, nprocs=1): + """Initialize BaseDefinition.""" if type(lons) != type(lats): raise TypeError('lons and lats must be of same type') elif lons is not None: @@ -101,7 +111,7 @@ class BaseDefinition(object): return self.hash def __eq__(self, other): - """Test for approximate equality""" + """Test for approximate equality.""" if self is other: return True if other.lons is None or other.lats is None: @@ -135,12 +145,11 @@ class BaseDefinition(object): return False def __ne__(self, other): - """Test for approximate equality""" - + """Test for approximate equality.""" return not self.__eq__(other) def get_area_extent_for_subset(self, row_LR, col_LR, row_UL, col_UL): - """Calculate extent for a subdomain of this area + """Calculate extent for a subdomain of this area. Rows are counted from upper left to lower left and columns are counted from upper left to upper right. @@ -159,7 +168,6 @@ class BaseDefinition(object): Ulrich Hamann """ - (a, b) = self.get_proj_coords(data_slice=(row_LR, col_LR)) a = a - 0.5 * self.pixel_size_x b = b - 0.5 * self.pixel_size_y @@ -170,7 +178,7 @@ class BaseDefinition(object): return a, b, c, d def get_lonlat(self, row, col): - """Retrieve lon and lat of single pixel + """Retrieve lon and lat of single pixel. Parameters ---------- @@ -180,8 +188,8 @@ class BaseDefinition(object): Returns ------- (lon, lat) : tuple of floats - """ + """ if self.ndim != 2: raise DimensionError(('operation undefined ' 'for %sD geometry ') % self.ndim) @@ -242,8 +250,7 @@ class BaseDefinition(object): SimpleBoundary(s1_lat.squeeze(), s2_lat.squeeze(), s3_lat.squeeze(), s4_lat.squeeze())) def get_bbox_lonlats(self): - """Returns the bounding box lons and lats""" - + """Return the bounding box lons and lats.""" s1_lon, s1_lat = self.get_lonlats(data_slice=(0, slice(None))) s2_lon, s2_lat = self.get_lonlats(data_slice=(slice(None), -1)) s3_lon, s3_lat = self.get_lonlats(data_slice=(-1, slice(None, None, -1))) @@ -254,7 +261,7 @@ class BaseDefinition(object): (s4_lon.squeeze(), s4_lat.squeeze())]) def get_cartesian_coords(self, nprocs=None, data_slice=None, cache=False): - """Retrieve cartesian coordinates of geometry definition + """Retrieve cartesian coordinates of geometry definition. Parameters ---------- @@ -269,6 +276,7 @@ class BaseDefinition(object): Returns ------- cartesian_coords : numpy array + """ if cache: warnings.warn("'cache' keyword argument will be removed in the " @@ -308,8 +316,7 @@ class BaseDefinition(object): @property def corners(self): - """Returns the corners of the current area. - """ + """Return the corners of the current area.""" from pyresample.spherical_geometry import Coordinate return [Coordinate(*self.get_lonlat(0, 0)), Coordinate(*self.get_lonlat(0, -1)), @@ -317,8 +324,10 @@ class BaseDefinition(object): Coordinate(*self.get_lonlat(-1, 0))] def __contains__(self, point): - """Is a point inside the 4 corners of the current area? This uses - great circle arcs as area boundaries. + """Check if a point is inside the 4 corners of the current area. + + This uses great circle arcs as area boundaries. + """ from pyresample.spherical_geometry import point_inside, Coordinate corners = self.corners @@ -329,9 +338,10 @@ class BaseDefinition(object): return point_inside(point, corners) def overlaps(self, other): - """Tests if the current area overlaps the *other* area. This is based - solely on the corners of areas, assuming the boundaries to be great - circles. + """Test if the current area overlaps the *other* area. + + This is based solely on the corners of areas, assuming the + boundaries to be great circles. Parameters ---------- @@ -341,8 +351,8 @@ class BaseDefinition(object): Returns ------- overlaps : bool - """ + """ from pyresample.spherical_geometry import Arc self_corners = self.corners @@ -373,16 +383,13 @@ class BaseDefinition(object): return False def get_area(self): - """Get the area of the convex area defined by the corners of the current - area. - """ + """Get the area of the convex area defined by the corners of the curren area.""" from pyresample.spherical_geometry import get_polygon_area return get_polygon_area(self.corners) def intersection(self, other): - """Returns the corners of the intersection polygon of the current area - with *other*. + """Return the corners of the intersection polygon of the current area with *other*. Parameters ---------- @@ -392,6 +399,7 @@ class BaseDefinition(object): Returns ------- (corner1, corner2, corner3, corner4) : tuple of points + """ from pyresample.spherical_geometry import intersection_polygon return intersection_polygon(self.corners, other.corners) @@ -407,8 +415,8 @@ class BaseDefinition(object): Returns ------- overlap_rate : float - """ + """ from pyresample.spherical_geometry import get_polygon_area other_area = other.get_area() inter_area = get_polygon_area(self.intersection(other)) @@ -420,9 +428,10 @@ class BaseDefinition(object): class CoordinateDefinition(BaseDefinition): - """Base class for geometry definitions defined by lons and lats only""" + """Base class for geometry definitions defined by lons and lats only.""" def __init__(self, lons, lats, nprocs=1): + """Initialize CoordinateDefinition.""" if not isinstance(lons, (np.ndarray, DataArray)): lons = np.asanyarray(lons) lats = np.asanyarray(lats) @@ -438,6 +447,7 @@ class CoordinateDefinition(BaseDefinition): self.__class__.__name__) def concatenate(self, other): + """Concatenate coordinate definitions.""" if self.ndim != other.ndim: raise DimensionError(('Unable to concatenate %sD and %sD ' 'geometries') % (self.ndim, other.ndim)) @@ -448,6 +458,7 @@ class CoordinateDefinition(BaseDefinition): return klass(lons, lats, nprocs=nprocs) def append(self, other): + """Append another coordinate definition to existing one.""" if self.ndim != other.ndim: raise DimensionError(('Unable to append %sD and %sD ' 'geometries') % (self.ndim, other.ndim)) @@ -457,6 +468,7 @@ class CoordinateDefinition(BaseDefinition): self.size = self.lons.size def __str__(self): + """Return string representation of the coordinate definition.""" # Rely on numpy's object printing return ('Shape: %s\nLons: %s\nLats: %s') % (str(self.shape), str(self.lons), @@ -464,7 +476,7 @@ class CoordinateDefinition(BaseDefinition): class GridDefinition(CoordinateDefinition): - """Grid defined by lons and lats + """Grid defined by lons and lats. Parameters ---------- @@ -485,9 +497,11 @@ class GridDefinition(CoordinateDefinition): Grid lats cartesian_coords : object Grid cartesian coordinates + """ def __init__(self, lons, lats, nprocs=1): + """Initialize GridDefinition.""" super(GridDefinition, self).__init__(lons, lats, nprocs) if lons.shape != lats.shape: raise ValueError('lon and lat grid must have same shape') @@ -538,6 +552,7 @@ class SwathDefinition(CoordinateDefinition): """ def __init__(self, lons, lats, nprocs=1): + """Initialize SwathDefinition.""" if not isinstance(lons, (np.ndarray, DataArray)): lons = np.asanyarray(lons) lats = np.asanyarray(lats) @@ -553,7 +568,7 @@ class SwathDefinition(CoordinateDefinition): @staticmethod def _do_transform(src, dst, lons, lats, alt): - """Helper for 'aggregate' method.""" + """Run pyproj.transform and stack the results.""" x, y, z = transform(src, dst, lons, lats, alt) return np.dstack((x, y, z)) @@ -597,6 +612,7 @@ class SwathDefinition(CoordinateDefinition): return self.hash def update_hash(self, the_hash=None): + """Update the hash.""" if the_hash is None: the_hash = hashlib.sha1() the_hash.update(get_array_hashable(self.lons)) @@ -668,6 +684,7 @@ class SwathDefinition(CoordinateDefinition): return blons, blats def compute_bb_proj_params(self, proj_dict): + """Compute BB projection parameters.""" projection = proj_dict['proj'] ellipsoid = proj_dict.get('ellps', 'WGS84') if projection == 'omerc': @@ -741,46 +758,101 @@ class DynamicAreaDefinition(object): resolution=None, optimize_projection=False, rotation=None): """Initialize the DynamicAreaDefinition. + Attributes + ---------- area_id: The name of the area. description: The description of the area. projection: - The dictionary or string of projection parameters. Doesn't have to be complete. - height, width: - The shape of the resulting area. + The dictionary or string of projection parameters. Doesn't have to + be complete. If not complete, ``proj_info`` must be provided to + ``freeze`` to "fill in" any missing parameters. + width: + x dimension in number of pixels, aka number of grid columns + height: + y dimension in number of pixels, aka number of grid rows + shape: + Corresponding array shape as (height, width) area_extent: The area extent of the area. + pixel_size_x: + Pixel width in projection units + pixel_size_y: + Pixel height in projection units resolution: - the resolution of the resulting area. + Resolution of the resulting area as (pixel_size_x, pixel_size_y) or a scalar if pixel_size_x == pixel_size_y. optimize_projection: Whether the projection parameters have to be optimized. rotation: Rotation in degrees (negative is cw) """ - if isinstance(projection, str): - proj_dict = proj4_str_to_dict(projection) - elif isinstance(projection, dict): - proj_dict = projection - else: - raise TypeError('Wrong type for projection: {0}. Expected dict or string.'.format(type(projection))) - self.area_id = area_id self.description = description - self.proj_dict = proj_dict - self.width = self.width = width - self.height = self.height = height + self.width = width + self.height = height + self.shape = (self.height, self.width) self.area_extent = area_extent self.optimize_projection = optimize_projection + if isinstance(resolution, (int, float)): + resolution = (resolution, resolution) self.resolution = resolution self.rotation = rotation + self._projection = projection + + # check if non-dict projections are valid + # dicts may be updated later + if not isinstance(self._projection, dict): + Proj(projection) + + def _get_proj_dict(self): + projection = self._projection + + if CRS is not None: + try: + crs = CRS(projection) + except RuntimeError: + # could be incomplete dictionary + return projection + if hasattr(crs, 'to_dict'): + # pyproj 2.2+ + proj_dict = crs.to_dict() + else: + proj_dict = proj4_str_to_dict(crs.to_proj4()) + else: + if isinstance(projection, str): + proj_dict = proj4_str_to_dict(projection) + elif isinstance(projection, dict): + proj_dict = projection.copy() + else: + raise TypeError('Wrong type for projection: {0}. Expected ' + 'dict or string.'.format(type(projection))) + + return proj_dict + + @property + def pixel_size_x(self): + """Return pixel size in X direction.""" + if self.resolution is None: + return None + return self.resolution[0] + + @property + def pixel_size_y(self): + """Return pixel size in Y direction.""" + if self.resolution is None: + return None + return self.resolution[1] - # size = (x_size, y_size) and shape = (y_size, x_size) def compute_domain(self, corners, resolution=None, shape=None): """Compute shape and area_extent from corners and [shape or resolution] info. Corners represents the center of pixels, while area_extent represents the edge of pixels. + + Note that ``shape`` is (rows, columns) and ``resolution`` is + (x_size, y_size); the dimensions are flipped. + """ if resolution is not None and shape is not None: raise ValueError("Both resolution and shape can't be provided.") @@ -792,10 +864,9 @@ class DynamicAreaDefinition(object): x_resolution = (corners[2] - corners[0]) * 1.0 / (width - 1) y_resolution = (corners[3] - corners[1]) * 1.0 / (height - 1) else: - try: - x_resolution, y_resolution = resolution - except TypeError: - x_resolution = y_resolution = resolution + if isinstance(resolution, (int, float)): + resolution = (resolution, resolution) + x_resolution, y_resolution = resolution width = int(np.rint((corners[2] - corners[0]) * 1.0 / x_resolution + 1)) height = int(np.rint((corners[3] - corners[1]) * 1.0 @@ -810,8 +881,11 @@ class DynamicAreaDefinition(object): def freeze(self, lonslats=None, resolution=None, shape=None, proj_info=None): """Create an AreaDefinition from this area with help of some extra info. - lonlats: - the geographical coordinates to contain in the resulting area. + Parameters + ---------- + lonlats : SwathDefinition or tuple + The geographical coordinates to contain in the resulting area. + A tuple should be ``(lons, lats)``. resolution: the resolution of the resulting area. shape: @@ -821,16 +895,26 @@ class DynamicAreaDefinition(object): Resolution and shape parameters are ignored if the instance is created with the `optimize_projection` flag set to True. + """ + proj_dict = self._get_proj_dict() + projection = self._projection if proj_info is not None: - self.proj_dict.update(proj_info) + # this is now our complete projection information + proj_dict.update(proj_info) + projection = proj_dict if self.optimize_projection: - return lonslats.compute_optimal_bb_area(self.proj_dict) + return lonslats.compute_optimal_bb_area(proj_dict) if resolution is None: resolution = self.resolution - if not self.area_extent or not self.width or not self.height: - proj4 = Proj(**self.proj_dict) + if shape is None: + shape = self.shape + height, width = shape + shape = None if None in shape else shape + area_extent = self.area_extent + if not area_extent or not width or not height: + proj4 = Proj(proj_dict) try: lons, lats = lonslats except (TypeError, ValueError): @@ -840,12 +924,10 @@ class DynamicAreaDefinition(object): yarr[yarr > 9e29] = np.nan corners = [np.nanmin(xarr), np.nanmin(yarr), np.nanmax(xarr), np.nanmax(yarr)] - # Note: size=(width, height) was changed to shape=(height, width). - domain = self.compute_domain(corners, resolution, shape) - self.area_extent, self.width, self.height = domain + area_extent, width, height = self.compute_domain(corners, resolution, shape) return AreaDefinition(self.area_id, self.description, '', - self.proj_dict, self.width, self.height, - self.area_extent, self.rotation) + projection, width, height, + area_extent, self.rotation) def invproj(data_x, data_y, proj_dict): @@ -907,6 +989,8 @@ class AreaDefinition(BaseDefinition): Pixel width in projection units pixel_size_y : float Pixel height in projection units + resolution : tuple + the resolution of the resulting area as (pixel_size_x, pixel_size_y). upper_left_extent : tuple Coordinates (x, y) of upper left corner of upper left pixel in projection units pixel_upper_left : tuple @@ -917,7 +1001,7 @@ class AreaDefinition(BaseDefinition): pixel_offset_y : float y offset between projection center and upper left corner of upper left pixel in units of pixels.. - proj4_string : str + proj_str : str Projection defined as Proj.4 string cartesian_coords : object Grid cartesian coordinates @@ -931,13 +1015,7 @@ class AreaDefinition(BaseDefinition): def __init__(self, area_id, description, proj_id, projection, width, height, area_extent, rotation=None, nprocs=1, lons=None, lats=None, dtype=np.float64): - if isinstance(projection, str): - proj_dict = proj4_str_to_dict(projection) - elif isinstance(projection, dict): - proj_dict = projection - else: - raise TypeError('Wrong type for projection: {0}. Expected dict or string.'.format(type(projection))) - + """Initialize AreaDefinition.""" super(AreaDefinition, self).__init__(lons, lats, nprocs) self.area_id = area_id self.description = description @@ -957,11 +1035,23 @@ class AreaDefinition(BaseDefinition): self.ndim = 2 self.pixel_size_x = (area_extent[2] - area_extent[0]) / float(width) self.pixel_size_y = (area_extent[3] - area_extent[1]) / float(height) - self.proj_dict = convert_proj_floats(proj_dict.items()) self.area_extent = tuple(area_extent) + if CRS is not None: + self.crs = CRS(projection) + self._proj_dict = None + else: + if isinstance(projection, str): + proj_dict = proj4_str_to_dict(projection) + elif isinstance(projection, dict): + # use the float-converted dict to pass to Proj + projection = convert_proj_floats(projection.items()) + proj_dict = projection + else: + raise TypeError('Wrong type for projection: {0}. Expected dict or string.'.format(type(projection))) + self._proj_dict = proj_dict # Calculate area_extent in lon lat - proj = Proj(**proj_dict) + proj = Proj(projection) corner_lons, corner_lats = proj((area_extent[0], area_extent[2]), (area_extent[1], area_extent[3]), inverse=True) @@ -983,6 +1073,17 @@ class AreaDefinition(BaseDefinition): self.dtype = dtype + @property + def proj_dict(self): + """Return the projection dictionary.""" + if self._proj_dict is None and hasattr(self, 'crs'): + if hasattr(self.crs, 'to_dict'): + # pyproj 2.2+ + self._proj_dict = self.crs.to_dict() + else: + self._proj_dict = proj4_str_to_dict(self.crs.to_proj4()) + return self._proj_dict + def copy(self, **override_kwargs): """Make a copy of the current area. @@ -1007,26 +1108,35 @@ class AreaDefinition(BaseDefinition): @property def shape(self): + """Return area shape.""" return self.height, self.width + @property + def resolution(self): + """Return area resolution in X and Y direction.""" + return self.pixel_size_x, self.pixel_size_y + @property def name(self): + """Return area name.""" warnings.warn("'name' is deprecated, use 'description' instead.", PendingDeprecationWarning) return self.description @property def x_size(self): + """Return area width.""" warnings.warn("'x_size' is deprecated, use 'width' instead.", PendingDeprecationWarning) return self.width @property def y_size(self): + """Return area height.""" warnings.warn("'y_size' is deprecated, use 'height' instead.", PendingDeprecationWarning) return self.height @classmethod def from_extent(cls, area_id, projection, shape, area_extent, units=None, **kwargs): - """Creates an AreaDefinition object from area_extent and shape. + """Create an AreaDefinition object from area_extent and shape. Parameters ---------- @@ -1066,12 +1176,13 @@ class AreaDefinition(BaseDefinition): Returns ------- AreaDefinition : AreaDefinition + """ return create_area_def(area_id, projection, shape=shape, area_extent=area_extent, units=units, **kwargs) @classmethod def from_circle(cls, area_id, projection, center, radius, shape=None, resolution=None, units=None, **kwargs): - """Creates an AreaDefinition object from center, radius, and shape or from center, radius, and resolution. + """Create an AreaDefinition from center, radius, and shape or from center, radius, and resolution. Parameters ---------- @@ -1123,13 +1234,14 @@ class AreaDefinition(BaseDefinition): Notes ----- * ``resolution`` and ``radius`` can be specified with one value if dx == dy + """ return create_area_def(area_id, projection, shape=shape, center=center, radius=radius, resolution=resolution, units=units, **kwargs) @classmethod def from_area_of_interest(cls, area_id, projection, shape, center, resolution, units=None, **kwargs): - """Creates an AreaDefinition object from center, resolution, and shape. + """Create an AreaDefinition from center, resolution, and shape. Parameters ---------- @@ -1171,13 +1283,14 @@ class AreaDefinition(BaseDefinition): Returns ------- AreaDefinition : AreaDefinition + """ return create_area_def(area_id, projection, shape=shape, center=center, resolution=resolution, units=units, **kwargs) @classmethod def from_ul_corner(cls, area_id, projection, shape, upper_left_extent, resolution, units=None, **kwargs): - """Creates an AreaDefinition object from upper_left_extent, resolution, and shape. + """Create an AreaDefinition object from upper_left_extent, resolution, and shape. Parameters ---------- @@ -1219,6 +1332,7 @@ class AreaDefinition(BaseDefinition): Returns ------- AreaDefinition : AreaDefinition + """ return create_area_def(area_id, projection, shape=shape, upper_left_extent=upper_left_extent, resolution=resolution, units=units, **kwargs) @@ -1231,9 +1345,11 @@ class AreaDefinition(BaseDefinition): @property def proj_str(self): + """Return PROJ projection string.""" return proj4_dict_to_str(self.proj_dict, sort=True) def __str__(self): + """Return string representation of the AreaDefinition.""" # We need a sorted dictionary for a unique hash of str(self) proj_dict = self.proj_dict proj_str = ('{' + @@ -1253,18 +1369,33 @@ class AreaDefinition(BaseDefinition): __repr__ = __str__ def to_cartopy_crs(self): + """Convert projection to cartopy CRS object.""" from pyresample._cartopy import from_proj bounds = (self.area_extent[0], self.area_extent[2], self.area_extent[1], self.area_extent[3]) - crs = from_proj(self.proj_str, bounds=bounds) + if hasattr(self, 'crs') and self.crs.to_epsg() is not None: + proj_params = "EPSG:{}".format(self.crs.to_epsg()) + else: + proj_params = self.proj_str + if Proj(proj_params).is_latlong(): + # Convert area extent from degrees to radians + bounds = np.deg2rad(bounds) + crs = from_proj(proj_params, bounds=bounds) return crs def create_areas_def(self): + """Generate YAML formatted representation of this area.""" + if hasattr(self, 'crs') and self.crs.to_epsg() is not None: + proj_dict = {'EPSG': self.crs.to_epsg()} + else: + proj_dict = self.proj_dict + # pyproj 2.0+ adds a '+type=crs' parameter + proj_dict.pop('type', None) res = OrderedDict(description=self.description, - projection=OrderedDict(self.proj_dict), + projection=OrderedDict(proj_dict), shape=OrderedDict([('height', self.height), ('width', self.width)])) units = res['projection'].pop('units', None) extent = OrderedDict([('lower_left_xy', list(self.area_extent[:2])), @@ -1276,6 +1407,7 @@ class AreaDefinition(BaseDefinition): return ordered_dump(OrderedDict([(self.area_id, res)])) def create_areas_def_legacy(self): + """Create area definition in legacy format.""" proj_dict = self.proj_dict proj_str = ','.join(["%s=%s" % (str(k), str(proj_dict[k])) for k in sorted(proj_dict.keys())]) @@ -1295,8 +1427,7 @@ class AreaDefinition(BaseDefinition): return area_def_str def __eq__(self, other): - """Test for equality""" - + """Test for equality.""" try: return ((self.proj_str == other.proj_str) and (self.shape == other.shape) and @@ -1305,8 +1436,7 @@ class AreaDefinition(BaseDefinition): return super(AreaDefinition, self).__eq__(other) def __ne__(self, other): - """Test for equality""" - + """Test for equality.""" return not self.__eq__(other) def update_hash(self, the_hash=None): @@ -1319,11 +1449,11 @@ class AreaDefinition(BaseDefinition): return the_hash def colrow2lonlat(self, cols, rows): - """ - Return longitudes and latitudes for the given image columns - and rows. Both scalars and arrays are supported. - To be used with scarse data points instead of slices - (see get_lonlats). + """Return lons and lats for the given image columns and rows. + + Both scalars and arrays are supported. To be used with scarse + data points instead of slices (see get_lonlats). + """ p = Proj(self.proj_str) x = self.projection_x_coords @@ -1331,15 +1461,18 @@ class AreaDefinition(BaseDefinition): return p(y[y.size - cols], x[x.size - rows], inverse=True) def lonlat2colrow(self, lons, lats): - """ - Return image columns and rows for the given longitudes - and latitudes. Both scalars and arrays are supported. - Same as get_xy_from_lonlat, renamed for convenience. + """Return image columns and rows for the given lons and lats. + + Both scalars and arrays are supported. Same as + get_xy_from_lonlat, renamed for convenience. + """ return self.get_xy_from_lonlat(lons, lats) def get_xy_from_lonlat(self, lon, lat): - """Retrieve closest x and y coordinates (column, row indices) for the + """Retrieve closest x and y coordinates. + + Retrieve closest x and y coordinates (column, row indices) for the specified geolocation (lon,lat) if inside area. If lon,lat is a point a ValueError is raised if the return point is outside the area domain. If lon,lat is a tuple of sequences of longitudes and latitudes, a tuple of @@ -1353,8 +1486,8 @@ class AreaDefinition(BaseDefinition): :Returns: (x, y) : tuple of integer points/arrays - """ + """ if isinstance(lon, list): lon = np.array(lon) if isinstance(lat, list): @@ -1393,7 +1526,6 @@ class AreaDefinition(BaseDefinition): ValueError: if the return point is outside the area domain """ - if isinstance(xm, list): xm = np.array(xm) if isinstance(ym, list): @@ -1434,7 +1566,7 @@ class AreaDefinition(BaseDefinition): return int(x__), int(y__) def get_lonlat(self, row, col): - """Retrieves lon and lat values of single point in area grid + """Retrieve lon and lat values of single point in area grid. Parameters ---------- @@ -1444,14 +1576,14 @@ class AreaDefinition(BaseDefinition): Returns ------- (lon, lat) : tuple of floats - """ + """ lon, lat = self.get_lonlats(nprocs=None, data_slice=(row, col)) return np.asscalar(lon), np.asscalar(lat) @staticmethod def _do_rotation(xspan, yspan, rot_deg=0): - """Helper method to apply a rotation factor to a matrix of points.""" + """Apply a rotation factor to a matrix of points.""" if hasattr(xspan, 'chunks'): # we were given dask arrays, use dask functions import dask.array as numpy @@ -1463,6 +1595,7 @@ class AreaDefinition(BaseDefinition): return numpy.einsum('ji, mni -> jmn', rot_mat, numpy.dstack([x, y])) def get_proj_vectors_dask(self, chunks=None, dtype=None): + """Get projection vectors.""" warnings.warn("'get_proj_vectors_dask' is deprecated, please use " "'get_proj_vectors' with the 'chunks' keyword argument specified.", DeprecationWarning) if chunks is None: @@ -1470,7 +1603,7 @@ class AreaDefinition(BaseDefinition): return self.get_proj_vectors(dtype=dtype, chunks=chunks) def _get_proj_vectors(self, dtype=None, check_rotation=True, chunks=None): - """Helper for getting 1D projection coordinates.""" + """Get 1D projection coordinates.""" x_kwargs = {} y_kwargs = {} @@ -1522,6 +1655,7 @@ class AreaDefinition(BaseDefinition): return self._get_proj_vectors(dtype=dtype, chunks=chunks) def get_proj_coords_dask(self, chunks=None, dtype=None): + """Get projection coordinates.""" warnings.warn("'get_proj_coords_dask' is deprecated, please use " "'get_proj_coords' with the 'chunks' keyword argument specified.", DeprecationWarning) if chunks is None: @@ -1571,6 +1705,7 @@ class AreaDefinition(BaseDefinition): @property def projection_x_coords(self): + """Return projection X coordinates.""" if self.rotation != 0: # rotation is only supported in 'get_proj_coords' right now return self.get_proj_coords(data_slice=(0, slice(None)))[0].squeeze() @@ -1578,6 +1713,7 @@ class AreaDefinition(BaseDefinition): @property def projection_y_coords(self): + """Return projection Y coordinates.""" if self.rotation != 0: # rotation is only supported in 'get_proj_coords' right now return self.get_proj_coords(data_slice=(slice(None), 0))[1].squeeze() @@ -1600,6 +1736,7 @@ class AreaDefinition(BaseDefinition): Coordinate(corner_lons[3], corner_lats[3])] def get_lonlats_dask(self, chunks=None, dtype=None): + """Get longitudes and latitudes.""" warnings.warn("'get_lonlats_dask' is deprecated, please use " "'get_lonlats' with the 'chunks' keyword argument specified.", DeprecationWarning) if chunks is None: @@ -1627,8 +1764,8 @@ class AreaDefinition(BaseDefinition): ------- (lons, lats) : tuple of numpy arrays Grids of area lons and and lats - """ + """ if cache: warnings.warn("'cache' keyword argument will be removed in the " "future and data will not be cached.", PendingDeprecationWarning) @@ -1722,7 +1859,8 @@ class AreaDefinition(BaseDefinition): intersection = data_boundary.contour_poly.intersection( area_boundary.contour_poly) if intersection is None: - logger.debug('Cannot determine appropriate slicing.') + logger.debug('Cannot determine appropriate slicing. ' + "Data and projection area do not overlap.") raise NotImplementedError x, y = self.get_xy_from_lonlat(np.rad2deg(intersection.lon), np.rad2deg(intersection.lat)) @@ -1765,11 +1903,10 @@ class AreaDefinition(BaseDefinition): def get_geostationary_angle_extent(geos_area): """Get the max earth (vs space) viewing angles in x and y.""" - # get some projection parameters - req = geos_area.proj_dict['a'] / 1000 - rp = geos_area.proj_dict['b'] / 1000 - h = geos_area.proj_dict['h'] / 1000 + req + req = geos_area.proj_dict['a'] / 1000.0 + rp = geos_area.proj_dict['b'] / 1000.0 + h = geos_area.proj_dict['h'] / 1000.0 + req # compute some constants aeq = 1 - req ** 2 / (h ** 2) @@ -1787,13 +1924,14 @@ def get_geostationary_bounding_box(geos_area, nb_points=50): Args: nb_points: Number of points on the polygon + """ xmax, ymax = get_geostationary_angle_extent(geos_area) # generate points around the north hemisphere in satellite projection # make it a bit smaller so that we stay inside the valid area - x = np.cos(np.linspace(-np.pi, 0, int(nb_points / 2))) * (xmax - 0.0001) - y = -np.sin(np.linspace(-np.pi, 0, int(nb_points / 2))) * (ymax - 0.0001) + x = np.cos(np.linspace(-np.pi, 0, int(nb_points / 2.0))) * (xmax - 0.0001) + y = -np.sin(np.linspace(-np.pi, 0, int(nb_points / 2.0))) * (ymax - 0.0001) ll_x, ll_y, ur_x, ur_y = geos_area.area_extent @@ -1855,9 +1993,10 @@ class StackedAreaDefinition(BaseDefinition): """Definition based on muliple vertically stacked AreaDefinitions.""" def __init__(self, *definitions, **kwargs): - """Base this instance on *definitions*. + """Initialize StackedAreaDefinition based on *definitions*. *kwargs* used here are `nprocs` and `dtype` (see AreaDefinition). + """ nprocs = kwargs.get('nprocs', 1) super(StackedAreaDefinition, self).__init__(nprocs=nprocs) @@ -1869,26 +2008,36 @@ class StackedAreaDefinition(BaseDefinition): @property def width(self): + """Return width of the area definition.""" return self.defs[0].width @property def x_size(self): + """Return width of the area definition.""" warnings.warn("'x_size' is deprecated, use 'width' instead.", PendingDeprecationWarning) return self.width @property def height(self): + """Return height of the area definition.""" return sum(definition.height for definition in self.defs) @property def y_size(self): + """Return height of the area definition.""" warnings.warn("'y_size' is deprecated, use 'height' instead.", PendingDeprecationWarning) return self.height @property def size(self): + """Return size of the area definition.""" return self.height * self.width + @property + def shape(self): + """Return shape of the area definition.""" + return (self.height, self.width) + def append(self, definition): """Append another definition to the area.""" if isinstance(definition, StackedAreaDefinition): @@ -1909,7 +2058,6 @@ class StackedAreaDefinition(BaseDefinition): def get_lonlats(self, nprocs=None, data_slice=None, cache=False, dtype=None, chunks=None): """Return lon and lat arrays of the area.""" - if chunks is not None: from dask.array import vstack else: @@ -1940,9 +2088,10 @@ class StackedAreaDefinition(BaseDefinition): return self.lons, self.lats def get_lonlats_dask(self, chunks=None, dtype=None): - """"Return lon and lat dask arrays of the area.""" + """Return lon and lat dask arrays of the area.""" warnings.warn("'get_lonlats_dask' is deprecated, please use " - "'get_lonlats' with the 'chunks' keyword argument specified.", DeprecationWarning) + "'get_lonlats' with the 'chunks' keyword argument specified.", + DeprecationWarning) if chunks is None: chunks = CHUNK_SIZE # FUTURE: Use a global config object instead return self.get_lonlats(chunks=chunks, dtype=dtype) @@ -1956,25 +2105,25 @@ class StackedAreaDefinition(BaseDefinition): @property def proj4_string(self): - """Returns projection definition as Proj.4 string""" + """Return projection definition as Proj.4 string.""" warnings.warn("'proj4_string' is deprecated, please use 'proj_str' " "instead.", DeprecationWarning) return self.defs[0].proj_str @property def proj_str(self): - """Returns projection definition as Proj.4 string""" + """Return projection definition as Proj.4 string.""" return self.defs[0].proj_str def update_hash(self, the_hash=None): + """Update the hash.""" for areadef in self.defs: the_hash = areadef.update_hash(the_hash) return the_hash def _get_slice(segments, shape): - """Generator for segmenting a 1D or 2D array""" - + """Segment a 1D or 2D array.""" if not (1 <= len(shape) <= 2): raise ValueError('Cannot segment array of shape: %s' % str(shape)) else: @@ -1992,8 +2141,7 @@ def _get_slice(segments, shape): def _flatten_cartesian_coords(cartesian_coords): - """Flatten array to (n, 3) shape""" - + """Flatten array to (n, 3) shape.""" shape = cartesian_coords.shape if len(shape) > 2: cartesian_coords = cartesian_coords.reshape(shape[0] * @@ -2017,6 +2165,7 @@ def _get_highest_level_class(obj1, obj2): def ordered_dump(data, stream=None, Dumper=yaml.Dumper, **kwds): + """Dump the data to YAML in ordered fashion.""" class OrderedDumper(Dumper): pass ===================================== pyresample/test/__init__.py ===================================== @@ -37,7 +37,8 @@ from pyresample.test import ( test_ewa_fornav, test_bilinear, test_data_reduce, - test_spatial_mp + test_spatial_mp, + test_bucket ) import unittest @@ -61,6 +62,7 @@ def suite(): mysuite.addTests(test_bilinear.suite()) mysuite.addTests(test_data_reduce.suite()) mysuite.addTests(test_spatial_mp.suite()) + mysuite.addTests(test_bucket.suite()) return mysuite ===================================== pyresample/test/test_bilinear.py ===================================== @@ -1,16 +1,20 @@ +"""Test bilinear interpolation.""" import unittest import numpy as np +try: + from unittest import mock +except ImportError: + import mock -from pyresample._spatial_mp import Proj -import pyresample.bilinear as bil -from pyresample import geometry, utils, kd_tree - - -class Test(unittest.TestCase): +class TestNumpyBilinear(unittest.TestCase): + """Test Numpy-based bilinear interpolation.""" @classmethod def setUpClass(cls): + """Do some setup for the test class.""" + from pyresample import geometry, kd_tree + cls.pts_irregular = (np.array([[-1., 1.], ]), np.array([[1., 2.], ]), np.array([[-2., -1.], ]), @@ -64,114 +68,147 @@ class Test(unittest.TestCase): cls.idx_ref = idx_ref def test_calc_abc(self): + """Test calculation of quadratic coefficients.""" + from pyresample.bilinear import _calc_abc + # No np.nan inputs pt_1, pt_2, pt_3, pt_4 = self.pts_irregular - res = bil._calc_abc(pt_1, pt_2, pt_3, pt_4, 0.0, 0.0) + res = _calc_abc(pt_1, pt_2, pt_3, pt_4, 0.0, 0.0) self.assertFalse(np.isnan(res[0])) self.assertFalse(np.isnan(res[1])) self.assertFalse(np.isnan(res[2])) # np.nan input -> np.nan output - res = bil._calc_abc(np.array([[np.nan, np.nan]]), - pt_2, pt_3, pt_4, 0.0, 0.0) + res = _calc_abc(np.array([[np.nan, np.nan]]), + pt_2, pt_3, pt_4, 0.0, 0.0) self.assertTrue(np.isnan(res[0])) self.assertTrue(np.isnan(res[1])) self.assertTrue(np.isnan(res[2])) def test_get_ts_irregular(self): - res = bil._get_ts_irregular(self.pts_irregular[0], - self.pts_irregular[1], - self.pts_irregular[2], - self.pts_irregular[3], - 0., 0.) + """Test calculations for irregular corner locations.""" + from pyresample.bilinear import _get_ts_irregular + + res = _get_ts_irregular(self.pts_irregular[0], + self.pts_irregular[1], + self.pts_irregular[2], + self.pts_irregular[3], + 0., 0.) self.assertEqual(res[0], 0.375) self.assertEqual(res[1], 0.5) - res = bil._get_ts_irregular(self.pts_vert_parallel[0], - self.pts_vert_parallel[1], - self.pts_vert_parallel[2], - self.pts_vert_parallel[3], - 0., 0.) + res = _get_ts_irregular(self.pts_vert_parallel[0], + self.pts_vert_parallel[1], + self.pts_vert_parallel[2], + self.pts_vert_parallel[3], + 0., 0.) self.assertTrue(np.isnan(res[0])) self.assertTrue(np.isnan(res[1])) def test_get_ts_uprights_parallel(self): - res = bil._get_ts_uprights_parallel(self.pts_vert_parallel[0], - self.pts_vert_parallel[1], - self.pts_vert_parallel[2], - self.pts_vert_parallel[3], - 0., 0.) + """Test calculation when uprights are parallel.""" + from pyresample.bilinear import _get_ts_uprights_parallel + + res = _get_ts_uprights_parallel(self.pts_vert_parallel[0], + self.pts_vert_parallel[1], + self.pts_vert_parallel[2], + self.pts_vert_parallel[3], + 0., 0.) self.assertEqual(res[0], 0.5) self.assertEqual(res[1], 0.5) def test_get_ts_parallellogram(self): - res = bil._get_ts_parallellogram(self.pts_both_parallel[0], - self.pts_both_parallel[1], - self.pts_both_parallel[2], - 0., 0.) + """Test calculation when the corners form a parallellogram.""" + from pyresample.bilinear import _get_ts_parallellogram + + res = _get_ts_parallellogram(self.pts_both_parallel[0], + self.pts_both_parallel[1], + self.pts_both_parallel[2], + 0., 0.) self.assertEqual(res[0], 0.5) self.assertEqual(res[1], 0.5) def test_get_ts(self): + """Test get_ts().""" + from pyresample.bilinear import _get_ts + out_x = np.array([[0.]]) out_y = np.array([[0.]]) - res = bil._get_ts(self.pts_irregular[0], - self.pts_irregular[1], - self.pts_irregular[2], - self.pts_irregular[3], - out_x, out_y) + res = _get_ts(self.pts_irregular[0], + self.pts_irregular[1], + self.pts_irregular[2], + self.pts_irregular[3], + out_x, out_y) self.assertEqual(res[0], 0.375) self.assertEqual(res[1], 0.5) - res = bil._get_ts(self.pts_both_parallel[0], - self.pts_both_parallel[1], - self.pts_both_parallel[2], - self.pts_both_parallel[3], - out_x, out_y) + res = _get_ts(self.pts_both_parallel[0], + self.pts_both_parallel[1], + self.pts_both_parallel[2], + self.pts_both_parallel[3], + out_x, out_y) self.assertEqual(res[0], 0.5) self.assertEqual(res[1], 0.5) - res = bil._get_ts(self.pts_vert_parallel[0], - self.pts_vert_parallel[1], - self.pts_vert_parallel[2], - self.pts_vert_parallel[3], - out_x, out_y) + res = _get_ts(self.pts_vert_parallel[0], + self.pts_vert_parallel[1], + self.pts_vert_parallel[2], + self.pts_vert_parallel[3], + out_x, out_y) self.assertEqual(res[0], 0.5) self.assertEqual(res[1], 0.5) def test_solve_quadratic(self): - res = bil._solve_quadratic(1, 0, 0) + """Test solving quadratic equation.""" + from pyresample.bilinear import (_solve_quadratic, _calc_abc) + + res = _solve_quadratic(1, 0, 0) self.assertEqual(res[0], 0.0) - res = bil._solve_quadratic(1, 2, 1) + res = _solve_quadratic(1, 2, 1) self.assertTrue(np.isnan(res[0])) - res = bil._solve_quadratic(1, 2, 1, min_val=-2.) + res = _solve_quadratic(1, 2, 1, min_val=-2.) self.assertEqual(res[0], -1.0) # Test that small adjustments work pt_1, pt_2, pt_3, pt_4 = self.pts_vert_parallel pt_1 = self.pts_vert_parallel[0].copy() pt_1[0][0] += 1e-7 - res = bil._calc_abc(pt_1, pt_2, pt_3, pt_4, 0.0, 0.0) - res = bil._solve_quadratic(res[0], res[1], res[2]) + res = _calc_abc(pt_1, pt_2, pt_3, pt_4, 0.0, 0.0) + res = _solve_quadratic(res[0], res[1], res[2]) self.assertAlmostEqual(res[0], 0.5, 5) - res = bil._calc_abc(pt_1, pt_3, pt_2, pt_4, 0.0, 0.0) - res = bil._solve_quadratic(res[0], res[1], res[2]) + res = _calc_abc(pt_1, pt_3, pt_2, pt_4, 0.0, 0.0) + res = _solve_quadratic(res[0], res[1], res[2]) self.assertAlmostEqual(res[0], 0.5, 5) def test_get_output_xy(self): + """Test calculation of output xy-coordinates.""" + from pyresample.bilinear import _get_output_xy + from pyresample._spatial_mp import Proj + proj = Proj(self.target_def.proj_str) - out_x, out_y = bil._get_output_xy(self.target_def, proj) + out_x, out_y = _get_output_xy(self.target_def, proj) self.assertTrue(out_x.all()) self.assertTrue(out_y.all()) def test_get_input_xy(self): + """Test calculation of input xy-coordinates.""" + from pyresample.bilinear import _get_input_xy + from pyresample._spatial_mp import Proj + proj = Proj(self.target_def.proj_str) - in_x, in_y = bil._get_output_xy(self.swath_def, proj) + in_x, in_y = _get_input_xy(self.swath_def, proj, + self.input_idxs, self.idx_ref) self.assertTrue(in_x.all()) self.assertTrue(in_y.all()) def test_get_bounding_corners(self): + """Test calculation of bounding corners.""" + from pyresample.bilinear import (_get_output_xy, + _get_input_xy, + _get_bounding_corners) + from pyresample._spatial_mp import Proj + proj = Proj(self.target_def.proj_str) - out_x, out_y = bil._get_output_xy(self.target_def, proj) - in_x, in_y = bil._get_input_xy(self.swath_def, proj, - self.input_idxs, self.idx_ref) - res = bil._get_bounding_corners(in_x, in_y, out_x, out_y, - self.neighbours, self.idx_ref) + out_x, out_y = _get_output_xy(self.target_def, proj) + in_x, in_y = _get_input_xy(self.swath_def, proj, + self.input_idxs, self.idx_ref) + res = _get_bounding_corners(in_x, in_y, out_x, out_y, + self.neighbours, self.idx_ref) for i in range(len(res) - 1): pt_ = res[i] for j in range(2): @@ -179,6 +216,9 @@ class Test(unittest.TestCase): self.assertTrue(np.isfinite(pt_[5, j])) def test_get_bil_info(self): + """Test calculation of bilinear resampling indices.""" + from pyresample.bilinear import get_bil_info + def _check_ts(t__, s__): for i in range(len(t__)): # Just check the exact value for one pixel @@ -196,85 +236,764 @@ class Test(unittest.TestCase): self.assertTrue(t__[i] <= 1.0) self.assertTrue(s__[i] <= 1.0) - t__, s__, input_idxs, idx_arr = bil.get_bil_info(self.swath_def, - self.target_def, - 50e5, neighbours=32, - nprocs=1, - reduce_data=False) + t__, s__, input_idxs, idx_arr = get_bil_info(self.swath_def, + self.target_def, + 50e5, neighbours=32, + nprocs=1, + reduce_data=False) _check_ts(t__, s__) - t__, s__, input_idxs, idx_arr = bil.get_bil_info(self.swath_def, - self.target_def, - 50e5, neighbours=32, - nprocs=1, - reduce_data=True) + t__, s__, input_idxs, idx_arr = get_bil_info(self.swath_def, + self.target_def, + 50e5, neighbours=32, + nprocs=1, + reduce_data=True) _check_ts(t__, s__) def test_get_sample_from_bil_info(self): - t__, s__, input_idxs, idx_arr = bil.get_bil_info(self.swath_def, - self.target_def, - 50e5, neighbours=32, - nprocs=1) + """Test resampling using resampling indices.""" + from pyresample.bilinear import get_bil_info, get_sample_from_bil_info + + t__, s__, input_idxs, idx_arr = get_bil_info(self.swath_def, + self.target_def, + 50e5, neighbours=32, + nprocs=1) # Sample from data1 - res = bil.get_sample_from_bil_info(self.data1.ravel(), t__, s__, - input_idxs, idx_arr) + res = get_sample_from_bil_info(self.data1.ravel(), t__, s__, + input_idxs, idx_arr) self.assertEqual(res[5], 1.) # Sample from data2 - res = bil.get_sample_from_bil_info(self.data2.ravel(), t__, s__, - input_idxs, idx_arr) + res = get_sample_from_bil_info(self.data2.ravel(), t__, s__, + input_idxs, idx_arr) self.assertEqual(res[5], 2.) # Reshaping - res = bil.get_sample_from_bil_info(self.data2.ravel(), t__, s__, - input_idxs, idx_arr, - output_shape=self.target_def.shape) + res = get_sample_from_bil_info(self.data2.ravel(), t__, s__, + input_idxs, idx_arr, + output_shape=self.target_def.shape) res = res.shape self.assertEqual(res[0], self.target_def.shape[0]) self.assertEqual(res[1], self.target_def.shape[1]) # Test rounding that is happening for certain values - res = bil.get_sample_from_bil_info(self.data3.ravel(), t__, s__, - input_idxs, idx_arr, - output_shape=self.target_def.shape) + res = get_sample_from_bil_info(self.data3.ravel(), t__, s__, + input_idxs, idx_arr, + output_shape=self.target_def.shape) # Four pixels are outside of the data self.assertEqual(np.isnan(res).sum(), 4) def test_resample_bilinear(self): + """Test whole bilinear resampling.""" + from pyresample.bilinear import resample_bilinear + # Single array - res = bil.resample_bilinear(self.data1, - self.swath_def, - self.target_def, - 50e5, neighbours=32, - nprocs=1) + res = resample_bilinear(self.data1, + self.swath_def, + self.target_def, + 50e5, neighbours=32, + nprocs=1) self.assertEqual(res.shape, self.target_def.shape) # There are 12 pixels with value 1, all others are zero self.assertEqual(res.sum(), 12) self.assertEqual((res == 0).sum(), 4) # Single array with masked output - res = bil.resample_bilinear(self.data1, - self.swath_def, - self.target_def, - 50e5, neighbours=32, - nprocs=1, fill_value=None) + res = resample_bilinear(self.data1, + self.swath_def, + self.target_def, + 50e5, neighbours=32, + nprocs=1, fill_value=None) self.assertTrue(hasattr(res, 'mask')) # There should be 12 valid pixels self.assertEqual(self.target_def.size - res.mask.sum(), 12) # Two stacked arrays data = np.dstack((self.data1, self.data2)) - res = bil.resample_bilinear(data, - self.swath_def, - self.target_def) + res = resample_bilinear(data, + self.swath_def, + self.target_def) shp = res.shape self.assertEqual(shp[0:2], self.target_def.shape) self.assertEqual(shp[-1], 2) +class TestXarrayBilinear(unittest.TestCase): + """Test Xarra/Dask -based bilinear interpolation.""" + + def setUp(self): + """Do some setup for common things.""" + import dask.array as da + from xarray import DataArray + from pyresample import geometry, kd_tree + + self.pts_irregular = (np.array([[-1., 1.], ]), + np.array([[1., 2.], ]), + np.array([[-2., -1.], ]), + np.array([[2., -4.], ])) + self.pts_vert_parallel = (np.array([[-1., 1.], ]), + np.array([[1., 2.], ]), + np.array([[-1., -1.], ]), + np.array([[1., -2.], ])) + self.pts_both_parallel = (np.array([[-1., 1.], ]), + np.array([[1., 1.], ]), + np.array([[-1., -1.], ]), + np.array([[1., -1.], ])) + + # Area definition with four pixels + self.target_def = geometry.AreaDefinition('areaD', + 'Europe (3km, HRV, VTC)', + 'areaD', + {'a': '6378144.0', + 'b': '6356759.0', + 'lat_0': '50.00', + 'lat_ts': '50.00', + 'lon_0': '8.00', + 'proj': 'stere'}, + 4, 4, + [-1370912.72, + -909968.64000000001, + 1029087.28, + 1490031.3600000001]) + + # Input data around the target pixel at 0.63388324, 55.08234642, + in_shape = (100, 100) + self.data1 = DataArray(da.ones((in_shape[0], in_shape[1])), dims=('y', 'x')) + self.data2 = 2. * self.data1 + self.data3 = self.data1 + 9.5 + lons, lats = np.meshgrid(np.linspace(-25., 40., num=in_shape[0]), + np.linspace(45., 75., num=in_shape[1])) + self.source_def = geometry.SwathDefinition(lons=lons, lats=lats) + + self.radius = 50e3 + self.neighbours = 32 + valid_input_index, output_idxs, index_array, dists = \ + kd_tree.get_neighbour_info(self.source_def, self.target_def, + self.radius, neighbours=self.neighbours, + nprocs=1) + input_size = valid_input_index.sum() + index_mask = (index_array == input_size) + index_array = np.where(index_mask, 0, index_array) + + self.valid_input_index = valid_input_index + self.index_array = index_array + + shp = self.source_def.shape + self.cols, self.lines = np.meshgrid(np.arange(shp[1]), + np.arange(shp[0])) + + def test_init(self): + """Test that the resampler has been initialized correctly.""" + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + # With defaults + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + self.assertTrue(resampler.source_geo_def == self.source_def) + self.assertTrue(resampler.target_geo_def == self.target_def) + self.assertEqual(resampler.radius_of_influence, self.radius) + self.assertEqual(resampler.neighbours, 32) + self.assertEqual(resampler.epsilon, 0) + self.assertTrue(resampler.reduce_data) + # These should be None + self.assertIsNone(resampler.valid_input_index) + self.assertIsNone(resampler.valid_output_index) + self.assertIsNone(resampler.index_array) + self.assertIsNone(resampler.distance_array) + self.assertIsNone(resampler.bilinear_t) + self.assertIsNone(resampler.bilinear_s) + self.assertIsNone(resampler.slices_x) + self.assertIsNone(resampler.slices_y) + self.assertIsNone(resampler.mask_slices) + self.assertIsNone(resampler.out_coords_x) + self.assertIsNone(resampler.out_coords_y) + # self.slices_{x,y} are used in self.slices dict + self.assertTrue(resampler.slices['x'] is resampler.slices_x) + self.assertTrue(resampler.slices['y'] is resampler.slices_y) + # self.out_coords_{x,y} are used in self.out_coords dict + self.assertTrue(resampler.out_coords['x'] is resampler.out_coords_x) + self.assertTrue(resampler.out_coords['y'] is resampler.out_coords_y) + + # Override defaults + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius, neighbours=16, + epsilon=0.1, reduce_data=False) + self.assertEqual(resampler.neighbours, 16) + self.assertEqual(resampler.epsilon, 0.1) + self.assertFalse(resampler.reduce_data) + + def test_get_bil_info(self): + """Test calculation of bilinear info.""" + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + def _check_ts(t__, s__, nans): + for i, _ in enumerate(t__): + # Just check the exact value for one pixel + if i == 5: + self.assertAlmostEqual(t__[i], 0.730659147133, 5) + self.assertAlmostEqual(s__[i], 0.310314173004, 5) + # These pixels are outside the area + elif i in nans: + self.assertTrue(np.isnan(t__[i])) + self.assertTrue(np.isnan(s__[i])) + # All the others should have values between 0.0 and 1.0 + else: + self.assertTrue(t__[i] >= 0.0) + self.assertTrue(s__[i] >= 0.0) + self.assertTrue(t__[i] <= 1.0) + self.assertTrue(s__[i] <= 1.0) + + # Data reduction enabled (default) + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius, reduce_data=True) + (t__, s__, slices, mask_slices, out_coords) = resampler.get_bil_info() + _check_ts(t__.compute(), s__.compute(), [3, 10, 12, 13, 14, 15]) + + # Nothing should be masked based on coordinates + self.assertTrue(np.all(~mask_slices)) + # Four values per output location + self.assertEqual(mask_slices.shape, (self.target_def.size, 4)) + + # self.slices_{x,y} are used in self.slices dict so they + # should be the same (object) + self.assertTrue(isinstance(slices, dict)) + self.assertTrue(resampler.slices['x'] is resampler.slices_x) + self.assertTrue(np.all(resampler.slices['x'] == slices['x'])) + self.assertTrue(resampler.slices['y'] is resampler.slices_y) + self.assertTrue(np.all(resampler.slices['y'] == slices['y'])) + + # self.slices_{x,y} are used in self.slices dict so they + # should be the same (object) + self.assertTrue(isinstance(out_coords, dict)) + self.assertTrue(resampler.out_coords['x'] is resampler.out_coords_x) + self.assertTrue(np.all(resampler.out_coords['x'] == out_coords['x'])) + self.assertTrue(resampler.out_coords['y'] is resampler.out_coords_y) + self.assertTrue(np.all(resampler.out_coords['y'] == out_coords['y'])) + + # Also some other attributes should have been set + self.assertTrue(t__ is resampler.bilinear_t) + self.assertTrue(s__ is resampler.bilinear_s) + self.assertIsNotNone(resampler.valid_output_index) + self.assertIsNotNone(resampler.index_array) + self.assertIsNotNone(resampler.valid_input_index) + + # Data reduction disabled + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius, reduce_data=False) + (t__, s__, slices, mask_slices, out_coords) = resampler.get_bil_info() + _check_ts(t__.compute(), s__.compute(), [10, 12, 13, 14, 15]) + + def test_get_sample_from_bil_info(self): + """Test bilinear interpolation as a whole.""" + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + _ = resampler.get_bil_info() + + # Sample from data1 + res = resampler.get_sample_from_bil_info(self.data1) + res = res.compute() + # Check couple of values + self.assertEqual(res.values[1, 1], 1.) + self.assertTrue(np.isnan(res.values[0, 3])) + # Check that the values haven't gone down or up a lot + self.assertAlmostEqual(np.nanmin(res.values), 1.) + self.assertAlmostEqual(np.nanmax(res.values), 1.) + # Check that dimensions are the same + self.assertEqual(res.dims, self.data1.dims) + + # Sample from data1, custom fill value + res = resampler.get_sample_from_bil_info(self.data1, fill_value=-1.0) + res = res.compute() + self.assertEqual(np.nanmin(res.values), -1.) + + # Sample from integer data + res = resampler.get_sample_from_bil_info(self.data1.astype(np.uint8), + fill_value=None) + res = res.compute() + # Five values should be filled with zeros, which is the + # default fill_value for integer data + self.assertEqual(np.sum(res == 0), 6) + + @mock.patch('pyresample.bilinear.xarr.setattr') + def test_compute_indices(self, mock_setattr): + """Test running .compute() for indices.""" + from pyresample.bilinear.xarr import (XArrayResamplerBilinear, + CACHE_INDICES) + + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + + # Set indices to Numpy arrays + for idx in CACHE_INDICES: + setattr(resampler, idx, np.array([])) + resampler._compute_indices() + # None of the indices shouldn't have been reassigned + mock_setattr.assert_not_called() + + # Set indices to a Mock object + arr = mock.MagicMock() + for idx in CACHE_INDICES: + setattr(resampler, idx, arr) + resampler._compute_indices() + # All the indices should have been reassigned + self.assertEqual(mock_setattr.call_count, len(CACHE_INDICES)) + # The compute should have been called the same amount of times + self.assertEqual(arr.compute.call_count, len(CACHE_INDICES)) + + def test_add_missing_coordinates(self): + """Test coordinate updating.""" + import dask.array as da + from xarray import DataArray + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + bands = ['R', 'G', 'B'] + data = DataArray(da.ones((3, 10, 10)), dims=('bands', 'y', 'x'), + coords={'bands': bands, + 'y': np.arange(10), 'x': np.arange(10)}) + resampler._add_missing_coordinates(data) + # X and Y coordinates should not change + self.assertIsNone(resampler.out_coords_x) + self.assertIsNone(resampler.out_coords_y) + self.assertIsNone(resampler.out_coords['x']) + self.assertIsNone(resampler.out_coords['y']) + self.assertTrue('bands' in resampler.out_coords) + self.assertTrue(np.all(resampler.out_coords['bands'] == bands)) + + def test_slice_data(self): + """Test slicing the data.""" + import dask.array as da + from xarray import DataArray + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + + # Too many dimensions + data = DataArray(da.ones((1, 3, 10, 10))) + with self.assertRaises(ValueError): + _ = resampler._slice_data(data, np.nan) + + # 2D data + data = DataArray(da.ones((10, 10))) + resampler.slices_x = np.random.randint(0, 10, (100, 4)) + resampler.slices_y = np.random.randint(0, 10, (100, 4)) + resampler.mask_slices = np.zeros((100, 4), dtype=np.bool) + p_1, p_2, p_3, p_4 = resampler._slice_data(data, np.nan) + self.assertEqual(p_1.shape, (100, )) + self.assertTrue(p_1.shape == p_2.shape == p_3.shape == p_4.shape) + self.assertTrue(np.all(p_1 == 1.0) and np.all(p_2 == 1.0) and + np.all(p_3 == 1.0) and np.all(p_4 == 1.0)) + + # 2D data with masking + resampler.mask_slices = np.ones((100, 4), dtype=np.bool) + p_1, p_2, p_3, p_4 = resampler._slice_data(data, np.nan) + self.assertTrue(np.all(np.isnan(p_1)) and np.all(np.isnan(p_2)) and + np.all(np.isnan(p_3)) and np.all(np.isnan(p_4))) + + # 3D data + data = DataArray(da.ones((3, 10, 10))) + resampler.slices_x = np.random.randint(0, 10, (100, 4)) + resampler.slices_y = np.random.randint(0, 10, (100, 4)) + resampler.mask_slices = np.zeros((100, 4), dtype=np.bool) + p_1, p_2, p_3, p_4 = resampler._slice_data(data, np.nan) + self.assertEqual(p_1.shape, (3, 100)) + self.assertTrue(p_1.shape == p_2.shape == p_3.shape == p_4.shape) + + # 3D data with masking + resampler.mask_slices = np.ones((100, 4), dtype=np.bool) + p_1, p_2, p_3, p_4 = resampler._slice_data(data, np.nan) + self.assertTrue(np.all(np.isnan(p_1)) and np.all(np.isnan(p_2)) and + np.all(np.isnan(p_3)) and np.all(np.isnan(p_4))) + + @mock.patch('pyresample.bilinear.xarr.np.meshgrid') + def test_get_slices(self, meshgrid): + """Test slice array creation.""" + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + meshgrid.return_value = (self.cols, self.lines) + + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + resampler.valid_input_index = self.valid_input_index + resampler.index_array = self.index_array + + resampler._get_slices() + self.assertIsNotNone(resampler.out_coords_x) + self.assertIsNotNone(resampler.out_coords_y) + self.assertTrue(resampler.out_coords_x is resampler.out_coords['x']) + self.assertTrue(resampler.out_coords_y is resampler.out_coords['y']) + self.assertTrue(np.allclose( + resampler.out_coords_x, + [-1070912.72, -470912.72, 129087.28, 729087.28])) + self.assertTrue(np.allclose( + resampler.out_coords_y, + [1190031.36, 590031.36, -9968.64, -609968.64])) + + self.assertIsNotNone(resampler.slices_x) + self.assertIsNotNone(resampler.slices_y) + self.assertTrue(resampler.slices_x is resampler.slices['x']) + self.assertTrue(resampler.slices_y is resampler.slices['y']) + self.assertTrue(resampler.slices_x.shape == (self.target_def.size, 32)) + self.assertTrue(resampler.slices_y.shape == (self.target_def.size, 32)) + self.assertEqual(np.sum(resampler.slices_x), 12471) + self.assertEqual(np.sum(resampler.slices_y), 2223) + + self.assertFalse(np.any(resampler.mask_slices)) + + # Ensure that source geo def is used in masking + # Setting target_geo_def to 0-size shouldn't cause any masked values + resampler.target_geo_def = np.array([]) + resampler._get_slices() + self.assertFalse(np.any(resampler.mask_slices)) + # Setting source area def to 0-size should mask all values + resampler.source_geo_def = np.array([[]]) + resampler._get_slices() + self.assertTrue(np.all(resampler.mask_slices)) + + @mock.patch('pyresample.bilinear.xarr.KDTree') + def test_create_resample_kdtree(self, KDTree): + """Test that KDTree creation is called.""" + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + + vii, kdtree = resampler._create_resample_kdtree() + self.assertEqual(np.sum(vii), 2700) + self.assertEqual(vii.size, self.source_def.size) + KDTree.assert_called_once() + + @mock.patch('pyresample.bilinear.xarr.query_no_distance') + def test_query_resample_kdtree(self, qnd): + """Test that query_no_distance is called in _query_resample_kdtree().""" + from pyresample.bilinear.xarr import XArrayResamplerBilinear + + resampler = XArrayResamplerBilinear(self.source_def, self.target_def, + self.radius) + res, none = resampler._query_resample_kdtree(1, 2, 3, 4, + reduce_data=5) + qnd.assert_called_with(2, 3, 4, 1, resampler.neighbours, + resampler.epsilon, + resampler.radius_of_influence) + + def test_get_input_xy_dask(self): + """Test computation of input X and Y coordinates in target proj.""" + import dask.array as da + from pyresample.bilinear.xarr import _get_input_xy_dask + from pyresample._spatial_mp import Proj + + proj = Proj(self.target_def.proj_str) + in_x, in_y = _get_input_xy_dask(self.source_def, proj, + da.from_array(self.valid_input_index), + da.from_array(self.index_array)) + + self.assertTrue(in_x.shape, (self.target_def.size, 32)) + self.assertTrue(in_y.shape, (self.target_def.size, 32)) + self.assertTrue(in_x.all()) + self.assertTrue(in_y.all()) + + def test_mask_coordinates_dask(self): + """Test masking of invalid coordinates.""" + import dask.array as da + from pyresample.bilinear.xarr import _mask_coordinates_dask + + lons, lats = _mask_coordinates_dask( + da.from_array([-200., 0., 0., 0., 200.]), + da.from_array([0., -100., 0, 100., 0.])) + lons, lats = da.compute(lons, lats) + self.assertTrue(lons[2] == lats[2] == 0.0) + self.assertEqual(np.sum(np.isnan(lons)), 4) + self.assertEqual(np.sum(np.isnan(lats)), 4) + + def test_get_bounding_corners_dask(self): + """Test finding surrounding bounding corners.""" + import dask.array as da + from pyresample.bilinear.xarr import (_get_input_xy_dask, + _get_bounding_corners_dask) + from pyresample._spatial_mp import Proj + from pyresample import CHUNK_SIZE + + proj = Proj(self.target_def.proj_str) + out_x, out_y = self.target_def.get_proj_coords(chunks=CHUNK_SIZE) + out_x = da.ravel(out_x) + out_y = da.ravel(out_y) + in_x, in_y = _get_input_xy_dask(self.source_def, proj, + da.from_array(self.valid_input_index), + da.from_array(self.index_array)) + pt_1, pt_2, pt_3, pt_4, ia_ = _get_bounding_corners_dask( + in_x, in_y, out_x, out_y, + self.neighbours, + da.from_array(self.index_array)) + + self.assertTrue(pt_1.shape == pt_2.shape == + pt_3.shape == pt_4.shape == + (self.target_def.size, 2)) + self.assertTrue(ia_.shape == (self.target_def.size, 4)) + + # Check which of the locations has four valid X/Y pairs by + # finding where there are non-NaN values + res = da.sum(pt_1 + pt_2 + pt_3 + pt_4, axis=1).compute() + self.assertEqual(np.sum(~np.isnan(res)), 10) + + def test_get_corner_dask(self): + """Test finding the closest corners.""" + import dask.array as da + from pyresample.bilinear.xarr import (_get_corner_dask, + _get_input_xy_dask) + from pyresample import CHUNK_SIZE + from pyresample._spatial_mp import Proj + + proj = Proj(self.target_def.proj_str) + in_x, in_y = _get_input_xy_dask(self.source_def, proj, + da.from_array(self.valid_input_index), + da.from_array(self.index_array)) + out_x, out_y = self.target_def.get_proj_coords(chunks=CHUNK_SIZE) + out_x = da.ravel(out_x) + out_y = da.ravel(out_y) + + # Some copy&paste from the code to get the input + out_x_tile = np.reshape(np.tile(out_x, self.neighbours), + (self.neighbours, out_x.size)).T + out_y_tile = np.reshape(np.tile(out_y, self.neighbours), + (self.neighbours, out_y.size)).T + x_diff = out_x_tile - in_x + y_diff = out_y_tile - in_y + stride = np.arange(x_diff.shape[0]) + + # Use lower left source pixels for testing + valid = (x_diff > 0) & (y_diff > 0) + x_3, y_3, idx_3 = _get_corner_dask(stride, valid, in_x, in_y, + da.from_array(self.index_array)) + + self.assertTrue(x_3.shape == y_3.shape == idx_3.shape == + (self.target_def.size, )) + # Four locations have no data to the lower left of them (the + # bottom row of the area + self.assertEqual(np.sum(np.isnan(x_3.compute())), 4) + + @mock.patch('pyresample.bilinear.xarr._get_ts_parallellogram_dask') + @mock.patch('pyresample.bilinear.xarr._get_ts_uprights_parallel_dask') + @mock.patch('pyresample.bilinear.xarr._get_ts_irregular_dask') + def test_get_ts_dask(self, irregular, uprights, parallellogram): + """Test that the three separate functions are called.""" + from pyresample.bilinear.xarr import _get_ts_dask + + # All valid values + t_irr = np.array([0.1, 0.2, 0.3]) + s_irr = np.array([0.1, 0.2, 0.3]) + irregular.return_value = (t_irr, s_irr) + t__, s__ = _get_ts_dask(1, 2, 3, 4, 5, 6) + irregular.assert_called_once() + uprights.assert_not_called() + parallellogram.assert_not_called() + self.assertTrue(np.allclose(t__.compute(), t_irr)) + self.assertTrue(np.allclose(s__.compute(), s_irr)) + + # NaN in the first step, good value for that location from the + # second step + t_irr = np.array([0.1, 0.2, np.nan]) + s_irr = np.array([0.1, 0.2, np.nan]) + irregular.return_value = (t_irr, s_irr) + t_upr = np.array([3, 3, 0.3]) + s_upr = np.array([3, 3, 0.3]) + uprights.return_value = (t_upr, s_upr) + t__, s__ = _get_ts_dask(1, 2, 3, 4, 5, 6) + self.assertEqual(irregular.call_count, 2) + uprights.assert_called_once() + parallellogram.assert_not_called() + # Only the last value of the first step should have been replaced + t_res = np.array([0.1, 0.2, 0.3]) + s_res = np.array([0.1, 0.2, 0.3]) + self.assertTrue(np.allclose(t__.compute(), t_res)) + self.assertTrue(np.allclose(s__.compute(), s_res)) + + # Two NaNs in the first step, one of which are found by the + # second, and the last bad value is replaced by the third step + t_irr = np.array([0.1, np.nan, np.nan]) + s_irr = np.array([0.1, np.nan, np.nan]) + irregular.return_value = (t_irr, s_irr) + t_upr = np.array([3, np.nan, 0.3]) + s_upr = np.array([3, np.nan, 0.3]) + uprights.return_value = (t_upr, s_upr) + t_par = np.array([4, 0.2, 0.3]) + s_par = np.array([4, 0.2, 0.3]) + parallellogram.return_value = (t_par, s_par) + t__, s__ = _get_ts_dask(1, 2, 3, 4, 5, 6) + self.assertEqual(irregular.call_count, 3) + self.assertEqual(uprights.call_count, 2) + parallellogram.assert_called_once() + # Only the last two values should have been replaced + t_res = np.array([0.1, 0.2, 0.3]) + s_res = np.array([0.1, 0.2, 0.3]) + self.assertTrue(np.allclose(t__.compute(), t_res)) + self.assertTrue(np.allclose(s__.compute(), s_res)) + + # Too large and small values should be set to NaN + t_irr = np.array([1.00001, -0.00001, 1e6]) + s_irr = np.array([1.00001, -0.00001, -1e6]) + irregular.return_value = (t_irr, s_irr) + # Second step also returns invalid values + t_upr = np.array([1.00001, 0.2, np.nan]) + s_upr = np.array([-0.00001, 0.2, np.nan]) + uprights.return_value = (t_upr, s_upr) + # Third step has one new valid value, the last will stay invalid + t_par = np.array([0.1, 0.2, 4.0]) + s_par = np.array([0.1, 0.2, 4.0]) + parallellogram.return_value = (t_par, s_par) + t__, s__ = _get_ts_dask(1, 2, 3, 4, 5, 6) + + t_res = np.array([0.1, 0.2, np.nan]) + s_res = np.array([0.1, 0.2, np.nan]) + self.assertTrue(np.allclose(t__.compute(), t_res, equal_nan=True)) + self.assertTrue(np.allclose(s__.compute(), s_res, equal_nan=True)) + + def test_get_ts_irregular_dask(self): + """Test calculations for irregular corner locations.""" + from pyresample.bilinear.xarr import _get_ts_irregular_dask + + res = _get_ts_irregular_dask(self.pts_irregular[0], + self.pts_irregular[1], + self.pts_irregular[2], + self.pts_irregular[3], + 0., 0.) + self.assertEqual(res[0], 0.375) + self.assertEqual(res[1], 0.5) + res = _get_ts_irregular_dask(self.pts_vert_parallel[0], + self.pts_vert_parallel[1], + self.pts_vert_parallel[2], + self.pts_vert_parallel[3], + 0., 0.) + self.assertTrue(np.isnan(res[0])) + self.assertTrue(np.isnan(res[1])) + + def test_get_ts_uprights_parallel(self): + """Test calculation when uprights are parallel.""" + from pyresample.bilinear import _get_ts_uprights_parallel + + res = _get_ts_uprights_parallel(self.pts_vert_parallel[0], + self.pts_vert_parallel[1], + self.pts_vert_parallel[2], + self.pts_vert_parallel[3], + 0., 0.) + self.assertEqual(res[0], 0.5) + self.assertEqual(res[1], 0.5) + + def test_get_ts_parallellogram(self): + """Test calculation when the corners form a parallellogram.""" + from pyresample.bilinear import _get_ts_parallellogram + + res = _get_ts_parallellogram(self.pts_both_parallel[0], + self.pts_both_parallel[1], + self.pts_both_parallel[2], + 0., 0.) + self.assertEqual(res[0], 0.5) + self.assertEqual(res[1], 0.5) + + def test_calc_abc(self): + """Test calculation of quadratic coefficients.""" + from pyresample.bilinear.xarr import _calc_abc_dask + + # No np.nan inputs + pt_1, pt_2, pt_3, pt_4 = self.pts_irregular + res = _calc_abc_dask(pt_1, pt_2, pt_3, pt_4, 0.0, 0.0) + self.assertFalse(np.isnan(res[0])) + self.assertFalse(np.isnan(res[1])) + self.assertFalse(np.isnan(res[2])) + # np.nan input -> np.nan output + res = _calc_abc_dask(np.array([[np.nan, np.nan]]), + pt_2, pt_3, pt_4, 0.0, 0.0) + self.assertTrue(np.isnan(res[0])) + self.assertTrue(np.isnan(res[1])) + self.assertTrue(np.isnan(res[2])) + + def test_solve_quadratic(self): + """Test solving quadratic equation.""" + from pyresample.bilinear.xarr import (_solve_quadratic_dask, + _calc_abc_dask) + + res = _solve_quadratic_dask(1, 0, 0).compute() + self.assertEqual(res, 0.0) + res = _solve_quadratic_dask(1, 2, 1).compute() + self.assertTrue(np.isnan(res)) + res = _solve_quadratic_dask(1, 2, 1, min_val=-2.).compute() + self.assertEqual(res, -1.0) + # Test that small adjustments work + pt_1, pt_2, pt_3, pt_4 = self.pts_vert_parallel + pt_1 = self.pts_vert_parallel[0].copy() + pt_1[0][0] += 1e-7 + res = _calc_abc_dask(pt_1, pt_2, pt_3, pt_4, 0.0, 0.0) + res = _solve_quadratic_dask(res[0], res[1], res[2]).compute() + self.assertAlmostEqual(res[0], 0.5, 5) + res = _calc_abc_dask(pt_1, pt_3, pt_2, pt_4, 0.0, 0.0) + res = _solve_quadratic_dask(res[0], res[1], res[2]).compute() + self.assertAlmostEqual(res[0], 0.5, 5) + + def test_query_no_distance(self): + """Test KDTree querying.""" + from pyresample.bilinear.xarr import query_no_distance + + kdtree = mock.MagicMock() + kdtree.query.return_value = (1, 2) + lons, lats = self.target_def.get_lonlats() + voi = (lons >= -180) & (lons <= 180) & (lats <= 90) & (lats >= -90) + res = query_no_distance(lons, lats, voi, kdtree, self.neighbours, + 0., self.radius) + # Only the second value from the query is returned + self.assertEqual(res, 2) + kdtree.query.assert_called_once() + + def test_get_valid_input_index_dask(self): + """Test finding valid indices for reduced input data.""" + from pyresample.bilinear.xarr import _get_valid_input_index_dask + + # Do not reduce data + vii, lons, lats = _get_valid_input_index_dask(self.source_def, + self.target_def, + False, self.radius) + self.assertEqual(vii.shape, (self.source_def.size, )) + self.assertTrue(vii.dtype == np.bool) + # No data has been reduced, whole input is used + self.assertTrue(vii.compute().all()) + + # Reduce data + vii, lons, lats = _get_valid_input_index_dask(self.source_def, + self.target_def, + True, self.radius) + # 2700 valid input points + self.assertEqual(vii.compute().sum(), 2700) + + def test_create_empty_bil_info(self): + """Test creation of empty bilinear info.""" + from pyresample.bilinear.xarr import _create_empty_bil_info + + t__, s__, vii, ia_ = _create_empty_bil_info(self.source_def, + self.target_def) + self.assertEqual(t__.shape, (self.target_def.size,)) + self.assertEqual(s__.shape, (self.target_def.size,)) + self.assertEqual(ia_.shape, (self.target_def.size, 4)) + self.assertTrue(ia_.dtype == np.int32) + self.assertEqual(vii.shape, (self.source_def.size,)) + self.assertTrue(vii.dtype == np.bool) + + def test_lonlat2xyz(self): + """Test conversion from geographic to cartesian 3D coordinates.""" + from pyresample.bilinear.xarr import lonlat2xyz + from pyresample import CHUNK_SIZE + + lons, lats = self.target_def.get_lonlats(chunks=CHUNK_SIZE) + res = lonlat2xyz(lons, lats) + self.assertEqual(res.shape, (self.target_def.size, 3)) + vals = [3188578.91069278, -612099.36103276, 5481596.63569999] + self.assertTrue(np.allclose(res.compute()[0, :], vals)) + + def suite(): - """The test suite. - """ + """Create the test suite.""" loader = unittest.TestLoader() mysuite = unittest.TestSuite() - mysuite.addTest(loader.loadTestsFromTestCase(Test)) + mysuite.addTest(loader.loadTestsFromTestCase(TestNumpyBilinear)) + mysuite.addTest(loader.loadTestsFromTestCase(TestXarrayBilinear)) return mysuite ===================================== pyresample/test/test_bucket.py ===================================== @@ -0,0 +1,219 @@ +import unittest +import numpy as np +import dask.array as da +import dask +import xarray as xr +try: + from unittest.mock import MagicMock, patch +except ImportError: + # separate mock package py<3.3 + from mock import MagicMock, patch + +from pyresample.geometry import AreaDefinition +from pyresample import bucket +from pyresample.test.utils import CustomScheduler + + +class Test(unittest.TestCase): + + adef = AreaDefinition('eurol', 'description', '', + {'ellps': 'WGS84', + 'lat_0': '90.0', + 'lat_ts': '60.0', + 'lon_0': '0.0', + 'proj': 'stere'}, 2560, 2048, + (-3780000.0, -7644000.0, 3900000.0, -1500000.0)) + + chunks = 2 + lons = da.from_array(np.array([[25., 25.], [25., 25.]]), + chunks=chunks) + lats = da.from_array(np.array([[60., 60.00001], [60.2, 60.3]]), + chunks=chunks) + + def setUp(self): + self.resampler = bucket.BucketResampler(self.adef, self.lons, self.lats) + + @patch('pyresample.bucket.Proj') + @patch('pyresample.bucket.BucketResampler._get_indices') + def test_init(self, get_indices, prj): + resampler = bucket.BucketResampler(self.adef, self.lons, self.lats) + get_indices.assert_called_once() + prj.assert_called_once_with(self.adef.proj_dict) + self.assertTrue(hasattr(resampler, 'target_area')) + self.assertTrue(hasattr(resampler, 'source_lons')) + self.assertTrue(hasattr(resampler, 'source_lats')) + self.assertTrue(hasattr(resampler, 'x_idxs')) + self.assertTrue(hasattr(resampler, 'y_idxs')) + self.assertTrue(hasattr(resampler, 'idxs')) + self.assertTrue(hasattr(resampler, 'get_sum')) + self.assertTrue(hasattr(resampler, 'get_count')) + self.assertTrue(hasattr(resampler, 'get_average')) + self.assertTrue(hasattr(resampler, 'get_fractions')) + self.assertIsNone(resampler.counts) + + def test_round_to_resolution(self): + """Test rounding to given resolution""" + # Scalar, integer resolution + self.assertEqual(bucket.round_to_resolution(5.5, 2.), 6) + # Scalar, non-integer resolution + self.assertEqual(bucket.round_to_resolution(5.5, 1.7), 5.1) + # List + self.assertTrue(np.all(bucket.round_to_resolution([4.2, 5.6], 2) == + np.array([4., 6.]))) + # Numpy array + self.assertTrue(np.all(bucket.round_to_resolution(np.array([4.2, 5.6]), 2) == + np.array([4., 6.]))) + # Dask array + self.assertTrue( + np.all(bucket.round_to_resolution(da.array([4.2, 5.6]), 2) == + np.array([4., 6.]))) + + def test_get_proj_coordinates(self): + """Test calculation of projection coordinates.""" + prj = MagicMock() + prj.return_value = ([3.1, 3.1, 3.1], [4.8, 4.8, 4.8]) + lons = [1., 1., 1.] + lats = [2., 2., 2.] + x_res, y_res = 0.5, 0.5 + self.resampler.prj = prj + result = self.resampler._get_proj_coordinates(lons, lats, x_res, y_res) + prj.assert_called_once_with(lons, lats) + self.assertTrue(isinstance(result, np.ndarray)) + self.assertEqual(result.shape, (2, 3)) + self.assertTrue(np.all(result == np.array([[3., 3., 3.], + [5., 5., 5.]]))) + + def test_get_bucket_indices(self): + """Test calculation of array indices.""" + # Ensure nothing is calculated + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + self.resampler._get_indices() + x_idxs, y_idxs = da.compute(self.resampler.x_idxs, + self.resampler.y_idxs) + self.assertTrue(np.all(x_idxs == np.array([1709, 1709, 1706, 1705]))) + self.assertTrue(np.all(y_idxs == np.array([465, 465, 458, 455]))) + + def test_get_sum(self): + """Test drop-in-a-bucket sum.""" + data = da.from_array(np.array([[2., 2.], [2., 2.]]), + chunks=self.chunks) + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_sum(data) + + result = result.compute() + # One bin with two hits, so max value is 2.0 + self.assertTrue(np.max(result) == 4.) + # Two bins with the same value + self.assertEqual(np.sum(result == 2.), 2) + # One bin with double the value + self.assertEqual(np.sum(result == 4.), 1) + self.assertEqual(result.shape, self.adef.shape) + + # Test that also Xarray.DataArrays work + data = xr.DataArray(data) + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_sum(data) + # One bin with two hits, so max value is 2.0 + self.assertTrue(np.max(result) == 4.) + # Two bins with the same value + self.assertEqual(np.sum(result == 2.), 2) + # One bin with double the value + self.assertEqual(np.sum(result == 4.), 1) + self.assertEqual(result.shape, self.adef.shape) + + # Test masking all-NaN bins + data = da.from_array(np.array([[np.nan, np.nan], [np.nan, np.nan]]), + chunks=self.chunks) + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_sum(data, mask_all_nan=True) + self.assertTrue(np.all(np.isnan(result))) + # By default all-NaN bins have a value of 0.0 + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_sum(data) + self.assertEqual(np.nanmax(result), 0.0) + + def test_get_count(self): + """Test drop-in-a-bucket sum.""" + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_count() + result = result.compute() + self.assertTrue(np.max(result) == 2) + self.assertEqual(np.sum(result == 1), 2) + self.assertEqual(np.sum(result == 2), 1) + self.assertTrue(self.resampler.counts is not None) + + def test_get_average(self): + """Test averaging bucket resampling.""" + data = da.from_array(np.array([[2., 4.], [3., np.nan]]), + chunks=self.chunks) + # Without pre-calculated indices + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_average(data) + result = result.compute() + self.assertEqual(np.nanmax(result), 3.) + self.assertTrue(np.any(np.isnan(result))) + # Use a fill value other than np.nan + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_average(data, fill_value=-1) + result = result.compute() + self.assertEqual(np.max(result), 3.) + self.assertEqual(np.min(result), -1) + self.assertFalse(np.any(np.isnan(result))) + + # Test masking all-NaN bins + data = da.from_array(np.array([[np.nan, np.nan], [np.nan, np.nan]]), + chunks=self.chunks) + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_average(data, mask_all_nan=True) + self.assertTrue(np.all(np.isnan(result))) + # By default all-NaN bins have a value of NaN + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_average(data) + self.assertTrue(np.all(np.isnan(result))) + + def test_resample_bucket_fractions(self): + """Test fraction calculations for categorical data.""" + data = da.from_array(np.array([[2, 4], [2, 2]]), + chunks=self.chunks) + categories = [1, 2, 3, 4] + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_fractions(data, categories=categories) + self.assertEqual(set(categories), set(result.keys())) + res = result[1].compute() + self.assertTrue(np.nanmax(res) == 0.) + res = result[2].compute() + self.assertTrue(np.nanmax(res) == 1.) + self.assertTrue(np.nanmin(res) == 0.5) + res = result[3].compute() + self.assertTrue(np.nanmax(res) == 0.) + res = result[4].compute() + self.assertTrue(np.nanmax(res) == 0.5) + self.assertTrue(np.nanmin(res) == 0.) + # There should be NaN values + self.assertTrue(np.any(np.isnan(res))) + + # Use a fill value + with dask.config.set(scheduler=CustomScheduler(max_computes=0)): + result = self.resampler.get_fractions(data, categories=categories, + fill_value=-1) + + # There should not be any NaN values + for i in categories: + res = result[i].compute() + self.assertFalse(np.any(np.isnan(res))) + self.assertTrue(np.min(res) == -1) + + # No categories given, need to compute the data once to get + # the categories + with dask.config.set(scheduler=CustomScheduler(max_computes=1)): + result = self.resampler.get_fractions(data, categories=None) + + +def suite(): + """The test suite. + """ + loader = unittest.TestLoader() + mysuite = unittest.TestSuite() + mysuite.addTest(loader.loadTestsFromTestCase(Test)) + + return mysuite ===================================== pyresample/test/test_geometry.py ===================================== @@ -129,7 +129,19 @@ class Test(unittest.TestCase): with patch('pyresample._cartopy.warnings.warn') as warn: # Test that user warning has been issued (EPSG to proj4 string is potentially lossy) area.to_cartopy_crs() - warn.assert_called() + if projection.startswith('EPSG'): + # we'll only get this for the new EPSG:XXXX syntax + warn.assert_called() + + # Bounds for latlong projection must be specified in radians + latlong_crs = geometry.AreaDefinition(area_id='latlong', + description='Global regular lat-lon grid', + proj_id='latlong', + projection={'proj': 'latlong', 'lon0': 0}, + width=360, + height=180, + area_extent=(-180, -90, 180, 90)).to_cartopy_crs() + self.assertTrue(np.allclose(latlong_crs.bounds, [-np.pi, np.pi, -np.pi/2, np.pi/2])) def test_create_areas_def(self): from pyresample import utils @@ -183,7 +195,7 @@ class Test(unittest.TestCase): ' area_extent:\n' ' lower_left_xy: [-49739, 5954123]\n' ' upper_right_xy: [1350361, 7354223]'.format(epsg=epsg_yaml))) - self.assertDictEqual(res, expected) + self.assertDictEqual(res, expected) def test_parse_area_file(self): from pyresample import utils @@ -1066,9 +1078,12 @@ class Test(unittest.TestCase): self.assertEqual(slice(3, 3709, None), slice_y) def test_proj_str(self): + """Test the 'proj_str' property of AreaDefinition.""" from collections import OrderedDict from pyresample import utils + # pyproj 2.0+ adds a +type=crs parameter + extra_params = ' +type=crs' if utils.is_pyproj2() else '' proj_dict = OrderedDict() proj_dict['proj'] = 'stere' proj_dict['a'] = 6378144.0 @@ -1081,20 +1096,34 @@ class Test(unittest.TestCase): [-1370912.72, -909968.64, 1029087.28, 1490031.36]) self.assertEqual(area.proj_str, - '+a=6378144.0 +b=6356759.0 +lat_0=50.0 +lat_ts=50.0 +lon_0=8.0 +proj=stere') + '+a=6378144.0 +b=6356759.0 +lat_0=50.0 +lat_ts=50.0 ' + '+lon_0=8.0 +proj=stere' + extra_params) + # try a omerc projection and no_rot parameters + proj_dict['proj'] = 'omerc' + proj_dict['alpha'] = proj_dict.pop('lat_ts') proj_dict['no_rot'] = '' area = geometry.AreaDefinition('areaD', 'Europe (3km, HRV, VTC)', 'areaD', proj_dict, 10, 10, [-1370912.72, -909968.64, 1029087.28, 1490031.36]) self.assertEqual(area.proj_str, - '+a=6378144.0 +b=6356759.0 +lat_0=50.0 +lat_ts=50.0 +lon_0=8.0 +no_rot +proj=stere') + '+a=6378144.0 +alpha=50.0 +b=6356759.0 +lat_0=50.0 ' + '+lon_0=8.0 +no_rot +proj=omerc' + extra_params) # EPSG - projections = ['+init=EPSG:6932'] if utils.is_pyproj2(): - projections.append('EPSG:6932') - for projection in projections: + # With pyproj 2.0+ we expand EPSG to full parameter list + full_proj = ('+datum=WGS84 +lat_0=-90 +lon_0=0 +no_defs ' + '+proj=laea +type=crs +units=m +x_0=0 +y_0=0') + projections = [ + ('+init=EPSG:6932', full_proj), + ('EPSG:6932', full_proj) + ] + else: + projections = [ + ('+init=EPSG:6932', '+init=EPSG:6932'), + ] + for projection, expected_proj in projections: area = geometry.AreaDefinition( area_id='ease-sh-2.0', description='25km EASE Grid 2.0 (Southern Hemisphere)', @@ -1102,7 +1131,7 @@ class Test(unittest.TestCase): projection=projection, width=123, height=123, area_extent=[-40000., -40000., 40000., 40000.]) - self.assertEqual(area.proj_str, projection) + self.assertEqual(area.proj_str, expected_proj) def test_striding(self): """Test striding AreaDefinitions.""" @@ -1367,6 +1396,7 @@ class TestSwathDefinition(unittest.TestCase): def test_compute_optimal_bb(self): """Test computing the bb area.""" + from pyresample.utils import is_pyproj2 import xarray as xr lats = np.array([[85.23900604248047, 62.256004333496094, 35.58000183105469], [80.84000396728516, 60.74200439453125, 34.08500289916992], @@ -1386,6 +1416,11 @@ class TestSwathDefinition(unittest.TestCase): proj_dict = {'gamma': 0.0, 'lonc': -11.391744043133668, 'ellps': 'WGS84', 'proj': 'omerc', 'alpha': 9.185764390923012, 'lat_0': -0.2821013754097188} + if is_pyproj2(): + # pyproj2 adds some extra defaults + proj_dict.update({'x_0': 0, 'y_0': 0, 'units': 'm', + 'k': 1, 'gamma': 0, + 'no_defs': None, 'type': 'crs'}) assert_np_dict_allclose(res.proj_dict, proj_dict) self.assertEqual(res.shape, (6, 3)) @@ -1705,21 +1740,35 @@ class TestDynamicAreaDefinition(unittest.TestCase): lats = [50, 66, 66, 50] result = area.freeze((lons, lats), resolution=3000, - proj_info={'lon0': 16, 'lat0': 58}) - + proj_info={'lon_0': 16, 'lat_0': 58}) + + np.testing.assert_allclose(result.area_extent, (-432079.38952, + -872594.690447, + 432079.38952, + 904633.303964)) + self.assertEqual(result.proj_dict['lon_0'], 16) + self.assertEqual(result.proj_dict['lat_0'], 58) + self.assertEqual(result.width, 288) + self.assertEqual(result.height, 592) + + # make sure that setting `proj_info` once doesn't + # set it in the dynamic area + result = area.freeze((lons, lats), + resolution=3000, + proj_info={'lon_0': 0}) np.testing.assert_allclose(result.area_extent, (538546.7274949469, 5380808.879250369, 1724415.6519203288, 6998895.701001488)) - self.assertEqual(result.proj_dict['lon0'], 16) - self.assertEqual(result.proj_dict['lat0'], 58) - self.assertEqual(result.x_size, 395) - self.assertEqual(result.y_size, 539) + self.assertEqual(result.proj_dict['lon_0'], 0) + # lat_0 could be provided or not depending on version of pyproj + self.assertEqual(result.proj_dict.get('lat_0', 0), 0) + self.assertEqual(result.width, 395) + self.assertEqual(result.height, 539) def test_freeze_with_bb(self): """Test freezing the area with bounding box computation.""" - area = geometry.DynamicAreaDefinition('test_area', 'A test area', - {'proj': 'omerc'}, + area = geometry.DynamicAreaDefinition('test_area', 'A test area', {'proj': 'omerc'}, optimize_projection=True) lons = [[10, 12.1, 14.2, 16.3], [10, 12, 14, 16], @@ -1729,13 +1778,23 @@ class TestDynamicAreaDefinition(unittest.TestCase): [50, 51, 52, 53]] import xarray as xr sdef = geometry.SwathDefinition(xr.DataArray(lons), xr.DataArray(lats)) - result = area.freeze(sdef, - resolution=1000) + result = area.freeze(sdef, resolution=1000) np.testing.assert_allclose(result.area_extent, [-336277.698941, 5513145.392745, 192456.651909, 7749649.63914]) self.assertEqual(result.x_size, 4) self.assertEqual(result.y_size, 18) + # Test for properties and shape usage in freeze. + area = geometry.DynamicAreaDefinition('test_area', 'A test area', {'proj': 'merc'}, + width=4, height=18) + self.assertEqual((18, 4), area.shape) + result = area.freeze(sdef) + np.testing.assert_allclose(result.area_extent, + (996309.4426, 6287132.757981, 1931393.165263, 10837238.860543)) + area = geometry.DynamicAreaDefinition('test_area', 'A test area', {'proj': 'merc'}, + resolution=1000) + self.assertEqual(1000, area.pixel_size_x) + self.assertEqual(1000, area.pixel_size_y) def test_compute_domain(self): """Test computing size and area extent.""" ===================================== pyresample/test/test_grid.py ===================================== @@ -201,8 +201,12 @@ class Test(unittest.TestCase): lat, 52.566998432390619, msg='Resampling of single lat failed') def test_proj4_string(self): + """Test 'proj_str' property of AreaDefinition.""" + from pyresample.utils import is_pyproj2 proj4_string = self.area_def.proj_str expected_string = '+a=6378144.0 +b=6356759.0 +lat_ts=50.0 +lon_0=8.0 +proj=stere +lat_0=50.0' + if is_pyproj2(): + expected_string += ' +type=crs' self.assertEqual( frozenset(proj4_string.split()), frozenset(expected_string.split())) ===================================== pyresample/test/test_spatial_mp.py ===================================== @@ -30,31 +30,31 @@ import unittest from pyresample._spatial_mp import BaseProj -class SpatialMPTest(unittest.TestCase): - @mock.patch('pyresample._spatial_mp.pyproj.Proj.__init__', return_value=None) - def test_base_proj_epsg(self, proj_init): - """Test Proj creation with EPSG codes""" - if pyproj.__version__ < '2': - return self.skipTest(reason='pyproj 2+ only') - - args = [ - [None, {'init': 'EPSG:6932'}], - [{'init': 'EPSG:6932'}, {}], - [None, {'EPSG': '6932'}], - [{'EPSG': '6932'}, {}] - ] - for projparams, kwargs in args: - BaseProj(projparams, **kwargs) - proj_init.assert_called_with(projparams='EPSG:6932', preserve_units=mock.ANY) - proj_init.reset_mock() +# class SpatialMPTest(unittest.TestCase): +# @mock.patch('pyresample._spatial_mp.pyproj.Proj.__init__', return_value=None) +# def test_base_proj_epsg(self, proj_init): +# """Test Proj creation with EPSG codes""" +# if pyproj.__version__ < '2': +# return self.skipTest(reason='pyproj 2+ only') +# +# args = [ +# [None, {'init': 'EPSG:6932'}], +# [{'init': 'EPSG:6932'}, {}], +# [None, {'EPSG': '6932'}], +# [{'EPSG': '6932'}, {}] +# ] +# for projparams, kwargs in args: +# BaseProj(projparams, **kwargs) +# proj_init.assert_called_with(projparams='EPSG:6932', preserve_units=mock.ANY) +# proj_init.reset_mock() def suite(): """The test suite. """ - loader = unittest.TestLoader() + # loader = unittest.TestLoader() mysuite = unittest.TestSuite() - mysuite.addTest(loader.loadTestsFromTestCase(SpatialMPTest)) + # mysuite.addTest(loader.loadTestsFromTestCase(SpatialMPTest)) return mysuite ===================================== pyresample/test/test_utils.py ===================================== @@ -41,39 +41,64 @@ class TestLegacyAreaParser(unittest.TestCase): def test_area_parser_legacy(self): """Test legacy area parser.""" from pyresample import parse_area_file + from pyresample.utils import is_pyproj2 ease_nh, ease_sh = parse_area_file(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'), 'ease_nh', 'ease_sh') + if is_pyproj2(): + # pyproj 2.0+ adds some extra parameters + projection = ("{'R': '6371228', 'lat_0': '90', 'lon_0': '0', " + "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', " + "'units': 'm', 'x_0': '0', 'y_0': '0'}") + else: + projection = ("{'a': '6371228.0', 'lat_0': '90.0', " + "'lon_0': '0.0', 'proj': 'laea', 'units': 'm'}") nh_str = """Area ID: ease_nh Description: Arctic EASE grid Projection ID: ease_nh -Projection: {'a': '6371228.0', 'lat_0': '90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} +Projection: {} Number of columns: 425 Number of rows: 425 -Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""" +Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection) self.assertEqual(ease_nh.__str__(), nh_str) - self.assertIsInstance(ease_nh.proj_dict['lat_0'], float) - + self.assertIsInstance(ease_nh.proj_dict['lat_0'], (int, float)) + + if is_pyproj2(): + projection = ("{'R': '6371228', 'lat_0': '-90', 'lon_0': '0', " + "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', " + "'units': 'm', 'x_0': '0', 'y_0': '0'}") + else: + projection = ("{'a': '6371228.0', 'lat_0': '-90.0', " + "'lon_0': '0.0', 'proj': 'laea', 'units': 'm'}") sh_str = """Area ID: ease_sh Description: Antarctic EASE grid Projection ID: ease_sh -Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} +Projection: {} Number of columns: 425 Number of rows: 425 -Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""" +Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection) self.assertEqual(ease_sh.__str__(), sh_str) - self.assertIsInstance(ease_sh.proj_dict['lat_0'], float) + self.assertIsInstance(ease_sh.proj_dict['lat_0'], (int, float)) def test_load_area(self): from pyresample import load_area + from pyresample.utils import is_pyproj2 ease_nh = load_area(os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg'), 'ease_nh') + if is_pyproj2(): + # pyproj 2.0+ adds some extra parameters + projection = ("{'R': '6371228', 'lat_0': '90', 'lon_0': '0', " + "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', " + "'units': 'm', 'x_0': '0', 'y_0': '0'}") + else: + projection = ("{'a': '6371228.0', 'lat_0': '90.0', " + "'lon_0': '0.0', 'proj': 'laea', 'units': 'm'}") nh_str = """Area ID: ease_nh Description: Arctic EASE grid Projection ID: ease_nh -Projection: {'a': '6371228.0', 'lat_0': '90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} +Projection: {} Number of columns: 425 Number of rows: 425 -Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""" +Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection) self.assertEqual(nh_str, ease_nh.__str__()) def test_not_found_exception(self): @@ -96,44 +121,61 @@ class TestYAMLAreaParser(unittest.TestCase): 'test_latlong') ease_nh, ease_sh, test_m, test_deg, test_latlong = test_areas + from pyresample.utils import is_pyproj2 + if is_pyproj2(): + # pyproj 2.0+ adds some extra parameters + projection = ("{'R': '6371228', 'lat_0': '-90', 'lon_0': '0', " + "'no_defs': 'None', 'proj': 'laea', 'type': 'crs', " + "'units': 'm', 'x_0': '0', 'y_0': '0'}") + else: + projection = ("{'a': '6371228.0', 'lat_0': '-90.0', " + "'lon_0': '0.0', 'proj': 'laea', 'units': 'm'}") nh_str = """Area ID: ease_nh Description: Arctic EASE grid -Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} +Projection: {} Number of columns: 425 Number of rows: 425 -Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""" +Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection) self.assertEqual(ease_nh.__str__(), nh_str) sh_str = """Area ID: ease_sh Description: Antarctic EASE grid -Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} +Projection: {} Number of columns: 425 Number of rows: 425 -Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""" +Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection) self.assertEqual(ease_sh.__str__(), sh_str) m_str = """Area ID: test_meters Description: test_meters -Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} +Projection: {} Number of columns: 850 Number of rows: 425 -Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""" +Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection) self.assertEqual(test_m.__str__(), m_str) deg_str = """Area ID: test_degrees Description: test_degrees -Projection: {'a': '6371228.0', 'lat_0': '-90.0', 'lon_0': '0.0', 'proj': 'laea', 'units': 'm'} +Projection: {} Number of columns: 850 Number of rows: 425 -Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""" +Area extent: (-5326849.0625, -5326849.0625, 5326849.0625, 5326849.0625)""".format(projection) self.assertEqual(test_deg.__str__(), deg_str) + if is_pyproj2(): + # pyproj 2.0+ adds some extra parameters + projection = ("{'ellps': 'WGS84', 'lat_0': '27.12', " + "'lon_0': '-81.36', 'proj': 'longlat', " + "'type': 'crs'}") + else: + projection = ("{'ellps': 'WGS84', 'lat_0': '27.12', " + "'lon_0': '-81.36', 'proj': 'longlat'}") latlong_str = """Area ID: test_latlong Description: Basic latlong grid -Projection: {'ellps': 'WGS84', 'lat_0': '27.12', 'lon_0': '-81.36', 'proj': 'longlat'} +Projection: {} Number of columns: 3473 Number of rows: 4058 -Area extent: (-0.0812, 0.4039, 0.0812, 0.5428)""" +Area extent: (-0.0812, 0.4039, 0.0812, 0.5428)""".format(projection) self.assertEqual(test_latlong.__str__(), latlong_str) def test_dynamic_area_parser_yaml(self): @@ -338,19 +380,29 @@ class TestMisc(unittest.TestCase): self.assertIsInstance(proj_dict2['lon_0'], float) # EPSG - expected = {'+init=EPSG:4326': {'init': 'EPSG:4326'}, - 'EPSG:4326': {'EPSG': 4326}} + proj_str = '+init=EPSG:4326' + proj_dict_exp = {'init': 'EPSG:4326'} + proj_dict = utils._proj4.proj4_str_to_dict(proj_str) + self.assertEqual(proj_dict, proj_dict_exp) + self.assertEqual(utils._proj4.proj4_dict_to_str(proj_dict), proj_str) # round-trip - for proj_str, proj_dict_exp in expected.items(): - proj_dict = utils._proj4.proj4_str_to_dict(proj_str) + proj_str = 'EPSG:4326' + proj_dict_exp = {'init': 'EPSG:4326'} + proj_dict_exp2 = {'proj': 'longlat', 'datum': 'WGS84', 'no_defs': None, 'type': 'crs'} + proj_dict = utils._proj4.proj4_str_to_dict(proj_str) + if 'init' in proj_dict: + # pyproj <2.0 self.assertEqual(proj_dict, proj_dict_exp) - self.assertEqual(utils._proj4.proj4_dict_to_str(proj_dict), proj_str) # round-trip - - # Invalid EPSG code (pyproj-2 syntax only) - self.assertRaises(ValueError, utils._proj4.proj4_str_to_dict, 'EPSG:XXXX') + else: + # pyproj 2.0+ + self.assertEqual(proj_dict, proj_dict_exp2) + # input != output for this style of EPSG code + # EPSG to PROJ.4 can be lossy + # self.assertEqual(utils._proj4.proj4_dict_to_str(proj_dict), proj_str) # round-trip def test_def2yaml_converter(self): from pyresample import parse_area_file, convert_def_to_yaml + from pyresample.utils import is_pyproj2 import tempfile def_file = os.path.join(os.path.dirname(__file__), 'test_files', 'areas.cfg') filehandle, yaml_file = tempfile.mkstemp() @@ -360,8 +412,16 @@ class TestMisc(unittest.TestCase): areas_new = set(parse_area_file(yaml_file)) areas = parse_area_file(def_file) for area in areas: - area.proj_dict.pop('units', None) + if is_pyproj2(): + # pyproj 2.0 adds units back in + # pyproj <2 doesn't + continue + # initialize _proj_dict + area.proj_dict # noqa + area._proj_dict.pop('units', None) areas_old = set(areas) + areas_new = {area.area_id: area for area in areas_new} + areas_old = {area.area_id: area for area in areas_old} self.assertEqual(areas_new, areas_old) finally: os.remove(yaml_file) @@ -375,18 +435,24 @@ class TestMisc(unittest.TestCase): transform = Affine(300.0379266750948, 0.0, 101985.0, 0.0, -300.041782729805, 2826915.0) crs = CRS(init='epsg:3857') + if utils.is_pyproj2(): + # pyproj 2.0+ expands CRS parameters + from pyproj import CRS + proj_dict = CRS(3857).to_dict() + else: + proj_dict = crs.to_dict() source = tmptiff(x_size, y_size, transform, crs=crs) area_id = 'area_id' proj_id = 'proj_id' - name = 'name' + description = 'name' area_def = utils._rasterio.get_area_def_from_raster( - source, area_id=area_id, name=name, proj_id=proj_id) + source, area_id=area_id, name=description, proj_id=proj_id) self.assertEqual(area_def.area_id, area_id) self.assertEqual(area_def.proj_id, proj_id) - self.assertEqual(area_def.name, name) + self.assertEqual(area_def.description, description) self.assertEqual(area_def.width, x_size) self.assertEqual(area_def.height, y_size) - self.assertDictEqual(crs.to_dict(), area_def.proj_dict) + self.assertDictEqual(proj_dict, area_def.proj_dict) self.assertTupleEqual(area_def.area_extent, (transform.c, transform.f + transform.e * y_size, transform.c + transform.a * x_size, transform.f)) @@ -422,6 +488,9 @@ class TestMisc(unittest.TestCase): source = tmptiff(transform=transform) proj_dict = {'init': 'epsg:3857'} area_def = utils._rasterio.get_area_def_from_raster(source, proj_dict=proj_dict) + if utils.is_pyproj2(): + from pyproj import CRS + proj_dict = CRS(3857).to_dict() self.assertDictEqual(area_def.proj_dict, proj_dict) ===================================== pyresample/test/utils.py ===================================== @@ -178,3 +178,21 @@ def create_test_latitude(start, stop, shape, twist_factor=0.0, dtype=np.float32) lat_array = np.repeat(lat_col, shape[1], axis=1) lat_array += twist_array return lat_array + + +class CustomScheduler(object): + """Scheduler raising an exception if data are computed too many times.""" + + def __init__(self, max_computes=1): + """Set starting and maximum compute counts.""" + self.max_computes = max_computes + self.total_computes = 0 + + def __call__(self, dsk, keys, **kwargs): + """Compute dask task and keep track of number of times we do so.""" + import dask + self.total_computes += 1 + if self.total_computes > self.max_computes: + raise RuntimeError("Too many dask computations were scheduled: " + "{}".format(self.total_computes)) + return dask.get(dsk, keys, **kwargs) ===================================== pyresample/utils/_proj4.py ===================================== @@ -15,9 +15,15 @@ # # You should have received a copy of the GNU Lesser General Public License along # with this program. If not, see . + from collections import OrderedDict import six +try: + from pyproj.crs import CRS +except ImportError: + CRS = None + def convert_proj_floats(proj_pairs): """Convert PROJ.4 parameters to floats if possible.""" @@ -26,9 +32,6 @@ def convert_proj_floats(proj_pairs): if len(x) == 1 or x[1] is True: proj_dict[x[0]] = True continue - if x[0] == 'EPSG': - proj_dict[x[0]] = x[1] - continue try: proj_dict[x[0]] = float(x[1]) @@ -41,15 +44,24 @@ def convert_proj_floats(proj_pairs): def proj4_str_to_dict(proj4_str): """Convert PROJ.4 compatible string definition to dict + EPSG codes should be provided as "EPSG:XXXX" where "XXXX" + is the EPSG number code. It can also be provided as + ``"+init=EPSG:XXXX"`` as long as the underlying PROJ library + supports it (deprecated in PROJ 6.0+). + Note: Key only parameters will be assigned a value of `True`. """ - if proj4_str.startswith('EPSG:'): - try: - code = int(proj4_str.split(':', 1)[1]) - except ValueError as err: - six.raise_from(ValueError("Invalid EPSG code '{}': {}".format(proj4_str, err)), - None) # Suppresses original exception context in python 3 - return OrderedDict(EPSG=code) + # convert EPSG codes to equivalent PROJ4 string definition + if proj4_str.startswith('EPSG:') and CRS is not None: + crs = CRS(proj4_str) + if hasattr(crs, 'to_dict'): + # pyproj 2.2+ + return crs.to_dict() + proj4_str = crs.to_proj4() + elif proj4_str.startswith('EPSG:'): + # legacy +init= PROJ4 string and no pyproj 2.0+ to help convert + proj4_str = "+init={}".format(proj4_str) + pairs = (x.split('=', 1) for x in proj4_str.replace('+', '').split(" ")) return convert_proj_floats(pairs) @@ -61,11 +73,6 @@ def proj4_dict_to_str(proj4_dict, sort=False): items = sorted(items) params = [] for key, val in items: - if key == 'EPSG': - # If EPSG code is present, ignore other parameters - params = ['EPSG:{}'.format(val)] - break - key = str(key) if key.startswith('+') else '+' + str(key) if key in ['+no_defs', '+no_off', '+no_rot']: param = key ===================================== pyresample/version.py ===================================== @@ -23,9 +23,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (tag: v1.12.3)" - git_full = "6b4367ce09ca6040c6107d2fafc31524eb45d1f8" - git_date = "2019-05-17 10:58:40 -0500" + git_refnames = " (HEAD -> master, tag: v1.13.0)" + git_full = "790a0bae85cb243c17f0150011e30c834f244e04" + git_date = "2019-09-13 08:32:20 +0200" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords ===================================== setup.py ===================================== @@ -25,13 +25,20 @@ import sys from setuptools import Extension, find_packages, setup from setuptools.command.build_ext import build_ext as _build_ext -requirements = ['setuptools>=3.2', 'pyproj>=1.9.5.1', 'numpy>=1.10.0', 'configobj', +requirements = ['setuptools>=3.2', 'pyproj>=1.9.5.1', 'configobj', 'pykdtree>=1.3.1', 'pyyaml', 'six'] extras_require = {'numexpr': ['numexpr'], 'quicklook': ['matplotlib', 'cartopy', 'pillow'], 'rasterio': ['rasterio'], 'dask': ['dask>=0.16.1']} +if sys.version_info.major > 2: + setup_requires = ['numpy>=1.10.0'] + requirements.append('numpy>=1.10.0') +else: + setup_requires = ['numpy>=1.10.0,<1.17.0'] + requirements.append('numpy>=1.10.0,<1.17.0') + test_requires = ['rasterio', 'dask', 'xarray', 'cartopy', 'pillow', 'matplotlib', 'scipy'] if sys.version_info < (3, 3): test_requires.append('mock') @@ -119,7 +126,7 @@ if __name__ == "__main__": package_dir={'pyresample': 'pyresample'}, packages=find_packages(), python_requires='>=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*', - setup_requires=['numpy'], + setup_requires=setup_requires, install_requires=requirements, extras_require=extras_require, tests_require=test_requires, View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/b62672cc83b9f001f5d291939a18c2fb03697005 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/b62672cc83b9f001f5d291939a18c2fb03697005 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 14 19:00:43 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 14 Sep 2019 18:00:43 +0000 Subject: qgis_3.4.12+dfsg-1~exp1_amd64.changes ACCEPTED into experimental, experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 13 Sep 2019 14:52:49 +0200 Source: qgis Binary: libqgis-3d3.4.12 libqgis-3d3.4.12-dbgsym libqgis-analysis3.4.12 libqgis-analysis3.4.12-dbgsym libqgis-app3.4.12 libqgis-app3.4.12-dbgsym libqgis-core3.4.12 libqgis-core3.4.12-dbgsym libqgis-customwidgets libqgis-customwidgets-dbgsym libqgis-dev libqgis-gui3.4.12 libqgis-gui3.4.12-dbgsym libqgis-native3.4.12 libqgis-native3.4.12-dbgsym libqgis-server3.4.12 libqgis-server3.4.12-dbgsym libqgisgrass7-3.4.12 libqgisgrass7-3.4.12-dbgsym libqgispython3.4.12 libqgispython3.4.12-dbgsym python3-qgis python3-qgis-common python3-qgis-dbgsym qgis qgis-api-doc qgis-common qgis-dbgsym qgis-plugin-grass qgis-plugin-grass-common qgis-plugin-grass-dbgsym qgis-provider-grass qgis-provider-grass-dbgsym qgis-providers qgis-providers-common qgis-providers-dbgsym qgis-server qgis-server-dbgsym Architecture: source amd64 all Version: 3.4.12+dfsg-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: libqgis-3d3.4.12 - QGIS - shared 3d library libqgis-analysis3.4.12 - QGIS - shared analysis library libqgis-app3.4.12 - QGIS - shared app library libqgis-core3.4.12 - QGIS - shared core library libqgis-customwidgets - QGIS custom widgets for Qt Designer libqgis-dev - QGIS - development files libqgis-gui3.4.12 - QGIS - shared gui library libqgis-native3.4.12 - QGIS - shared native gui library libqgis-server3.4.12 - QGIS - shared server library libqgisgrass7-3.4.12 - QGIS - shared grass library libqgispython3.4.12 - QGIS - shared Python library python3-qgis - Python bindings to QGIS python3-qgis-common - Python bindings to QGIS - architecture-independent files qgis - Geographic Information System (GIS) qgis-api-doc - QGIS API documentation qgis-common - QGIS - architecture-independent data qgis-plugin-grass - GRASS plugin for QGIS qgis-plugin-grass-common - GRASS plugin for QGIS - architecture-independent data qgis-provider-grass - GRASS provider for QGIS qgis-providers - collection of data providers to QGIS qgis-providers-common - collection of data providers to QGIS - architecture-independent f qgis-server - QGIS server providing various OGC services Changes: qgis (3.4.12+dfsg-1~exp1) experimental; urgency=medium . * New upstream release. * Update symbols for amd64. Checksums-Sha1: 30da3ec432014c3aaf6d34966c2868e75eef047a 4675 qgis_3.4.12+dfsg-1~exp1.dsc c61783cf6bef6bff6a842ada2588cbc3cd50553f 62544008 qgis_3.4.12+dfsg.orig.tar.xz 009853c8a46fbc72870c3c5d20e4f4f26f1699aa 266880 qgis_3.4.12+dfsg-1~exp1.debian.tar.xz 344b5f35d7ea4562309745e92c0f95a7c9d2f27c 10439588 libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb e7fa3d17f8a9cc9b746da7de86d03ffcb3f18e37 2140600 libqgis-3d3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 82815f203f767f1ee88733f496047f24509cbdcf 50820916 libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 96795e8e60c76af233ad757bbaa8bd0012ef0ce8 2721296 libqgis-analysis3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 342ce83ef948a70ebf1d269608b9df130bd5d2f5 114883100 libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 1f88fbb70233ae6f9eb2c0c48becbf6c2cde69f2 4739444 libqgis-app3.4.12_3.4.12+dfsg-1~exp1_amd64.deb b8690c6d8b65f841c3bf5c6ee05b94505daf5779 160270364 libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb a3fd6a9c856e66d8ffb7eb6d87912ecd41f00368 6351920 libqgis-core3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 8231d16b651b3ef922a15b80930ecc8bcbda27fa 5013428 libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb bc613596336645dfbeaddb18564ea4800b510a1f 5402264 libqgis-customwidgets_3.4.12+dfsg-1~exp1_amd64.deb 8dbf625dde036c074c00706d8b3928e67ad78b72 2931300 libqgis-dev_3.4.12+dfsg-1~exp1_amd64.deb 035ffdd25c72cc6a42f53589550128ee608533a9 149520232 libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb d74f003533883e461730ef21caf3b19c8163eec0 4840512 libqgis-gui3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 133186615abb8bdb80aadd0d713806918ca52167 602196 libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb de0b81be05cea7ad9aa84b9ba2524cebc49993ce 2001988 libqgis-native3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 1a5d06330b2bd7ff601d5e13cb62518697060c0f 6261072 libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb b08623a08f27c759726b14e1c1b6337e4ea2c285 2132472 libqgis-server3.4.12_3.4.12+dfsg-1~exp1_amd64.deb e8814de53b81fe74728858c791de6e470d7085d1 5028516 libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb dc52a051dbbc1155b205050305188796fcd59761 2188812 libqgisgrass7-3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 3dbf890228554798fe5a95195b83198f3bce898b 384996 libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 7b973f9b9212252cf306dd1eb77085d600de7617 2003096 libqgispython3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 37f596471a9a0d47f2928047692b065198796ef4 4313688 python3-qgis-common_3.4.12+dfsg-1~exp1_all.deb d94a203feac519f1a7e77cfe58d63a7a345ca17d 43099560 python3-qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb ee3ba2451c5f7446be48401e2a97d2132e2d7d33 9133968 python3-qgis_3.4.12+dfsg-1~exp1_amd64.deb c6a4d3551c770453b47495301f19a36901c6c404 996489112 qgis-api-doc_3.4.12+dfsg-1~exp1_all.deb 879f7ae200375c4f7a35258b0af50b697634763b 11835928 qgis-common_3.4.12+dfsg-1~exp1_all.deb dcaabf4396fa95822f20205eefece2b4ee8adb3f 24020668 qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 799bbc0d0931d84aa81c4fccb6163cd3798f17de 2463404 qgis-plugin-grass-common_3.4.12+dfsg-1~exp1_all.deb 3916322a67fb6c4501fce4349675dd4f208aee41 11410264 qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 3e31a32750b07354db6dbdc9fca0d541111a8709 2560116 qgis-plugin-grass_3.4.12+dfsg-1~exp1_amd64.deb 1979183ecde74f95ddbb24b573e70121498b4899 1698152 qgis-provider-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 4ea2deb0c5747965f3516966bd05eee2678f1667 2051748 qgis-provider-grass_3.4.12+dfsg-1~exp1_amd64.deb 753e7c921471c682622b1ce5884f68220ef2a4ac 2933632 qgis-providers-common_3.4.12+dfsg-1~exp1_all.deb 56c19f712fb63cc0bf05b24e094405641e00841b 69266424 qgis-providers-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 3dd1892a7efe5723536bdfa76e8202b7904c1466 3895144 qgis-providers_3.4.12+dfsg-1~exp1_amd64.deb 2a7a2a96b5422b70840badde60581a9c917372a7 13231868 qgis-server-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 935f32db0e3a8a64ad4325cba09903264bc5e476 2486420 qgis-server_3.4.12+dfsg-1~exp1_amd64.deb 12648f60b18332c33955dc183af2eda0e5cba369 35454 qgis_3.4.12+dfsg-1~exp1_amd64.buildinfo dd4acc7f418ad37781d1910c965beca6d758450d 6820304 qgis_3.4.12+dfsg-1~exp1_amd64.deb Checksums-Sha256: a995aede2a4f115fd369f0daf6522f0035a4439b429078e562d5fadd63d3bf77 4675 qgis_3.4.12+dfsg-1~exp1.dsc 461c0b6ae478d8b335b800b7d8652f290ae1a47fe385c7d474d88e28fa32baab 62544008 qgis_3.4.12+dfsg.orig.tar.xz 052a747158373ce07ee66cc78526f2e66396fe74ca5fa5ef415a28d0ed409bf9 266880 qgis_3.4.12+dfsg-1~exp1.debian.tar.xz 71ea89cd9df93559cbeab07faaff78d4362daa9589fdf768108eef40720dd48a 10439588 libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 1c895e25940972dc52ab68dfc2a546c2afb6866d9210b5866779890ce6d362c2 2140600 libqgis-3d3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 4026b2ea29b08ea00145b95fa86b8254645ec51a950f216916448045b3c35364 50820916 libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 88c5912601e13004622b96cd7fb2aa680b5e1d7c8828220a0a2fef6042395064 2721296 libqgis-analysis3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 424db1d877aaa2708acf51e2996dce979468ca44135b2ca3602cdb01acbee21a 114883100 libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb dfd6290831db87f8dc1bec50e11c5d629196122a32d5a138caf47b6bf2c78ec5 4739444 libqgis-app3.4.12_3.4.12+dfsg-1~exp1_amd64.deb ede8fb45a9732d9240711a571ae0e4c0ad5c5650fa7e8375435c25dd81a2dc19 160270364 libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 21d992ea36080e793a39759453f27632513cbcd4f624c044fd5f273b9a34a980 6351920 libqgis-core3.4.12_3.4.12+dfsg-1~exp1_amd64.deb cdb7f4fa10c4b99ff3418fb8aaad04bfe74fd1c846a68dcc3df37ed3092349d6 5013428 libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb bf0a0030c60dc40a67a80dbb80aca8614c7419a1a56323ec61bfbf3aec8f75ff 5402264 libqgis-customwidgets_3.4.12+dfsg-1~exp1_amd64.deb cae8ca0abcb09d6d6ea821e48fe0f08efa5ca3362947cede6c9790f7eb7d5c42 2931300 libqgis-dev_3.4.12+dfsg-1~exp1_amd64.deb eb509b7545acf898bf57434165e2b1a4df17395b69d33725caa82268b9bda169 149520232 libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 758256a8e0ccaf1c273e48cb9dcc59f1916406952dcfbe5bed5cb3810fb48b54 4840512 libqgis-gui3.4.12_3.4.12+dfsg-1~exp1_amd64.deb b67e29e597321f0d08a4e97038d2021095275d7e9e45826622d473b290a12435 602196 libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 15d6b159aa20eb56e490dc3e4216e230bd388ad1f7b8ab4458ab83bfff0f24ca 2001988 libqgis-native3.4.12_3.4.12+dfsg-1~exp1_amd64.deb fbfb27c3dc1a7ac579e68265748c7bd45b87ee9e568edca2baec469eb3377b7d 6261072 libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 9221ed4b6b5e0a1fff244bbe0333cbb4ff9c8e4562ef2b1c22777d5f841b4bb4 2132472 libqgis-server3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 0fca60cad55dce51d57374f25ccca9b9de15749f20e81082d59b7c60123f3663 5028516 libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb f2dc4e88fddb451676e69d880b11ce577156d336cb56161a365add799472c5c8 2188812 libqgisgrass7-3.4.12_3.4.12+dfsg-1~exp1_amd64.deb e7049c734812f551dc03ab260ee654c3b7a5b7b5db4cbdfbf529643909a8fd28 384996 libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb a726b0bb4bd24324b6ff3d57026f8146b103a6bd451d839ab288aedf52a97b3a 2003096 libqgispython3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 72afe8c83e8a6b8d2bcd002e10e0545bb136d4f0d6623125821bd186a84e8f4d 4313688 python3-qgis-common_3.4.12+dfsg-1~exp1_all.deb b91ee27440a401641b10e92d91fd700d0fb5e6d3e4a96da696f56147fa1f1abb 43099560 python3-qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb bc486fdfd4ee6f26a96f9a2dd3ff6da215e5cb5dd76539cf7d6a35f8fb8964e7 9133968 python3-qgis_3.4.12+dfsg-1~exp1_amd64.deb c633efe008093528ed6ea9d680bf4c39ffe4cfaebbd898857d5a0829859be1c8 996489112 qgis-api-doc_3.4.12+dfsg-1~exp1_all.deb 57892f822fab6ccfc4db6c2c8bee74d31b2f7f83ce0a67cc21a1772d7cf7b9cd 11835928 qgis-common_3.4.12+dfsg-1~exp1_all.deb 50a544b4a8fded5b3eafbb00728e8078c672b7c30ddec24eeb69f02308b5ce8d 24020668 qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 18af737585e8daf2e2d14ab26c68f976d70df2a1de74468097320d75b1dad91e 2463404 qgis-plugin-grass-common_3.4.12+dfsg-1~exp1_all.deb 275769d8a4c8626b37939a2cc2dac98b595fce97e5fc4341fa9301607e51ff6a 11410264 qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb c1843a09261f76ccf83c67b9646ae3156072b647bbfacce90e946379d3a22f6d 2560116 qgis-plugin-grass_3.4.12+dfsg-1~exp1_amd64.deb 0133abde9d1a5233d206850c970cfa9782dbd158c3ffa07f9bbcf7443f13c668 1698152 qgis-provider-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb f645c3f9d5394225c8c2eff71d1be11ac6a09a4363b8bbd6cf3baa59e27abdec 2051748 qgis-provider-grass_3.4.12+dfsg-1~exp1_amd64.deb 2bfd4b446b94f6be8fca0b3da1567209ef5e6274159066fc2a81a6275069afec 2933632 qgis-providers-common_3.4.12+dfsg-1~exp1_all.deb 49921bb9c5aaef117e62aa5b96601b70e2279930fb7fd757f86af9606804de6a 69266424 qgis-providers-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 5976dbd081dc160b7dad1edf0c797eeb1bf6f8dfca7bd5776f0b134a0d0c29f4 3895144 qgis-providers_3.4.12+dfsg-1~exp1_amd64.deb a0a2dda530a5c2037acab72ddb9709c9f80700d256e4c5a6b5818c71c897177b 13231868 qgis-server-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 8428815a82dcfb0b46261fd0c5839f858d39a4e9c4f7a341cf937d09f6df2a83 2486420 qgis-server_3.4.12+dfsg-1~exp1_amd64.deb e7f67110b96227e99c5da91689f7f8a62f9de517a5645850214f3bbc64561a16 35454 qgis_3.4.12+dfsg-1~exp1_amd64.buildinfo 4a01d988d6416e8d915bc1f6449a6ac4d7a19cb4eb525d883be416e959e0b4fe 6820304 qgis_3.4.12+dfsg-1~exp1_amd64.deb Files: 4e709a6db9c2eb4e2c46ab7f851afad6 4675 science optional qgis_3.4.12+dfsg-1~exp1.dsc 4bf705b65c596ba2fd04e8aec14bb3d6 62544008 science optional qgis_3.4.12+dfsg.orig.tar.xz 98abc509f390d715da91d30bff06cec7 266880 science optional qgis_3.4.12+dfsg-1~exp1.debian.tar.xz 65061d01b3b437914d67fc57cae734e5 10439588 debug optional libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 562e68acec3234cae8d624ad8057078d 2140600 libs optional libqgis-3d3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 5a9e7fe290466ced0d797b8f5d9f5937 50820916 debug optional libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb c565981f3871c5d726842ff0832ed9be 2721296 libs optional libqgis-analysis3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 39662e560a593348b47225eb57f7c125 114883100 debug optional libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb ec03ac5377eab608e9274cbe8fd5202b 4739444 libs optional libqgis-app3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 79f6c032e72e84557f6f44c09f39833f 160270364 debug optional libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb acd2e4865cfedfba60d1433efc69b9b6 6351920 libs optional libqgis-core3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 1c1f028a620fc9d6e6ce62970e92e21f 5013428 debug optional libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb fb86942ce8ef9a154550def538bbec69 5402264 science optional libqgis-customwidgets_3.4.12+dfsg-1~exp1_amd64.deb f99f083ca04b9cf01d736f932739a4ed 2931300 libdevel optional libqgis-dev_3.4.12+dfsg-1~exp1_amd64.deb ee5d682ea0962850e80f6a0d254e8d2e 149520232 debug optional libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb a97eed440fed6c28764475bac3616149 4840512 libs optional libqgis-gui3.4.12_3.4.12+dfsg-1~exp1_amd64.deb da4bc90225ac2ec35f29bdfcd2435f2f 602196 debug optional libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb c1cc45786e67890b7ec0e774c0ef5e75 2001988 libs optional libqgis-native3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 52857b11ed9b7cc57d308712bdb15972 6261072 debug optional libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb de85133091b624a38caab4436833701f 2132472 libs optional libqgis-server3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 594f7717568a0a720c04fa0e0849c586 5028516 debug optional libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb a170b001bf9420f8fb65ec03b8be3baf 2188812 libs optional libqgisgrass7-3.4.12_3.4.12+dfsg-1~exp1_amd64.deb 243212d6c98cd59cf1b5377509a1c65f 384996 debug optional libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb f1736e628c641d51ba3a49906cb39569 2003096 libs optional libqgispython3.4.12_3.4.12+dfsg-1~exp1_amd64.deb fac24402d88a464cfed5226d842a80b6 4313688 python optional python3-qgis-common_3.4.12+dfsg-1~exp1_all.deb 67b854e94b9e9230402674ba1f347594 43099560 debug optional python3-qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 43e1a36f5cb28f3325436adeb92acbca 9133968 python optional python3-qgis_3.4.12+dfsg-1~exp1_amd64.deb ef838ace06c98cc35d6a81f90ec4db22 996489112 doc optional qgis-api-doc_3.4.12+dfsg-1~exp1_all.deb 1fcef6517979573baa7d7d3e58ffbd33 11835928 science optional qgis-common_3.4.12+dfsg-1~exp1_all.deb b17d5dd9eed612b794dad8ad80be80c5 24020668 debug optional qgis-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb c311df73aea49f47c52e972db226082a 2463404 science optional qgis-plugin-grass-common_3.4.12+dfsg-1~exp1_all.deb 2a23ddf51d026037df9a0efa59779d18 11410264 debug optional qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 860d4fb4cfa6675c026b0d9870603724 2560116 science optional qgis-plugin-grass_3.4.12+dfsg-1~exp1_amd64.deb 2325aaff200ca29986a61b7e51e3579d 1698152 debug optional qgis-provider-grass-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb a1e8ed3095918926f7f77b4cabd399f8 2051748 science optional qgis-provider-grass_3.4.12+dfsg-1~exp1_amd64.deb cc324a0546b1e84bbba4bf716b6e25b2 2933632 science optional qgis-providers-common_3.4.12+dfsg-1~exp1_all.deb d38a0e9d74181f98e77418a670a2b350 69266424 debug optional qgis-providers-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 1c03d68bb5c69bd6bf639c18da83d13b 3895144 science optional qgis-providers_3.4.12+dfsg-1~exp1_amd64.deb a4b4f08648d22ea92a7b5714f91bed35 13231868 debug optional qgis-server-dbgsym_3.4.12+dfsg-1~exp1_amd64.deb 4c3e5de62b744d14363f2f41d9a916da 2486420 science optional qgis-server_3.4.12+dfsg-1~exp1_amd64.deb 737a06e6f5fe9808775e38005e0b9a5b 35454 science optional qgis_3.4.12+dfsg-1~exp1_amd64.buildinfo 11211aade05974c36b70e7b94ebb897d 6820304 science optional qgis_3.4.12+dfsg-1~exp1_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl170+0ACgkQZ1DxCuiN SvE1rRAAtnJ+CTggXPHjR31FwFxpdslBacjHZX7oIyGXP/1pEK7UBqfe95UU5McP T/f20ZeEC+pXMrxgb6HZ0dn38H+qgOY7XyEdRCbynmT+FzT23JaxUyixu71j17b/ vOBroCHC/B5gdEB4aaxGM283GLNKzLcmymOV/ehQQeFOa3AXWI5Jwr5U30HAZ5tS qD63MXAFqs2jdyYiIjeFXLvjdjZRUDFU5Ex6sdqhgCmqvFTMtKdhERVRZKn/aVNG JkBD62ZibZdFo3dUQ3dx6fccZnjTy7968JrfFG4C9rSEadnBl0C9E5QRxsTfT2Ne xSrYFne4GiLLcEav+RM7SPH10xW1Hgm7UIPUnzgOP3BztVaqCVsk4sti3bFpUdIn GqCUEdxJgh4PgG3fD4HldksOHUBrqbNkzPSy8zxb2gVd6shtTwYayYl6+cJ++iqa Oh0zQHNH8ihzBDSRs0R1eYJI0+DjHOj11w08Zd/GU/q8hhFK4n2czJZSLewGv/4B Bo8KQG8dhaAA/xwbmPHzKpz7MxvPiYVHECByY2IfzKpo3vjO5OLYpQxRGIiw2Vzn vcgJkPptWa3jRxk/mO0+uzGs1zqo96pdwso04a3W4ixSt07EyDlkqufolxGsP6PW mPaJ2/w3fLfWKooUVe6oS98/WK5Z01+wjXb+d5flF3A5rFCCBKE= =EyEH -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 14 19:42:28 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 14 Sep 2019 18:42:28 +0000 Subject: [Git][debian-gis-team/pyresample] Pushed new tag debian/1.13.0-1 Message-ID: <5d7d3494bcec6_73483fbbb295be58937786@godard.mail> Bas Couwenberg pushed new tag debian/1.13.0-1 at Debian GIS Project / pyresample -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/tree/debian/1.13.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 14 19:51:53 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 14 Sep 2019 18:51:53 +0000 Subject: Processing of pyresample_1.13.0-1_source.changes Message-ID: pyresample_1.13.0-1_source.changes uploaded successfully to localhost along with the files: pyresample_1.13.0-1.dsc pyresample_1.13.0.orig.tar.gz pyresample_1.13.0-1.debian.tar.xz pyresample_1.13.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 14 20:06:20 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 14 Sep 2019 19:06:20 +0000 Subject: pyresample_1.13.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 14 Sep 2019 17:36:42 +0000 Source: pyresample Architecture: source Version: 1.13.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: pyresample (1.13.0-1) unstable; urgency=medium . * New upstream release. * debian/patches: - refresh all patches - new 0005-Comapt-with-dask-1.0.patch and 0006-Skip-test-on-deprecatet-basemap.patch Checksums-Sha1: e538c28da14bb08a9882ce6bb84e7335502b9ef1 2532 pyresample_1.13.0-1.dsc e1ee214be142c56aba2d700c2f66aab8f90af695 5782275 pyresample_1.13.0.orig.tar.gz 23730a5a0a43a40b481a6f62b81a38fce6dda85a 10720 pyresample_1.13.0-1.debian.tar.xz 593d0c682e03813b94b601a68b209ebca8c25dec 12638 pyresample_1.13.0-1_amd64.buildinfo Checksums-Sha256: 480197d3b7f7ab91af9d9e07b8a9cafb29ff016e5858a32e9228514620673ed2 2532 pyresample_1.13.0-1.dsc 0178abf2099f3e6e59928924382117812eea20bd6db19b601bda17f5ffff9ece 5782275 pyresample_1.13.0.orig.tar.gz 8a0ae679974accee1fb62da730ede867e50f91e1c90858728554f6bdc15e41ac 10720 pyresample_1.13.0-1.debian.tar.xz 400b78f1d1d432107bf5bf588509cb146715b1fb29df605f481e84013b0fd18a 12638 pyresample_1.13.0-1_amd64.buildinfo Files: be982e2fcdce85a45cd0a95852b3429e 2532 python optional pyresample_1.13.0-1.dsc 16e58833281132ae4786c8ce8d972ec5 5782275 python optional pyresample_1.13.0.orig.tar.gz 63535fe52943666bb219719eacf2412d 10720 python optional pyresample_1.13.0-1.debian.tar.xz 053d42acc4303a7227d6ea6efecdb07d 12638 python optional pyresample_1.13.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl19NFUACgkQZ1DxCuiN SvHVYxAAxk6mCkLXmwdFUH4QpNk/wkTUC+CVlbbfoN2+/ircUdF7cir2+JTVHvQW Cn+tDKxH6afgXhcuvgbuG+9bvt1dpNpTnByJwHOo+SCGkQgN7vxrJtuLntbDTrnl 7USA9f5+5NhysKR2e8OnWNy+PvfEMREp7YgJjeAr0Nb8hViPcstVMwZWzfM+re0J CIjF3k2qLwjRZBpXa19aICbnvNcwEFAKqRCW3yjZ1T4wBXhEWrtv4IELddjLwF4O gHFp903tXsSFhntbV60NooJtE8xJsLvZiTefEQT1YTDvEWV4CoqPh6U7x5aYpmqO Ufp+rNNnCDiGl4im7hH7Y2rf9FWmo+329dwkj+mn04hXRvgC8PEcEzNpuBGLAdof Hd9mBis3OugImk0pPpI4p4qNUDaYAf1Aa/ojHtvRoTj8NOJ9dhDqUdEf0Euc7w/B gUYrI0zleMIBWMTW1jQ0EloxLt2kKGkI4u8Ot7jfekXoVvLbwKoiuf4FAe1Oe3SY +6H6jB9QW1wGE/EJpAdKeZh2B7P22tLarhFg8pNr2Ium6Uuz1eRhwap4WO2gapjD te9Sg6KfMunlLOn3E2z65i/S2GsMANOJO4gY63nzE3CBgnxIXMuE7Vc+sDQeky4h mmgH0avPq70QpSMUd1Onua788WjKm9wR+wUeJGoOeu/dEbrc/nE= =U7G2 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From noreply at release.debian.org Sun Sep 15 05:39:20 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sun, 15 Sep 2019 04:39:20 +0000 Subject: rasterio 1.0.28-1 MIGRATED to testing Message-ID: FYI: The status of the rasterio source package in Debian's testing distribution has changed. Previous version: 1.0.26-1 Current version: 1.0.28-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Mon Sep 16 05:39:03 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Mon, 16 Sep 2019 04:39:03 +0000 Subject: otb is marked for autoremoval from testing Message-ID: otb 6.6.1+dfsg-3 is marked for autoremoval from testing on 2019-09-18 It (build-)depends on packages with these RC bugs: 875075: openscenegraph: [openscenegraph] Future Qt4 removal from Buster 935086: insighttoolkit4: FTBFS with GCC-9: use of undeclared identifier '__builtin_is_constant_evaluated' From noreply at release.debian.org Mon Sep 16 05:39:16 2019 From: noreply at release.debian.org (Debian testing watch) Date: Mon, 16 Sep 2019 04:39:16 +0000 Subject: fiona 1.8.6-3 MIGRATED to testing Message-ID: FYI: The status of the fiona source package in Debian's testing distribution has changed. Previous version: 1.8.6-2 Current version: 1.8.6-3 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Mon Sep 16 05:39:19 2019 From: noreply at release.debian.org (Debian testing watch) Date: Mon, 16 Sep 2019 04:39:19 +0000 Subject: libgeotiff-dfsg REMOVED from testing Message-ID: FYI: The status of the libgeotiff-dfsg source package in Debian's testing distribution has changed. Previous version: 1.4.3-1 Current version: (not in testing) Hint: Package not in unstable The script that generates this mail tries to extract removal reasons from comments in the britney hint files. Those comments were not originally meant to be machine readable, so if the reason for removing your package seems to be nonsense, it is probably the reporting script that got confused. Please check the actual hints file before you complain about meaningless removals. -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Mon Sep 16 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Mon, 16 Sep 2019 04:39:21 +0000 Subject: otb 6.6.1+dfsg-3 MIGRATED to testing Message-ID: FYI: The status of the otb source package in Debian's testing distribution has changed. Previous version: 6.6.1+dfsg-2 Current version: 6.6.1+dfsg-3 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Mon Sep 16 08:31:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 07:31:31 +0000 Subject: [Git][debian-gis-team/qgis][master] 2 commits: Update symbols for other architectures. Message-ID: <5d7f3a53c70f1_73483fbbb2b208881078635@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / qgis Commits: baa5486d by Bas Couwenberg at 2019-09-16T04:14:51Z Update symbols for other architectures. - - - - - 572f9e88 by Bas Couwenberg at 2019-09-16T04:15:22Z Set distribution to unstable. - - - - - 4 changed files: - debian/changelog - debian/libqgis-analysis3.4.12.symbols - debian/libqgis-core3.4.12.symbols - debian/libqgisgrass7-3.4.12.symbols Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +qgis (3.4.12+dfsg-1) unstable; urgency=medium + + * Update symbols for other architectures. + * Move from experimental to unstable. + + -- Bas Couwenberg Mon, 16 Sep 2019 06:14:58 +0200 + qgis (3.4.12+dfsg-1~exp1) experimental; urgency=medium * New upstream release. ===================================== debian/libqgis-analysis3.4.12.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.12 amd64 +# SymbolsHelper-Confirmed: 3.4.12 amd64 armel armhf i386 powerpc libqgis_analysis.so.3.4.12 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev _ZN10QByteArrayD1Ev at Base 3.4.5 @@ -469,7 +469,9 @@ libqgis_analysis.so.3.4.12 #PACKAGE# #MINVER# _ZN24CloughTocherInterpolator4initEdd at Base 2.0.1 _ZN24CloughTocherInterpolator9calcPointEddR8QgsPoint at Base 3.4.5 _ZN24CloughTocherInterpolatorC1EP16NormVecDecorator at Base 2.8.0 + (arch=!amd64)_ZN24CloughTocherInterpolatorC1Ev at Base 3.4.12 _ZN24CloughTocherInterpolatorC2EP16NormVecDecorator at Base 2.8.0 + (arch=!amd64)_ZN24CloughTocherInterpolatorC2Ev at Base 3.4.12 _ZN24CloughTocherInterpolatorD0Ev at Base 2.0.1 _ZN24CloughTocherInterpolatorD1Ev at Base 2.0.1 _ZN24CloughTocherInterpolatorD2Ev at Base 2.0.1 ===================================== debian/libqgis-core3.4.12.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.11 amd64 arm64 armel armhf i386 m68k mips64el mipsel powerpc ppc64 ppc64el +# SymbolsHelper-Confirmed: 3.4.12 arm64 armel armhf libqgis_core.so.3.4.12 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev GEOPROJ4 at Base 2.0.1 @@ -40,7 +40,7 @@ libqgis_core.so.3.4.12 #PACKAGE# #MINVER# _ZN10QgsArchiveaSERKS_ at Base 3.4.5 _ZN10QgsClipper11clippedLineERK8QgsCurveRK12QgsRectangle at Base 3.4.5 _ZN10QgsClipper21connectSeparatedLinesEddddRK12QgsRectangleR9QPolygonF at Base 2.0.1 - (arch=!amd64)_ZN10QgsClipper21trimPolygonToBoundaryERK9QPolygonFRS0_RK12QgsRectangleNS_8BoundaryEd at Base 3.4.11 + (arch=!amd64 !arm64)_ZN10QgsClipper21trimPolygonToBoundaryERK9QPolygonFRS0_RK12QgsRectangleNS_8BoundaryEd at Base 3.4.11 _ZN10QgsClipper5MAX_XE at Base 2.0.1 _ZN10QgsClipper5MAX_YE at Base 2.0.1 _ZN10QgsClipper5MIN_XE at Base 2.0.1 ===================================== debian/libqgisgrass7-3.4.12.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 3.4.11 amd64 i386 powerpc +# SymbolsHelper-Confirmed: 3.4.12 arm64 ppc64 ppc64el s390x libqgisgrass7.so.3.4.12 #PACKAGE# #MINVER# * Build-Depends-Package: libqgis-dev _ZN10QByteArray6detachEv at Base 3.4.5 @@ -678,7 +678,6 @@ libqgisgrass7.so.3.4.12 #PACKAGE# #MINVER# _ZNK23QgsGrassFeatureIterator10metaObjectEv at Base 2.14.0 _ZNK26QgsAbstractFeatureIterator7isValidEv at Base 3.4.5 _ZNK8QgsGrass10metaObjectEv at Base 2.14.0 - (optional=templinst|arch=!amd64)_ZSt4swapIN8QVariant7PrivateEENSt9enable_ifIXsrSt6__and_IJSt6__not_ISt15__is_tuple_likeIT_EESt21is_move_constructibleIS6_ESt18is_move_assignableIS6_EEE5valueEvE4typeERS6_SG_ at Base 3.4.11 _ZTI12QgsGrassCopy at Base 2.14.0 _ZTI14QgsGrassImport at Base 2.14.0 _ZTI14QgsGrassVector at Base 2.14.0 View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/4ba0089acd3836deedc48cbaf188315a41e5cbb7...572f9e88c469040d27989f3e1952aa6603a6c3fa -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/4ba0089acd3836deedc48cbaf188315a41e5cbb7...572f9e88c469040d27989f3e1952aa6603a6c3fa You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 08:32:12 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 07:32:12 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag debian/3.4.12+dfsg-1 Message-ID: <5d7f3a7cc1a62_73482ad95c88bbc410788f6@godard.mail> Bas Couwenberg pushed new tag debian/3.4.12+dfsg-1 at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/debian/3.4.12+dfsg-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 08:44:14 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 07:44:14 +0000 Subject: Processing of qgis_3.4.12+dfsg-1_source.changes Message-ID: qgis_3.4.12+dfsg-1_source.changes uploaded successfully to localhost along with the files: qgis_3.4.12+dfsg-1.dsc qgis_3.4.12+dfsg-1.debian.tar.xz qgis_3.4.12+dfsg-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 16 09:38:09 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 08:38:09 +0000 Subject: qgis_3.4.12+dfsg-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 06:14:58 +0200 Source: qgis Architecture: source Version: 3.4.12+dfsg-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: qgis (3.4.12+dfsg-1) unstable; urgency=medium . * Update symbols for other architectures. * Move from experimental to unstable. Checksums-Sha1: 1c6666cac8b7c2963e3f9e05da02946f50503834 4655 qgis_3.4.12+dfsg-1.dsc e73eaaae0ccfc5faf145a2ff88fb426d518cb882 266948 qgis_3.4.12+dfsg-1.debian.tar.xz 7380015ff6d6f9d13581e939b79feb0e331a503e 34891 qgis_3.4.12+dfsg-1_amd64.buildinfo Checksums-Sha256: 3dd6ccb3b936fc08c3ab334130fa6a6f00448a5cf806f564a07809de150ef00d 4655 qgis_3.4.12+dfsg-1.dsc ec8a163d927f27558e0ba4c222dd9ca4ca825b988e5eca52c8978e5d46a44ec5 266948 qgis_3.4.12+dfsg-1.debian.tar.xz c92b2425fab08802513c74fcd1ac35c0394c4b7154dd1ca7ec40eedf68437e63 34891 qgis_3.4.12+dfsg-1_amd64.buildinfo Files: eb19ecc10aaf9d7f6c5e9d88b627a164 4655 science optional qgis_3.4.12+dfsg-1.dsc f36c0c010a29630c7a595923b3d590d4 266948 science optional qgis_3.4.12+dfsg-1.debian.tar.xz edf4be5e376250e01a4d567d47d69b15 34891 science optional qgis_3.4.12+dfsg-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/OjIACgkQZ1DxCuiN SvGMWA//UB2jUYgzVrVAngr9l8lB34o9/8wU7jZCOu1zeqayQICYWbnsZxLDXxSu Fez77bdIMbIPitA3VHOKxF0RzT4dRFVPsZxso/eXFhTONgbMiw4A1QB4ciFUjFtu De0NGKfqSX3UK1VyUKXm2Y3n0tH6QT48HgcRfb6Op86UUqYYVGoYSroYDNVqGoFj iH2pqe5D7kSigahPuJnCSfRilJr5HgtBA0apqwUotkzFxIct02Zy7JVA6PGXo2Q8 2FP8546z/1vtwvsqtp6lzIPuXie/yqpYYhMN7oogcPgCp1qcw3AWdftLJ4BGU4Er tk2XYbkcqgb7lJqRoRHc1p6Jf+gwT0iuVPS8+mwmL206C1xMEHzPeBNDYobNyQ8F 1HwoGWTvOz+Orq4WnA/uvidm0q+f7Xde2Z+VvNAU5AnlcYz0zIVIeqHXZCqMWjvK 0ZOD8dd/PbFF8pvD0y56+BgaCfc6cDsbPUA1vMSWqFlZFjz4Hmk3oWr8j8x0lwrd BwpFy78e2nU/2JK21Oxavjqd89j9Et3EfuIdaUaPsDFNQxSgMj2n6cInd+7iwaMH jI+TIRe8XUBeuVEBx1fE1UjvMXyXOkW7pHV4QtMUDxSle1Pb/kz5LwknaUu9RDkI STPjRitwG7yYlVjSJ583qRF8gYoGbVuP82pS+y1WE0Yg8jaBtlM= =U/zf -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From info at shipboob.com Mon Sep 16 12:07:17 2019 From: info at shipboob.com (Shipboob Inc) Date: Mon, 16 Sep 2019 04:07:17 -0700 Subject: Quote Sample Message-ID: <0.0.0.1E4.1D56C7EEDD9CDB2.0@slot0.shipboob.com> GoodDay, We have an urgent requirement as per attached samples & specifications. Kindly quote your price and inquiry at the earliest .Also confirm the Terms & Condition included, Delivery schedule for supply. Looking forward to your valuable confirmation. sincerely yours -- Thanks Arif Jenkins Manager -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Quote.img Type: application/octet-stream Size: 1376256 bytes Desc: not available URL: From gitlab at salsa.debian.org Mon Sep 16 12:23:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 11:23:49 +0000 Subject: [Git][debian-gis-team/saga][master] Set distribution to unstable. Message-ID: <5d7f70c5459db_73482ad961a5bcb0111111c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / saga Commits: d7f4ccbd by Bas Couwenberg at 2019-09-16T10:44:39Z Set distribution to unstable. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,11 +1,11 @@ -saga (7.3.0+dfsg-2) UNRELEASED; urgency=medium +saga (7.3.0+dfsg-2) unstable; urgency=medium * Team upload. * Switch to wxWidgets GTK 3 implementation. (closes: #933464) * Update lintian override for spelling-error-in-binary. - -- Bas Couwenberg Tue, 30 Jul 2019 17:02:03 +0200 + -- Bas Couwenberg Mon, 16 Sep 2019 12:44:30 +0200 saga (7.3.0+dfsg-1) unstable; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/d7f4ccbd7d7d3e34d0c503595b311b29f8464abe -- View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/d7f4ccbd7d7d3e34d0c503595b311b29f8464abe You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 12:23:54 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 11:23:54 +0000 Subject: [Git][debian-gis-team/saga] Pushed new tag debian/7.3.0+dfsg-2 Message-ID: <5d7f70ca15367_73482ad9619e40e8111133e@godard.mail> Bas Couwenberg pushed new tag debian/7.3.0+dfsg-2 at Debian GIS Project / saga -- View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/tree/debian/7.3.0+dfsg-2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 12:32:32 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 11:32:32 +0000 Subject: Processing of saga_7.3.0+dfsg-2_source.changes Message-ID: saga_7.3.0+dfsg-2_source.changes uploaded successfully to localhost along with the files: saga_7.3.0+dfsg-2.dsc saga_7.3.0+dfsg-2.debian.tar.xz saga_7.3.0+dfsg-2_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Mon Sep 16 12:42:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 11:42:26 +0000 Subject: [Git][debian-gis-team/spatialite-gui][master] Set distribution to unstable. Message-ID: <5d7f752212297_73482ad96187f29811152a8@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / spatialite-gui Commits: 46afc1f4 by Bas Couwenberg at 2019-09-16T11:25:07Z Set distribution to unstable. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,9 +1,9 @@ -spatialite-gui (2.1.0~beta0+really2.0.0~devel2-5) UNRELEASED; urgency=medium +spatialite-gui (2.1.0~beta0+really2.0.0~devel2-5) unstable; urgency=medium * Switch to wxWidgets GTK 3 implementation. (closes: #933409) - -- Bas Couwenberg Tue, 30 Jul 2019 16:20:25 +0200 + -- Bas Couwenberg Mon, 16 Sep 2019 13:24:58 +0200 spatialite-gui (2.1.0~beta0+really2.0.0~devel2-4) unstable; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/commit/46afc1f408d840655c86d190b9ba249020b953ef -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/commit/46afc1f408d840655c86d190b9ba249020b953ef You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 12:42:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 11:42:30 +0000 Subject: [Git][debian-gis-team/spatialite-gui] Pushed new tag debian/2.1.0_beta0+really2.0.0_devel2-5 Message-ID: <5d7f7526c5a17_73482ad9619e40e81115412@godard.mail> Bas Couwenberg pushed new tag debian/2.1.0_beta0+really2.0.0_devel2-5 at Debian GIS Project / spatialite-gui -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/tree/debian/2.1.0_beta0+really2.0.0_devel2-5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 12:44:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 11:44:43 +0000 Subject: [Git][debian-gis-team/librasterlite2][experimental] Set distribution to experimental. Message-ID: <5d7f75abc4352_73482ad95d7a9c001115673@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / librasterlite2 Commits: e89fbadd by Bas Couwenberg at 2019-09-16T11:29:41Z Set distribution to experimental. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,10 +1,10 @@ -librasterlite2 (1.1.0~beta0+really1.1.0~beta0-1~exp3) UNRELEASED; urgency=medium +librasterlite2 (1.1.0~beta0+really1.1.0~beta0-1~exp3) experimental; urgency=medium * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. * Bump Standards-Version to 4.4.0, no changes. - -- Bas Couwenberg Fri, 15 Mar 2019 15:30:52 +0100 + -- Bas Couwenberg Mon, 16 Sep 2019 13:29:30 +0200 librasterlite2 (1.1.0~beta0+really1.1.0~beta0-1~exp2) experimental; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/librasterlite2/commit/e89fbaddae27a9e6294d6f6fa7bd3341d51c94e5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/librasterlite2/commit/e89fbaddae27a9e6294d6f6fa7bd3341d51c94e5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 12:44:48 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 11:44:48 +0000 Subject: [Git][debian-gis-team/librasterlite2] Pushed new tag debian/1.1.0_beta0+really1.1.0_beta0-1_exp3 Message-ID: <5d7f75b0a21c1_73482ad9619e40e811158cf@godard.mail> Bas Couwenberg pushed new tag debian/1.1.0_beta0+really1.1.0_beta0-1_exp3 at Debian GIS Project / librasterlite2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/librasterlite2/tree/debian/1.1.0_beta0+really1.1.0_beta0-1_exp3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 12:52:43 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 11:52:43 +0000 Subject: Processing of librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.changes Message-ID: librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.changes uploaded successfully to localhost along with the files: librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.dsc librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.debian.tar.xz librasterlite2-1-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb librasterlite2-1_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb librasterlite2-dev_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.buildinfo libsqlite3-mod-rasterlite2-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb libsqlite3-mod-rasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb rasterlite2-bin-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb rasterlite2-bin_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 16 12:52:44 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 11:52:44 +0000 Subject: Processing of spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5_source.changes Message-ID: spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5_source.changes uploaded successfully to localhost along with the files: spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.dsc spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.debian.tar.xz spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 16 12:52:57 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 11:52:57 +0000 Subject: saga_7.3.0+dfsg-2_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 12:44:30 +0200 Source: saga Architecture: source Version: 7.3.0+dfsg-2 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 933464 Changes: saga (7.3.0+dfsg-2) unstable; urgency=medium . * Team upload. * Switch to wxWidgets GTK 3 implementation. (closes: #933464) * Update lintian override for spelling-error-in-binary. Checksums-Sha1: 6a2466a1b0ae385c0741d4ec3ca0e54533011b0f 2647 saga_7.3.0+dfsg-2.dsc cd29f643735014ab08cca474927650438cd0626e 17404 saga_7.3.0+dfsg-2.debian.tar.xz e6a8c95c66df63f76e076e6be912b1a98c0767b9 24341 saga_7.3.0+dfsg-2_amd64.buildinfo Checksums-Sha256: abfb037a1e7e0cd0a935ba749a4d7e0b0045e3ce37e6d4a295ffc24a0c70d78f 2647 saga_7.3.0+dfsg-2.dsc bcad90fcb9eb4bf1e2c12cfb2d5de30eb54869fc9ca23a13e20bfef2f039afc2 17404 saga_7.3.0+dfsg-2.debian.tar.xz 6957d254b9004fcfcf11f19f6bbdc010746446cf3978b75c7160eb9bab3cedc7 24341 saga_7.3.0+dfsg-2_amd64.buildinfo Files: 2750c60349e37f021b3bb8167ba600a9 2647 science optional saga_7.3.0+dfsg-2.dsc 77c06938a4da2aa9f6dc88b2506955ee 17404 science optional saga_7.3.0+dfsg-2.debian.tar.xz 6a5945331271e774688aaa0503197d8c 24341 science optional saga_7.3.0+dfsg-2_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/cLAACgkQZ1DxCuiN SvGszg/9HuigaLbOqr5Nbl0sdr5r7aLkQuFH6Ix1KZLMZPnJldhHl7kyhIF2pB/j 8C0iHNeZK7CrffYMtE3MG3AaQBWo0hUhKvsbxvSonIlBaQhU/uvPoiaFnQJI39N8 zsLq8m5IX3U4CF6IANK2687oBVHA+gQFPVgu9w0BztO7tgdu/Nhrow4yvhoJspje XrBPfnT+l4GyzCnhOoh9rvr6pi5TPki4Ux3Td/o8rekWeLYhNSrw/mVdViJhlc0R QlYCRm/W/QTGWUH+q0jbo78/ryzBAsWBaYm5Rfw2RZuFIqV85MDu9+gYoZUQxMGJ 0dtytxFCjipkzqLyWHzGvivtUIRPSf9n36KIweyBip4DNLYzglgygPMlgzlxULJi 3p7yYDomKxj3rjCGBkFKvfQOiyAQKKqLl/CW/xsK6KuJ4Rp2Utaw0E7c4KSN3sBu N7KyPoKQjxBdpSmTRjJPEM3lotfzkva9QWHYroaIcUNzc5p3Y2smY8jS0bKaKlv5 BEC5zZoelE4p+R3KmG6wrE3AqZbOzVeWIWpef2RpuCKU+HZ7C2E1VJ/6GRx6rNxs bMR76RyxjwLuQtinhPQurS/yvDAiMvJ4Qc6Q9AK2Lee1971fmN57KMxVTmCBzn4Z itxYG/4e1uSqUIou0O17RgQItp6yOba8GpYPGBs8HBSabQQ3+iI= =G4/a -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Mon Sep 16 12:57:05 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Mon, 16 Sep 2019 11:57:05 +0000 Subject: Bug#933464: marked as done (saga: Please rebuild against wxWidgets GTK 3 package) References: <20190730140533.B0F9F22A09F6@bear.techie.net> Message-ID: Your message dated Mon, 16 Sep 2019 11:52:58 +0000 with message-id and subject line Bug#933464: fixed in saga 7.3.0+dfsg-2 has caused the Debian Bug report #933464, regarding saga: Please rebuild against wxWidgets GTK 3 package to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 933464: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933464 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: swt at techie.net Subject: saga: Please rebuild against wxWidgets GTK 3 package Date: Tue, 30 Jul 2019 10:05:32 -0400 Size: 3063 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#933464: fixed in saga 7.3.0+dfsg-2 Date: Mon, 16 Sep 2019 11:52:58 +0000 Size: 5145 URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 13:05:58 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 12:05:58 +0000 Subject: librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 13:29:30 +0200 Source: librasterlite2 Binary: librasterlite2-1 librasterlite2-1-dbgsym librasterlite2-dev libsqlite3-mod-rasterlite2 libsqlite3-mod-rasterlite2-dbgsym rasterlite2-bin rasterlite2-bin-dbgsym Architecture: source amd64 Version: 1.1.0~beta0+really1.1.0~beta0-1~exp3 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: librasterlite2-1 - library for huge raster coverages using a SpatiaLite DBMS librasterlite2-dev - library for huge raster coverages using a SpatiaLite DBMS - heade libsqlite3-mod-rasterlite2 - SQLite 3 module for huge raster coverages rasterlite2-bin - command line tools for librasterlite2 Changes: librasterlite2 (1.1.0~beta0+really1.1.0~beta0-1~exp3) experimental; urgency=medium . * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. * Bump Standards-Version to 4.4.0, no changes. Checksums-Sha1: e4f754e14824334ea26feb99be728897d87b5d4a 2770 librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.dsc b91b8fb434c301b64bb4bb265fe39645d132499c 24980 librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.debian.tar.xz bd79527dc1a0b1366e6dade90f4a0e06f99606fc 1373516 librasterlite2-1-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb dd7c989f37ca4d8f4e4cc417c1f5dfae1cbd4065 384780 librasterlite2-1_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 0504571ce132e747d3a5cda4ec2ae63505a172e4 433132 librasterlite2-dev_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 952577748dafe27e6bab38a0ee97ed4a5b20c1bc 14320 librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.buildinfo 27569532852b6706a47f44b2bde1a9daa05e3a64 1540392 libsqlite3-mod-rasterlite2-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 30f645ee8d54970d2ac80c7c093cabf9baa4a056 375252 libsqlite3-mod-rasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 856fefd7ae95854c1351702dab99552a9bdbb189 181644 rasterlite2-bin-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb dc9aff0f0728e80114804b52aeebc005448bbf9c 79004 rasterlite2-bin_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb Checksums-Sha256: b0e96151bfbdbf6b3d87aa800cce85956afde311c9b0b84e47e4c6c02030427d 2770 librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.dsc 8868f2a53ddf88f75882186d01b05e4fb96b85135d37b945566e3e13a384ae26 24980 librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.debian.tar.xz 9b20440dfe6a1418216be7303bd5398c51cf92254170ffd93c50248160b0fbca 1373516 librasterlite2-1-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb cf6617cca8f133281b5f1cdfd2dd85b881fb1570a4c41c9bda0bb00425170557 384780 librasterlite2-1_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 0b153d068b8cbcd0448b0c60ca32af85099f9399b4bdb5c9cee58799ccada173 433132 librasterlite2-dev_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 336afd5309f41988aeeb7cefea7ca17c25c36df325199c7db91ddf32a0dae2ac 14320 librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.buildinfo 9ceefff094a918cf6ccd191bf101e600be6cde0cd26ff3911ac1a3405f4d8a16 1540392 libsqlite3-mod-rasterlite2-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb d27f32e03b9c37afa4772949dbe02a35607c8ec29c92fdae8fd8d74fc0e1e240 375252 libsqlite3-mod-rasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 9a5fe2fb7e8a2cbcd6e6ae2f95c2a336170df1d1a4151f809379d2074a0be03a 181644 rasterlite2-bin-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb e4574f1d41e40feaebed70a39343a7af61c0d38df7cd57c973a4e8be8f0b5ce9 79004 rasterlite2-bin_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb Files: c8c658e550f3cb93cc188349a9557e0b 2770 science optional librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.dsc 5fb3ab665e9ba59fa7d3130dcda882bb 24980 science optional librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3.debian.tar.xz 6cf84d163957c375337a6fe7f9c781f3 1373516 debug optional librasterlite2-1-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb b3eeca419232c6207c70fc7a1229b70e 384780 libs optional librasterlite2-1_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 6d0b0962a46d17fdfdaf9a5213d364bd 433132 libdevel optional librasterlite2-dev_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 0a9a89a41489c97e5142c387ce3c214c 14320 science optional librasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.buildinfo 53836a5c5e2fe546188ffcfc12f1ad5c 1540392 debug optional libsqlite3-mod-rasterlite2-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb c2107b5348992fca0f967e0986f6eded 375252 libs optional libsqlite3-mod-rasterlite2_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 2385e4749b05337ce113c4ac411e5275 181644 debug optional rasterlite2-bin-dbgsym_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb 0feadd4fd0f745d102dccc9b61234acf 79004 graphics optional rasterlite2-bin_1.1.0~beta0+really1.1.0~beta0-1~exp3_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/dRAACgkQZ1DxCuiN SvFecQ//YQiMqyumLUsnLujyYn3sbkPfSKt21J3BRAjoGf7MHr5/l7JaS95Ao31n RlCaGhSJ2Z6Db/I5b/PDDRY0qLxUTy5OnmvfQsf90ZWPkWRIIjQDoPap6E0tvAIG aGppa9EGQs75NgFI00HMT+UPaBXV1ARLn0nI7REODftbFr0rXNoJA1g3dNKkKan8 LuLTX4wStWcIT+rGlyqD4xmx7wThpg0rq7XAyKTVaE1ZZbrR7X7lA5NSnzMRnxTU 46OIL9gYD4DQXjQw74sF9XGjU1AL/+EZvf77iTmdoYhDvruxqPrMKkZ37NCoXcot VnJiYTaO4xG2jxBkZhoC0FqTA4Am0LihXU/5P73XHbI3CpnJGXodroZaLINPyCSX etiVuADD3rlvDFuLsnhvwT/MK6j6on1evYXyxFQozGxCqmPqlnI3H5s5JAAVSIvh kbOK2UBLEGlOpDBGAq1FHap3FkpNYuVP4bMbnX8ycaPK9r0LM5U56WwlHCTxYXoO 1ejIDY/gknpLrhMuYMVmqVikdl8sai98JN+idtSF0GhlPDUFj8Fi7BpnuX14x9M8 oSkrkCDo/sWRKg2qH8IJvtyONPxJGwzzAD44KQ6ByIywmColgNvs4sZ21OOe2qxT FJ1kZuzF0+Crn5HElhx5ML21gVQDhq45MpFsNp4GynIWRFNXpWE= =akL7 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 16 13:08:50 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 12:08:50 +0000 Subject: spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 13:24:58 +0200 Source: spatialite-gui Architecture: source Version: 2.1.0~beta0+really2.0.0~devel2-5 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 933409 Changes: spatialite-gui (2.1.0~beta0+really2.0.0~devel2-5) unstable; urgency=medium . * Switch to wxWidgets GTK 3 implementation. (closes: #933409) Checksums-Sha1: 077588f4c7df42f67462cc22d350108d9a3aaa4e 2454 spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.dsc cef2a10ca9c98fffed1fc49a6485595c507ef676 344180 spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.debian.tar.xz 454369aa243a74fae13d9402e85c3fc4a61d97ea 13355 spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5_amd64.buildinfo Checksums-Sha256: e18cbdb72446888497b0680c5e2ab349fbf83281b0363ec10125fe653c87600e 2454 spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.dsc 2ebb1d312d6c14f5b973194efb9ce0f07186e519dd899bb6b349ec0a88d0a07d 344180 spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.debian.tar.xz d07c0f80bc2b125d2216a6669d362a4b7b94fc74c75c0af084a64fc0cd083782 13355 spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5_amd64.buildinfo Files: 8845bb3d5ad80d8ddf147e4c1fed8e6d 2454 utils optional spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.dsc fb24ae76205c70229fe1affed5f832c0 344180 utils optional spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5.debian.tar.xz 635aef88b78d64ee6e692bcc035869ea 13355 utils optional spatialite-gui_2.1.0~beta0+really2.0.0~devel2-5_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/dQEACgkQZ1DxCuiN SvHNkxAApVcdUDOXNRqLWl7os+AT5T7cB48YZgfuYTTQSqMNuGCTRoUxhq6PYr8F QnSDxPgQrmY+chHNJIbs78ocT4yAzreWuoznlSgmVdk1cdwcbMUwaMAPDSCYc9vg pFKF2B1bq9SFSGYomHaWAaWD3rWkVVRYivBAs3ghoZ4RkePacCwdGyydh29q5VOn 3bG/0tkVVC+o5aN0ek26xMe3U7u+YGy5B64sMTJgS1mZidSserAgIW0TzEnfsWo5 bt+XqgLXngyKKOPFHHxcTDYe8t6BFdkL8X4Ym6NTKYdNXPcbIw6PoXtZjbw2YaZu fhV12xquKl4sJCoVIGxTxZIUuDMWaJujMh5brxetMjfOrrfaxedg23bBZV2syQA9 IjgpvdF7ziK6vRAS84D3HnbV0Te9DCl8DgRcVrRPVnk/nit2Hi4VDIMozV3CBY/f htKjUcZJojf6h0UyozmecybfNLmsLIVvk23OW/YYTQLKWwcnZSjRMqaPAxl4TX5w k/Le6wUvhkc6rJuc1GDddJ7SJ975ExUmtbQ9kBUkosW6gZZbcWFx2rHYALjEj4p4 8n5O/qzE3lPAIiItKMfqA00RLVZQgp4b2rn4pTASUOtr6iLkPb1C5wtDQ3oiXGqd w1j+h+44pVtyBn3KQskj+2LkmjrgVhXCbMwewdLXw1fFlnxmnWQ= =+gnN -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Mon Sep 16 13:12:23 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Mon, 16 Sep 2019 12:12:23 +0000 Subject: Bug#933409: marked as done (spatialite-gui: Please rebuild against wxWidgets GTK 3 package) References: <20190730140539.8EAA222A0A3D@bear.techie.net> Message-ID: Your message dated Mon, 16 Sep 2019 12:08:50 +0000 with message-id and subject line Bug#933409: fixed in spatialite-gui 2.1.0~beta0+really2.0.0~devel2-5 has caused the Debian Bug report #933409, regarding spatialite-gui: Please rebuild against wxWidgets GTK 3 package to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 933409: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933409 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: swt at techie.net Subject: spatialite-gui: Please rebuild against wxWidgets GTK 3 package Date: Tue, 30 Jul 2019 10:05:38 -0400 Size: 3083 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#933409: fixed in spatialite-gui 2.1.0~beta0+really2.0.0~devel2-5 Date: Mon, 16 Sep 2019 12:08:50 +0000 Size: 5497 URL: From gitlab at salsa.debian.org Mon Sep 16 13:54:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 12:54:44 +0000 Subject: [Git][debian-gis-team/spatialite][experimental] Set distribution to experimental. Message-ID: <5d7f861485ee8_73482ad95d7a9c0011316c6@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / spatialite Commits: 8146825e by Bas Couwenberg at 2019-09-16T11:45:21Z Set distribution to experimental. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,8 +1,8 @@ -spatialite (5.0.0~beta0-1~exp4) UNRELEASED; urgency=medium +spatialite (5.0.0~beta0-1~exp4) experimental; urgency=medium * Require at least librttopo-dev 1.1.0. - -- Bas Couwenberg Sat, 27 Jul 2019 11:04:24 +0200 + -- Bas Couwenberg Mon, 16 Sep 2019 13:45:00 +0200 spatialite (5.0.0~beta0-1~exp3) experimental; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite/commit/8146825ecc0986498b550b11c00152a5b182c65b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite/commit/8146825ecc0986498b550b11c00152a5b182c65b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 13:54:48 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 12:54:48 +0000 Subject: [Git][debian-gis-team/spatialite] Pushed new tag debian/5.0.0_beta0-1_exp4 Message-ID: <5d7f86189ae91_73482ad95d7a9c0011320eb@godard.mail> Bas Couwenberg pushed new tag debian/5.0.0_beta0-1_exp4 at Debian GIS Project / spatialite -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite/tree/debian/5.0.0_beta0-1_exp4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 14:03:32 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 13:03:32 +0000 Subject: Processing of spatialite_5.0.0~beta0-1~exp4_amd64.changes Message-ID: spatialite_5.0.0~beta0-1~exp4_amd64.changes uploaded successfully to localhost along with the files: spatialite_5.0.0~beta0-1~exp4.dsc spatialite_5.0.0~beta0-1~exp4.debian.tar.xz libspatialite-dev_5.0.0~beta0-1~exp4_amd64.deb libspatialite7-dbgsym_5.0.0~beta0-1~exp4_amd64.deb libspatialite7_5.0.0~beta0-1~exp4_amd64.deb libsqlite3-mod-spatialite-dbgsym_5.0.0~beta0-1~exp4_amd64.deb libsqlite3-mod-spatialite_5.0.0~beta0-1~exp4_amd64.deb spatialite_5.0.0~beta0-1~exp4_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Mon Sep 16 14:04:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 13:04:17 +0000 Subject: [Git][debian-gis-team/spatialite-tools][experimental] 2 commits: No change rebuild with PROJ 6. Message-ID: <5d7f8851a5a7_73482ad96187f298113286@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / spatialite-tools Commits: 3667c5ed by Bas Couwenberg at 2019-09-16T12:54:59Z No change rebuild with PROJ 6. - - - - - 70b30cec by Bas Couwenberg at 2019-09-16T12:55:11Z Set distribution to experimental. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +spatialite-tools (4.4.0~rc1-1~exp5) experimental; urgency=medium + + * No change rebuild with PROJ 6. + + -- Bas Couwenberg Mon, 16 Sep 2019 14:55:03 +0200 + spatialite-tools (4.4.0~rc1-1~exp4) experimental; urgency=medium * Bump Standards-Version to 4.4.0, no changes. View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-tools/compare/5eb9c22ba905eacee5fd9a8ea5be01c84982edf4...70b30cecacef995ec843b3f0e32228d56e8a896e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-tools/compare/5eb9c22ba905eacee5fd9a8ea5be01c84982edf4...70b30cecacef995ec843b3f0e32228d56e8a896e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 14:04:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 13:04:26 +0000 Subject: [Git][debian-gis-team/spatialite-tools] Pushed new tag debian/4.4.0_rc1-1_exp5 Message-ID: <5d7f885a77962_73482ad96187f29811330a0@godard.mail> Bas Couwenberg pushed new tag debian/4.4.0_rc1-1_exp5 at Debian GIS Project / spatialite-tools -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-tools/tree/debian/4.4.0_rc1-1_exp5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 14:07:20 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 13:07:20 +0000 Subject: [Git][debian-gis-team/spatialite-gui][experimental] Set distribution to experimental. Message-ID: <5d7f89083909d_73483fbbba63e34411337e3@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / spatialite-gui Commits: bb3a7b96 by Bas Couwenberg at 2019-09-16T11:24:43Z Set distribution to experimental. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,9 +1,9 @@ -spatialite-gui (2.1.0~beta0+really2.1.0~beta0-1~exp5) UNRELEASED; urgency=medium +spatialite-gui (2.1.0~beta0+really2.1.0~beta0-1~exp5) experimental; urgency=medium * Switch to wxWidgets GTK 3 implementation. (closes: #933409) - -- Bas Couwenberg Tue, 30 Jul 2019 16:20:25 +0200 + -- Bas Couwenberg Mon, 16 Sep 2019 13:24:23 +0200 spatialite-gui (2.1.0~beta0+really2.1.0~beta0-1~exp4) experimental; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/commit/bb3a7b965046a39113ddc2473d1004b54c7183df -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/commit/bb3a7b965046a39113ddc2473d1004b54c7183df You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 14:07:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 13:07:31 +0000 Subject: [Git][debian-gis-team/spatialite-gui] Pushed new tag debian/2.1.0_beta0+really2.1.0_beta0-1_exp5 Message-ID: <5d7f89131e38b_73482ad961a5bcb011339b0@godard.mail> Bas Couwenberg pushed new tag debian/2.1.0_beta0+really2.1.0_beta0-1_exp5 at Debian GIS Project / spatialite-gui -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/tree/debian/2.1.0_beta0+really2.1.0_beta0-1_exp5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 14:13:33 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 13:13:33 +0000 Subject: Processing of spatialite-tools_4.4.0~rc1-1~exp5_amd64.changes Message-ID: spatialite-tools_4.4.0~rc1-1~exp5_amd64.changes uploaded successfully to localhost along with the files: spatialite-tools_4.4.0~rc1-1~exp5.dsc spatialite-tools_4.4.0~rc1-1~exp5.debian.tar.xz spatialite-bin-dbgsym_4.4.0~rc1-1~exp5_amd64.deb spatialite-bin_4.4.0~rc1-1~exp5_amd64.deb spatialite-tools_4.4.0~rc1-1~exp5_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 16 14:14:37 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 13:14:37 +0000 Subject: spatialite_5.0.0~beta0-1~exp4_amd64.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 13:45:00 +0200 Source: spatialite Binary: libspatialite-dev libspatialite7 libspatialite7-dbgsym libsqlite3-mod-spatialite libsqlite3-mod-spatialite-dbgsym Architecture: source amd64 Version: 5.0.0~beta0-1~exp4 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: libspatialite-dev - Geospatial extension for SQLite - development files libspatialite7 - Geospatial extension for SQLite - libraries libsqlite3-mod-spatialite - Geospatial extension for SQLite - loadable module Changes: spatialite (5.0.0~beta0-1~exp4) experimental; urgency=medium . * Require at least librttopo-dev 1.1.0. Checksums-Sha1: 8103cce9c562357f41891a619f580885db605d8a 2465 spatialite_5.0.0~beta0-1~exp4.dsc 9bc1e61d48d9a5802e43f007776ae6fa05d5ee25 27200 spatialite_5.0.0~beta0-1~exp4.debian.tar.xz 65ae942bbdf79ad3423041c2222d052a608978b3 1755636 libspatialite-dev_5.0.0~beta0-1~exp4_amd64.deb a1efa0ecee69bcae038a1c5b53c8911324b62d0a 3751844 libspatialite7-dbgsym_5.0.0~beta0-1~exp4_amd64.deb 2145c44944df19d0ceccfb43313ceca8d5f8659c 1630636 libspatialite7_5.0.0~beta0-1~exp4_amd64.deb 1f7091a1ecd6276acd0d2662a01cfe43fd0b3723 4122600 libsqlite3-mod-spatialite-dbgsym_5.0.0~beta0-1~exp4_amd64.deb 3cc961e1fbd11974624b16247fca513e17514173 1591844 libsqlite3-mod-spatialite_5.0.0~beta0-1~exp4_amd64.deb 1c1ca8a75ab276c593eebf918ffebf295fb8b8b2 8425 spatialite_5.0.0~beta0-1~exp4_amd64.buildinfo Checksums-Sha256: 366cf5124cb445d3f303f7f32944caf7ac0fd4c2b1ad30e90d9d8457b49b90d9 2465 spatialite_5.0.0~beta0-1~exp4.dsc 0d51892494bf4a66aab395f19648dbc7e99bcb5b538d051d8888a946b12348f2 27200 spatialite_5.0.0~beta0-1~exp4.debian.tar.xz 494745daad12721bea9409505db278af481ff36db1d4627d86c52c122aecfd60 1755636 libspatialite-dev_5.0.0~beta0-1~exp4_amd64.deb 5b78f96ff6e089e24c09387d07e7a63bb02bde8889fd9bceac4f6fad26372eac 3751844 libspatialite7-dbgsym_5.0.0~beta0-1~exp4_amd64.deb 68d43c561fdcb1c0e86e06946f053720760a92d589e701e989d3447f11e9fca2 1630636 libspatialite7_5.0.0~beta0-1~exp4_amd64.deb e169d2149f73584567d27d815a5226893e4cbd73419344fbc09629e010962b25 4122600 libsqlite3-mod-spatialite-dbgsym_5.0.0~beta0-1~exp4_amd64.deb 723585c14ada724ac72472072a39cea0ed51b563fcbbeb4d57d0b18683624321 1591844 libsqlite3-mod-spatialite_5.0.0~beta0-1~exp4_amd64.deb f6d9fde90df44c721f60df5733774bfba8a54f88c8a0b554265f689b24d90a6b 8425 spatialite_5.0.0~beta0-1~exp4_amd64.buildinfo Files: e703417c043a97772d7326d7aec0e82a 2465 science optional spatialite_5.0.0~beta0-1~exp4.dsc 5862602ff6b1ff51c5c10455270e21bc 27200 science optional spatialite_5.0.0~beta0-1~exp4.debian.tar.xz e0a1013ae4e6d6cc2fb2d833892f43c6 1755636 libdevel optional libspatialite-dev_5.0.0~beta0-1~exp4_amd64.deb 1a2170a99dee9579871f193250e1099e 3751844 debug optional libspatialite7-dbgsym_5.0.0~beta0-1~exp4_amd64.deb fe14758b3e4e2127d7e9adcd6335ab03 1630636 libs optional libspatialite7_5.0.0~beta0-1~exp4_amd64.deb 09eda47da10ca970b1f3f73428a5e7ef 4122600 debug optional libsqlite3-mod-spatialite-dbgsym_5.0.0~beta0-1~exp4_amd64.deb e43caf355b157d564740523de7cc8d83 1591844 libs optional libsqlite3-mod-spatialite_5.0.0~beta0-1~exp4_amd64.deb bc3a9d53aa54c5bc2117ad1cc39459d7 8425 science optional spatialite_5.0.0~beta0-1~exp4_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/hd0ACgkQZ1DxCuiN SvFiXBAAzv7r0d6Oa17OzyEsLgGn9iVlleRVzSy4oAMJeC38/lXlo0d7srtmDdD7 VNYb38BMZdHQA8Na9qCLyUgChr+/msvbEwS08wVjZA9nahqUMFJqQDZj2I9fyV3V 2h43KGCugU+GKf4vL75tmBOhA8ZK/ist+AuFOCT6cBelEdUdCPNyTgRI33Q/BbZN zQd+eQNV8mrGqb+3t9pZhcyxibBKiYmuoJ2i2fsx75aJDm/eNVJOT60vG/UBbf0+ wPm3C0xaVRTjUz9M/9atD1yGLpg0AjriBt1a0jaTVbrPowqb5XU6vR0FURKa8LhQ dBY1zQuuaSf8UncvDOIZmCpvh45UXOjAHRXdlLc9B4sjefJP0zRCRB24cVbwDgPY 1xQ35zDHakvfg0TB7/kQqvfh9XxzYlWatLRNYV/yqzx4M4XqpP6bdFKxv789iLap q3oYJ3hZOYw620lImlhqe/F8QaeIsHMsVSN8CFOm68it75n5Vgn9Yzsm5rltYfnP lkTWjbEDgFYvmUUa249POZa2eQEn51nNSCnw/FmOjTK34qsR4nWCTIyVz+jqhHYe Fq0sXZCpJlBXPGPaxrA1BDcPbLfOwZ3RmZQS9LkoAYVpzBbvX5MQ4tjcuXPm2SBm 42P40jvblKA9t20NrjINFe3fRzYvi1JANz5De8ztQGAYdIzHW0k= =nERg -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 16 14:18:38 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 13:18:38 +0000 Subject: Processing of spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.changes Message-ID: spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.changes uploaded successfully to localhost along with the files: spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.dsc spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.debian.tar.xz spatialite-gui-dbgsym_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.buildinfo spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Mon Sep 16 14:45:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 13:45:23 +0000 Subject: [Git][debian-gis-team/postgis][experimental] Set distribution to experimental. Message-ID: <5d7f91f36870a_73482ad961a5bcb01141949@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / postgis Commits: 77486d24 by Bas Couwenberg at 2019-09-16T13:22:21Z Set distribution to experimental. - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,5 +1,6 @@ -postgis (3.0.0~alpha4+dfsg-2) UNRELEASED; urgency=medium +postgis (3.0.0~alpha4+dfsg-2~exp1) experimental; urgency=medium + [ Christoph Berg ] * debian/tests: Test postgis_raster extension as well. * debian/tests/regress: Makefile.in needs some variables from ./configure. * Instead of conflicting with older postgresql-*-postgis-*-scripts packages, @@ -8,7 +9,7 @@ postgis (3.0.0~alpha4+dfsg-2) UNRELEASED; urgency=medium extension package. * Remove unversioned and unused sql files from address standardizer. - -- Christoph Berg Mon, 12 Aug 2019 10:55:44 +0200 + -- Bas Couwenberg Mon, 16 Sep 2019 15:21:59 +0200 postgis (3.0.0~alpha4+dfsg-1) experimental; urgency=medium View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/77486d249421c96e94949a1f9915f0a5bbdc84fd -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/77486d249421c96e94949a1f9915f0a5bbdc84fd You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 16 14:45:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 16 Sep 2019 13:45:27 +0000 Subject: [Git][debian-gis-team/postgis] Pushed new tag debian/3.0.0_alpha4+dfsg-2_exp1 Message-ID: <5d7f91f7a0e71_73482ad9619e40e8114212e@godard.mail> Bas Couwenberg pushed new tag debian/3.0.0_alpha4+dfsg-2_exp1 at Debian GIS Project / postgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/tree/debian/3.0.0_alpha4+dfsg-2_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 14:52:59 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 13:52:59 +0000 Subject: spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 13:24:23 +0200 Source: spatialite-gui Binary: spatialite-gui spatialite-gui-dbgsym Architecture: source amd64 Version: 2.1.0~beta0+really2.1.0~beta0-1~exp5 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: spatialite-gui - user-friendly graphical user interface for SpatiaLite Closes: 933409 Changes: spatialite-gui (2.1.0~beta0+really2.1.0~beta0-1~exp5) experimental; urgency=medium . * Switch to wxWidgets GTK 3 implementation. (closes: #933409) Checksums-Sha1: 291f06da9a43d88b3c257b941ca94b5217bc565a 2558 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.dsc 1be220d8fdf1b432728022ac67f4b7ef6736bc17 14428 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.debian.tar.xz 5cc1d3215352b051266b60835e5dca140a07a575 19127508 spatialite-gui-dbgsym_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb f36e76d2da4ed34d6f3d0f9cdedf8a762f0b39dc 13606 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.buildinfo 685d7076eebbe7df269870eed3b72df5c574171d 1955776 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb Checksums-Sha256: f4a4a2bd9158723ba85ce22ff8ddb54e241c0cd228a5761824312eee1c7ea76e 2558 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.dsc 5165a15c7e170bb770bb57ca7484050d095cb6f9408ff67a55bffef2ce0fd07d 14428 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.debian.tar.xz a5114c52daa6a2588cccc56752e7cbe5f06cbc30a379bd05bbb396e7470f7d52 19127508 spatialite-gui-dbgsym_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb 1f6835fde8e2d412ec66acc06f5ea7688e39481d2cb33e5ad36eae0b5db31aae 13606 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.buildinfo d4fc8f1d4ff44fac83e62153a1cadc7ad6730820c12d0ccb7e190913a12ad35b 1955776 spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb Files: d9799386d7921e2549bf3b32f6f619ca 2558 utils optional spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.dsc 3d697dc527a865774e958e22231a1514 14428 utils optional spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5.debian.tar.xz 0e2dc1538a793b394425f09061c4617b 19127508 debug optional spatialite-gui-dbgsym_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb e88d4f783a1ed01e55330b3a4daa3527 13606 utils optional spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.buildinfo 5c95fb9dcc577d3978f8b0f28fa4378f 1955776 utils optional spatialite-gui_2.1.0~beta0+really2.1.0~beta0-1~exp5_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/iO8ACgkQZ1DxCuiN SvFCLQ//SlwLXEhvohWznuayzb4zd1h4Fi/tj7LqWUsBcqJPgXixr26+tBhWbwNk 2LdxGYY1jN9OS26ugquIS6UZJEaB3+aybGPVOTjdY8U127BtTaGIh2pRj8Nl1aF0 9earrJlHUkA4Jwl1x1Cdt0vqJkDY+TAP1AS2jhdIfJSjn0Z52ixNmeVg6DyiL1Na iXPaAR9tgIL87Ry5nazZRe8UAK4Div2j1RvU3crhdbI3XoHT/VBmClVB8lMdroFf yIXqO/YCOlq/J2SN6Obl0UTZ58KOrcacas78czb434PkJ3ie+ClyIZNSIgS0WhhE qobTSIzXIL2MK30kltZZIhVh82+W9eLnVXD+TIQjzJwQABVS8uKIMWe/L1ZCfyOr FD2YX6Qx7ZWLO2w8v7POZz4icaTvvkSHErtG5K5CvtqMpcApTpQN0E78bbxkcRSR yp9ViiprqMWazaJVHi58Wyu+E/JfsCADBDyPdioOmF4sTsPTZ4iYJ/azuwMShERv 6T2OVaAYxpzmJGmnT3V06ESBIJZQAUKkfC3aMsQKL/38rRHfmEdVWmUpLz6xD1aD 66r7EaNqXbf8AdT3Cl9ByoiAfZyYks29KjcIsu5gOx3Oq/yIoxKkSXNVMuRt8WKb 6H3hU8XpBYqdiFXMthSvbc9tM1WFVd+Rno9OZHS6Dks51PK9wvk= =BTWh -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 16 14:53:10 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 13:53:10 +0000 Subject: spatialite-tools_4.4.0~rc1-1~exp5_amd64.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 14:55:03 +0200 Source: spatialite-tools Binary: spatialite-bin spatialite-bin-dbgsym Architecture: source amd64 Version: 4.4.0~rc1-1~exp5 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: spatialite-bin - Geospatial extension for SQLite - tools Changes: spatialite-tools (4.4.0~rc1-1~exp5) experimental; urgency=medium . * No change rebuild with PROJ 6. Checksums-Sha1: facfdc349e8e861afd014e079ffb58c5fc0685ad 2421 spatialite-tools_4.4.0~rc1-1~exp5.dsc 1de411bc5b383f2c5902e97b25cfa73c5a595aad 11700 spatialite-tools_4.4.0~rc1-1~exp5.debian.tar.xz 16240902b1d39fdd73f43807adbc0e6a28fb0b03 716260 spatialite-bin-dbgsym_4.4.0~rc1-1~exp5_amd64.deb 7be4e62492b2f14c44c7f7994b430b0f79866198 220748 spatialite-bin_4.4.0~rc1-1~exp5_amd64.deb ee8fb071ca84468861ec449d91bf77cd43b8d72c 8984 spatialite-tools_4.4.0~rc1-1~exp5_amd64.buildinfo Checksums-Sha256: 48179837a9919429d5eaa078028df92b6e96fcf612ade39a0e5cab86205df181 2421 spatialite-tools_4.4.0~rc1-1~exp5.dsc 1c477ef8a1d97d9773dbcb2c3e9564a6f4b02977bd32067cd449cc9ac2cb067c 11700 spatialite-tools_4.4.0~rc1-1~exp5.debian.tar.xz c212c627ed2b1615a84bddc3041f156d6f915259e63c7155a24a825c5c81e57e 716260 spatialite-bin-dbgsym_4.4.0~rc1-1~exp5_amd64.deb 4534d05efc59d3788b27bf4e768a3f40ad0f6e154458b1ae1f96981930053c48 220748 spatialite-bin_4.4.0~rc1-1~exp5_amd64.deb 98cb35ca300ff89afb7c742c7aa57fe16b3537658cbf5b53d06c81c296ee4de1 8984 spatialite-tools_4.4.0~rc1-1~exp5_amd64.buildinfo Files: 001360db3cd90cfbd12a5b1ef2ab8b9f 2421 science optional spatialite-tools_4.4.0~rc1-1~exp5.dsc 654e461d2b0d4ff29eb5ce35bc5f1c26 11700 science optional spatialite-tools_4.4.0~rc1-1~exp5.debian.tar.xz 83d5831b6d4c5091a3e2d02794c68bc3 716260 debug optional spatialite-bin-dbgsym_4.4.0~rc1-1~exp5_amd64.deb bc79eeed09254205d23d2ce04725c62f 220748 science optional spatialite-bin_4.4.0~rc1-1~exp5_amd64.deb a3c21f404baa9dfe4b7e6e7666d27a22 8984 science optional spatialite-tools_4.4.0~rc1-1~exp5_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/h8wACgkQZ1DxCuiN SvHo4hAAzDsVTZGn3QGLD6sx/cAGFcsE5F5dhQRsd/Ns2xQ+69JKZe0Dce+jpKyA YPDH2poFwkzhJyeNpYYYUsIh27ZWDNr3u7O1/PJ6BR45Qsm1CmVfokl1Vs/phDm6 RyECQE0lOoHY1AgY4vH+HJI2Qo6r3K7tYmwlwf5Ozxr3lFrNUtrEX2wbZSzX086u DFSGZabrn0GEih/AmiXvoqoh8XD1RSix8QFUUzVp0r9QiO+t7QgMQ0oAQPa9YwAn 1rzdKPM8dNhe6FoTxRUrWTuCA06YOR6f4LXudadc3GXui4cYED6W+uZ3LDgnal3C N99cbF1N5Nud4DaykXFjIEySu2Z/BZjaH2cZs9VNihki+AmdADxCOPAuI6gsSv5C l80eYiGdgLLZEhwogiAA+cxmh4e433Ywjf4vLy9znGdNs1bHhGk1uqOkYzwc7Ag0 IOwuxBaQONVI9cExDEngIumW7NkQq/R8Pb2DzN4W5mwMc61x7MOpm5xc0+cCiHek f0P9Arb5Gg5TYspTUR1imiIXveT0+uu1198UNyFR9kjGUC2EJ6YVLw3XS8AHu2rt lSfzjFQV0evAL447M4W9Co1Yx8ZzsPeBm5J3x+HQ1CmXzWoyhU0NM4EsNhRY7zgP Vph1jyR+hQXpZBtWFbsvdEOX7j/q0Zjer24ygzKyOP0xGMC83yA= =7IXe -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 16 14:53:49 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 13:53:49 +0000 Subject: Processing of postgis_3.0.0~alpha4+dfsg-2~exp1_source.changes Message-ID: postgis_3.0.0~alpha4+dfsg-2~exp1_source.changes uploaded successfully to localhost along with the files: postgis_3.0.0~alpha4+dfsg-2~exp1.dsc postgis_3.0.0~alpha4+dfsg-2~exp1.debian.tar.xz postgis_3.0.0~alpha4+dfsg-2~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From owner at bugs.debian.org Mon Sep 16 14:54:19 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Mon, 16 Sep 2019 13:54:19 +0000 Subject: Bug#933409: marked as done (spatialite-gui: Please rebuild against wxWidgets GTK 3 package) References: <20190730140539.8EAA222A0A3D@bear.techie.net> Message-ID: Your message dated Mon, 16 Sep 2019 13:52:59 +0000 with message-id and subject line Bug#933409: fixed in spatialite-gui 2.1.0~beta0+really2.1.0~beta0-1~exp5 has caused the Debian Bug report #933409, regarding spatialite-gui: Please rebuild against wxWidgets GTK 3 package to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 933409: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=933409 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: swt at techie.net Subject: spatialite-gui: Please rebuild against wxWidgets GTK 3 package Date: Tue, 30 Jul 2019 10:05:38 -0400 Size: 3083 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#933409: fixed in spatialite-gui 2.1.0~beta0+really2.1.0~beta0-1~exp5 Date: Mon, 16 Sep 2019 13:52:59 +0000 Size: 6449 URL: From ftpmaster at ftp-master.debian.org Mon Sep 16 15:47:17 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 16 Sep 2019 14:47:17 +0000 Subject: postgis_3.0.0~alpha4+dfsg-2~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 15:21:59 +0200 Source: postgis Architecture: source Version: 3.0.0~alpha4+dfsg-2~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: postgis (3.0.0~alpha4+dfsg-2~exp1) experimental; urgency=medium . [ Christoph Berg ] * debian/tests: Test postgis_raster extension as well. * debian/tests/regress: Makefile.in needs some variables from ./configure. * Instead of conflicting with older postgresql-*-postgis-*-scripts packages, use the alternatives system for managing the extension/*.control files. This finally allows us to depend on the -scripts package from the main extension package. * Remove unversioned and unused sql files from address standardizer. Checksums-Sha1: f3e66ee6f38adcb5aeaf6f690cda6bb2b0cce527 3038 postgis_3.0.0~alpha4+dfsg-2~exp1.dsc 20c8f17cd126fb643667862382d30a016f867669 37608 postgis_3.0.0~alpha4+dfsg-2~exp1.debian.tar.xz cd058edab7b8075efabc3285cc7f4c164fe2099c 23259 postgis_3.0.0~alpha4+dfsg-2~exp1_amd64.buildinfo Checksums-Sha256: a595f543da4c5dec5e3452ccbb1638da68863ffd97afd216491d9de72569a62e 3038 postgis_3.0.0~alpha4+dfsg-2~exp1.dsc ceb0ced087634247884c23b968e59eb844fb1886042b220e453fb3f5f029c534 37608 postgis_3.0.0~alpha4+dfsg-2~exp1.debian.tar.xz f4151f83f547662c9d164afd28714437e0babaa994d996babc93240a482ec58f 23259 postgis_3.0.0~alpha4+dfsg-2~exp1_amd64.buildinfo Files: ab36e4cc4772ab1e31694d60d9c010db 3038 misc optional postgis_3.0.0~alpha4+dfsg-2~exp1.dsc dee228aa63ff5597436c74d83c9c12aa 37608 misc optional postgis_3.0.0~alpha4+dfsg-2~exp1.debian.tar.xz ce3df006d9609401f4430b61f8d78ff1 23259 misc optional postgis_3.0.0~alpha4+dfsg-2~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl1/kdcACgkQZ1DxCuiN SvEg+BAAlez1R/d0U6JbgRk5h8eXR6tKVjdexr5jp5HondYe4zGHpjhdj+0Zc6uE L06R23o9z1MJVc4b7z7Kk6xm7GQFpwtAer0r9b6S4M7wX7+vlmY5nDXEdRPMiuYD TrXdw6IrPFjHelKaNZ0jkEg3IPVUnZEkSJkMARZPONXAefeD0/aeQ/fPBDOqdhVz Sezked48+/3Qm0OIZYQiLkQreKM3ecbgDfFlGuPTJiUBeO5KF4wXWVtpE3p2Pfd9 sbM2mOK8/uG5E+QbGFGqA8yAeRfAXtJ5chaHyVlF61NbTZCzyd14MGCd1uFxL7ph 2LCWwiHZIYAYnVIIv8o8lZdIX0aXiJMEgcvs2kYBoBLGmGHVa67TkRcK3DRlvZHM l1VShgluj2A040u0vwtR39eF3F2n7nROZwTGtXRYLNNQPJtfgFYAtDvamilqsd99 EFRiesk5XRgbhcRzJmFH5AK/oZ0WsuW2q7ALrK8JlQKHU8kLDqJnuy6EfDGrKLCf C6IuvdJ5JKcN+Otjx7wK6u3iW72cBVjl8NuVGvf9K49SEY5CTva6HBvKhCUXtWj7 HBru9m5wADSKj2tR96e95l+l0/vOj5a0Zmf1I0iBq6ErnXKn2D0X2gQgm6NdxQOo 2HiJ5DttGgpVOJJPQPV0gHHMRU5/3UQSyRSqKGj9gB559dHOxxQ= =xqVV -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Mon Sep 16 22:34:27 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 16 Sep 2019 21:34:27 +0000 Subject: [Git][debian-gis-team/satpy][master] 3 commits: Fix compatibility with latest proj Message-ID: <5d7fffe33df09_73482ad96080d4781217441@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / satpy Commits: 037d2f22 by Antonio Valentino at 2019-09-16T21:27:29Z Fix compatibility with latest proj - - - - - 9d079720 by Antonio Valentino at 2019-09-16T21:32:55Z Remove obsolete fields Name, Contact from debian/upstream/metadata. - - - - - 5b8e7bd9 by Antonio Valentino at 2019-09-16T21:33:47Z Set distribution to unstable - - - - - 4 changed files: - debian/changelog - + debian/patches/0006-Fix-compatibility-with-new-proj-version.patch - debian/patches/series - debian/upstream/metadata Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,12 @@ +satpy (0.16.1-3) unstable; urgency=medium + + * debian/patches: + - new 0006-Fix-compatibility-with-new-proj-version.patch + (backport form upstream) + * Remove obsolete fields Name, Contact from debian/upstream/metadata. + + -- Antonio Valentino Mon, 16 Sep 2019 21:33:09 +0000 + satpy (0.16.1-2) unstable; urgency=medium * Use debhelper-compat instead of debian/compat. ===================================== debian/patches/0006-Fix-compatibility-with-new-proj-version.patch ===================================== @@ -0,0 +1,202 @@ +From: Antonio Valentino +Date: Mon, 16 Sep 2019 21:23:46 +0000 +Subject: Fix compatibility with new proj version + +Backport form upstream master (13892290b02456b04432e4fb86d9e85dad8d2990). +--- + satpy/tests/reader_tests/test_ahi_hsd.py | 22 +++++++++++++-------- + satpy/tests/reader_tests/test_hrit_base.py | 13 ++++++------ + satpy/tests/reader_tests/test_seviri_l1b_hrit.py | 13 ++++++------ + satpy/tests/test_config.py | 25 +++++++++++++++++++++--- + satpy/tests/writer_tests/test_cf.py | 10 ++++++++-- + satpy/writers/mitiff.py | 14 +++++++++---- + 6 files changed, 68 insertions(+), 29 deletions(-) + +diff --git a/satpy/tests/reader_tests/test_ahi_hsd.py b/satpy/tests/reader_tests/test_ahi_hsd.py +index e74bbfe..2335bee 100644 +--- a/satpy/tests/reader_tests/test_ahi_hsd.py ++++ b/satpy/tests/reader_tests/test_ahi_hsd.py +@@ -71,10 +71,13 @@ class TestAHIHSDNavigation(unittest.TestCase): + 'spare': ''} + + area_def = fh.get_area_def(None) +- self.assertEqual(area_def.proj_dict, {'a': 6378137.0, 'b': 6356752.3, +- 'h': 35785863.0, 'lon_0': 140.7, +- 'proj': 'geos', 'units': 'm'}) +- ++ proj_dict = area_def.proj_dict ++ self.assertEqual(proj_dict['a'], 6378137.0) ++ self.assertEqual(proj_dict['b'], 6356752.3) ++ self.assertEqual(proj_dict['h'], 35785863.0) ++ self.assertEqual(proj_dict['lon_0'], 140.7) ++ self.assertEqual(proj_dict['proj'], 'geos') ++ self.assertEqual(proj_dict['units'], 'm') + self.assertEqual(area_def.area_extent, (592000.0038256244, 4132000.026701824, + 1592000.0102878278, 5132000.033164027)) + +@@ -113,10 +116,13 @@ class TestAHIHSDNavigation(unittest.TestCase): + 'spare': ''} + + area_def = fh.get_area_def(None) +- self.assertEqual(area_def.proj_dict, {'a': 6378137.0, 'b': 6356752.3, +- 'h': 35785863.0, 'lon_0': 140.7, +- 'proj': 'geos', 'units': 'm'}) +- ++ proj_dict = area_def.proj_dict ++ self.assertEqual(proj_dict['a'], 6378137.0) ++ self.assertEqual(proj_dict['b'], 6356752.3) ++ self.assertEqual(proj_dict['h'], 35785863.0) ++ self.assertEqual(proj_dict['lon_0'], 140.7) ++ self.assertEqual(proj_dict['proj'], 'geos') ++ self.assertEqual(proj_dict['units'], 'm') + self.assertEqual(area_def.area_extent, (-5500000.035542117, -3300000.021325271, + 5500000.035542117, -2200000.0142168473)) + +diff --git a/satpy/tests/reader_tests/test_hrit_base.py b/satpy/tests/reader_tests/test_hrit_base.py +index 50d3ae0..631228f 100644 +--- a/satpy/tests/reader_tests/test_hrit_base.py ++++ b/satpy/tests/reader_tests/test_hrit_base.py +@@ -142,12 +142,13 @@ class TestHRITFileHandler(unittest.TestCase): + + def test_get_area_def(self): + area = self.reader.get_area_def('VIS06') +- self.assertEqual(area.proj_dict, {'a': 6378169.0, +- 'b': 6356583.8, +- 'h': 35785831.0, +- 'lon_0': 44.0, +- 'proj': 'geos', +- 'units': 'm'}) ++ proj_dict = area.proj_dict ++ self.assertEqual(proj_dict['a'], 6378169.0) ++ self.assertEqual(proj_dict['b'], 6356583.8) ++ self.assertEqual(proj_dict['h'], 35785831.0) ++ self.assertEqual(proj_dict['lon_0'], 44.0) ++ self.assertEqual(proj_dict['proj'], 'geos') ++ self.assertEqual(proj_dict['units'], 'm') + self.assertEqual(area.area_extent, + (-77771774058.38356, -77771774058.38356, + 30310525626438.438, 3720765401003.719)) +diff --git a/satpy/tests/reader_tests/test_seviri_l1b_hrit.py b/satpy/tests/reader_tests/test_seviri_l1b_hrit.py +index c601c23..f34574c 100644 +--- a/satpy/tests/reader_tests/test_seviri_l1b_hrit.py ++++ b/satpy/tests/reader_tests/test_seviri_l1b_hrit.py +@@ -134,12 +134,13 @@ class TestHRITMSGFileHandler(unittest.TestCase): + + def test_get_area_def(self): + area = self.reader.get_area_def(DatasetID('VIS006')) +- self.assertEqual(area.proj_dict, {'a': 6378169.0, +- 'b': 6356583.8, +- 'h': 35785831.0, +- 'lon_0': 44.0, +- 'proj': 'geos', +- 'units': 'm'}) ++ proj_dict = area.proj_dict ++ self.assertEqual(proj_dict['a'], 6378169.0) ++ self.assertEqual(proj_dict['b'], 6356583.8) ++ self.assertEqual(proj_dict['h'], 35785831.0) ++ self.assertEqual(proj_dict['lon_0'], 44.0) ++ self.assertEqual(proj_dict['proj'], 'geos') ++ self.assertEqual(proj_dict['units'], 'm') + self.assertEqual(area.area_extent, + (-77771774058.38356, -3720765401003.719, + 30310525626438.438, 77771774058.38356)) +diff --git a/satpy/tests/test_config.py b/satpy/tests/test_config.py +index 2b53836..b3b6d24 100644 +--- a/satpy/tests/test_config.py ++++ b/satpy/tests/test_config.py +@@ -79,13 +79,32 @@ class TestBuiltinAreas(unittest.TestCase): + return unittest.skip("RasterIO 1.0+ required") + + from pyresample import parse_area_file ++ from pyresample.geometry import SwathDefinition + from satpy.resample import get_area_file ++ import numpy as np ++ import xarray as xr ++ ++ lons = np.array([[0, 0.1, 0.2], [0.05, 0.15, 0.25]]) ++ lats = np.array([[0, 0.1, 0.2], [0.05, 0.15, 0.25]]) ++ lons = xr.DataArray(lons) ++ lats = xr.DataArray(lats) ++ swath_def = SwathDefinition(lons, lats) + all_areas = parse_area_file(get_area_file()) + for area_obj in all_areas: +- if getattr(area_obj, 'optimize_projection', False): +- # the PROJ.4 is known to not be valid on this DynamicAreaDef +- continue ++ if hasattr(area_obj, 'freeze'): ++ try: ++ area_obj = area_obj.freeze(lonslats=swath_def) ++ except RuntimeError: ++ # we didn't provide enough info to freeze, hard to guess ++ # in a generic test so just skip this area ++ continue + proj_dict = area_obj.proj_dict ++ if proj_dict.get('proj') in ('ob_tran', 'nsper') and \ ++ 'wktext' not in proj_dict: ++ # FIXME: rasterio doesn't understand ob_tran unless +wktext ++ # See: https://github.com/pyproj4/pyproj/issues/357 ++ # pyproj 2.0+ seems to drop wktext from PROJ dict ++ continue + _ = CRS.from_dict(proj_dict) + + +diff --git a/satpy/tests/writer_tests/test_cf.py b/satpy/tests/writer_tests/test_cf.py +index 2f7fc63..c4721d0 100644 +--- a/satpy/tests/writer_tests/test_cf.py ++++ b/satpy/tests/writer_tests/test_cf.py +@@ -712,8 +712,14 @@ class TestCFWriter(unittest.TestCase): + with mock.patch('satpy.writers.cf_writer.warnings.warn') as warn: + res, grid_mapping = area2gridmapping(ds) + warn.assert_called() +- self.assertDictEqual(dict(pyresample.geometry.proj4_str_to_dict(res.attrs['grid_proj4'])), +- dict(pyresample.geometry.proj4_str_to_dict(proj_str))) ++ proj_dict = pyresample.geometry.proj4_str_to_dict(res.attrs['grid_proj4']) ++ self.assertEqual(proj_dict['lon_0'], 4.535) ++ self.assertEqual(proj_dict['lat_0'], 46.0) ++ self.assertEqual(proj_dict['o_lon_p'], -5.465) ++ self.assertEqual(proj_dict['o_lat_p'], 90.0) ++ self.assertEqual(proj_dict['proj'], 'ob_tran') ++ self.assertEqual(proj_dict['o_proj'], 'stere') ++ self.assertEqual(proj_dict['ellps'], 'WGS84') + self.assertEqual(grid_mapping, cosmo_expected) + + def test_area2lonlat(self): +diff --git a/satpy/writers/mitiff.py b/satpy/writers/mitiff.py +index 47a32d0..da45738 100644 +--- a/satpy/writers/mitiff.py ++++ b/satpy/writers/mitiff.py +@@ -203,9 +203,15 @@ class MITIFFWriter(ImageWriter): + proj4_string = " Proj string: " + + if isinstance(datasets, list): +- proj4_string += first_dataset.attrs['area'].proj4_string ++ area = first_dataset.attrs['area'] + else: +- proj4_string += datasets.attrs['area'].proj4_string ++ area = datasets.attrs['area'] ++ # Use pyproj's CRS object to get a valid EPSG code if possible ++ # only in newer pyresample versions with pyproj 2.0+ installed ++ if hasattr(area, 'crs') and area.crs.to_epsg() is not None: ++ proj4_string += "+init=EPSG:{}".format(area.crs.to_epsg()) ++ else: ++ proj4_string += area.proj_str + + x_0 = 0 + y_0 = 0 +@@ -246,14 +252,14 @@ class MITIFFWriter(ImageWriter): + if 'units' not in proj4_string: + proj4_string += ' +units=km' + +- if isinstance(datasets, list): ++ if 'x_0' not in proj4_string and isinstance(datasets, list): + proj4_string += ' +x_0=%.6f' % ( + (-first_dataset.attrs['area'].area_extent[0] + + first_dataset.attrs['area'].pixel_size_x) + x_0) + proj4_string += ' +y_0=%.6f' % ( + (-first_dataset.attrs['area'].area_extent[1] + + first_dataset.attrs['area'].pixel_size_y) + y_0) +- else: ++ elif 'x_0' not in proj4_string: + proj4_string += ' +x_0=%.6f' % ( + (-datasets.attrs['area'].area_extent[0] + + datasets.attrs['area'].pixel_size_x) + x_0) ===================================== debian/patches/series ===================================== @@ -3,3 +3,4 @@ 0003-Explicitly-set-chunks-in-dask-arrays.patch 0004-Disable-tests-on-the-number-of-calls-to-ll2cr.patch 0005-Fix-test_gaclacfile.patch +0006-Fix-compatibility-with-new-proj-version.patch ===================================== debian/upstream/metadata ===================================== @@ -1,6 +1,4 @@ Bug-Database: https://github.com/pytroll/satpy/issues Bug-Submit: https://github.com/pytroll/satpy/issues/new -Contact: The Pytroll Team -Name: satpy Repository: https://github.com/pytroll/satpy.git Repository-Browse: https://github.com/pytroll/satpy View it on GitLab: https://salsa.debian.org/debian-gis-team/satpy/compare/828c14f16b8394c161c7a265c9c55029c2386aeb...5b8e7bd9d2eca2ddd5338db90acbbd7ca29c900f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/satpy/compare/828c14f16b8394c161c7a265c9c55029c2386aeb...5b8e7bd9d2eca2ddd5338db90acbbd7ca29c900f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From 1uk at bclevr.info Tue Sep 17 04:44:03 2019 From: 1uk at bclevr.info (Mary from BCLEVR) Date: Tue, 17 Sep 2019 06:44:03 +0300 Subject: Are You a Pro Or a Freelancer? Get Notified Whenever Someone Is Looking For a Service You Provide Message-ID: BCLEVR We Will Introduce You New Business Opportunities And New Customers Every Day. Sign Up Now. No Charges. Grow your business and get new customers and business opportunities every day without having to do anything for it. Welcome to BCLEVR! BCLEVR is a free marketplace for professional services that automatically matches users looking for services with the right professionals. It's simple, fast and completely free for both users and professionals. Perfect for freelancers, professionals, service providers and business owners. SO SIMPLE! 1. We bring you new customers Get real time notifications whenever someone is looking for the service you provide 2. You decide. You control Check all the service requests persons submit on BCLEVR and reply only to those of interest to you 3. Close the deal Send your quotations with personalized message and an estimated price No fees, no commissions, no subscriptions plans, unlimited usage. We work with more than 1,000 categories of services. Sign up now and select the categories you work with to receive alerts in real time whenever someone is looking for a service that you provide. OPEN BCLEVR [1] © 2019 BCLEVR Rua Prof Dr Carlos Lloyd 14, 2º Dto 4715-319 Braga, Portugal I want to stop receiving this newsletter [2] | Open this message on browser [3] Links: ------ [1] http://www.bclevr.info/uk.html [2] http://uk.bclevr.info/https://uk.bclevr.info/pommo/update.php?email=pkg-grass-devel at lists.alioth.debian.org&code=0be6867507975988f2c3de8ee511fb1c [3] http://www.bclevr.info/news/uk/2019/uk_gen_pro.html -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:08:07 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:08:07 +0000 Subject: [Git][debian-gis-team/libosmium][master] 4 commits: New upstream version 2.15.3 Message-ID: <5d805c273003d_73483fbbbed45e2c1239427@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libosmium Commits: 49940fd7 by Bas Couwenberg at 2019-09-17T03:54:01Z New upstream version 2.15.3 - - - - - 9a471f9d by Bas Couwenberg at 2019-09-17T03:54:04Z Update upstream source from tag 'upstream/2.15.3' Update to upstream version '2.15.3' with Debian dir 97b79a64cfc951fe5b65325cbf5727ab93e507b3 - - - - - 3f2183ca by Bas Couwenberg at 2019-09-17T03:54:14Z New upstream release. - - - - - 491ac059 by Bas Couwenberg at 2019-09-17T03:54:49Z Set distribution to unstable. - - - - - 7 changed files: - CHANGELOG.md - CMakeLists.txt - debian/changelog - include/osmium/io/detail/pbf_decoder.hpp - include/osmium/io/detail/pbf_output_format.hpp - include/osmium/io/detail/xml_input_format.hpp - include/osmium/version.hpp Changes: ===================================== CHANGELOG.md ===================================== @@ -13,6 +13,24 @@ This project adheres to [Semantic Versioning](https://semver.org/). ### Fixed +## [2.15.3] - 2019-09-16 + +### Added + +* New header option "sorting" when reading and writing PBFs. If the header + option "sorting" is set to `Type_then_ID`, the optional header property + `Sort.Type_then_ID` is set on writing to PBF files. When reading PBF files + with this header property, the "sorting" header option is set accordingly. + +### Fixed + +* Do not propagate C++ exception through C code. We are using the Expat + XML parser, a C library. It calls callbacks in our code. When those + callbacks throw, the exception was propagated through the C code. This + did work in the tests, but that behaviour isn't guaranteed (C++ + standard says it is implementation defined). This fixes it by catching + the exception and rethrowing it later. + ## [2.15.2] - 2019-08-16 ### Added @@ -956,7 +974,8 @@ This project adheres to [Semantic Versioning](https://semver.org/). Doxygen (up to version 1.8.8). This version contains a workaround to fix this. -[unreleased]: https://github.com/osmcode/libosmium/compare/v2.15.2...HEAD +[unreleased]: https://github.com/osmcode/libosmium/compare/v2.15.3...HEAD +[2.15.3]: https://github.com/osmcode/libosmium/compare/v2.15.2...v2.15.3 [2.15.2]: https://github.com/osmcode/libosmium/compare/v2.15.1...v2.15.2 [2.15.1]: https://github.com/osmcode/libosmium/compare/v2.15.0...v2.15.1 [2.15.0]: https://github.com/osmcode/libosmium/compare/v2.14.2...v2.15.0 ===================================== CMakeLists.txt ===================================== @@ -40,7 +40,7 @@ project(libosmium) set(LIBOSMIUM_VERSION_MAJOR 2) set(LIBOSMIUM_VERSION_MINOR 15) -set(LIBOSMIUM_VERSION_PATCH 2) +set(LIBOSMIUM_VERSION_PATCH 3) set(LIBOSMIUM_VERSION "${LIBOSMIUM_VERSION_MAJOR}.${LIBOSMIUM_VERSION_MINOR}.${LIBOSMIUM_VERSION_PATCH}") ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libosmium (2.15.3-1) unstable; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Tue, 17 Sep 2019 05:54:40 +0200 + libosmium (2.15.2-1) unstable; urgency=medium * New upstream release. ===================================== include/osmium/io/detail/pbf_decoder.hpp ===================================== @@ -847,8 +847,13 @@ namespace osmium { } } break; - case protozero::tag_and_type(OSMFormat::HeaderBlock::repeated_string_optional_features, protozero::pbf_wire_type::length_delimited): - header.set("pbf_optional_feature_" + std::to_string(i++), pbf_header_block.get_string()); + case protozero::tag_and_type(OSMFormat::HeaderBlock::repeated_string_optional_features, protozero::pbf_wire_type::length_delimited): { + const auto opt = pbf_header_block.get_string(); + header.set("pbf_optional_feature_" + std::to_string(i++), opt); + if (opt == "Sort.Type_then_ID") { + header.set("sorting", "Type_then_ID"); + } + } break; case protozero::tag_and_type(OSMFormat::HeaderBlock::optional_string_writingprogram, protozero::pbf_wire_type::length_delimited): header.set("generator", pbf_header_block.get_string()); ===================================== include/osmium/io/detail/pbf_output_format.hpp ===================================== @@ -577,6 +577,10 @@ namespace osmium { pbf_header_block.add_string(OSMFormat::HeaderBlock::repeated_string_optional_features, "LocationsOnWays"); } + if (header.get("sorting") == "Type_then_ID") { + pbf_header_block.add_string(OSMFormat::HeaderBlock::repeated_string_optional_features, "Sort.Type_then_ID"); + } + pbf_header_block.add_string(OSMFormat::HeaderBlock::optional_string_writingprogram, header.get("generator")); const std::string osmosis_replication_timestamp{header.get("osmosis_replication_timestamp")}; ===================================== include/osmium/io/detail/xml_input_format.hpp ===================================== @@ -60,6 +60,7 @@ DEALINGS IN THE SOFTWARE. #include #include +#include #include #include #include @@ -177,17 +178,44 @@ namespace osmium { class ExpatXMLParser { XML_Parser m_parser; + std::exception_ptr m_exception_ptr{}; - static void XMLCALL start_element_wrapper(void* data, const XML_Char* element, const XML_Char** attrs) { - static_cast(data)->start_element(element, attrs); + template + void member_wrap(XMLParser& xml_parser, TFunc&& func) noexcept { + if (m_exception_ptr) { + return; + } + try { + std::forward(func)(xml_parser); + } catch (...) { + m_exception_ptr = std::current_exception(); + XML_StopParser(m_parser, 0); + } + } + + template + static void wrap(void* data, TFunc&& func) noexcept { + assert(data); + auto& xml_parser = *static_cast(data); + xml_parser.m_expat_xml_parser->member_wrap(xml_parser, std::forward(func)); + } + + static void XMLCALL start_element_wrapper(void* data, const XML_Char* element, const XML_Char** attrs) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.start_element(element, attrs); + }); } - static void XMLCALL end_element_wrapper(void* data, const XML_Char* element) { - static_cast(data)->end_element(element); + static void XMLCALL end_element_wrapper(void* data, const XML_Char* element) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.end_element(element); + }); } - static void XMLCALL character_data_wrapper(void* data, const XML_Char* text, int len) { - static_cast(data)->characters(text, len); + static void XMLCALL character_data_wrapper(void* data, const XML_Char* text, int len) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.characters(text, len); + }); } // This handler is called when there are any XML entities @@ -195,7 +223,7 @@ namespace osmium { // but they can be misused. See // https://en.wikipedia.org/wiki/Billion_laughs // The handler will just throw an error. - static void entity_declaration_handler(void* /*userData*/, + static void entity_declaration_handler(void* data, const XML_Char* /*entityName*/, int /*is_parameter_entity*/, const XML_Char* /*value*/, @@ -203,8 +231,10 @@ namespace osmium { const XML_Char* /*base*/, const XML_Char* /*systemId*/, const XML_Char* /*publicId*/, - const XML_Char* /*notationName*/) { - throw osmium::xml_error{"XML entities are not supported"}; + const XML_Char* /*notationName*/) noexcept { + wrap(data, [&](XMLParser& /*xml_parser*/) { + throw osmium::xml_error{"XML entities are not supported"}; + }); } public: @@ -233,12 +263,17 @@ namespace osmium { void operator()(const std::string& data, bool last) { assert(data.size() < std::numeric_limits::max()); if (XML_Parse(m_parser, data.data(), static_cast(data.size()), last) == XML_STATUS_ERROR) { + if (m_exception_ptr) { + std::rethrow_exception(m_exception_ptr); + } throw osmium::xml_error{m_parser}; } } }; // class ExpatXMLParser + ExpatXMLParser* m_expat_xml_parser{nullptr}; + template static void check_attributes(const XML_Char** attrs, T&& check) { while (*attrs) { @@ -739,6 +774,7 @@ namespace osmium { osmium::thread::set_thread_name("_osmium_xml_in"); ExpatXMLParser parser{this}; + m_expat_xml_parser = &parser; while (!input_done()) { const std::string data{get_input()}; ===================================== include/osmium/version.hpp ===================================== @@ -35,8 +35,8 @@ DEALINGS IN THE SOFTWARE. #define LIBOSMIUM_VERSION_MAJOR 2 #define LIBOSMIUM_VERSION_MINOR 15 -#define LIBOSMIUM_VERSION_PATCH 2 +#define LIBOSMIUM_VERSION_PATCH 3 -#define LIBOSMIUM_VERSION_STRING "2.15.2" +#define LIBOSMIUM_VERSION_STRING "2.15.3" #endif // OSMIUM_VERSION_HPP View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/compare/0bfd6739fcf2d72cd676c4e74af19c59ce073896...491ac059e92aaa156026936cd99b59f674484ac6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/compare/0bfd6739fcf2d72cd676c4e74af19c59ce073896...491ac059e92aaa156026936cd99b59f674484ac6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:08:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:08:08 +0000 Subject: [Git][debian-gis-team/libosmium][pristine-tar] pristine-tar data for libosmium_2.15.3.orig.tar.gz Message-ID: <5d805c2830d13_73482ad963a4024c123963b@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / libosmium Commits: 21d65e7d by Bas Couwenberg at 2019-09-17T03:54:04Z pristine-tar data for libosmium_2.15.3.orig.tar.gz - - - - - 2 changed files: - + libosmium_2.15.3.orig.tar.gz.delta - + libosmium_2.15.3.orig.tar.gz.id Changes: ===================================== libosmium_2.15.3.orig.tar.gz.delta ===================================== Binary files /dev/null and b/libosmium_2.15.3.orig.tar.gz.delta differ ===================================== libosmium_2.15.3.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +fbc6c8f4467e2f01e6d20e5d608627aab95f2480 View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/commit/21d65e7da79518f49633c4fef0c9ec37f5d0f994 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/commit/21d65e7da79518f49633c4fef0c9ec37f5d0f994 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:08:11 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:08:11 +0000 Subject: [Git][debian-gis-team/libosmium][upstream] New upstream version 2.15.3 Message-ID: <5d805c2bd6973_73483fbbbed45e2c123982a@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / libosmium Commits: 49940fd7 by Bas Couwenberg at 2019-09-17T03:54:01Z New upstream version 2.15.3 - - - - - 6 changed files: - CHANGELOG.md - CMakeLists.txt - include/osmium/io/detail/pbf_decoder.hpp - include/osmium/io/detail/pbf_output_format.hpp - include/osmium/io/detail/xml_input_format.hpp - include/osmium/version.hpp Changes: ===================================== CHANGELOG.md ===================================== @@ -13,6 +13,24 @@ This project adheres to [Semantic Versioning](https://semver.org/). ### Fixed +## [2.15.3] - 2019-09-16 + +### Added + +* New header option "sorting" when reading and writing PBFs. If the header + option "sorting" is set to `Type_then_ID`, the optional header property + `Sort.Type_then_ID` is set on writing to PBF files. When reading PBF files + with this header property, the "sorting" header option is set accordingly. + +### Fixed + +* Do not propagate C++ exception through C code. We are using the Expat + XML parser, a C library. It calls callbacks in our code. When those + callbacks throw, the exception was propagated through the C code. This + did work in the tests, but that behaviour isn't guaranteed (C++ + standard says it is implementation defined). This fixes it by catching + the exception and rethrowing it later. + ## [2.15.2] - 2019-08-16 ### Added @@ -956,7 +974,8 @@ This project adheres to [Semantic Versioning](https://semver.org/). Doxygen (up to version 1.8.8). This version contains a workaround to fix this. -[unreleased]: https://github.com/osmcode/libosmium/compare/v2.15.2...HEAD +[unreleased]: https://github.com/osmcode/libosmium/compare/v2.15.3...HEAD +[2.15.3]: https://github.com/osmcode/libosmium/compare/v2.15.2...v2.15.3 [2.15.2]: https://github.com/osmcode/libosmium/compare/v2.15.1...v2.15.2 [2.15.1]: https://github.com/osmcode/libosmium/compare/v2.15.0...v2.15.1 [2.15.0]: https://github.com/osmcode/libosmium/compare/v2.14.2...v2.15.0 ===================================== CMakeLists.txt ===================================== @@ -40,7 +40,7 @@ project(libosmium) set(LIBOSMIUM_VERSION_MAJOR 2) set(LIBOSMIUM_VERSION_MINOR 15) -set(LIBOSMIUM_VERSION_PATCH 2) +set(LIBOSMIUM_VERSION_PATCH 3) set(LIBOSMIUM_VERSION "${LIBOSMIUM_VERSION_MAJOR}.${LIBOSMIUM_VERSION_MINOR}.${LIBOSMIUM_VERSION_PATCH}") ===================================== include/osmium/io/detail/pbf_decoder.hpp ===================================== @@ -847,8 +847,13 @@ namespace osmium { } } break; - case protozero::tag_and_type(OSMFormat::HeaderBlock::repeated_string_optional_features, protozero::pbf_wire_type::length_delimited): - header.set("pbf_optional_feature_" + std::to_string(i++), pbf_header_block.get_string()); + case protozero::tag_and_type(OSMFormat::HeaderBlock::repeated_string_optional_features, protozero::pbf_wire_type::length_delimited): { + const auto opt = pbf_header_block.get_string(); + header.set("pbf_optional_feature_" + std::to_string(i++), opt); + if (opt == "Sort.Type_then_ID") { + header.set("sorting", "Type_then_ID"); + } + } break; case protozero::tag_and_type(OSMFormat::HeaderBlock::optional_string_writingprogram, protozero::pbf_wire_type::length_delimited): header.set("generator", pbf_header_block.get_string()); ===================================== include/osmium/io/detail/pbf_output_format.hpp ===================================== @@ -577,6 +577,10 @@ namespace osmium { pbf_header_block.add_string(OSMFormat::HeaderBlock::repeated_string_optional_features, "LocationsOnWays"); } + if (header.get("sorting") == "Type_then_ID") { + pbf_header_block.add_string(OSMFormat::HeaderBlock::repeated_string_optional_features, "Sort.Type_then_ID"); + } + pbf_header_block.add_string(OSMFormat::HeaderBlock::optional_string_writingprogram, header.get("generator")); const std::string osmosis_replication_timestamp{header.get("osmosis_replication_timestamp")}; ===================================== include/osmium/io/detail/xml_input_format.hpp ===================================== @@ -60,6 +60,7 @@ DEALINGS IN THE SOFTWARE. #include #include +#include #include #include #include @@ -177,17 +178,44 @@ namespace osmium { class ExpatXMLParser { XML_Parser m_parser; + std::exception_ptr m_exception_ptr{}; - static void XMLCALL start_element_wrapper(void* data, const XML_Char* element, const XML_Char** attrs) { - static_cast(data)->start_element(element, attrs); + template + void member_wrap(XMLParser& xml_parser, TFunc&& func) noexcept { + if (m_exception_ptr) { + return; + } + try { + std::forward(func)(xml_parser); + } catch (...) { + m_exception_ptr = std::current_exception(); + XML_StopParser(m_parser, 0); + } + } + + template + static void wrap(void* data, TFunc&& func) noexcept { + assert(data); + auto& xml_parser = *static_cast(data); + xml_parser.m_expat_xml_parser->member_wrap(xml_parser, std::forward(func)); + } + + static void XMLCALL start_element_wrapper(void* data, const XML_Char* element, const XML_Char** attrs) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.start_element(element, attrs); + }); } - static void XMLCALL end_element_wrapper(void* data, const XML_Char* element) { - static_cast(data)->end_element(element); + static void XMLCALL end_element_wrapper(void* data, const XML_Char* element) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.end_element(element); + }); } - static void XMLCALL character_data_wrapper(void* data, const XML_Char* text, int len) { - static_cast(data)->characters(text, len); + static void XMLCALL character_data_wrapper(void* data, const XML_Char* text, int len) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.characters(text, len); + }); } // This handler is called when there are any XML entities @@ -195,7 +223,7 @@ namespace osmium { // but they can be misused. See // https://en.wikipedia.org/wiki/Billion_laughs // The handler will just throw an error. - static void entity_declaration_handler(void* /*userData*/, + static void entity_declaration_handler(void* data, const XML_Char* /*entityName*/, int /*is_parameter_entity*/, const XML_Char* /*value*/, @@ -203,8 +231,10 @@ namespace osmium { const XML_Char* /*base*/, const XML_Char* /*systemId*/, const XML_Char* /*publicId*/, - const XML_Char* /*notationName*/) { - throw osmium::xml_error{"XML entities are not supported"}; + const XML_Char* /*notationName*/) noexcept { + wrap(data, [&](XMLParser& /*xml_parser*/) { + throw osmium::xml_error{"XML entities are not supported"}; + }); } public: @@ -233,12 +263,17 @@ namespace osmium { void operator()(const std::string& data, bool last) { assert(data.size() < std::numeric_limits::max()); if (XML_Parse(m_parser, data.data(), static_cast(data.size()), last) == XML_STATUS_ERROR) { + if (m_exception_ptr) { + std::rethrow_exception(m_exception_ptr); + } throw osmium::xml_error{m_parser}; } } }; // class ExpatXMLParser + ExpatXMLParser* m_expat_xml_parser{nullptr}; + template static void check_attributes(const XML_Char** attrs, T&& check) { while (*attrs) { @@ -739,6 +774,7 @@ namespace osmium { osmium::thread::set_thread_name("_osmium_xml_in"); ExpatXMLParser parser{this}; + m_expat_xml_parser = &parser; while (!input_done()) { const std::string data{get_input()}; ===================================== include/osmium/version.hpp ===================================== @@ -35,8 +35,8 @@ DEALINGS IN THE SOFTWARE. #define LIBOSMIUM_VERSION_MAJOR 2 #define LIBOSMIUM_VERSION_MINOR 15 -#define LIBOSMIUM_VERSION_PATCH 2 +#define LIBOSMIUM_VERSION_PATCH 3 -#define LIBOSMIUM_VERSION_STRING "2.15.2" +#define LIBOSMIUM_VERSION_STRING "2.15.3" #endif // OSMIUM_VERSION_HPP View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/commit/49940fd7b62e1e21b4358e43d1ef7712ecfdd102 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/commit/49940fd7b62e1e21b4358e43d1ef7712ecfdd102 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:08:13 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:08:13 +0000 Subject: [Git][debian-gis-team/libosmium] Pushed new tag debian/2.15.3-1 Message-ID: <5d805c2d259d6_73482ad963a4024c12400fd@godard.mail> Bas Couwenberg pushed new tag debian/2.15.3-1 at Debian GIS Project / libosmium -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/tree/debian/2.15.3-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:08:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:08:14 +0000 Subject: [Git][debian-gis-team/libosmium] Pushed new tag upstream/2.15.3 Message-ID: <5d805c2ed0b1_73483fbbb458f89012402a9@godard.mail> Bas Couwenberg pushed new tag upstream/2.15.3 at Debian GIS Project / libosmium -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/tree/upstream/2.15.3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 17 05:15:57 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 04:15:57 +0000 Subject: Processing of libosmium_2.15.3-1_source.changes Message-ID: libosmium_2.15.3-1_source.changes uploaded successfully to localhost along with the files: libosmium_2.15.3-1.dsc libosmium_2.15.3.orig.tar.gz libosmium_2.15.3-1.debian.tar.xz libosmium_2.15.3-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Tue Sep 17 05:19:23 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 04:19:23 +0000 Subject: libosmium_2.15.3-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Tue, 17 Sep 2019 05:54:40 +0200 Source: libosmium Architecture: source Version: 2.15.3-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: libosmium (2.15.3-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: 83df6cdfc1e59cb1ee7b15be36bb40ad6adb0f17 2153 libosmium_2.15.3-1.dsc b154701113183e250e2ffd37f0f849c362c1ce83 500166 libosmium_2.15.3.orig.tar.gz edc8e7616a52975b91a43c9bd36d0b761936617b 6304 libosmium_2.15.3-1.debian.tar.xz 6714b5a23285df608e5312acf1a6e547646a2e77 13156 libosmium_2.15.3-1_amd64.buildinfo Checksums-Sha256: d4c3ff6b9ef641bba01bc82a48ce10937c3842d19e4bd375d016bd2ffbb91903 2153 libosmium_2.15.3-1.dsc f95b76aa03fe60b5dc3e86329d6eb4baec4f86522a8d440ad149068019fb6866 500166 libosmium_2.15.3.orig.tar.gz d3af4782069f635e3174757932d3ef6b9af872cd8c056e4d8c4258b181af95a7 6304 libosmium_2.15.3-1.debian.tar.xz dd9ef31cb036f2a05ffcde8f476c56c4234ce10fd25b42d88c3e0b9dbc8398d6 13156 libosmium_2.15.3-1_amd64.buildinfo Files: a9c324c639790e1b5738e569be96a2fc 2153 science optional libosmium_2.15.3-1.dsc 28ccca10955ad9cccd8e196a22a80d65 500166 science optional libosmium_2.15.3.orig.tar.gz 96d227c292a04afb2c6e4bbee40fe64e 6304 science optional libosmium_2.15.3-1.debian.tar.xz bc0c7d81c36d21b5ecd3da4dd498cbd6 13156 science optional libosmium_2.15.3-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2AXAoACgkQZ1DxCuiN SvE+HxAArAezsFd3a2FqfrIogNfhT9s9mR0GiDI2+w1momNlEutGMUrgIUb8W0XA 0TsxjOQtnYJU1pVjd2HpaubioZGylnPy/M2eVrbN3Y9Q5egyJWtgDCpln/37STny DuhMdUDvCFv68xscno5THRHLMIEn5284e4eEnRa7YjdsS5NXbSCq2xy/xlNmSKVO qg/3KUd4qsa0LdnA5UQrC+V04IZ7c2Jwxzd58es6wXCy73A1Np/uiHUqfnjTJuI3 4ZGTuPXpXHIeckNoh1Wv3uvrtUm7GaK7iI57yuU4dZyMtt7Pnpome+Nh1UBSRCp2 6Z/n0jlERELR6U7gbYNZ8LMlGN+pnTrbs6pDsn6RtdyK6FOO74FTBeiVpQWDmUqf 5Hbep3u68r8gtmqkknIQVZb6lPeWrhVeRUiFCz0f9z+edfsU9JiDd74W9WAl3Fcw TKLTnwjwO8qz4T3LpxaDmXInwp5OtoJYfd0L7T6JuF9iW5ggmFyFYxFpBh7rsxNx 6FJgAPGtU1E0i7vXU6UH7tVCxqsXAK7ZRpWeGdBgTAyY99Qj5MQz2f8++dy+TNTq 2Y7vLcfOi8oY0tM0/NmXiHOw/gOlQaNWBXG8xMqZrLgLiQNxSTIBVw64rWCkjawo oGk/d0Yehg6k0S//zOKBNEADwURm23FEBWp0W1Y6du9l6PB0crc= =Q/Dd -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Tue Sep 17 05:29:24 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:29:24 +0000 Subject: [Git][debian-gis-team/qmapshack][buster-backports] 7 commits: New upstream version 1.13.2 Message-ID: <5d806124d1342_73482ad963a4024c12413a@godard.mail> Bas Couwenberg pushed to branch buster-backports at Debian GIS Project / qmapshack Commits: d737bcde by Bas Couwenberg at 2019-09-11T16:30:12Z New upstream version 1.13.2 - - - - - ee63ad48 by Bas Couwenberg at 2019-09-11T16:31:05Z Update upstream source from tag 'upstream/1.13.2' Update to upstream version '1.13.2' with Debian dir 3cf573f1763a390486c1343bfd5e27429d5d75c3 - - - - - 9d3449e7 by Bas Couwenberg at 2019-09-11T16:31:22Z New upstream release. - - - - - b4a993b3 by Bas Couwenberg at 2019-09-11T17:05:58Z Add patch to fix spelling errors. - - - - - 6411d982 by Bas Couwenberg at 2019-09-11T17:05:58Z Set distribution to unstable. - - - - - dd41c2da by Bas Couwenberg at 2019-09-17T04:03:44Z Merge tag 'debian/1.13.2-1' into buster-backports releasing package qmapshack version 1.13.2-1 - - - - - 60f6e08f by Bas Couwenberg at 2019-09-17T04:03:56Z Rebuild for buster-backports. - - - - - 30 changed files: - CMakeLists.txt - CMakeLists.txt.user - MacOSX/HowtoBuildOSX.txt - MacOSX/bundle-qmaptool.sh - changelog.txt - debian/changelog - debian/patches/series - + debian/patches/spelling-errors.patch - msvc_64/QMapShack_Installer.nsi - msvc_64/cmake/FindPROJ4.cmake - msvc_64/copyfiles.bat - + src/icons/32x32/Attention.png - + src/icons/32x32/EnergyCycling.png - + src/icons/32x32/Hint.png - + src/icons/48x48/Attention.png - + src/icons/48x48/EnergyCycling.png - + src/icons/48x48/Hint.png - + src/icons/Attention.svg - + src/icons/EnergyCycling.svg - + src/icons/Hint.svg - src/qmapshack/CMainWindow.cpp - src/qmapshack/CMakeLists.txt - src/qmapshack/canvas/CCanvas.cpp - src/qmapshack/canvas/CCanvas.h - src/qmapshack/gis/CGisWorkspace.cpp - src/qmapshack/gis/CGisWorkspace.h - src/qmapshack/gis/IGisItem.h - src/qmapshack/gis/IGisWorkspace.ui - src/qmapshack/gis/ovl/CGisItemOvlArea.cpp - src/qmapshack/gis/ovl/CGisItemOvlArea.h The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/compare/0058281144845c4add57d4a1745eb1329c6c54aa...60f6e08fc3f24ef27f6bdb3cf7b74f60af9ecc0f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/compare/0058281144845c4add57d4a1745eb1329c6c54aa...60f6e08fc3f24ef27f6bdb3cf7b74f60af9ecc0f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:33:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:33:10 +0000 Subject: [Git][debian-gis-team/qmapshack] Pushed new tag debian/1.13.2-1_bpo10+1 Message-ID: <5d806206d4115_73482ad963a4024c1241571@godard.mail> Bas Couwenberg pushed new tag debian/1.13.2-1_bpo10+1 at Debian GIS Project / qmapshack -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/tree/debian/1.13.2-1_bpo10+1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 17 05:35:59 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 04:35:59 +0000 Subject: Processing of qmapshack_1.13.2-1~bpo10+1_source.changes Message-ID: qmapshack_1.13.2-1~bpo10+1_source.changes uploaded successfully to localhost along with the files: qmapshack_1.13.2-1~bpo10+1.dsc qmapshack_1.13.2-1~bpo10+1.debian.tar.xz qmapshack_1.13.2-1~bpo10+1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From noreply at release.debian.org Tue Sep 17 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Tue, 17 Sep 2019 04:39:21 +0000 Subject: qmapshack 1.13.2-1 MIGRATED to testing Message-ID: FYI: The status of the qmapshack source package in Debian's testing distribution has changed. Previous version: 1.13.1-1 Current version: 1.13.2-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Tue Sep 17 05:45:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:45:25 +0000 Subject: [Git][debian-gis-team/satpy] Pushed new tag debian/0.16.1-3 Message-ID: <5d8064e5870df_73483fbbbed45e2c124312e@godard.mail> Bas Couwenberg pushed new tag debian/0.16.1-3 at Debian GIS Project / satpy -- View it on GitLab: https://salsa.debian.org/debian-gis-team/satpy/tree/debian/0.16.1-3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 17 05:49:27 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 04:49:27 +0000 Subject: qmapshack_1.13.2-1~bpo10+1_source.changes ACCEPTED into buster-backports Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Tue, 17 Sep 2019 06:03:49 +0200 Source: qmapshack Architecture: source Version: 1.13.2-1~bpo10+1 Distribution: buster-backports Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: qmapshack (1.13.2-1~bpo10+1) buster-backports; urgency=medium . * Rebuild for buster-backports. . qmapshack (1.13.2-1) unstable; urgency=medium . * New upstream release. * Add patch to fix spelling errors. Checksums-Sha1: b1754d2da7878a4444c6565cb46bb8f8f1fb7594 2262 qmapshack_1.13.2-1~bpo10+1.dsc 569c935095ff28165e057124a50615352a2eb2e3 12296 qmapshack_1.13.2-1~bpo10+1.debian.tar.xz 320c93d8f12f5d6aa06e77c2859d07e6f310093c 18698 qmapshack_1.13.2-1~bpo10+1_amd64.buildinfo Checksums-Sha256: 3969e064faf586d84e1387237e230b2f645aeb6ba453481006f6610bf2566cbb 2262 qmapshack_1.13.2-1~bpo10+1.dsc 6120244087e6db7a81e313764b64ae127650bf1004ee7f3376ef95ca8818cb06 12296 qmapshack_1.13.2-1~bpo10+1.debian.tar.xz 1fdd441f9549204ff26dcb36bf0189959c67a4ddb6d9253b0acbbf9b71095d2c 18698 qmapshack_1.13.2-1~bpo10+1_amd64.buildinfo Files: 0eb2af61e3f93e13f3fdf9e2c7454f06 2262 science optional qmapshack_1.13.2-1~bpo10+1.dsc a03101d1ff2472568a876cdfde13b6b7 12296 science optional qmapshack_1.13.2-1~bpo10+1.debian.tar.xz 57fb411fb8f0593eea8b189cf5736a09 18698 science optional qmapshack_1.13.2-1~bpo10+1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2AYQkACgkQZ1DxCuiN SvHx0xAAq+va/ed4i5owKrey82oJ/hH+Zpo0u4bKpb/P8IrygQQ1flfnaUptp7uq pMivWD5TdP9139U1mK7QgLrWOO4NLssdwX9ex8NqB6IB5rd/f5ywgdR4P2iW4k2A GqixMD/yY9pPf2izZl2q87h2aW/X/Y4B00ZrCD3YalSPcV7YClXmpCOf9ewRhbpe 4CZ6TlcnCFDeqRJJ8Mcb3yt1DfZFd7jfmhQdozI0eCHLwraEEl7L0Eef3Pbjvtx7 nzzpnDgiPAFGt3No5fO9efRp9uop3HUcNrUGBVy3dY5CpaAA0ffmxa/HvUJPkeex bNyhJnwQt5Wy8KIwAA/ydmAwVcPYgPMD/IMrhj1zIcKWjvPcFTqNHU+dDtX6wrPl 5m7cDhViKwCBM29TDKC1F+SAcs+OG1EmhCIXTJDXZ/ovUSBukRJYyHVZfC4j2HgD o2QrjS8KoqR0XI4Adt+uAt5OATKcTVXWo5TyTd3ZbdsDkn6BkMK8UmIiGAFtxMU6 PDipjiqLnFrAT0G8DkEg5vCAfpHbe40vEubBLujmzwUUpayj8dzbkp3M3GJcNIg3 yJTYbnbS+et+vm7YhF+1FmVyKLIQqJ/Epj1SWMWIpnrHQfuMUJzB0x4Cf6eIKK1E kzzYaK8X/44Lqd4DTkrzqgVwMB2130hyCw1Cxcp8CD+uYAxGUKI= =iYgl -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Tue Sep 17 05:55:58 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:55:58 +0000 Subject: [Git][debian-gis-team/osmium-tool][master] 7 commits: New upstream version 1.11.0 Message-ID: <5d80675e22a49_73483fbbbed45e2c124355e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osmium-tool Commits: ca51f93d by Bas Couwenberg at 2019-09-17T04:08:45Z New upstream version 1.11.0 - - - - - fe26a5ec by Bas Couwenberg at 2019-09-17T04:08:49Z Update upstream source from tag 'upstream/1.11.0' Update to upstream version '1.11.0' with Debian dir e44abbaa945a36af05bab6feb57c556a5c41be5a - - - - - 8fc8155f by Bas Couwenberg at 2019-09-17T04:11:12Z New upstream release. - - - - - 1b4835f6 by Bas Couwenberg at 2019-09-17T04:22:49Z Update copyright years for Jochen Topf. - - - - - 7d31ec92 by Bas Couwenberg at 2019-09-17T04:23:09Z Bump minimum required libosmium2-dev to 2.15.2. - - - - - 1704eff0 by Bas Couwenberg at 2019-09-17T04:44:38Z Add patch to fix spelling errors. - - - - - 4821c871 by Bas Couwenberg at 2019-09-17T04:44:38Z Set distribution to unstable. - - - - - 30 changed files: - .clang-tidy - .travis.yml - CHANGELOG.md - CMakeLists.txt - README.md - cmake/test_install.cmake - debian/changelog - debian/control - debian/copyright - + debian/patches/series - + debian/patches/spelling-errors.patch - man/CMakeLists.txt - man/common-options.md - man/input-options.md - man/manpage.template - man/osmium-add-locations-to-ways.md - man/osmium-apply-changes.md - man/osmium-cat.md - man/osmium-changeset-filter.md - man/osmium-check-refs.md - + man/osmium-create-locations-index.md - man/osmium-derive-changes.md - man/osmium-diff.md - man/osmium-export.md - man/osmium-extract.md - man/osmium-file-formats.md - man/osmium-fileinfo.md - man/osmium-getid.md - man/osmium-getparents.md - man/osmium-index-types.md The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/compare/77a8dd645663389dc7c068b24f30837469b7dacd...4821c871ff81341c15503bb53565305bb9466fe9 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/compare/77a8dd645663389dc7c068b24f30837469b7dacd...4821c871ff81341c15503bb53565305bb9466fe9 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:55:59 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:55:59 +0000 Subject: [Git][debian-gis-team/osmium-tool][pristine-tar] pristine-tar data for osmium-tool_1.11.0.orig.tar.gz Message-ID: <5d80675f42083_73482ad9606be07c124373c@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / osmium-tool Commits: a6ca07f8 by Bas Couwenberg at 2019-09-17T04:08:48Z pristine-tar data for osmium-tool_1.11.0.orig.tar.gz - - - - - 2 changed files: - + osmium-tool_1.11.0.orig.tar.gz.delta - + osmium-tool_1.11.0.orig.tar.gz.id Changes: ===================================== osmium-tool_1.11.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/osmium-tool_1.11.0.orig.tar.gz.delta differ ===================================== osmium-tool_1.11.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +1b7ad86440bcdecaceaf0b472df476a74aed8d00 View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/a6ca07f8dae0b0ad7a5b4c0a3bedb817800b5c66 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/a6ca07f8dae0b0ad7a5b4c0a3bedb817800b5c66 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:56:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:56:00 +0000 Subject: [Git][debian-gis-team/osmium-tool][upstream] New upstream version 1.11.0 Message-ID: <5d80676021f09_73482ad963a4024c12439db@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / osmium-tool Commits: ca51f93d by Bas Couwenberg at 2019-09-17T04:08:45Z New upstream version 1.11.0 - - - - - 30 changed files: - .clang-tidy - .travis.yml - CHANGELOG.md - CMakeLists.txt - README.md - cmake/test_install.cmake - man/CMakeLists.txt - man/common-options.md - man/input-options.md - man/manpage.template - man/osmium-add-locations-to-ways.md - man/osmium-apply-changes.md - man/osmium-cat.md - man/osmium-changeset-filter.md - man/osmium-check-refs.md - + man/osmium-create-locations-index.md - man/osmium-derive-changes.md - man/osmium-diff.md - man/osmium-export.md - man/osmium-extract.md - man/osmium-file-formats.md - man/osmium-fileinfo.md - man/osmium-getid.md - man/osmium-getparents.md - man/osmium-index-types.md - man/osmium-merge-changes.md - + man/osmium-query-locations-index.md - man/osmium-renumber.md - man/osmium-show.md - man/osmium-sort.md The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/ca51f93d43fa57284253de71271cc1bbcd8ce3ac -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/ca51f93d43fa57284253de71271cc1bbcd8ce3ac You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 17 05:56:03 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 04:56:03 +0000 Subject: Processing of satpy_0.16.1-3_source.changes Message-ID: satpy_0.16.1-3_source.changes uploaded successfully to localhost along with the files: satpy_0.16.1-3.dsc satpy_0.16.1-3.debian.tar.xz satpy_0.16.1-3_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Tue Sep 17 05:56:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:56:25 +0000 Subject: [Git][debian-gis-team/osmium-tool] Pushed new tag debian/1.11.0-1 Message-ID: <5d806779ac62d_73482ad9606be07c12441be@godard.mail> Bas Couwenberg pushed new tag debian/1.11.0-1 at Debian GIS Project / osmium-tool -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/tree/debian/1.11.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 05:56:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 04:56:26 +0000 Subject: [Git][debian-gis-team/osmium-tool] Pushed new tag upstream/1.11.0 Message-ID: <5d80677a81e11_73482ad963a4024c12443a0@godard.mail> Bas Couwenberg pushed new tag upstream/1.11.0 at Debian GIS Project / osmium-tool -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/tree/upstream/1.11.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Tue Sep 17 06:04:09 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 05:04:09 +0000 Subject: satpy_0.16.1-3_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 16 Sep 2019 21:33:09 +0000 Source: satpy Architecture: source Version: 0.16.1-3 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: satpy (0.16.1-3) unstable; urgency=medium . * debian/patches: - new 0006-Fix-compatibility-with-new-proj-version.patch (backport form upstream) * Remove obsolete fields Name, Contact from debian/upstream/metadata. Checksums-Sha1: 2021134a2e84da102a04488ec562635caa766ab2 2705 satpy_0.16.1-3.dsc 467ab4008a8143f564e13923bf2468828d7a51ec 9404 satpy_0.16.1-3.debian.tar.xz e664ba627ea415c03279c2fb36229ef8b0835485 14007 satpy_0.16.1-3_amd64.buildinfo Checksums-Sha256: e839118bfe90abc0b3701e69a30ef157204d2c9b3e42381b991dc6e3b63fde33 2705 satpy_0.16.1-3.dsc e033b3a952bb44302d85320e97a8e2556dbd714a3607c274f1d45e88e4d18471 9404 satpy_0.16.1-3.debian.tar.xz 88428851f8d7c82c20a69379e2808b96adddaee2bddc425b716efa0a9d4112ea 14007 satpy_0.16.1-3_amd64.buildinfo Files: c3f2362a4c9cf34b2b122ae001a38d18 2705 python optional satpy_0.16.1-3.dsc 13dc75ddf8d867e16eda30ac8e950fe6 9404 python optional satpy_0.16.1-3.debian.tar.xz 8b711f903f5a3ddf2adcd7c6b1839554 14007 python optional satpy_0.16.1-3_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2AZNUACgkQZ1DxCuiN SvGTnhAAkBkdgNpWvvfLRGrcrR4nJneTqMhiBuOYO38BBQjX5CqHGB2MVCTLdoWh nsqSiyBdqDUaacuiAX+50dza6a5ceMVYS+WjebJTNBQb8LUazXj/ecyotQGpdNpG SUqGQc77diRxAdGm9DNuHG0R3ph2sYv7osSrrzTgOYlVIurxCUe6BN05CwGewmAO 1p62NWJVsFPQNXwNHEpm/svhjyT3w5osP1ATuNM2JWgMe7KX3U8awSuU43z90uTA D0peEI/EYM7X33seBHlBr1xEmRprJ+KJwVmCt1LCw+sjvELj3Z1JtAMc+EkMvurV 1lj7xHKZ5MH8rkXpe2zyMERM+PBXXoB4LL/YKDzarPDDH+gcQj8FvdiyCwG9C/w4 0KvOxFBx0qeFPPe9xaBiFTjUEJsLZzq7c4HSM46U6cjYKx+OiSssHfLtFFEMDUFU M9mvvWIsbMlTF3dOQnfyW6RxvNVNur22kKZ2BBBy1o4SrSrxAo7lbuclvRI62DrJ Iw8dVZczW0PtiLZBnQWhdJ1bMSFhBCrl4nO8npqEudN1bHvntL2qR4dZutreoT0L IkzgpriZ6HeKww9FezDco9P6aT03Z0LMr3aPDEUMHYRjBjQRBq9ce+BIfy1thpym J7plHmX9VdKNW7/qQaTmtljzkBV8XX7G6EpSNiM8kJUPZigRIbc= =oIMc -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Tue Sep 17 06:06:03 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 05:06:03 +0000 Subject: Processing of osmium-tool_1.11.0-1_source.changes Message-ID: osmium-tool_1.11.0-1_source.changes uploaded successfully to localhost along with the files: osmium-tool_1.11.0-1.dsc osmium-tool_1.11.0.orig.tar.gz osmium-tool_1.11.0-1.debian.tar.xz osmium-tool_1.11.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Tue Sep 17 06:23:39 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Tue, 17 Sep 2019 05:23:39 +0000 Subject: osmium-tool_1.11.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Tue, 17 Sep 2019 06:25:36 +0200 Source: osmium-tool Architecture: source Version: 1.11.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: osmium-tool (1.11.0-1) unstable; urgency=medium . * New upstream release. * Bump Standards-Version to 4.4.0, no changes. * Append -DNDEBUG to CXXFLAGS to remove buildpath from binaries. * Update gbp.conf to use --source-only-changes by default. * Update copyright years for Jochen Topf. * Bump minimum required libosmium2-dev to 2.15.2. * Add patch to fix spelling errors. Checksums-Sha1: c3a5e3cd3b2f833b8fa3e280f83d15f81d733675 2073 osmium-tool_1.11.0-1.dsc 05a860eebd5ad738588f37b4c38d74b48a2c687d 385051 osmium-tool_1.11.0.orig.tar.gz 9b65391e1edbe96983fc45e5e6abe1a2abb73d6a 6300 osmium-tool_1.11.0-1.debian.tar.xz 485a85a12c565ca9dc8d021e5bf22964cbceb3d5 7992 osmium-tool_1.11.0-1_amd64.buildinfo Checksums-Sha256: c53c3b2359eb2499d2c1f3c38597f4b8df885bd3259ea0b8a7c8a0232c511418 2073 osmium-tool_1.11.0-1.dsc 09720d8ffcf250000628cb174934885962e09677094bd5bd96071f11fe170f4f 385051 osmium-tool_1.11.0.orig.tar.gz 257a9a2c89bcf10d8cea71a007b1dc0786ed95c9a4d7e97d0e1eef693ae3c3e7 6300 osmium-tool_1.11.0-1.debian.tar.xz 6e3f8d9fa46f61b029c747179e324becb0a3696fc22724a2cca6193b4c9a3437 7992 osmium-tool_1.11.0-1_amd64.buildinfo Files: d6457656c8b917bfbef297c4c30dbfc4 2073 science optional osmium-tool_1.11.0-1.dsc 74bd783d37cc4cf2e9a8cb6a83dbdc39 385051 science optional osmium-tool_1.11.0.orig.tar.gz a0e6874756b1b575088262ae4bbc0b96 6300 science optional osmium-tool_1.11.0-1.debian.tar.xz d8c5243587d2175da8faa41bacbe294d 7992 science optional osmium-tool_1.11.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2AZ0kACgkQZ1DxCuiN SvEOuQ/8DOhmPFilmkZsN47ktJi39VBSN/FJKSKtGflGiZ/ANEGjef0IG+RQFlFA s1ZB6HEGzs+IcHOsJ9B3+35lWyG4KHHWig2DjWUgN8ZZ/XDYppxGwG7D5QValX2U btV3SaDIIf3uAzGCgRAhhF3L1Wnjt4pPFkGT2SILd+6D6rQ/cyWoGCu9h/RJB4Bc LSAC942mJoDEeALFyjQaVWIhrYqU6b32dV5nTjzmJtqFQKc3Ohu7shgZ90jlHETC yXAblhrbCY3z/UX3lM80vi8KJrIhGjYPDRi6Akp++mA9BY+2kDPiu4ekOKFzRi4H jo5uBgfr2uUbjEmwcIf1clLlVrhOgWDqlmuOFlPjf0h62jyqd1yECzNqFq0vv5XY blTULi2vqdXipGPlSNlBcaB+jT39OY45ZUbw/YmaNRKXngp7/zMUOO3p3buL2QtB ieWT5wXQCKUZXPpRexc2PpButg24etAsQsQ0SjJXSCRGAyAx2EDC06iQE0bep8zQ LeGsCXnkqVya+Z406baKo63CmrEo5BZ0dsCZmiRNGjQEdxsyE12oLAm5O8EPUXSP yCtmLRSlvY/ziVDeD4ndsLVJlH8mLueKOha0F8Lf9LnJLRf/vvcLNUnVJc1sAAP8 vDAhqehNX0Cis1u2GOYXqxmzTLezZhaeDg6aDS0KVv3Qrma81K4= =3I5n -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Tue Sep 17 06:38:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 05:38:15 +0000 Subject: [Git][debian-gis-team/osmium-tool][master] Mark spelling-errors.patch as Appplied-Upstream. Message-ID: <5d8071473847e_73482ad9616e0d881246548@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osmium-tool Commits: 1ecf6887 by Bas Couwenberg at 2019-09-17T05:38:08Z Mark spelling-errors.patch as Appplied-Upstream. - - - - - 1 changed file: - debian/patches/spelling-errors.patch Changes: ===================================== debian/patches/spelling-errors.patch ===================================== @@ -2,6 +2,7 @@ Description: Fix spelling errors. * wih -> with Author: Bas Couwenberg Forwarded: https://github.com/osmcode/osmium-tool/pull/175 +Applied-Upstream: https://github.com/osmcode/osmium-tool/commit/494f376395152b1601c21ea79d7989cb996b5478 --- a/man/osmium-fileinfo.md +++ b/man/osmium-fileinfo.md View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/1ecf68877f4d0ecf95fb4e8c70a5e5678201fefc -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/1ecf68877f4d0ecf95fb4e8c70a5e5678201fefc You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From admin at vin.vinctgroups.nl Tue Sep 17 14:57:31 2019 From: admin at vin.vinctgroups.nl (Mailbox_Admin) Date: Tue, 17 Sep 2019 06:57:31 -0700 Subject: De-activation of your mailbox {pkg-grass-devel@lists.alioth.debian.org} Message-ID: Hi pkg-grass-devel at lists.alioth.debian.org, pkg-grass-devel at lists.alioth.debian.org removal from the server has been approved and initiated, Due to ignorance of last verification warning. Removal will occur in exactly 48 hours ! We recommend that you do any of the below and protect your mailbox CONTINUE REMOVAL CANCEL REMOVAL © 2019 >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>> Please do not reply this message. <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< <<<<<<<<<<<<<<<<<<<<<<<<<<<<<< <<<<<<<<<<<<<<<<<<<<<<<<<<<< -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastic at xs4all.nl Tue Sep 17 15:34:29 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Tue, 17 Sep 2019 16:34:29 +0200 Subject: Bug#939384: qgis-providers: proj_create: crs not found In-Reply-To: <9f8ee4fe-1837-d5ec-7d13-3bd264bf9e85@xs4all.nl> References: <20190904101817.GA15957@msg.df7cb.de> <20190904101817.GA15957@msg.df7cb.de> <9f8ee4fe-1837-d5ec-7d13-3bd264bf9e85@xs4all.nl> <20190904101817.GA15957@msg.df7cb.de> Message-ID: <6885853b-7d7f-89cc-79d9-8fac0a55f7b0@xs4all.nl> Control: fixed -1 qgis/3.4.12+dfsg-1~exp1 On 9/4/19 12:53 PM, Sebastiaan Couwenberg wrote: >> There may be changes for PROJ 6 in upcoming QGIS 3.4.x releases, but >> full support will take some time until we upgrade to the 3.10 LTR in >> early 2020. This was fixed upstream in 3.4.12 by suppressing the proj_create output. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From owner at bugs.debian.org Tue Sep 17 15:39:13 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Tue, 17 Sep 2019 14:39:13 +0000 Subject: Processed: Re: Bug#939384: qgis-providers: proj_create: crs not found References: <6885853b-7d7f-89cc-79d9-8fac0a55f7b0@xs4all.nl> <20190904101817.GA15957@msg.df7cb.de> Message-ID: Processing control commands: > fixed -1 qgis/3.4.12+dfsg-1~exp1 Bug #939384 {Done: Sebastiaan Couwenberg } [qgis-providers] qgis-providers: proj_create: crs not found Marked as fixed in versions qgis/3.4.12+dfsg-1~exp1. -- 939384: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=939384 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From gitlab at salsa.debian.org Tue Sep 17 16:59:21 2019 From: gitlab at salsa.debian.org (Ross Gammon) Date: Tue, 17 Sep 2019 15:59:21 +0000 Subject: [Git][debian-gis-team/geographiclib][master] Remove myself from uploaders Message-ID: <5d8102d9a16b4_73483fbbb6706460133193d@godard.mail> Ross Gammon pushed to branch master at Debian GIS Project / geographiclib Commits: fa9c1943 by Ross Gammon at 2019-09-17T15:55:39Z Remove myself from uploaders I have not uploaded geographiclib since the early days when I was learning about transitions in Debian. And I don't use the package myself. In any case, I need to reduce the number of upåstreams I am monitoring due to lack of time. There is no urgency for uploading just for this change (or I would have done it myself). - - - - - 1 changed file: - debian/control Changes: ===================================== debian/control ===================================== @@ -1,7 +1,6 @@ Source: geographiclib Maintainer: Debian GIS Project Uploaders: Francesco Paolo Lovergine , - Ross Gammon , Bas Couwenberg Section: science Priority: optional View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/fa9c1943f7a05ca4fa8cbb5b173efb53d87be479 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/fa9c1943f7a05ca4fa8cbb5b173efb53d87be479 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Tue Sep 17 17:08:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 16:08:27 +0000 Subject: [Git][debian-gis-team/geographiclib][master] Remove myself from uploaders Message-ID: <5d8104fb6ac36_73483fbbb665613c13332d7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / geographiclib Commits: f69a71a5 by Bas Couwenberg at 2019-09-17T16:08:18Z Remove myself from uploaders - - - - - 1 changed file: - debian/changelog Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +geographiclib (1.49-6) UNRELEASED; urgency=medium + + [ Ross Gammon ] + * Remove myself from uploaders + + -- Bas Couwenberg Tue, 17 Sep 2019 18:08:04 +0200 + geographiclib (1.49-5) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/f69a71a5763ca3300e44668ebfa3954a269fa709 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/f69a71a5763ca3300e44668ebfa3954a269fa709 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From helmut at subdivi.de Tue Sep 17 21:20:10 2019 From: helmut at subdivi.de (Helmut Grohne) Date: Tue, 17 Sep 2019 22:20:10 +0200 Subject: Bug#940618: mark libmapbox-variant-dev Multi-Arch: foreign Message-ID: <20190917202009.GA24700@alf.mars> Package: libmapbox-variant-dev Version: 1.1.6-1 Tags: patch User: debian-cross at lists.debian.org Usertags: cross-satisfiability Control: affects -1 + src:node-mapnik src:python-mapnik src:mapnik src:viking The affected packages fail to satisfy their cross Build-Depends, because their (transitive) dependency on libmapbox-variant-dev is unsatisfiably. In general, Architecture: all packages can never satisfy cross build dependencies unless marked Multi-Arch: foreign or annotated :native. In this case, the foreign marking is appropriate, because libmapbox-variant-dev is a header-only library that entirely lacks dependencies and maintainer scripts. Please consider applying the attached patch. Helmut -------------- next part -------------- diff --minimal -Nru mapbox-variant-1.1.6/debian/changelog mapbox-variant-1.1.6/debian/changelog --- mapbox-variant-1.1.6/debian/changelog 2019-07-07 08:35:09.000000000 +0200 +++ mapbox-variant-1.1.6/debian/changelog 2019-09-17 22:16:36.000000000 +0200 @@ -1,3 +1,10 @@ +mapbox-variant (1.1.6-1.1) UNRELEASED; urgency=medium + + * Non-maintainer upload. + * Mark libmapbox-variant-dev Multi-Arch: foreign. (Closes: #-1) + + -- Helmut Grohne Tue, 17 Sep 2019 22:16:36 +0200 + mapbox-variant (1.1.6-1) unstable; urgency=medium * Update gbp.conf to use --source-only-changes by default. diff --minimal -Nru mapbox-variant-1.1.6/debian/control mapbox-variant-1.1.6/debian/control --- mapbox-variant-1.1.6/debian/control 2018-12-25 22:33:55.000000000 +0100 +++ mapbox-variant-1.1.6/debian/control 2019-09-17 22:16:34.000000000 +0200 @@ -15,6 +15,7 @@ Package: libmapbox-variant-dev Architecture: all +Multi-Arch: foreign Depends: ${misc:Depends} Description: Alternative to boost::variant for C++11 Mapbox variant has the same speedy performance of boost::variant but is From owner at bugs.debian.org Tue Sep 17 21:27:05 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Tue, 17 Sep 2019 20:27:05 +0000 Subject: Processed: mark libmapbox-variant-dev Multi-Arch: foreign References: <20190917202009.GA24700@alf.mars> <20190917202009.GA24700@alf.mars> Message-ID: Processing control commands: > affects -1 + src:node-mapnik src:python-mapnik src:mapnik src:viking Bug #940618 [libmapbox-variant-dev] mark libmapbox-variant-dev Multi-Arch: foreign Added indication that 940618 affects src:node-mapnik, src:python-mapnik, src:mapnik, and src:viking -- 940618: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=940618 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From gitlab at salsa.debian.org Tue Sep 17 21:44:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Tue, 17 Sep 2019 20:44:41 +0000 Subject: [Git][debian-gis-team/mapbox-variant][master] Mark libmapbox-variant-dev Multi-Arch: foreign. (Closes: #940618) Message-ID: <5d8145b9603eb_73482ad9613d5cb01372029@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapbox-variant Commits: 4dc8f05f by Bas Couwenberg at 2019-09-17T20:44:29Z Mark libmapbox-variant-dev Multi-Arch: foreign. (Closes: #940618) - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,11 @@ mapbox-variant (1.1.6-2) UNRELEASED; urgency=medium + [ Bas Couwenberg ] * Bump Standards-Version to 4.4.0, no changes. + [ Helmut Grohne ] + * Mark libmapbox-variant-dev Multi-Arch: foreign. (Closes: #940618) + -- Bas Couwenberg Wed, 10 Jul 2019 18:26:56 +0200 mapbox-variant (1.1.6-1) unstable; urgency=medium ===================================== debian/control ===================================== @@ -15,6 +15,7 @@ Homepage: https://github.com/mapbox/variant Package: libmapbox-variant-dev Architecture: all +Multi-Arch: foreign Depends: ${misc:Depends} Description: Alternative to boost::variant for C++11 Mapbox variant has the same speedy performance of boost::variant but is View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-variant/commit/4dc8f05ff5ef1d204abb0261d52fdcb49663525b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-variant/commit/4dc8f05ff5ef1d204abb0261d52fdcb49663525b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From owner at bugs.debian.org Tue Sep 17 21:57:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Tue, 17 Sep 2019 20:57:03 +0000 Subject: Processed: Re: Bug#940618: mark libmapbox-variant-dev Multi-Arch: foreign References: <20190917202009.GA24700@alf.mars> Message-ID: Processing control commands: > tags -1 pending Bug #940618 [libmapbox-variant-dev] mark libmapbox-variant-dev Multi-Arch: foreign Added tag(s) pending. -- 940618: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=940618 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From sebastic at xs4all.nl Tue Sep 17 21:45:24 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Tue, 17 Sep 2019 22:45:24 +0200 Subject: Bug#940618: mark libmapbox-variant-dev Multi-Arch: foreign In-Reply-To: <20190917202009.GA24700@alf.mars> References: <20190917202009.GA24700@alf.mars> <20190917202009.GA24700@alf.mars> Message-ID: Control: tags -1 pending Hi Helmut, Thanks for the patch, it's applied in git and will be included in the next upload. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From sebastic at xs4all.nl Wed Sep 18 08:01:56 2019 From: sebastic at xs4all.nl (Bas Couwenberg) Date: Wed, 18 Sep 2019 09:01:56 +0200 Subject: Bug#940635: RM: osmium-tool [s390x] -- ROM; Big endian specific test failure Message-ID: <156879011699.31824.13257236517766168684.reportbug@osiris.linuxminded.xs4all.nl> Package: ftp.debian.org Severity: normal Please remove osmium-tools from s390x to unblock testing migration. It FTBFS due to a big endian specific test failure: https://github.com/osmcode/osmium-tool/issues/176 Kind Regards, Bas From brendle at demos-deutschland.de Wed Sep 18 13:05:32 2019 From: brendle at demos-deutschland.de (Daniel Brendle) Date: Wed, 18 Sep 2019 14:05:32 +0200 Subject: problem with qgis-providers on bullseye/sid Message-ID: <299ef53a-f960-51d7-5cdc-70446a9d2141@demos-deutschland.de> Ohai, dear maintainer Me and a colleague just ran into a situation where we tried to upgrade our packages on debian bullseye/sid and ended up having the postinst script of qgis-providers returning 127 because it did not find the library libproj.so.13 there is a package for that called libproj13. we installed it and it fixed the error. an apt show qgis-proviers revealed that the package does not list libproj13 as a dependency. there is also a never version of libproj → libproj15. if this is an optional dependency then the post-inst script should not fail. otherwise it should be made sure via deb-dependency that a valid libproj-package is installed before installing qgis-providers. thanks in advance have a nice day yours, grindhold Mit freundlichen Grüßen Daniel Brendle Software Engineer -- DEMOS E-Partizipation GmbH ---------------------------------------------------------------------- brendle at demos-deutschland.de, PGP-ID: 0xD5F5DCFE www.demos-deutschland.de, Tel: 030 2787 846 - 0XXXXX Büro Berlin: Panoramastraße 1, 10178 Berlin - Mitte Büro Hamburg: Eifflerstraße 43, 22769 Hamburg - Altona Geschäftsführung: Rolf Lührs & Dr. Markus Klima AG Hamburg HRB 110069B, Prokurist: Matthias Rehkop ---------------------------------------------------------------------- www.twitter.com/demosbird www.facebook.com/demosgmbh From sebastic at xs4all.nl Wed Sep 18 13:29:54 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Wed, 18 Sep 2019 14:29:54 +0200 Subject: problem with qgis-providers on bullseye/sid In-Reply-To: <299ef53a-f960-51d7-5cdc-70446a9d2141@demos-deutschland.de> References: <299ef53a-f960-51d7-5cdc-70446a9d2141@demos-deutschland.de> Message-ID: <95de5db1-c621-2e83-c901-2c937d3c04e3@xs4all.nl> On 9/18/19 2:05 PM, Daniel Brendle wrote: > Me and a colleague just ran into a situation where we tried to upgrade > our packages on debian bullseye/sid and ended up having the postinst > script of qgis-providers returning 127 because it did not find the > library libproj.so.13 Install the qgis packages from the Debian repositories, not the ones from the qgis.org repository. The qgis packages in testing & unstable have been built with PROJ 6 (libproj15). > there is a package for that called libproj13. we installed it and it > fixed the error. an apt show qgis-proviers revealed that the package > does not list libproj13 as a dependency. there is also a never version > of libproj → libproj15. if this is an optional dependency then the > post-inst script should not fail. otherwise it should be made sure via > deb-dependency that a valid libproj-package is installed before > installing qgis-providers. The qgis-providers package depends on libgdal20 which in turn depends on libproj15. If you only have official Debian testing sources enabled there is no problem. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From jef at norbit.de Wed Sep 18 13:50:20 2019 From: jef at norbit.de (=?utf-8?Q?J=C3=BCrgen_E=2E?= Fischer) Date: Wed, 18 Sep 2019 14:50:20 +0200 Subject: problem with qgis-providers on bullseye/sid In-Reply-To: <299ef53a-f960-51d7-5cdc-70446a9d2141@demos-deutschland.de> References: <299ef53a-f960-51d7-5cdc-70446a9d2141@demos-deutschland.de> Message-ID: <20190918125019.fjlommehxtcsjafs@norbit.de> Moin Daniel, On Wed, 18. Sep 2019 at 14:05:32 +0200, Daniel Brendle wrote: > Me and a colleague just ran into a situation where we tried to upgrade > our packages on debian bullseye/sid and ended up having the postinst > script of qgis-providers returning 127 because it did not find the > library libproj.so.13 Which repository did you use and when did you encounter this? Jürgen -- Jürgen E. Fischer norBIT GmbH Tel. +49-4931-918175-31 Dipl.-Inf. (FH) Rheinstraße 13 Fax. +49-4931-918175-50 Software Engineer D-26506 Norden https://www.norbit.de -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 827 bytes Desc: not available URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: Pflichtangaben URL: From gitlab at salsa.debian.org Thu Sep 19 05:27:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 04:27:34 +0000 Subject: [Git][debian-gis-team/qmapshack][master] Update URLs for move to GitHub. Message-ID: <5d8303b6a7ab8_73483fbbbf05478015323c4@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / qmapshack Commits: 9b9c7675 by Bas Couwenberg at 2019-09-19T04:19:38Z Update URLs for move to GitHub. - - - - - 5 changed files: - debian/changelog - debian/control - debian/copyright - debian/upstream/metadata - debian/watch Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +qmapshack (1.13.2-2) UNRELEASED; urgency=medium + + * Update URLs for move to GitHub. + + -- Bas Couwenberg Thu, 19 Sep 2019 06:19:19 +0200 + qmapshack (1.13.2-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -20,7 +20,7 @@ Build-Depends: cmake, Standards-Version: 4.4.0 Vcs-Browser: https://salsa.debian.org/debian-gis-team/qmapshack Vcs-Git: https://salsa.debian.org/debian-gis-team/qmapshack.git -Homepage: https://bitbucket.org/maproom/qmapshack/wiki/Home +Homepage: https://github.com/Maproom/qmapshack/wiki Package: qmapshack Architecture: any ===================================== debian/copyright ===================================== @@ -1,7 +1,7 @@ Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ Upstream-Name: QMapShack Upstream-Contact: Oliver Eichler -Source: https://bitbucket.org/maproom/qmapshack/downloads +Source: https://github.com/Maproom/qmapshack Files: * Copyright: 2006-2009, 2012, 2014-2018, Oliver Eichler ===================================== debian/upstream/metadata ===================================== @@ -1,6 +1,7 @@ --- -Bug-Database: https://bitbucket.org/maproom/qmapshack/issues +Bug-Database: https://github.com/Maproom/qmapshack/issues +Bug-Submit: https://github.com/Maproom/qmapshack/issues/new Contact: Oliver Eichler Name: QMapShack -Repository: https://bitbucket.org/maproom/qmapshack -Repository-Browse: https://bitbucket.org/maproom/qmapshack/src +Repository: https://github.com/Maproom/qmapshack.git +Repository-Browse: https://github.com/Maproom/qmapshack ===================================== debian/watch ===================================== @@ -3,5 +3,5 @@ opts=\ dversionmangle=s/\+(debian|dfsg|ds|deb)\d*$//,\ uversionmangle=s/(\d)[_\.\-\+]?((RC|rc|pre|dev|beta|alpha)\d*)$/$1~$2/;s/RC/rc/,\ filenamemangle=s/(?:.*?)?(?:rel|v|V|qmapshack)?[\-\_]?(?:%20)?(\d+\.\d\S+)\.(tgz|tbz|txz|(?:tar\.(?:gz|bz2|xz)))/qmapshack-$1.$2/ \ -https://bitbucket.org/maproom/qmapshack/downloads/ \ -(?:.*?/)?(?:rel|v|V|qmapshack)?[\-\_]?(?:%20)?(\d+\.\d\S+)\.(?:tgz|tbz|txz|(?:tar\.(?:gz|bz2|xz))) +https://github.com/Maproom/qmapshack/releases \ +(?:.*?/archive/)?(?:rel|v|V|qmapshack)?[\-\_]?(?:%20)?(\d+\.\d\S+)\.(?:tgz|tbz|txz|(?:tar\.(?:gz|bz2|xz))) View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/9b9c7675ef2d8f57664b8d631c66e01479071c4c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/9b9c7675ef2d8f57664b8d631c66e01479071c4c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From noreply at release.debian.org Thu Sep 19 05:39:18 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 19 Sep 2019 04:39:18 +0000 Subject: mapserver 7.4.2-1 MIGRATED to testing Message-ID: FYI: The status of the mapserver source package in Debian's testing distribution has changed. Previous version: 7.4.1-1 Current version: 7.4.2-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Thu Sep 19 05:39:19 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 19 Sep 2019 04:39:19 +0000 Subject: owslib 0.18.0-2 MIGRATED to testing Message-ID: FYI: The status of the owslib source package in Debian's testing distribution has changed. Previous version: 0.18.0-1 Current version: 0.18.0-2 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Thu Sep 19 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 19 Sep 2019 04:39:21 +0000 Subject: pywps 4.2.1-3 MIGRATED to testing Message-ID: FYI: The status of the pywps source package in Debian's testing distribution has changed. Previous version: 4.2.1-1 Current version: 4.2.1-3 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Thu Sep 19 05:39:25 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 19 Sep 2019 04:39:25 +0000 Subject: satpy 0.16.1-3 MIGRATED to testing Message-ID: FYI: The status of the satpy source package in Debian's testing distribution has changed. Previous version: 0.16.1-2 Current version: 0.16.1-3 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Thu Sep 19 05:49:40 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 04:49:40 +0000 Subject: [Git][debian-gis-team/python-snuggs][master] 4 commits: New upstream version 1.4.7 Message-ID: <5d8308e4b1b2c_73483fbbbee520e01533198@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-snuggs Commits: 03483b9c by Bas Couwenberg at 2019-09-19T04:43:09Z New upstream version 1.4.7 - - - - - 9f92362a by Bas Couwenberg at 2019-09-19T04:43:11Z Update upstream source from tag 'upstream/1.4.7' Update to upstream version '1.4.7' with Debian dir 8d5b81b3d1a6a17bffc9729ebe12c3b472282d9c - - - - - c6d837fa by Bas Couwenberg at 2019-09-19T04:43:43Z New upstream release. - - - - - 46a78f11 by Bas Couwenberg at 2019-09-19T04:44:20Z Set distribution to unstable. - - - - - 5 changed files: - .travis.yml - CHANGES.txt - debian/changelog - snuggs/__init__.py - test_snuggs.py Changes: ===================================== .travis.yml ===================================== @@ -5,7 +5,7 @@ python: - "3.6" - "3.7" install: - - "pip install pytest-cov coveralls pyparsing==2.3.1" + - "pip install pytest-cov coveralls pyparsing~=2.0" - "pip install -e .[test]" script: - python -m pytest --cov snuggs --cov-report term-missing @@ -16,5 +16,5 @@ deploy: tags: true condition: "$TRAVIS_PYTHON_VERSION = 3.6" provider: pypi - user: mapboxci + user: __token__ distributions: "sdist bdist_wheel" ===================================== CHANGES.txt ===================================== @@ -1,6 +1,12 @@ Changes ======= +1.4.7 (2019-09-18) +------------------ +- The snuggs tests of syntax errors no longer assert a specific pyparsing + exception message as the format of these messages is not stable (#15). + Previous versions of snuggs had no other issues with pyparsing 2.3 or 2.4. + 1.4.6 (2019-05-15) ------------------ - Tests were failing on Python 2.7 (#20, #21) due to loss of precision in ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +python-snuggs (1.4.7-1) unstable; urgency=medium + + * Team upload. + * New upstream release. + + -- Bas Couwenberg Thu, 19 Sep 2019 06:44:11 +0200 + python-snuggs (1.4.6-2) unstable; urgency=medium * Team upload. ===================================== snuggs/__init__.py ===================================== @@ -16,7 +16,7 @@ import numpy __all__ = ['eval'] -__version__ = "1.4.6" +__version__ = "1.4.7" # Python 2-3 compatibility string_types = (str,) if sys.version_info[0] >= 3 else (basestring,) # flake8: noqa ===================================== test_snuggs.py ===================================== @@ -196,9 +196,6 @@ def test_missing_closing_paren(): snuggs.eval("(+ 1 2") assert excinfo.value.lineno == 1 assert excinfo.value.offset == 7 - exception_options = ['expected a function or operator', - 'Expected {Forward: ... | Forward: ...}'] - assert str(excinfo.value) in exception_options def test_missing_func(): @@ -214,9 +211,6 @@ def test_missing_func2(): snuggs.eval("(# 1 2)") assert excinfo.value.lineno == 1 assert excinfo.value.offset == 2 - exception_options = ['expected a function or operator', - 'Expected {Forward: ... | Forward: ...}'] - assert str(excinfo.value) in exception_options def test_undefined_var(): @@ -232,9 +226,6 @@ def test_bogus_higher_order_func(): snuggs.eval("((bogus * 2) 2)") assert excinfo.value.lineno == 1 assert excinfo.value.offset == 3 - exception_options = ['expected a function or operator', - 'Expected {Forward: ... | Forward: ...}'] - assert str(excinfo.value) in exception_options def test_type_error(): View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/compare/a109e8a6db88c35248cdb9fe706712197dea24da...46a78f11a4b3b41ffb234eab2fc36c3e64f85d0c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/compare/a109e8a6db88c35248cdb9fe706712197dea24da...46a78f11a4b3b41ffb234eab2fc36c3e64f85d0c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 05:49:42 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 04:49:42 +0000 Subject: [Git][debian-gis-team/python-snuggs][pristine-tar] pristine-tar data for python-snuggs_1.4.7.orig.tar.gz Message-ID: <5d8308e653769_73483fbbb4751b4c1533360@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-snuggs Commits: 72645992 by Bas Couwenberg at 2019-09-19T04:43:10Z pristine-tar data for python-snuggs_1.4.7.orig.tar.gz - - - - - 2 changed files: - + python-snuggs_1.4.7.orig.tar.gz.delta - + python-snuggs_1.4.7.orig.tar.gz.id Changes: ===================================== python-snuggs_1.4.7.orig.tar.gz.delta ===================================== Binary files /dev/null and b/python-snuggs_1.4.7.orig.tar.gz.delta differ ===================================== python-snuggs_1.4.7.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +fc19121a71a46a602968883f14478fdd456ddc25 View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/commit/72645992914b07f1f32a322c5dc91e2fa4e32018 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/commit/72645992914b07f1f32a322c5dc91e2fa4e32018 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 05:49:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 04:49:44 +0000 Subject: [Git][debian-gis-team/python-snuggs][upstream] New upstream version 1.4.7 Message-ID: <5d8308e81a2f4_73482ad95d9103781533531@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-snuggs Commits: 03483b9c by Bas Couwenberg at 2019-09-19T04:43:09Z New upstream version 1.4.7 - - - - - 4 changed files: - .travis.yml - CHANGES.txt - snuggs/__init__.py - test_snuggs.py Changes: ===================================== .travis.yml ===================================== @@ -5,7 +5,7 @@ python: - "3.6" - "3.7" install: - - "pip install pytest-cov coveralls pyparsing==2.3.1" + - "pip install pytest-cov coveralls pyparsing~=2.0" - "pip install -e .[test]" script: - python -m pytest --cov snuggs --cov-report term-missing @@ -16,5 +16,5 @@ deploy: tags: true condition: "$TRAVIS_PYTHON_VERSION = 3.6" provider: pypi - user: mapboxci + user: __token__ distributions: "sdist bdist_wheel" ===================================== CHANGES.txt ===================================== @@ -1,6 +1,12 @@ Changes ======= +1.4.7 (2019-09-18) +------------------ +- The snuggs tests of syntax errors no longer assert a specific pyparsing + exception message as the format of these messages is not stable (#15). + Previous versions of snuggs had no other issues with pyparsing 2.3 or 2.4. + 1.4.6 (2019-05-15) ------------------ - Tests were failing on Python 2.7 (#20, #21) due to loss of precision in ===================================== snuggs/__init__.py ===================================== @@ -16,7 +16,7 @@ import numpy __all__ = ['eval'] -__version__ = "1.4.6" +__version__ = "1.4.7" # Python 2-3 compatibility string_types = (str,) if sys.version_info[0] >= 3 else (basestring,) # flake8: noqa ===================================== test_snuggs.py ===================================== @@ -196,9 +196,6 @@ def test_missing_closing_paren(): snuggs.eval("(+ 1 2") assert excinfo.value.lineno == 1 assert excinfo.value.offset == 7 - exception_options = ['expected a function or operator', - 'Expected {Forward: ... | Forward: ...}'] - assert str(excinfo.value) in exception_options def test_missing_func(): @@ -214,9 +211,6 @@ def test_missing_func2(): snuggs.eval("(# 1 2)") assert excinfo.value.lineno == 1 assert excinfo.value.offset == 2 - exception_options = ['expected a function or operator', - 'Expected {Forward: ... | Forward: ...}'] - assert str(excinfo.value) in exception_options def test_undefined_var(): @@ -232,9 +226,6 @@ def test_bogus_higher_order_func(): snuggs.eval("((bogus * 2) 2)") assert excinfo.value.lineno == 1 assert excinfo.value.offset == 3 - exception_options = ['expected a function or operator', - 'Expected {Forward: ... | Forward: ...}'] - assert str(excinfo.value) in exception_options def test_type_error(): View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/commit/03483b9cdea993cddc4e27f828079c1292782e02 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/commit/03483b9cdea993cddc4e27f828079c1292782e02 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 05:49:45 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 04:49:45 +0000 Subject: [Git][debian-gis-team/python-snuggs] Pushed new tag debian/1.4.7-1 Message-ID: <5d8308e990a2e_73483fbbb4751b4c15337eb@godard.mail> Bas Couwenberg pushed new tag debian/1.4.7-1 at Debian GIS Project / python-snuggs -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/tree/debian/1.4.7-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 05:49:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 04:49:46 +0000 Subject: [Git][debian-gis-team/python-snuggs] Pushed new tag upstream/1.4.7 Message-ID: <5d8308ea9659a_73482ad95d91037815339ba@godard.mail> Bas Couwenberg pushed new tag upstream/1.4.7 at Debian GIS Project / python-snuggs -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/tree/upstream/1.4.7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 19 05:58:00 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 04:58:00 +0000 Subject: Processing of python-snuggs_1.4.7-1_source.changes Message-ID: python-snuggs_1.4.7-1_source.changes uploaded successfully to localhost along with the files: python-snuggs_1.4.7-1.dsc python-snuggs_1.4.7.orig.tar.gz python-snuggs_1.4.7-1.debian.tar.xz python-snuggs_1.4.7-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Thu Sep 19 06:04:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:04:31 +0000 Subject: [Git][debian-gis-team/mapserver][buster-backports] 8 commits: New upstream version 7.4.2 Message-ID: <5d830c5f2f252_73483fbbbf0547801535974@godard.mail> Bas Couwenberg pushed to branch buster-backports at Debian GIS Project / mapserver Commits: 96263bc5 by Bas Couwenberg at 2019-09-14T05:49:01Z New upstream version 7.4.2 - - - - - a87066ed by Bas Couwenberg at 2019-09-14T05:52:32Z Merge tag 'upstream/7.4.2' Upstream version 7.4.2 - - - - - bb4491ca by Bas Couwenberg at 2019-09-14T05:52:47Z New upstream release. - - - - - 2a559851 by Bas Couwenberg at 2019-09-14T06:10:29Z Update 7.4.1 symbols for other architectures. - - - - - b0b74dbe by Bas Couwenberg at 2019-09-14T06:23:34Z Update 7.4.2 symbols for amd64. - - - - - 7b8bd6d8 by Bas Couwenberg at 2019-09-14T06:23:34Z Set distribution to unstable. - - - - - 3fff178b by Bas Couwenberg at 2019-09-19T04:50:44Z Merge tag 'debian/7.4.2-1' into buster-backports releasing package mapserver version 7.4.2-1 - - - - - 6c8914d8 by Bas Couwenberg at 2019-09-19T04:50:55Z Rebuild for buster-backports. - - - - - 30 changed files: - CMakeLists.txt - HISTORY.TXT - Makefile - + ci/travis/after_success.sh - + ci/travis/before_install.sh - + ci/travis/script.sh - debian/changelog - debian/libmapserver2.symbols - mapcopy.c - mapfile.c - mapgdal.c - maplabel.c - maplayer.c - maplexer.c - maplexer.l - mapmetadata.c - mapmssql2008.c - mapogcsld.c - mapogcsld.h - mapogcsos.c - mapogr.cpp - mapogroutput.c - mapows.c - mapparser.c - mapparser.h - mapparser.y - mapquery.c - mapresample.c - mapscript/phpng/CMakeLists.txt - mapsymbol.c The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/compare/6229927975cae3d2169399d5f2bfc3ded2e58a4e...6c8914d8aa0f5326a10006559dc083496369c2a5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/compare/6229927975cae3d2169399d5f2bfc3ded2e58a4e...6c8914d8aa0f5326a10006559dc083496369c2a5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:04:36 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:04:36 +0000 Subject: [Git][debian-gis-team/mapserver] Pushed new tag debian/7.4.2-1_bpo10+1 Message-ID: <5d830c6437142_73483fbbb4751b4c15361de@godard.mail> Bas Couwenberg pushed new tag debian/7.4.2-1_bpo10+1 at Debian GIS Project / mapserver -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/tree/debian/7.4.2-1_bpo10+1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 19 06:04:37 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 05:04:37 +0000 Subject: python-snuggs_1.4.7-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 19 Sep 2019 06:44:11 +0200 Source: python-snuggs Architecture: source Version: 1.4.7-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-snuggs (1.4.7-1) unstable; urgency=medium . * Team upload. * New upstream release. Checksums-Sha1: 850ef131d8b589fa73c40ab0d89d9321b2c8ccdc 2096 python-snuggs_1.4.7-1.dsc c5fbb90e8166f28215f9f56a42a994986cd5e4ad 7762 python-snuggs_1.4.7.orig.tar.gz f4ae4015e7cd2d031b31d0f50e93d24463567fc8 3156 python-snuggs_1.4.7-1.debian.tar.xz a89185a8f36cf5368c41f0c5a1e618e50b6e4745 7548 python-snuggs_1.4.7-1_amd64.buildinfo Checksums-Sha256: f6c62f6256352351ee7b244a42b880a657e4552ddb193737361ecd3821afbeea 2096 python-snuggs_1.4.7-1.dsc 86b775c64c37600d5e537bad49a3fc5e3c15bdbefba4c245adce261e064792a6 7762 python-snuggs_1.4.7.orig.tar.gz 9569b4eb4a7d9c570c51f21698bfdfa3e69e451e94ef1e595ee8317c94252b41 3156 python-snuggs_1.4.7-1.debian.tar.xz da781a86f9db32bc0b318423129003cb87d18c392bfd91246ff34ae5b3ec4da2 7548 python-snuggs_1.4.7-1_amd64.buildinfo Files: f3246b7edc77a57ad1bb130e55a1a211 2096 python optional python-snuggs_1.4.7-1.dsc 28633274352aa2149b4d490c1e883a2a 7762 python optional python-snuggs_1.4.7.orig.tar.gz 79f9e820ed6cd34b41a90a244aacc034 3156 python optional python-snuggs_1.4.7-1.debian.tar.xz a56cb82a73915c6d34486040b4a6b929 7548 python optional python-snuggs_1.4.7-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2DCMMACgkQZ1DxCuiN SvG+/w/9E2OFtyVx8BRsYSBiQ6r35NExg06ooK3IdT1dEU5Mo/t/uzHHpTDgL997 6v+jOkCGpPLeoFSeqGiRVYQMP8XPkxVjoIrtWZdJRFhZ2AZVLAz6iQrq07ah1jkE 5ly5Xxrhh6zHGIU1j+eFp++wzdFvLntJdGl8GanWTMGgktt3l3/1YFn6MToPjefq h4GUNyCPGrVpdtWPbJwehCvrll7Ucb9gloLV+ElMCASQXuzZyEp6ObAX6zWQYoDU 6aeFJckz1LzjadSu0dm4XYHo5+t8mZRkW1xlSL0+eJf2jtzPR0msBGl8yo8Am+QG q0i6Ga2io+oeazUph2q7wdCgLIh5h2sdRJ/6TW1OWjWkZDX6N5q0T3mXEbRx3XS8 Z6aic8lzDBvWlzklEyUtm+nWuzMEpIx+gjsww4raZ/2RRSOieG+HXLV9/Mab2X61 D4k6AB8XY1oFny9RHSAPEwhYonBg9ou8VN4lyL0T8Ja6cGuk2+pIH8mnqgkamWrY KFGX0pYcQtAkqhM+awdHmv1vBiph99uhLL7U6fTkmAEp8caMPVRRHzCGvZWmftAK KctQ2KXxfF9hupXgnKzPEXqjaWdwtRIbpiDnohmy0pP8OFuSZdk136kRBVzCdbOl kpzJPkSDY21LFg8lMqpigpUEeiCoVIAgw1LwY0Use26osINYNuY= =1UMi -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Thu Sep 19 06:07:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:07:10 +0000 Subject: [Git][debian-gis-team/routino][pristine-tar] pristine-tar data for routino_3.3.2.orig.tar.gz Message-ID: <5d830cfe171a5_73483fbbb4751b4c15366c6@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / routino Commits: 6712bc40 by Bas Couwenberg at 2019-09-19T04:29:50Z pristine-tar data for routino_3.3.2.orig.tar.gz - - - - - 2 changed files: - + routino_3.3.2.orig.tar.gz.delta - + routino_3.3.2.orig.tar.gz.id Changes: ===================================== routino_3.3.2.orig.tar.gz.delta ===================================== Binary files /dev/null and b/routino_3.3.2.orig.tar.gz.delta differ ===================================== routino_3.3.2.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +24ca1791214dd9f0d9423d3b1db1f69e56d2f83d View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/commit/6712bc40185376ee5c6b9976aa4a783f6e1b2e4e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/commit/6712bc40185376ee5c6b9976aa4a783f6e1b2e4e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:07:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:07:10 +0000 Subject: [Git][debian-gis-team/routino][master] 7 commits: New upstream version 3.3.2 Message-ID: <5d830cfe878ed_73483fbbb83988581536752@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / routino Commits: 5fac67b1 by Bas Couwenberg at 2019-09-19T04:29:41Z New upstream version 3.3.2 - - - - - 26dd85ad by Bas Couwenberg at 2019-09-19T04:29:50Z Update upstream source from tag 'upstream/3.3.2' Update to upstream version '3.3.2' with Debian dir 7539032ce93cea30264924113c59a9dada3175c4 - - - - - 6876c516 by Bas Couwenberg at 2019-09-19T04:31:25Z New upstream release. - - - - - fb90821d by Bas Couwenberg at 2019-09-19T04:48:55Z Also build python3-routino package. - - - - - 73b2cb86 by Bas Couwenberg at 2019-09-19T04:57:50Z Set distribution to experimental. - - - - - 449296e8 by Bas Couwenberg at 2019-09-19T04:58:00Z Revert "Also build python3-routino package." This reverts commit fb90821d76598c2c0f15c68554a8661e6920b3f3. - - - - - 715457de by Bas Couwenberg at 2019-09-19T04:59:11Z Set distribution to unstable. - - - - - 7 changed files: - ChangeLog - debian/changelog - doc/NEWS.txt - doc/README.txt - doc/html/readme.html - python/Makefile - src/version.h Changes: ===================================== ChangeLog ===================================== @@ -1,3 +1,17 @@ +2019-09-18 Andrew M. Bishop + + Version 3.3.2 released. + +2019-09-18 [r2025] Andrew M. Bishop + + * FILES, doc/NEWS.txt, doc/README.txt, doc/html/readme.html, + src/version.h: Update for version 3.3.2 release. + +2019-09-09 [r2023] Andrew M. Bishop + + * python/Makefile: Fix parallel compilation in the 'python' + directory. + 2019-09-08 Andrew M. Bishop Version 3.3.1 released. ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +routino (3.3.2-1) unstable; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Thu, 19 Sep 2019 06:42:17 +0200 + routino (3.3.1-1) unstable; urgency=medium * New upstream release. ===================================== doc/NEWS.txt ===================================== @@ -1,3 +1,14 @@ +Version 3.3.2 of Routino released : Wed Sep 18 2019 +--------------------------------------------------- + +Bug fixes: + Ensure that parallel compilation works in the python directory. + Updated the version number in the executables to "3.3.2". + + +Note: This version is compatible with databases from versions 2.7.1 - 3.3.1. + + Version 3.3.1 of Routino released : Sun Sep 8 2019 -------------------------------------------------- ===================================== doc/README.txt ===================================== @@ -136,6 +136,7 @@ Status Version 3.2 of Routino was released on 12th March 2017. Version 3.3 of Routino was released on 7th September 2019. Version 3.3.1 of Routino was released on 8th September 2019. + Version 3.3.2 of Routino was released on 18th September 2019. The full version history is available in the NEWS.txt file. ===================================== doc/html/readme.html ===================================== @@ -224,13 +224,27 @@ Version 3.2 of Routino was released on 12th March 2017. Version 3.3 of Routino was released on 7th September 2019.
Version 3.3.1 of Routino was released on 8th September 2019. +
+Version 3.3.2 of Routino was released on 18th September 2019.

The full version history is available in the NEWS.txt file. -

Changes in Version 3.3.1

+

Changes in Version 3.3.2

+ +
+
Bug fixes: +
Ensure that parallel compilation works in the python directory. +
Updated the version number in the executables to "3.3.2". +
+ +

+Note: This version is compatible with databases from versions 2.7.1 - 3.3. + + +

Changes in Version 3.3.1

Bug fixes: @@ -242,7 +256,7 @@ The full version history is available in the NEWS.txt file. Note: This version is compatible with databases from versions 2.7.1 - 3.3. -

Changes in Version 3.3

+

Changes in Version 3.3

Bug fixes: ===================================== python/Makefile ===================================== @@ -76,9 +76,15 @@ $(BUILD_TIMESTAMP): $(SWIG_C) $(SWIG_CC) $(SWIG_PY) $(PY_FILES) $(C_FILES) $(CC_ src/_router.c : src/router.i ../src/routino.h $(SWIG) -python -o $@ $< +src/router.py : src/_router.c + @true # fake rule since src/router.py is created by the same rule as src/_router.c + src/_database.cc : src/database.i src/database.hh $(SWIG) -c++ -python -o $@ $< +src/database.py : src/_database.cc + @true # fake rule since src/database.py is created by the same rule as src/_database.cc + src/%.o : src/%.c $(CC) -c $(CFLAGS) $< -o $@ ===================================== src/version.h ===================================== @@ -3,7 +3,7 @@ Part of the Routino routing software. ******************/ /****************** - This file Copyright 2016, 2017 Andrew M. Bishop + This file Copyright 2016, 2017, 2019 Andrew M. Bishop This program is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by @@ -23,7 +23,7 @@ #ifndef VERSION_H #define VERSION_H /*+ To stop multiple inclusions. +*/ -#define ROUTINO_VERSION "3.2+svn" +#define ROUTINO_VERSION "3.3.2" #define ROUTINO_URL "" View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/fd0b60b5338c42c13bcca1436e3d4ee2c5589474...715457de0a120e35f8f3c8ec154b1afe9fc75887 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/compare/fd0b60b5338c42c13bcca1436e3d4ee2c5589474...715457de0a120e35f8f3c8ec154b1afe9fc75887 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:07:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:07:14 +0000 Subject: [Git][debian-gis-team/routino][upstream] New upstream version 3.3.2 Message-ID: <5d830d027cbfa_73483fbbbf05478015369b8@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / routino Commits: 5fac67b1 by Bas Couwenberg at 2019-09-19T04:29:41Z New upstream version 3.3.2 - - - - - 6 changed files: - ChangeLog - doc/NEWS.txt - doc/README.txt - doc/html/readme.html - python/Makefile - src/version.h Changes: ===================================== ChangeLog ===================================== @@ -1,3 +1,17 @@ +2019-09-18 Andrew M. Bishop + + Version 3.3.2 released. + +2019-09-18 [r2025] Andrew M. Bishop + + * FILES, doc/NEWS.txt, doc/README.txt, doc/html/readme.html, + src/version.h: Update for version 3.3.2 release. + +2019-09-09 [r2023] Andrew M. Bishop + + * python/Makefile: Fix parallel compilation in the 'python' + directory. + 2019-09-08 Andrew M. Bishop Version 3.3.1 released. ===================================== doc/NEWS.txt ===================================== @@ -1,3 +1,14 @@ +Version 3.3.2 of Routino released : Wed Sep 18 2019 +--------------------------------------------------- + +Bug fixes: + Ensure that parallel compilation works in the python directory. + Updated the version number in the executables to "3.3.2". + + +Note: This version is compatible with databases from versions 2.7.1 - 3.3.1. + + Version 3.3.1 of Routino released : Sun Sep 8 2019 -------------------------------------------------- ===================================== doc/README.txt ===================================== @@ -136,6 +136,7 @@ Status Version 3.2 of Routino was released on 12th March 2017. Version 3.3 of Routino was released on 7th September 2019. Version 3.3.1 of Routino was released on 8th September 2019. + Version 3.3.2 of Routino was released on 18th September 2019. The full version history is available in the NEWS.txt file. ===================================== doc/html/readme.html ===================================== @@ -224,13 +224,27 @@ Version 3.2 of Routino was released on 12th March 2017. Version 3.3 of Routino was released on 7th September 2019.
Version 3.3.1 of Routino was released on 8th September 2019. +
+Version 3.3.2 of Routino was released on 18th September 2019.

The full version history is available in the NEWS.txt file. -

Changes in Version 3.3.1

+

Changes in Version 3.3.2

+ +
+
Bug fixes: +
Ensure that parallel compilation works in the python directory. +
Updated the version number in the executables to "3.3.2". +
+ +

+Note: This version is compatible with databases from versions 2.7.1 - 3.3. + + +

Changes in Version 3.3.1

Bug fixes: @@ -242,7 +256,7 @@ The full version history is available in the NEWS.txt file. Note: This version is compatible with databases from versions 2.7.1 - 3.3. -

Changes in Version 3.3

+

Changes in Version 3.3

Bug fixes: ===================================== python/Makefile ===================================== @@ -76,9 +76,15 @@ $(BUILD_TIMESTAMP): $(SWIG_C) $(SWIG_CC) $(SWIG_PY) $(PY_FILES) $(C_FILES) $(CC_ src/_router.c : src/router.i ../src/routino.h $(SWIG) -python -o $@ $< +src/router.py : src/_router.c + @true # fake rule since src/router.py is created by the same rule as src/_router.c + src/_database.cc : src/database.i src/database.hh $(SWIG) -c++ -python -o $@ $< +src/database.py : src/_database.cc + @true # fake rule since src/database.py is created by the same rule as src/_database.cc + src/%.o : src/%.c $(CC) -c $(CFLAGS) $< -o $@ ===================================== src/version.h ===================================== @@ -3,7 +3,7 @@ Part of the Routino routing software. ******************/ /****************** - This file Copyright 2016, 2017 Andrew M. Bishop + This file Copyright 2016, 2017, 2019 Andrew M. Bishop This program is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by @@ -23,7 +23,7 @@ #ifndef VERSION_H #define VERSION_H /*+ To stop multiple inclusions. +*/ -#define ROUTINO_VERSION "3.2+svn" +#define ROUTINO_VERSION "3.3.2" #define ROUTINO_URL "" View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/commit/5fac67b1cb00e070216875fe8530f3ee9868e510 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/commit/5fac67b1cb00e070216875fe8530f3ee9868e510 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:07:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:07:26 +0000 Subject: [Git][debian-gis-team/routino] Pushed new tag debian/3.3.2-1 Message-ID: <5d830d0e2b556_73483fbbbee520e015373d3@godard.mail> Bas Couwenberg pushed new tag debian/3.3.2-1 at Debian GIS Project / routino -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/tree/debian/3.3.2-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:07:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:07:27 +0000 Subject: [Git][debian-gis-team/routino] Pushed new tag upstream/3.3.2 Message-ID: <5d830d0ff0cd6_73482ad95d910378153757b@godard.mail> Bas Couwenberg pushed new tag upstream/3.3.2 at Debian GIS Project / routino -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/tree/upstream/3.3.2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 19 06:13:45 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 05:13:45 +0000 Subject: Processing of mapserver_7.4.2-1~bpo10+1_source.changes Message-ID: mapserver_7.4.2-1~bpo10+1_source.changes uploaded successfully to localhost along with the files: mapserver_7.4.2-1~bpo10+1.dsc mapserver_7.4.2-1~bpo10+1.debian.tar.xz mapserver_7.4.2-1~bpo10+1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Thu Sep 19 06:17:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:17:25 +0000 Subject: [Git][debian-gis-team/netcdf-fortran][pristine-tar] pristine-tar data for netcdf-fortran_4.5.2+ds.orig.tar.xz Message-ID: <5d830f65884e2_73482ad95d910378153813f@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / netcdf-fortran Commits: 72e6570d by Bas Couwenberg at 2019-09-19T05:07:52Z pristine-tar data for netcdf-fortran_4.5.2+ds.orig.tar.xz - - - - - 2 changed files: - + netcdf-fortran_4.5.2+ds.orig.tar.xz.delta - + netcdf-fortran_4.5.2+ds.orig.tar.xz.id Changes: ===================================== netcdf-fortran_4.5.2+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/netcdf-fortran_4.5.2+ds.orig.tar.xz.delta differ ===================================== netcdf-fortran_4.5.2+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +418a744dcf62288896a1e5932ce0bf0e4e63230a View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/72e6570d98635869010afbfe8548268af5f2a993 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/72e6570d98635869010afbfe8548268af5f2a993 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:17:48 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:17:48 +0000 Subject: [Git][debian-gis-team/netcdf-fortran] Pushed new tag debian/4.5.2+ds-1_exp1 Message-ID: <5d830f7c40fae_73483fbbbee520e01538379@godard.mail> Bas Couwenberg pushed new tag debian/4.5.2+ds-1_exp1 at Debian GIS Project / netcdf-fortran -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/tree/debian/4.5.2+ds-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:17:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:17:49 +0000 Subject: [Git][debian-gis-team/netcdf-fortran] Pushed new tag upstream/4.5.2+ds Message-ID: <5d830f7de75b2_73483fbbb4751b4c1538571@godard.mail> Bas Couwenberg pushed new tag upstream/4.5.2+ds at Debian GIS Project / netcdf-fortran -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/tree/upstream/4.5.2+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 19 06:19:12 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 05:19:12 +0000 Subject: Processing of routino_3.3.2-1_source.changes Message-ID: routino_3.3.2-1_source.changes uploaded successfully to localhost along with the files: routino_3.3.2-1.dsc routino_3.3.2.orig.tar.gz routino_3.3.2-1.debian.tar.xz routino_3.3.2-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Thu Sep 19 06:19:14 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 05:19:14 +0000 Subject: mapserver_7.4.2-1~bpo10+1_source.changes ACCEPTED into buster-backports Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 19 Sep 2019 06:50:49 +0200 Source: mapserver Architecture: source Version: 7.4.2-1~bpo10+1 Distribution: buster-backports Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: mapserver (7.4.2-1~bpo10+1) buster-backports; urgency=medium . * Rebuild for buster-backports. . mapserver (7.4.2-1) unstable; urgency=medium . * New upstream release. * Update 7.4.1 symbols for other architectures. * Update 7.4.2 symbols for amd64. Checksums-Sha1: 25d31cc345dc155ffaae293a8959b040a1ef5635 3283 mapserver_7.4.2-1~bpo10+1.dsc b6c82442b2aef9e04d1daeb3a39b0516946a8c75 47076 mapserver_7.4.2-1~bpo10+1.debian.tar.xz c5506264e61cabf3fa74cb0f93c433f87491222d 22944 mapserver_7.4.2-1~bpo10+1_amd64.buildinfo Checksums-Sha256: c6a877a04c06df61cda11164cd92cf11700a8767c7638cd36583c3bde29b50e8 3283 mapserver_7.4.2-1~bpo10+1.dsc e25d004106e942d6be07e90f7bb864192787e4fc15637532d6bf62a74bbc2473 47076 mapserver_7.4.2-1~bpo10+1.debian.tar.xz 6f0f2f0967369edd8db916db2d0711e771ede82984baa931a25cb66e0d608f51 22944 mapserver_7.4.2-1~bpo10+1_amd64.buildinfo Files: a7460367c7c7c6e2a73a7644e51a797f 3283 devel optional mapserver_7.4.2-1~bpo10+1.dsc e4956ec86dff5d0b3323d93c92b0e981 47076 devel optional mapserver_7.4.2-1~bpo10+1.debian.tar.xz 9764c414a660fcb76882abe17c847169 22944 devel optional mapserver_7.4.2-1~bpo10+1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2DDEsACgkQZ1DxCuiN SvEQCxAAku88/bT+4BmMsX8JxPjFwFqpCSmEBQ00eNGvtwl0+LaIkAhHYSwm5jvs dfLu22Zg2chONV4mK4kljrmsOfOtJZ1yNoj7UgeUEXC2/21HLIay9fp0XQjtmzQz UJwW5fGEu/wo4gNPCuh8qcLCmnzWf7x1eApfnAAGFhC8C18Z50+nlw/NdPxgYjQf jsKQMP0i6wJgby242+DR+7er+9+iXeINiWqnlWKo3QSaAb2bu19Gr6O9bsb9Na6C gVIPT8NC88M5ILhSK2yaCXTkJPxR3yoAMXlEtyCcJBHbt5R16dySEI797Y6DTsMV zl0w7oHrsE15l5PpNAjGw6AWHqFnImDZraR03FfygMMUIcmpSs8HlwoTleLk3Ayf PP/PDGUWQlrHbMjFu0ZHcxwCcIArBCZ036l5hZhbF/UGwtaaNUPkLBcVnG8M9nrz 5WBFbgrhMuzSKatCM8mx+2n8hd2K7BcW/VmisFEtbRqYj0GgPbDXlULB8PTT9CD4 KfmAUIzIEubAw7hKj1ALPjoe368OIv2r2AGQTCHwROGX0jMXfpX/k77dRJRxxgTr DkN0hWCL8ZqWdAZH5w6p6R3EVyBQxzSa1Vg2bThFnoPCewFPDy+sE4/cQlb02d2d uSReEbs7lw4JoPvluFBG2VKQgRzcNGC9Lfa2lW4CmQp7MxevAOk= =UkQp -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Thu Sep 19 06:19:13 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:19:13 +0000 Subject: [Git][debian-gis-team/netcdf-fortran][experimental] 4 commits: New upstream version 4.5.2+ds Message-ID: <5d830fd151271_73483fbbbf05478015388c7@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / netcdf-fortran Commits: 32dff86c by Bas Couwenberg at 2019-09-19T05:07:49Z New upstream version 4.5.2+ds - - - - - 9f70e650 by Bas Couwenberg at 2019-09-19T05:07:53Z Update upstream source from tag 'upstream/4.5.2+ds' Update to upstream version '4.5.2+ds' with Debian dir fbb2f40bb4f574b02190eaaffa92b893389c8a83 - - - - - bcaf5ecd by Bas Couwenberg at 2019-09-19T05:08:05Z New upstream release. - - - - - 36b6a33a by Bas Couwenberg at 2019-09-19T05:10:07Z Set distribution to unstable. - - - - - 25 changed files: - .gitignore - CMakeExtras/Makefile.in - CMakeLists.txt - Makefile.in - RELEASE_NOTES.md - config.guess - config.sub - configure - configure.ac - debian/changelog - docs/Doxyfile.developer - docs/Makefile.in - examples/F77/Makefile.in - examples/F90/Makefile.in - examples/Makefile.in - fortran/Makefile.in - libsrc/Makefile.in - ltmain.sh - m4/libtool.m4 - nf03_test/Makefile.in - nf03_test4/CMakeLists.txt - nf03_test4/Makefile.am - nf03_test4/Makefile.in - nf_test/Makefile.in - nf_test4/Makefile.in Changes: ===================================== .gitignore ===================================== @@ -3,3 +3,6 @@ build* \#.\# *.*~ html +*.o +*.tmp +*.tmp2 \ No newline at end of file ===================================== CMakeExtras/Makefile.in ===================================== @@ -267,6 +267,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== CMakeLists.txt ===================================== @@ -17,7 +17,7 @@ set(PACKAGE "${NC4F_CTEST_PROJECT_NAME}" CACHE STRING "") #Project Version SET(NC4F_VERSION_MAJOR 4) SET(NC4F_VERSION_MINOR 5) -SET(NC4F_VERSION_PATCH 1) +SET(NC4F_VERSION_PATCH 2) SET(NC4F_VERSION_NOTE "") SET(NC4F_VERSION ${NC4F_VERSION_MAJOR}.${NC4F_VERSION_MINOR}.${NC4F_VERSION_PATCH}${NC4F_VERSION_NOTE}) SET(VERSION ${NC4F_VERSION}) @@ -394,7 +394,8 @@ MACRO(build_bin_test F ext) ENDIF() ENDMACRO() -OPTION(LARGE_FILE_TESTS "Run large file tests, which are slow and take lots of disk." OFF) +OPTION(ENABLE_LARGE_FILE_TESTS "Run large file tests, which are slow and take lots of disk." OFF) +SET(LARGE_FILE_TESTS ${ENABLE_LARGE_FILE_TESTS}) OPTION(BUILD_BENCHMARKS "Run F90 I/O Benchmarks" OFF) OPTION(TEST_WITH_VALGRIND "Run extra tests with valgrind" OFF) OPTION(ENABLE_PARALLEL_TESTS "Run parallel I/O tests for F90 and F77" OFF) @@ -540,7 +541,7 @@ ENDIF() CHECK_FUNCTION_EXISTS(alloca HAVE_ALLOCA) -CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_def_opaque "" USE_NETCDF4) +CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_get_chunk_cache_ints "" USE_NETCDF4) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nccreate "" USE_NETCDF_V2) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_set_log_level "" USE_LOGGING) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} oc_open "" BUILD_DAP) ===================================== Makefile.in ===================================== @@ -382,6 +382,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== RELEASE_NOTES.md ===================================== @@ -6,7 +6,16 @@ Release Notes {#nf_release_notes} This file contains a high-level description of this package's evolution. Entries are in reverse chronological order (most recent first). -## 4.5.2 - TBD +## 4.5.2 - September 18, 2019 + +### Requirements + +* netCDF-C: 4.6.0 or greater + +### Changes + +* Corrected an issue where netCDF-Fortran would fail to build correctly on some platforms when the underlying `libnetcdf` lacked netCDF-4 support. See [GitHub #200](https://github.com/Unidata/netcdf-fortran/issues/200) for more information. +* Corrected an issue where cmake-specific large file tests weren't being captured by `make dist`. See [Github #198](https://github.com/Unidata/netcdf-fortran/issues/198) for more details. ## 4.5.1 - September 4, 2019 ===================================== config.guess ===================================== @@ -2,7 +2,7 @@ # Attempt to guess a canonical system name. # Copyright 1992-2018 Free Software Foundation, Inc. -timestamp='2018-03-08' +timestamp='2018-02-24' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -1046,7 +1046,11 @@ EOF echo "$UNAME_MACHINE"-dec-linux-"$LIBC" exit ;; x86_64:Linux:*:*) - echo "$UNAME_MACHINE"-pc-linux-"$LIBC" + if objdump -f /bin/sh | grep -q elf32-x86-64; then + echo "$UNAME_MACHINE"-pc-linux-"$LIBC"x32 + else + echo "$UNAME_MACHINE"-pc-linux-"$LIBC" + fi exit ;; xtensa*:Linux:*:*) echo "$UNAME_MACHINE"-unknown-linux-"$LIBC" @@ -1469,7 +1473,7 @@ EOF exit 1 # Local variables: -# eval: (add-hook 'before-save-hook 'time-stamp) +# eval: (add-hook 'write-file-functions 'time-stamp) # time-stamp-start: "timestamp='" # time-stamp-format: "%:y-%02m-%02d" # time-stamp-end: "'" ===================================== config.sub ===================================== @@ -2,7 +2,7 @@ # Configuration validation subroutine script. # Copyright 1992-2018 Free Software Foundation, Inc. -timestamp='2018-03-08' +timestamp='2018-02-22' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -1376,7 +1376,7 @@ case $os in | -ekkobsd* | -kfreebsd* | -freebsd* | -riscix* | -lynxos* \ | -bosx* | -nextstep* | -cxux* | -aout* | -elf* | -oabi* \ | -ptx* | -coff* | -ecoff* | -winnt* | -domain* | -vsta* \ - | -udi* | -eabi* | -lites* | -ieee* | -go32* | -aux* | -hcos* \ + | -udi* | -eabi* | -lites* | -ieee* | -go32* | -aux* \ | -chorusos* | -chorusrdb* | -cegcc* | -glidix* \ | -cygwin* | -msys* | -pe* | -psos* | -moss* | -proelf* | -rtems* \ | -midipix* | -mingw32* | -mingw64* | -linux-gnu* | -linux-android* \ @@ -1794,7 +1794,7 @@ echo "$basic_machine$os" exit # Local variables: -# eval: (add-hook 'before-save-hook 'time-stamp) +# eval: (add-hook 'write-file-functions 'time-stamp) # time-stamp-start: "timestamp='" # time-stamp-format: "%:y-%02m-%02d" # time-stamp-end: "'" ===================================== configure ===================================== @@ -1,6 +1,6 @@ #! /bin/sh # Guess values for system-dependent variables and create Makefiles. -# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.1. +# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.2. # # Report bugs to . # @@ -590,8 +590,8 @@ MAKEFLAGS= # Identity of this package. PACKAGE_NAME='netCDF-Fortran' PACKAGE_TARNAME='netcdf-fortran' -PACKAGE_VERSION='4.5.1' -PACKAGE_STRING='netCDF-Fortran 4.5.1' +PACKAGE_VERSION='4.5.2' +PACKAGE_STRING='netCDF-Fortran 4.5.2' PACKAGE_BUGREPORT='support-netcdf at unidata.ucar.edu' PACKAGE_URL='' @@ -795,6 +795,7 @@ infodir docdir oldincludedir includedir +runstatedir localstatedir sharedstatedir sysconfdir @@ -898,6 +899,7 @@ datadir='${datarootdir}' sysconfdir='${prefix}/etc' sharedstatedir='${prefix}/com' localstatedir='${prefix}/var' +runstatedir='${localstatedir}/run' includedir='${prefix}/include' oldincludedir='/usr/include' docdir='${datarootdir}/doc/${PACKAGE_TARNAME}' @@ -1150,6 +1152,15 @@ do | -silent | --silent | --silen | --sile | --sil) silent=yes ;; + -runstatedir | --runstatedir | --runstatedi | --runstated \ + | --runstate | --runstat | --runsta | --runst | --runs \ + | --run | --ru | --r) + ac_prev=runstatedir ;; + -runstatedir=* | --runstatedir=* | --runstatedi=* | --runstated=* \ + | --runstate=* | --runstat=* | --runsta=* | --runst=* | --runs=* \ + | --run=* | --ru=* | --r=*) + runstatedir=$ac_optarg ;; + -sbindir | --sbindir | --sbindi | --sbind | --sbin | --sbi | --sb) ac_prev=sbindir ;; -sbindir=* | --sbindir=* | --sbindi=* | --sbind=* | --sbin=* \ @@ -1287,7 +1298,7 @@ fi for ac_var in exec_prefix prefix bindir sbindir libexecdir datarootdir \ datadir sysconfdir sharedstatedir localstatedir includedir \ oldincludedir docdir infodir htmldir dvidir pdfdir psdir \ - libdir localedir mandir + libdir localedir mandir runstatedir do eval ac_val=\$$ac_var # Remove trailing slashes. @@ -1400,7 +1411,7 @@ if test "$ac_init_help" = "long"; then # Omit some internal or obsolete options to make the list less imposing. # This message is too long to be a string in the A/UX 3.1 sh. cat <<_ACEOF -\`configure' configures netCDF-Fortran 4.5.1 to adapt to many kinds of systems. +\`configure' configures netCDF-Fortran 4.5.2 to adapt to many kinds of systems. Usage: $0 [OPTION]... [VAR=VALUE]... @@ -1440,6 +1451,7 @@ Fine tuning of the installation directories: --sysconfdir=DIR read-only single-machine data [PREFIX/etc] --sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com] --localstatedir=DIR modifiable single-machine data [PREFIX/var] + --runstatedir=DIR modifiable per-process data [LOCALSTATEDIR/run] --libdir=DIR object code libraries [EPREFIX/lib] --includedir=DIR C header files [PREFIX/include] --oldincludedir=DIR C header files for non-gcc [/usr/include] @@ -1471,7 +1483,7 @@ fi if test -n "$ac_init_help"; then case $ac_init_help in - short | recursive ) echo "Configuration of netCDF-Fortran 4.5.1:";; + short | recursive ) echo "Configuration of netCDF-Fortran 4.5.2:";; esac cat <<\_ACEOF @@ -1623,7 +1635,7 @@ fi test -n "$ac_init_help" && exit $ac_status if $ac_init_version; then cat <<\_ACEOF -netCDF-Fortran configure 4.5.1 +netCDF-Fortran configure 4.5.2 generated by GNU Autoconf 2.69 Copyright (C) 2012 Free Software Foundation, Inc. @@ -2428,7 +2440,7 @@ cat >config.log <<_ACEOF This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. -It was created by netCDF-Fortran $as_me 4.5.1, which was +It was created by netCDF-Fortran $as_me 4.5.2, which was generated by GNU Autoconf 2.69. Invocation command line was $ $0 $@ @@ -2779,11 +2791,11 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu # Create the VERSION file, which contains the package version from # AC_INIT. -echo -n 4.5.1>VERSION +echo -n 4.5.2>VERSION -{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.1" >&5 -$as_echo "$as_me: netCDF-Fortran 4.5.1" >&6;} +{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.2" >&5 +$as_echo "$as_me: netCDF-Fortran 4.5.2" >&6;} # Keep libtool macros in an m4 directory. @@ -3418,7 +3430,7 @@ fi # Define the identity of the package. PACKAGE='netcdf-fortran' - VERSION='4.5.1' + VERSION='4.5.2' cat >>confdefs.h <<_ACEOF @@ -3534,7 +3546,6 @@ fi MAINT=$MAINTAINER_MODE_TRUE - { $as_echo "$as_me:${as_lineno-$LINENO}: checking user options" >&5 $as_echo "$as_me: checking user options" >&6;} @@ -8216,7 +8227,7 @@ linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) lt_cv_deplibs_check_method=pass_all ;; -netbsd*) +netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then lt_cv_deplibs_check_method='match_pattern /lib[^/]+(\.so\.[0-9]+\.[0-9]+|_pic\.a)$' else @@ -9079,11 +9090,8 @@ _LT_EOF test $ac_status = 0; }; then # Now try to grab the symbols. nlist=conftest.nm - if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist\""; } >&5 - (eval $NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist) 2>&5 - ac_status=$? - $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } && test -s "$nlist"; then + $ECHO "$as_me:$LINENO: $NM conftest.$ac_objext | $lt_cv_sys_global_symbol_pipe > $nlist" >&5 + if eval "$NM" conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist 2>&5 && test -s "$nlist"; then # Try sorting and uniquifying the output. if sort "$nlist" | uniq > "$nlist"T; then mv -f "$nlist"T "$nlist" @@ -11443,6 +11451,12 @@ lt_prog_compiler_static= lt_prog_compiler_pic='-KPIC' lt_prog_compiler_static='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-fPIC' + lt_prog_compiler_static='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -11919,6 +11933,9 @@ $as_echo_n "checking whether the $compiler linker ($LD) supports shared librarie openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + link_all_deplibs=no + ;; esac ld_shlibs=yes @@ -12173,7 +12190,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -12843,6 +12860,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } if test yes = "$lt_cv_irix_exported_symbol"; then archive_expsym_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + link_all_deplibs=no else archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' archive_expsym_cmds='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -12864,7 +12882,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -13979,6 +13997,18 @@ fi dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -15294,6 +15324,12 @@ lt_prog_compiler_static_F77= lt_prog_compiler_pic_F77='-KPIC' lt_prog_compiler_static_F77='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + lt_prog_compiler_wl_F77='-Wl,' + lt_prog_compiler_pic_F77='-fPIC' + lt_prog_compiler_static_F77='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -15755,6 +15791,9 @@ $as_echo_n "checking whether the $compiler linker ($LD) supports shared librarie openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + link_all_deplibs_F77=no + ;; esac ld_shlibs_F77=yes @@ -16009,7 +16048,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_F77='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -16629,6 +16668,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } if test yes = "$lt_cv_irix_exported_symbol"; then archive_expsym_cmds_F77='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + link_all_deplibs_F77=no else archive_cmds_F77='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' archive_expsym_cmds_F77='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -16650,7 +16690,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_F77='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -17587,6 +17627,18 @@ fi dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -18403,6 +18455,12 @@ lt_prog_compiler_static_FC= lt_prog_compiler_pic_FC='-KPIC' lt_prog_compiler_static_FC='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + lt_prog_compiler_wl_FC='-Wl,' + lt_prog_compiler_pic_FC='-fPIC' + lt_prog_compiler_static_FC='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -18864,6 +18922,9 @@ $as_echo_n "checking whether the $compiler linker ($LD) supports shared librarie openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + link_all_deplibs_FC=no + ;; esac ld_shlibs_FC=yes @@ -19118,7 +19179,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_FC='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -19738,6 +19799,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } if test yes = "$lt_cv_irix_exported_symbol"; then archive_expsym_cmds_FC='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + link_all_deplibs_FC=no else archive_cmds_FC='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' archive_expsym_cmds_FC='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -19759,7 +19821,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_FC='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -20696,6 +20758,18 @@ fi dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -21074,7 +21148,7 @@ else We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21120,7 +21194,7 @@ else We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21144,7 +21218,7 @@ rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21189,7 +21263,7 @@ else We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21213,7 +21287,7 @@ rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -22765,12 +22839,12 @@ nc_build_v4=no nc_has_logging=no nc_has_dap=no -for ac_func in nc_def_opaque +for ac_func in nc_get_chunk_cache_ints do : - ac_fn_c_check_func "$LINENO" "nc_def_opaque" "ac_cv_func_nc_def_opaque" -if test "x$ac_cv_func_nc_def_opaque" = xyes; then : + ac_fn_c_check_func "$LINENO" "nc_get_chunk_cache_ints" "ac_cv_func_nc_get_chunk_cache_ints" +if test "x$ac_cv_func_nc_get_chunk_cache_ints" = xyes; then : cat >>confdefs.h <<_ACEOF -#define HAVE_NC_DEF_OPAQUE 1 +#define HAVE_NC_GET_CHUNK_CACHE_INTS 1 _ACEOF nc_build_v4=yes else @@ -24641,7 +24715,7 @@ cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 # report actual input values of CONFIG_FILES etc. instead of their # values after options handling. ac_log=" -This file was extended by netCDF-Fortran $as_me 4.5.1, which was +This file was extended by netCDF-Fortran $as_me 4.5.2, which was generated by GNU Autoconf 2.69. Invocation command line was CONFIG_FILES = $CONFIG_FILES @@ -24702,7 +24776,7 @@ _ACEOF cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" ac_cs_version="\\ -netCDF-Fortran config.status 4.5.1 +netCDF-Fortran config.status 4.5.2 configured by $0, generated by GNU Autoconf 2.69, with options \\"\$ac_cs_config\\" @@ -25875,7 +25949,6 @@ See \`config.log' for more details" "$LINENO" 5; } cat <<_LT_EOF >> "$cfgfile" #! $SHELL # Generated automatically by $as_me ($PACKAGE) $VERSION -# Libtool was configured on host `(hostname || uname -n) 2>/dev/null | sed 1q`: # NOTE: Changes made to this file will be lost: look at ltmain.sh. # Provide generalized library-building support services. ===================================== configure.ac ===================================== @@ -9,7 +9,7 @@ AC_PREREQ([2.59]) # Initialize with name, version, and support email address. -AC_INIT([netCDF-Fortran], [4.5.1], [support-netcdf at unidata.ucar.edu]) +AC_INIT([netCDF-Fortran], [4.5.2], [support-netcdf at unidata.ucar.edu]) # Create the VERSION file, which contains the package version from # AC_INIT. @@ -30,7 +30,6 @@ AC_CANONICAL_TARGET # This call is required by automake. AM_INIT_AUTOMAKE([foreign dist-zip subdir-objects]) AM_MAINTAINER_MODE() - AC_MSG_NOTICE([checking user options]) AC_COMPILE_IFELSE([AC_LANG_PROGRAM([], [[ @@ -396,7 +395,7 @@ nc_build_v4=no nc_has_logging=no nc_has_dap=no -AC_CHECK_FUNCS([nc_def_opaque],[nc_build_v4=yes],[nc_build_v4=no]) +AC_CHECK_FUNCS([nc_get_chunk_cache_ints],[nc_build_v4=yes],[nc_build_v4=no]) test "x$ac_cv_func_nccreate" = xyes && nc_build_v2=yes test "x$ac_cv_func_nc_set_log_level" = xyes && nc_has_logging=yes test "x$ac_cv_func_oc_open" = xyes && nc_has_dap=yes ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +netcdf-fortran (4.5.2+ds-1~exp1) experimental; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Thu, 19 Sep 2019 07:09:46 +0200 + netcdf-fortran (4.5.1+ds-1~exp1) experimental; urgency=medium * New upstream release. ===================================== docs/Doxyfile.developer ===================================== @@ -38,13 +38,13 @@ PROJECT_NAME = netcdf-fortran # could be handy for archiving the generated documentation or if some version # control system is used. -PROJECT_NUMBER = 4.5.0-Development +PROJECT_NUMBER = 4.5.2 # Using the PROJECT_BRIEF tag one can provide an optional one line description # for a project that appears at the top of each page and should give viewer a # quick idea about the purpose of the project. Keep the description short. -PROJECT_BRIEF = +PROJECT_BRIEF = # With the PROJECT_LOGO tag one can specify a logo or an icon that is included # in the documentation. The maximum height of the logo should not exceed 55 @@ -58,7 +58,7 @@ PROJECT_LOGO = docs/netcdf-50x50.png # entered, it will be relative to the location where doxygen was started. If # left blank the current directory will be used. -OUTPUT_DIRECTORY = +OUTPUT_DIRECTORY = # If the CREATE_SUBDIRS tag is set to YES then doxygen will create 4096 sub- # directories (in 2 levels) under the output directory of each output format and @@ -118,7 +118,7 @@ REPEAT_BRIEF = YES # the entity):The $name class, The $name widget, The $name file, is, provides, # specifies, contains, represents, a, an and the. -ABBREVIATE_BRIEF = +ABBREVIATE_BRIEF = # If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then # doxygen will generate a detailed section even if there is only a brief @@ -152,7 +152,7 @@ FULL_PATH_NAMES = YES # will be relative from the directory where doxygen is started. # This tag requires that the tag FULL_PATH_NAMES is set to YES. -STRIP_FROM_PATH = +STRIP_FROM_PATH = # The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of the # path mentioned in the documentation of a class, which tells the reader which @@ -161,7 +161,7 @@ STRIP_FROM_PATH = # specify the list of include paths that are normally passed to the compiler # using the -I flag. -STRIP_FROM_INC_PATH = +STRIP_FROM_INC_PATH = # If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter (but # less readable) file names. This can be useful is your file systems doesn't @@ -228,13 +228,13 @@ TAB_SIZE = 4 # "Side Effects:". You can put \n's in the value part of an alias to insert # newlines. -ALIASES = +ALIASES = # This tag can be used to specify a number of word-keyword mappings (TCL only). # A mapping has the form "name=value". For example adding "class=itcl::class" # will allow you to use the command class in the itcl::class meaning. -TCL_SUBST = +TCL_SUBST = # Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C sources # only. Doxygen will then generate output that is more tailored for C. For @@ -281,7 +281,7 @@ OPTIMIZE_OUTPUT_VHDL = NO # Note that for custom extensions you also need to set FILE_PATTERNS otherwise # the files are not read by doxygen. -EXTENSION_MAPPING = +EXTENSION_MAPPING = # If the MARKDOWN_SUPPORT tag is enabled then doxygen pre-processes all comments # according to the Markdown format, which allows for more readable @@ -629,7 +629,7 @@ GENERATE_DEPRECATEDLIST= YES # sections, marked by \if ... \endif and \cond # ... \endcond blocks. -ENABLED_SECTIONS = +ENABLED_SECTIONS = # The MAX_INITIALIZER_LINES tag determines the maximum number of lines that the # initial value of a variable or macro / define can have for it to appear in the @@ -671,7 +671,7 @@ SHOW_NAMESPACES = YES # by doxygen. Whatever the program writes to standard output is used as the file # version. For an example see the documentation. -FILE_VERSION_FILTER = +FILE_VERSION_FILTER = # The LAYOUT_FILE tag can be used to specify a layout file which will be parsed # by doxygen. The layout file controls the global structure of the generated @@ -684,7 +684,7 @@ FILE_VERSION_FILTER = # DoxygenLayout.xml, doxygen will parse it automatically even if the LAYOUT_FILE # tag is left empty. -LAYOUT_FILE = +LAYOUT_FILE = # The CITE_BIB_FILES tag can be used to specify one or more bib files containing # the reference definitions. This must be a list of .bib files. The .bib @@ -694,7 +694,7 @@ LAYOUT_FILE = # LATEX_BIB_STYLE. To use this feature you need bibtex and perl available in the # search path. See also \cite for info how to create references. -CITE_BIB_FILES = +CITE_BIB_FILES = #--------------------------------------------------------------------------- # Configuration options related to warning and progress messages @@ -753,7 +753,7 @@ WARN_FORMAT = "$file:$line: $text" # messages should be written. If left blank the output is written to standard # error (stderr). -WARN_LOGFILE = +WARN_LOGFILE = #--------------------------------------------------------------------------- # Configuration options related to the input files @@ -810,7 +810,7 @@ RECURSIVE = YES # Note that relative paths are relative to the directory from which doxygen is # run. -EXCLUDE = +EXCLUDE = # The EXCLUDE_SYMLINKS tag can be used to select whether or not files or # directories that are symbolic links (a Unix file system feature) are excluded @@ -826,7 +826,7 @@ EXCLUDE_SYMLINKS = NO # Note that the wildcards are matched against the file with absolute path, so to # exclude all test directories for example use the pattern */test/* -EXCLUDE_PATTERNS = +EXCLUDE_PATTERNS = # The EXCLUDE_SYMBOLS tag can be used to specify one or more symbol names # (namespaces, classes, functions, etc.) that should be excluded from the @@ -837,20 +837,20 @@ EXCLUDE_PATTERNS = # Note that the wildcards are matched against the file with absolute path, so to # exclude all test directories use the pattern */test/* -EXCLUDE_SYMBOLS = +EXCLUDE_SYMBOLS = # The EXAMPLE_PATH tag can be used to specify one or more files or directories # that contain example code fragments that are included (see the \include # command). -EXAMPLE_PATH = +EXAMPLE_PATH = # If the value of the EXAMPLE_PATH tag contains directories, you can use the # EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp and # *.h) to filter out the source-files in the directories. If left blank all # files are included. -EXAMPLE_PATTERNS = +EXAMPLE_PATTERNS = # If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be # searched for input files to be used with the \include or \dontinclude commands @@ -863,7 +863,7 @@ EXAMPLE_RECURSIVE = NO # that contain images that are to be included in the documentation (see the # \image command). -IMAGE_PATH = +IMAGE_PATH = # The INPUT_FILTER tag can be used to specify a program that doxygen should # invoke to filter for each input file. Doxygen will invoke the filter program @@ -880,7 +880,7 @@ IMAGE_PATH = # code is scanned, but not when the output code is generated. If lines are added # or removed, the anchors will not be placed correctly. -INPUT_FILTER = +INPUT_FILTER = # The FILTER_PATTERNS tag can be used to specify filters on a per file pattern # basis. Doxygen will compare the file name with each pattern and apply the @@ -889,7 +889,7 @@ INPUT_FILTER = # filters are used. If the FILTER_PATTERNS tag is empty or if none of the # patterns match the file name, INPUT_FILTER is applied. -FILTER_PATTERNS = +FILTER_PATTERNS = # If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using # INPUT_FILTER) will also be used to filter the input files that are used for @@ -904,7 +904,7 @@ FILTER_SOURCE_FILES = NO # *.ext= (so without naming a filter). # This tag requires that the tag FILTER_SOURCE_FILES is set to YES. -FILTER_SOURCE_PATTERNS = +FILTER_SOURCE_PATTERNS = # If the USE_MDFILE_AS_MAINPAGE tag refers to the name of a markdown file that # is part of the input, its contents will be placed on the main page @@ -1016,7 +1016,7 @@ CLANG_ASSISTED_PARSING = NO # specified with INPUT and INCLUDE_PATH. # This tag requires that the tag CLANG_ASSISTED_PARSING is set to YES. -CLANG_OPTIONS = +CLANG_OPTIONS = #--------------------------------------------------------------------------- # Configuration options related to the alphabetical class index @@ -1042,7 +1042,7 @@ COLS_IN_ALPHA_INDEX = 5 # while generating the index headers. # This tag requires that the tag ALPHABETICAL_INDEX is set to YES. -IGNORE_PREFIX = +IGNORE_PREFIX = #--------------------------------------------------------------------------- # Configuration options related to the HTML output @@ -1086,7 +1086,7 @@ HTML_FILE_EXTENSION = .html # of the possible markers and block names see the documentation. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_HEADER = +HTML_HEADER = # The HTML_FOOTER tag can be used to specify a user-defined HTML footer for each # generated HTML page. If the tag is left blank doxygen will generate a standard @@ -1096,7 +1096,7 @@ HTML_HEADER = # that doxygen normally uses. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_FOOTER = +HTML_FOOTER = # The HTML_STYLESHEET tag can be used to specify a user-defined cascading style # sheet that is used by each HTML page. It can be used to fine-tune the look of @@ -1108,7 +1108,7 @@ HTML_FOOTER = # obsolete. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_STYLESHEET = +HTML_STYLESHEET = # The HTML_EXTRA_STYLESHEET tag can be used to specify additional user-defined # cascading style sheets that are included after the standard style sheets @@ -1121,7 +1121,7 @@ HTML_STYLESHEET = # list). For an example see the documentation. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_EXTRA_STYLESHEET = +HTML_EXTRA_STYLESHEET = # The HTML_EXTRA_FILES tag can be used to specify one or more extra images or # other source files which should be copied to the HTML output directory. Note @@ -1131,7 +1131,7 @@ HTML_EXTRA_STYLESHEET = # files will be copied as-is; there are no commands or markers available. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_EXTRA_FILES = +HTML_EXTRA_FILES = # The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. Doxygen # will adjust the colors in the style sheet and background images according to @@ -1260,7 +1260,7 @@ GENERATE_HTMLHELP = NO # written to the html output directory. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. -CHM_FILE = +CHM_FILE = # The HHC_LOCATION tag can be used to specify the location (absolute path # including file name) of the HTML help compiler (hhc.exe). If non-empty, @@ -1268,7 +1268,7 @@ CHM_FILE = # The file has to be specified with full path. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. -HHC_LOCATION = +HHC_LOCATION = # The GENERATE_CHI flag controls if a separate .chi index file is generated # (YES) or that it should be included in the master .chm file (NO). @@ -1281,7 +1281,7 @@ GENERATE_CHI = NO # and project file content. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. -CHM_INDEX_ENCODING = +CHM_INDEX_ENCODING = # The BINARY_TOC flag controls whether a binary table of contents is generated # (YES) or a normal table of contents (NO) in the .chm file. Furthermore it @@ -1312,7 +1312,7 @@ GENERATE_QHP = NO # the HTML output folder. # This tag requires that the tag GENERATE_QHP is set to YES. -QCH_FILE = +QCH_FILE = # The QHP_NAMESPACE tag specifies the namespace to use when generating Qt Help # Project output. For more information please see Qt Help Project / Namespace @@ -1337,7 +1337,7 @@ QHP_VIRTUAL_FOLDER = doc # filters). # This tag requires that the tag GENERATE_QHP is set to YES. -QHP_CUST_FILTER_NAME = +QHP_CUST_FILTER_NAME = # The QHP_CUST_FILTER_ATTRS tag specifies the list of the attributes of the # custom filter to add. For more information please see Qt Help Project / Custom @@ -1345,21 +1345,21 @@ QHP_CUST_FILTER_NAME = # filters). # This tag requires that the tag GENERATE_QHP is set to YES. -QHP_CUST_FILTER_ATTRS = +QHP_CUST_FILTER_ATTRS = # The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this # project's filter section matches. Qt Help Project / Filter Attributes (see: # http://qt-project.org/doc/qt-4.8/qthelpproject.html#filter-attributes). # This tag requires that the tag GENERATE_QHP is set to YES. -QHP_SECT_FILTER_ATTRS = +QHP_SECT_FILTER_ATTRS = # The QHG_LOCATION tag can be used to specify the location of Qt's # qhelpgenerator. If non-empty doxygen will try to run qhelpgenerator on the # generated .qhp file. # This tag requires that the tag GENERATE_QHP is set to YES. -QHG_LOCATION = +QHG_LOCATION = # If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files will be # generated, together with the HTML files, they form an Eclipse help plugin. To @@ -1492,7 +1492,7 @@ MATHJAX_RELPATH = http://cdn.mathjax.org/mathjax/latest # MATHJAX_EXTENSIONS = TeX/AMSmath TeX/AMSsymbols # This tag requires that the tag USE_MATHJAX is set to YES. -MATHJAX_EXTENSIONS = +MATHJAX_EXTENSIONS = # The MATHJAX_CODEFILE tag can be used to specify a file with javascript pieces # of code that will be used on startup of the MathJax code. See the MathJax site @@ -1500,7 +1500,7 @@ MATHJAX_EXTENSIONS = # example see the documentation. # This tag requires that the tag USE_MATHJAX is set to YES. -MATHJAX_CODEFILE = +MATHJAX_CODEFILE = # When the SEARCHENGINE tag is enabled doxygen will generate a search box for # the HTML output. The underlying search engine uses javascript and DHTML and @@ -1560,7 +1560,7 @@ EXTERNAL_SEARCH = NO # Searching" for details. # This tag requires that the tag SEARCHENGINE is set to YES. -SEARCHENGINE_URL = +SEARCHENGINE_URL = # When SERVER_BASED_SEARCH and EXTERNAL_SEARCH are both enabled the unindexed # search data is written to a file for indexing by an external tool. With the @@ -1576,7 +1576,7 @@ SEARCHDATA_FILE = searchdata.xml # projects and redirect the results back to the right project. # This tag requires that the tag SEARCHENGINE is set to YES. -EXTERNAL_SEARCH_ID = +EXTERNAL_SEARCH_ID = # The EXTRA_SEARCH_MAPPINGS tag can be used to enable searching through doxygen # projects other than the one defined by this configuration file, but that are @@ -1586,7 +1586,7 @@ EXTERNAL_SEARCH_ID = # EXTRA_SEARCH_MAPPINGS = tagname1=loc1 tagname2=loc2 ... # This tag requires that the tag SEARCHENGINE is set to YES. -EXTRA_SEARCH_MAPPINGS = +EXTRA_SEARCH_MAPPINGS = #--------------------------------------------------------------------------- # Configuration options related to the LaTeX output @@ -1650,7 +1650,7 @@ PAPER_TYPE = a4 # If left blank no extra packages will be included. # This tag requires that the tag GENERATE_LATEX is set to YES. -EXTRA_PACKAGES = +EXTRA_PACKAGES = # The LATEX_HEADER tag can be used to specify a personal LaTeX header for the # generated LaTeX document. The header should contain everything until the first @@ -1666,7 +1666,7 @@ EXTRA_PACKAGES = # to HTML_HEADER. # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_HEADER = +LATEX_HEADER = # The LATEX_FOOTER tag can be used to specify a personal LaTeX footer for the # generated LaTeX document. The footer should contain everything after the last @@ -1677,7 +1677,7 @@ LATEX_HEADER = # Note: Only use a user-defined footer if you know what you are doing! # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_FOOTER = +LATEX_FOOTER = # The LATEX_EXTRA_STYLESHEET tag can be used to specify additional user-defined # LaTeX style sheets that are included after the standard style sheets created @@ -1688,7 +1688,7 @@ LATEX_FOOTER = # list). # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_EXTRA_STYLESHEET = +LATEX_EXTRA_STYLESHEET = # The LATEX_EXTRA_FILES tag can be used to specify one or more extra images or # other source files which should be copied to the LATEX_OUTPUT output @@ -1696,7 +1696,7 @@ LATEX_EXTRA_STYLESHEET = # markers available. # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_EXTRA_FILES = +LATEX_EXTRA_FILES = # If the PDF_HYPERLINKS tag is set to YES, the LaTeX that is generated is # prepared for conversion to PDF (using ps2pdf or pdflatex). The PDF file will @@ -1796,14 +1796,14 @@ RTF_HYPERLINKS = NO # default style sheet that doxygen normally uses. # This tag requires that the tag GENERATE_RTF is set to YES. -RTF_STYLESHEET_FILE = +RTF_STYLESHEET_FILE = # Set optional variables used in the generation of an RTF document. Syntax is # similar to doxygen's config file. A template extensions file can be generated # using doxygen -e rtf extensionFile. # This tag requires that the tag GENERATE_RTF is set to YES. -RTF_EXTENSIONS_FILE = +RTF_EXTENSIONS_FILE = # If the RTF_SOURCE_CODE tag is set to YES then doxygen will include source code # with syntax highlighting in the RTF output. @@ -1848,7 +1848,7 @@ MAN_EXTENSION = .3 # MAN_EXTENSION with the initial . removed. # This tag requires that the tag GENERATE_MAN is set to YES. -MAN_SUBDIR = +MAN_SUBDIR = # If the MAN_LINKS tag is set to YES and doxygen generates man output, then it # will generate one additional man file for each entity documented in the real @@ -1961,7 +1961,7 @@ PERLMOD_PRETTY = YES # overwrite each other's variables. # This tag requires that the tag GENERATE_PERLMOD is set to YES. -PERLMOD_MAKEVAR_PREFIX = +PERLMOD_MAKEVAR_PREFIX = #--------------------------------------------------------------------------- # Configuration options related to the preprocessor @@ -2002,7 +2002,7 @@ SEARCH_INCLUDES = YES # preprocessor. # This tag requires that the tag SEARCH_INCLUDES is set to YES. -INCLUDE_PATH = +INCLUDE_PATH = # You can use the INCLUDE_FILE_PATTERNS tag to specify one or more wildcard # patterns (like *.h and *.hpp) to filter out the header-files in the @@ -2010,7 +2010,7 @@ INCLUDE_PATH = # used. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES. -INCLUDE_FILE_PATTERNS = +INCLUDE_FILE_PATTERNS = # The PREDEFINED tag can be used to specify one or more macro names that are # defined before the preprocessor is started (similar to the -D option of e.g. @@ -2020,7 +2020,7 @@ INCLUDE_FILE_PATTERNS = # recursively expanded use the := operator instead of the = operator. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES. -PREDEFINED = +PREDEFINED = # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # tag can be used to specify a list of macro names that should be expanded. The @@ -2029,7 +2029,7 @@ PREDEFINED = # definition found in the source code. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES. -EXPAND_AS_DEFINED = +EXPAND_AS_DEFINED = # If the SKIP_FUNCTION_MACROS tag is set to YES then doxygen's preprocessor will # remove all references to function-like macros that are alone on a line, have @@ -2058,13 +2058,13 @@ SKIP_FUNCTION_MACROS = YES # the path). If a tag file is not located in the directory in which doxygen is # run, you must also specify the path to the tagfile here. -TAGFILES = +TAGFILES = # When a file name is specified after GENERATE_TAGFILE, doxygen will create a # tag file that is based on the input files it reads. See section "Linking to # external documentation" for more information about the usage of tag files. -GENERATE_TAGFILE = +GENERATE_TAGFILE = # If the ALLEXTERNALS tag is set to YES, all external class will be listed in # the class index. If set to NO, only the inherited external classes will be @@ -2113,14 +2113,14 @@ CLASS_DIAGRAMS = YES # the mscgen tool resides. If left empty the tool is assumed to be found in the # default search path. -MSCGEN_PATH = +MSCGEN_PATH = # You can include diagrams made with dia in doxygen documentation. Doxygen will # then run dia to produce the diagram and insert it in the documentation. The # DIA_PATH tag allows you to specify the directory where the dia binary resides. # If left empty dia is assumed to be found in the default search path. -DIA_PATH = +DIA_PATH = # If set to YES the inheritance and collaboration graphs will hide inheritance # and usage relations if the target is undocumented or is not a class. @@ -2169,7 +2169,7 @@ DOT_FONTSIZE = 10 # the path where dot can find it using this tag. # This tag requires that the tag HAVE_DOT is set to YES. -DOT_FONTPATH = +DOT_FONTPATH = # If the CLASS_GRAPH tag is set to YES then doxygen will generate a graph for # each documented class showing the direct and indirect inheritance relations. @@ -2313,26 +2313,26 @@ INTERACTIVE_SVG = NO # found. If left blank, it is assumed the dot tool can be found in the path. # This tag requires that the tag HAVE_DOT is set to YES. -DOT_PATH = +DOT_PATH = # The DOTFILE_DIRS tag can be used to specify one or more directories that # contain dot files that are included in the documentation (see the \dotfile # command). # This tag requires that the tag HAVE_DOT is set to YES. -DOTFILE_DIRS = +DOTFILE_DIRS = # The MSCFILE_DIRS tag can be used to specify one or more directories that # contain msc files that are included in the documentation (see the \mscfile # command). -MSCFILE_DIRS = +MSCFILE_DIRS = # The DIAFILE_DIRS tag can be used to specify one or more directories that # contain dia files that are included in the documentation (see the \diafile # command). -DIAFILE_DIRS = +DIAFILE_DIRS = # When using plantuml, the PLANTUML_JAR_PATH tag should be used to specify the # path where java can find the plantuml.jar file. If left blank, it is assumed @@ -2340,12 +2340,12 @@ DIAFILE_DIRS = # generate a warning when it encounters a \startuml command in this case and # will not generate output for the diagram. -PLANTUML_JAR_PATH = +PLANTUML_JAR_PATH = # When using plantuml, the specified paths are searched for files specified by # the !include statement in a plantuml block. -PLANTUML_INCLUDE_PATH = +PLANTUML_INCLUDE_PATH = # The DOT_GRAPH_MAX_NODES tag can be used to set the maximum number of nodes # that will be shown in the graph. If the number of nodes in a graph becomes ===================================== docs/Makefile.in ===================================== @@ -303,6 +303,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== examples/F77/Makefile.in ===================================== @@ -560,6 +560,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== examples/F90/Makefile.in ===================================== @@ -624,6 +624,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== examples/Makefile.in ===================================== @@ -325,6 +325,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== fortran/Makefile.in ===================================== @@ -494,6 +494,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== libsrc/Makefile.in ===================================== @@ -267,6 +267,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== ltmain.sh ===================================== @@ -31,7 +31,7 @@ PROGRAM=libtool PACKAGE=libtool -VERSION=2.4.6 +VERSION="2.4.6 Debian-2.4.6-10" package_revision=2.4.6 @@ -1370,7 +1370,7 @@ func_lt_ver () #! /bin/sh # Set a version string for this script. -scriptversion=2014-01-07.03; # UTC +scriptversion=2015-10-07.11; # UTC # A portable, pluggable option parser for Bourne shell. # Written by Gary V. Vaughan, 2010 @@ -1530,6 +1530,8 @@ func_run_hooks () { $debug_cmd + _G_rc_run_hooks=false + case " $hookable_fns " in *" $1 "*) ;; *) func_fatal_error "'$1' does not support hook funcions.n" ;; @@ -1538,16 +1540,16 @@ func_run_hooks () eval _G_hook_fns=\$$1_hooks; shift for _G_hook in $_G_hook_fns; do - eval $_G_hook '"$@"' - - # store returned options list back into positional - # parameters for next 'cmd' execution. - eval _G_hook_result=\$${_G_hook}_result - eval set dummy "$_G_hook_result"; shift + if eval $_G_hook '"$@"'; then + # store returned options list back into positional + # parameters for next 'cmd' execution. + eval _G_hook_result=\$${_G_hook}_result + eval set dummy "$_G_hook_result"; shift + _G_rc_run_hooks=: + fi done - func_quote_for_eval ${1+"$@"} - func_run_hooks_result=$func_quote_for_eval_result + $_G_rc_run_hooks && func_run_hooks_result=$_G_hook_result } @@ -1557,10 +1559,16 @@ func_run_hooks () ## --------------- ## # In order to add your own option parsing hooks, you must accept the -# full positional parameter list in your hook function, remove any -# options that you action, and then pass back the remaining unprocessed +# full positional parameter list in your hook function, you may remove/edit +# any options that you action, and then pass back the remaining unprocessed # options in '_result', escaped suitably for -# 'eval'. Like this: +# 'eval'. In this case you also must return $EXIT_SUCCESS to let the +# hook's caller know that it should pay attention to +# '_result'. Returning $EXIT_FAILURE signalizes that +# arguments are left untouched by the hook and therefore caller will ignore the +# result variable. +# +# Like this: # # my_options_prep () # { @@ -1570,9 +1578,11 @@ func_run_hooks () # usage_message=$usage_message' # -s, --silent don'\''t print informational messages # ' -# -# func_quote_for_eval ${1+"$@"} -# my_options_prep_result=$func_quote_for_eval_result +# # No change in '$@' (ignored completely by this hook). There is +# # no need to do the equivalent (but slower) action: +# # func_quote_for_eval ${1+"$@"} +# # my_options_prep_result=$func_quote_for_eval_result +# false # } # func_add_hook func_options_prep my_options_prep # @@ -1581,25 +1591,37 @@ func_run_hooks () # { # $debug_cmd # +# args_changed=false +# # # Note that for efficiency, we parse as many options as we can # # recognise in a loop before passing the remainder back to the # # caller on the first unrecognised argument we encounter. # while test $# -gt 0; do # opt=$1; shift # case $opt in -# --silent|-s) opt_silent=: ;; +# --silent|-s) opt_silent=: +# args_changed=: +# ;; # # Separate non-argument short options: # -s*) func_split_short_opt "$_G_opt" # set dummy "$func_split_short_opt_name" \ # "-$func_split_short_opt_arg" ${1+"$@"} # shift +# args_changed=: # ;; -# *) set dummy "$_G_opt" "$*"; shift; break ;; +# *) # Make sure the first unrecognised option "$_G_opt" +# # is added back to "$@", we could need that later +# # if $args_changed is true. +# set dummy "$_G_opt" ${1+"$@"}; shift; break ;; # esac # done # -# func_quote_for_eval ${1+"$@"} -# my_silent_option_result=$func_quote_for_eval_result +# if $args_changed; then +# func_quote_for_eval ${1+"$@"} +# my_silent_option_result=$func_quote_for_eval_result +# fi +# +# $args_changed # } # func_add_hook func_parse_options my_silent_option # @@ -1611,16 +1633,32 @@ func_run_hooks () # $opt_silent && $opt_verbose && func_fatal_help "\ # '--silent' and '--verbose' options are mutually exclusive." # -# func_quote_for_eval ${1+"$@"} -# my_option_validation_result=$func_quote_for_eval_result +# false # } # func_add_hook func_validate_options my_option_validation # -# You'll alse need to manually amend $usage_message to reflect the extra +# You'll also need to manually amend $usage_message to reflect the extra # options you parse. It's preferable to append if you can, so that # multiple option parsing hooks can be added safely. +# func_options_finish [ARG]... +# ---------------------------- +# Finishing the option parse loop (call 'func_options' hooks ATM). +func_options_finish () +{ + $debug_cmd + + _G_func_options_finish_exit=false + if func_run_hooks func_options ${1+"$@"}; then + func_options_finish_result=$func_run_hooks_result + _G_func_options_finish_exit=: + fi + + $_G_func_options_finish_exit +} + + # func_options [ARG]... # --------------------- # All the functions called inside func_options are hookable. See the @@ -1630,17 +1668,28 @@ func_options () { $debug_cmd - func_options_prep ${1+"$@"} - eval func_parse_options \ - ${func_options_prep_result+"$func_options_prep_result"} - eval func_validate_options \ - ${func_parse_options_result+"$func_parse_options_result"} + _G_rc_options=false + + for my_func in options_prep parse_options validate_options options_finish + do + if eval func_$my_func '${1+"$@"}'; then + eval _G_res_var='$'"func_${my_func}_result" + eval set dummy "$_G_res_var" ; shift + _G_rc_options=: + fi + done - eval func_run_hooks func_options \ - ${func_validate_options_result+"$func_validate_options_result"} + # Save modified positional parameters for caller. As a top-level + # options-parser function we always need to set the 'func_options_result' + # variable (regardless the $_G_rc_options value). + if $_G_rc_options; then + func_options_result=$_G_res_var + else + func_quote_for_eval ${1+"$@"} + func_options_result=$func_quote_for_eval_result + fi - # save modified positional parameters for caller - func_options_result=$func_run_hooks_result + $_G_rc_options } @@ -1649,9 +1698,9 @@ func_options () # All initialisations required before starting the option parse loop. # Note that when calling hook functions, we pass through the list of # positional parameters. If a hook function modifies that list, and -# needs to propogate that back to rest of this script, then the complete +# needs to propagate that back to rest of this script, then the complete # modified list must be put in 'func_run_hooks_result' before -# returning. +# returning $EXIT_SUCCESS (otherwise $EXIT_FAILURE is returned). func_hookable func_options_prep func_options_prep () { @@ -1661,10 +1710,14 @@ func_options_prep () opt_verbose=false opt_warning_types= - func_run_hooks func_options_prep ${1+"$@"} + _G_rc_options_prep=false + if func_run_hooks func_options_prep ${1+"$@"}; then + _G_rc_options_prep=: + # save modified positional parameters for caller + func_options_prep_result=$func_run_hooks_result + fi - # save modified positional parameters for caller - func_options_prep_result=$func_run_hooks_result + $_G_rc_options_prep } @@ -1678,18 +1731,20 @@ func_parse_options () func_parse_options_result= + _G_rc_parse_options=false # this just eases exit handling while test $# -gt 0; do # Defer to hook functions for initial option parsing, so they # get priority in the event of reusing an option name. - func_run_hooks func_parse_options ${1+"$@"} - - # Adjust func_parse_options positional parameters to match - eval set dummy "$func_run_hooks_result"; shift + if func_run_hooks func_parse_options ${1+"$@"}; then + eval set dummy "$func_run_hooks_result"; shift + _G_rc_parse_options=: + fi # Break out of the loop if we already parsed every option. test $# -gt 0 || break + _G_match_parse_options=: _G_opt=$1 shift case $_G_opt in @@ -1704,7 +1759,10 @@ func_parse_options () ;; --warnings|--warning|-W) - test $# = 0 && func_missing_arg $_G_opt && break + if test $# = 0 && func_missing_arg $_G_opt; then + _G_rc_parse_options=: + break + fi case " $warning_categories $1" in *" $1 "*) # trailing space prevents matching last $1 above @@ -1757,15 +1815,25 @@ func_parse_options () shift ;; - --) break ;; + --) _G_rc_parse_options=: ; break ;; -*) func_fatal_help "unrecognised option: '$_G_opt'" ;; - *) set dummy "$_G_opt" ${1+"$@"}; shift; break ;; + *) set dummy "$_G_opt" ${1+"$@"}; shift + _G_match_parse_options=false + break + ;; esac + + $_G_match_parse_options && _G_rc_parse_options=: done - # save modified positional parameters for caller - func_quote_for_eval ${1+"$@"} - func_parse_options_result=$func_quote_for_eval_result + + if $_G_rc_parse_options; then + # save modified positional parameters for caller + func_quote_for_eval ${1+"$@"} + func_parse_options_result=$func_quote_for_eval_result + fi + + $_G_rc_parse_options } @@ -1778,16 +1846,21 @@ func_validate_options () { $debug_cmd + _G_rc_validate_options=false + # Display all warnings if -W was not given. test -n "$opt_warning_types" || opt_warning_types=" $warning_categories" - func_run_hooks func_validate_options ${1+"$@"} + if func_run_hooks func_validate_options ${1+"$@"}; then + # save modified positional parameters for caller + func_validate_options_result=$func_run_hooks_result + _G_rc_validate_options=: + fi # Bail if the options were screwed! $exit_cmd $EXIT_FAILURE - # save modified positional parameters for caller - func_validate_options_result=$func_run_hooks_result + $_G_rc_validate_options } @@ -2068,12 +2141,12 @@ include the following information: compiler: $LTCC compiler flags: $LTCFLAGS linker: $LD (gnu? $with_gnu_ld) - version: $progname (GNU libtool) 2.4.6 + version: $progname $scriptversion Debian-2.4.6-10 automake: `($AUTOMAKE --version) 2>/dev/null |$SED 1q` autoconf: `($AUTOCONF --version) 2>/dev/null |$SED 1q` Report bugs to . -GNU libtool home page: . +GNU libtool home page: . General help using GNU software: ." exit 0 } @@ -2270,6 +2343,8 @@ libtool_options_prep () nonopt= preserve_args= + _G_rc_lt_options_prep=: + # Shorthand for --mode=foo, only valid as the first argument case $1 in clean|clea|cle|cl) @@ -2293,11 +2368,18 @@ libtool_options_prep () uninstall|uninstal|uninsta|uninst|unins|unin|uni|un|u) shift; set dummy --mode uninstall ${1+"$@"}; shift ;; + *) + _G_rc_lt_options_prep=false + ;; esac - # Pass back the list of options. - func_quote_for_eval ${1+"$@"} - libtool_options_prep_result=$func_quote_for_eval_result + if $_G_rc_lt_options_prep; then + # Pass back the list of options. + func_quote_for_eval ${1+"$@"} + libtool_options_prep_result=$func_quote_for_eval_result + fi + + $_G_rc_lt_options_prep } func_add_hook func_options_prep libtool_options_prep @@ -2309,9 +2391,12 @@ libtool_parse_options () { $debug_cmd + _G_rc_lt_parse_options=false + # Perform our own loop to consume as many options as possible in # each iteration. while test $# -gt 0; do + _G_match_lt_parse_options=: _G_opt=$1 shift case $_G_opt in @@ -2386,15 +2471,22 @@ libtool_parse_options () func_append preserve_args " $_G_opt" ;; - # An option not handled by this hook function: - *) set dummy "$_G_opt" ${1+"$@"}; shift; break ;; + # An option not handled by this hook function: + *) set dummy "$_G_opt" ${1+"$@"} ; shift + _G_match_lt_parse_options=false + break + ;; esac + $_G_match_lt_parse_options && _G_rc_lt_parse_options=: done + if $_G_rc_lt_parse_options; then + # save modified positional parameters for caller + func_quote_for_eval ${1+"$@"} + libtool_parse_options_result=$func_quote_for_eval_result + fi - # save modified positional parameters for caller - func_quote_for_eval ${1+"$@"} - libtool_parse_options_result=$func_quote_for_eval_result + $_G_rc_lt_parse_options } func_add_hook func_parse_options libtool_parse_options @@ -7272,10 +7364,14 @@ func_mode_link () # -tp=* Portland pgcc target processor selection # --sysroot=* for sysroot support # -O*, -g*, -flto*, -fwhopr*, -fuse-linker-plugin GCC link-time optimization + # -specs=* GCC specs files # -stdlib=* select c++ std lib with clang + # -fsanitize=* Clang/GCC memory and address sanitizer + # -fuse-ld=* Linker select flags for GCC -64|-mips[0-9]|-r[0-9][0-9]*|-xarch=*|-xtarget=*|+DA*|+DD*|-q*|-m*| \ -t[45]*|-txscale*|-p|-pg|--coverage|-fprofile-*|-F*|@*|-tp=*|--sysroot=*| \ - -O*|-g*|-flto*|-fwhopr*|-fuse-linker-plugin|-fstack-protector*|-stdlib=*) + -O*|-g*|-flto*|-fwhopr*|-fuse-linker-plugin|-fstack-protector*|-stdlib=*| \ + -specs=*|-fsanitize=*|-fuse-ld=*) func_quote_for_eval "$arg" arg=$func_quote_for_eval_result func_append compile_command " $arg" @@ -7568,7 +7664,10 @@ func_mode_link () case $pass in dlopen) libs=$dlfiles ;; dlpreopen) libs=$dlprefiles ;; - link) libs="$deplibs %DEPLIBS% $dependency_libs" ;; + link) + libs="$deplibs %DEPLIBS%" + test "X$link_all_deplibs" != Xno && libs="$libs $dependency_libs" + ;; esac fi if test lib,dlpreopen = "$linkmode,$pass"; then @@ -7887,19 +7986,19 @@ func_mode_link () # It is a libtool convenience library, so add in its objects. func_append convenience " $ladir/$objdir/$old_library" func_append old_convenience " $ladir/$objdir/$old_library" + tmp_libs= + for deplib in $dependency_libs; do + deplibs="$deplib $deplibs" + if $opt_preserve_dup_deps; then + case "$tmp_libs " in + *" $deplib "*) func_append specialdeplibs " $deplib" ;; + esac + fi + func_append tmp_libs " $deplib" + done elif test prog != "$linkmode" && test lib != "$linkmode"; then func_fatal_error "'$lib' is not a convenience library" fi - tmp_libs= - for deplib in $dependency_libs; do - deplibs="$deplib $deplibs" - if $opt_preserve_dup_deps; then - case "$tmp_libs " in - *" $deplib "*) func_append specialdeplibs " $deplib" ;; - esac - fi - func_append tmp_libs " $deplib" - done continue fi # $pass = conv @@ -8823,6 +8922,9 @@ func_mode_link () revision=$number_minor lt_irix_increment=no ;; + *) + func_fatal_configuration "$modename: unknown library version type '$version_type'" + ;; esac ;; no) ===================================== m4/libtool.m4 ===================================== @@ -728,7 +728,6 @@ _LT_CONFIG_SAVE_COMMANDS([ cat <<_LT_EOF >> "$cfgfile" #! $SHELL # Generated automatically by $as_me ($PACKAGE) $VERSION -# Libtool was configured on host `(hostname || uname -n) 2>/dev/null | sed 1q`: # NOTE: Changes made to this file will be lost: look at ltmain.sh. # Provide generalized library-building support services. @@ -2887,6 +2886,18 @@ linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -3546,7 +3557,7 @@ linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) lt_cv_deplibs_check_method=pass_all ;; -netbsd*) +netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then lt_cv_deplibs_check_method='match_pattern /lib[[^/]]+(\.so\.[[0-9]]+\.[[0-9]]+|_pic\.a)$' else @@ -4052,7 +4063,8 @@ _LT_EOF if AC_TRY_EVAL(ac_compile); then # Now try to grab the symbols. nlist=conftest.nm - if AC_TRY_EVAL(NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist) && test -s "$nlist"; then + $ECHO "$as_me:$LINENO: $NM conftest.$ac_objext | $lt_cv_sys_global_symbol_pipe > $nlist" >&AS_MESSAGE_LOG_FD + if eval "$NM" conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist 2>&AS_MESSAGE_LOG_FD && test -s "$nlist"; then # Try sorting and uniquifying the output. if sort "$nlist" | uniq > "$nlist"T; then mv -f "$nlist"T "$nlist" @@ -4424,7 +4436,7 @@ m4_if([$1], [CXX], [ ;; esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) ;; *qnx* | *nto*) # QNX uses GNU C++, but need to define -shared option too, otherwise @@ -4692,6 +4704,12 @@ m4_if([$1], [CXX], [ _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -4936,6 +4954,9 @@ m4_if([$1], [CXX], [ ;; esac ;; + linux* | k*bsd*-gnu | gnu*) + _LT_TAGVAR(link_all_deplibs, $1)=no + ;; *) _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED '\''s/.* //'\'' | sort | uniq > $export_symbols' ;; @@ -4998,6 +5019,9 @@ dnl Note also adjust exclude_expsyms for C++ above. openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + _LT_TAGVAR(link_all_deplibs, $1)=no + ;; esac _LT_TAGVAR(ld_shlibs, $1)=yes @@ -5252,7 +5276,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -5773,6 +5797,7 @@ _LT_EOF if test yes = "$lt_cv_irix_exported_symbol"; then _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + _LT_TAGVAR(link_all_deplibs, $1)=no else _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -5794,7 +5819,7 @@ _LT_EOF esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -6420,7 +6445,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' else GXX=no @@ -6795,7 +6820,7 @@ if test yes != "$_lt_caught_CXX_error"; then # explicitly linking system object files so we need to strip them # from the output so that they don't get included in the library # dependencies. - output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $EGREP "\-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $EGREP " \-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' ;; *) if test yes = "$GXX"; then @@ -6860,7 +6885,7 @@ if test yes != "$_lt_caught_CXX_error"; then # explicitly linking system object files so we need to strip them # from the output so that they don't get included in the library # dependencies. - output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $GREP "\-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $GREP " \-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' ;; *) if test yes = "$GXX"; then @@ -7199,7 +7224,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' else # FIXME: insert proper C++ library support @@ -7283,7 +7308,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' else # g++ 2.7 appears to require '-G' NOT '-shared' on this # platform. @@ -7294,7 +7319,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -G $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -G $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' fi _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-R $wl$libdir' ===================================== nf03_test/Makefile.in ===================================== @@ -547,6 +547,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== nf03_test4/CMakeLists.txt ===================================== @@ -49,10 +49,10 @@ if (USE_NETCDF4) # This is the netCDF-4 F90 large file test. IF (LARGE_FILE_TESTS) - SET(nc4_largefile_PROGRAMS tst_flarge) + SET(nc4_largefile_PROGRAMS f90tst_flarge) SET(check_PROGRAMS ${check_PROGRAMS} ${nc4_largefile_PROGRAMS}) SET(TESTS ${TESTS} ${nc4_largefile_PROGRAMS}) - SET(tst_flarge_SOURCES tst_flarge.f90) + SET(f90tst_flarge_SOURCES f90tst_flarge.f90) ENDIF(LARGE_FILE_TESTS) # This is an f90 benchmark. ===================================== nf03_test4/Makefile.am ===================================== @@ -12,7 +12,7 @@ AM_FCFLAGS = -I$(top_builddir)/fortran # All tests need to link to fortran library. LDADD = ${top_builddir}/fortran/libnetcdff.la -# tst_f90_nc4 +# tst_f90_nc4 NC4_F90_TESTS = f90tst_vars f90tst_vars_vlen f90tst_grps f90tst_fill \ f90tst_fill2 f90tst_vars2 f90tst_vars3 f90tst_vars4 f90tst_path \ f90tst_rengrps f90tst_nc4 f90tst_types f90tst_types2 @@ -65,7 +65,7 @@ endif # TEST_PARALLEL # check-valgrind target, which runs all tests with valgrind. @VALGRIND_CHECK_RULES@ -EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh +EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh f90tst_flarge.f90 # Cleaning up files created during the testing. CLEANFILES = f90tst_*.nc fort.* ===================================== nf03_test4/Makefile.in ===================================== @@ -686,6 +686,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ @@ -707,7 +708,7 @@ AM_FCFLAGS = -I$(top_builddir)/fortran # All tests need to link to fortran library. LDADD = ${top_builddir}/fortran/libnetcdff.la -# tst_f90_nc4 +# tst_f90_nc4 NC4_F90_TESTS = f90tst_vars f90tst_vars_vlen f90tst_grps f90tst_fill \ f90tst_fill2 f90tst_vars2 f90tst_vars3 f90tst_vars4 f90tst_path \ f90tst_rengrps f90tst_nc4 f90tst_types f90tst_types2 @@ -734,7 +735,7 @@ f90tst_types2_SOURCES = f90tst_types2.f90 @TEST_PARALLEL_TRUE at f90tst_parallel3_SOURCES = f90tst_parallel3.F90 @TEST_PARALLEL_TRUE at f90tst_nc4_par_SOURCES = f90tst_nc4_par.F90 @TEST_PARALLEL_TRUE at f90tst_parallel_fill_SOURCES = f90tst_parallel_fill.f90 -EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh +EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh f90tst_flarge.f90 # Cleaning up files created during the testing. CLEANFILES = f90tst_*.nc fort.* ===================================== nf_test/Makefile.in ===================================== @@ -644,6 +644,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== nf_test4/Makefile.in ===================================== @@ -731,6 +731,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/compare/ded57453dfda30c4b3b33adb04b9190bc9d3ae76...36b6a33ae397bf6461d2123295fe7f6e0aa79b68 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/compare/ded57453dfda30c4b3b33adb04b9190bc9d3ae76...36b6a33ae397bf6461d2123295fe7f6e0aa79b68 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 06:19:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 05:19:17 +0000 Subject: [Git][debian-gis-team/netcdf-fortran][upstream] New upstream version 4.5.2+ds Message-ID: <5d830fd5d0769_73482ad95d91037815389e5@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / netcdf-fortran Commits: 32dff86c by Bas Couwenberg at 2019-09-19T05:07:49Z New upstream version 4.5.2+ds - - - - - 24 changed files: - .gitignore - CMakeExtras/Makefile.in - CMakeLists.txt - Makefile.in - RELEASE_NOTES.md - config.guess - config.sub - configure - configure.ac - docs/Doxyfile.developer - docs/Makefile.in - examples/F77/Makefile.in - examples/F90/Makefile.in - examples/Makefile.in - fortran/Makefile.in - libsrc/Makefile.in - ltmain.sh - m4/libtool.m4 - nf03_test/Makefile.in - nf03_test4/CMakeLists.txt - nf03_test4/Makefile.am - nf03_test4/Makefile.in - nf_test/Makefile.in - nf_test4/Makefile.in Changes: ===================================== .gitignore ===================================== @@ -3,3 +3,6 @@ build* \#.\# *.*~ html +*.o +*.tmp +*.tmp2 \ No newline at end of file ===================================== CMakeExtras/Makefile.in ===================================== @@ -267,6 +267,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== CMakeLists.txt ===================================== @@ -17,7 +17,7 @@ set(PACKAGE "${NC4F_CTEST_PROJECT_NAME}" CACHE STRING "") #Project Version SET(NC4F_VERSION_MAJOR 4) SET(NC4F_VERSION_MINOR 5) -SET(NC4F_VERSION_PATCH 1) +SET(NC4F_VERSION_PATCH 2) SET(NC4F_VERSION_NOTE "") SET(NC4F_VERSION ${NC4F_VERSION_MAJOR}.${NC4F_VERSION_MINOR}.${NC4F_VERSION_PATCH}${NC4F_VERSION_NOTE}) SET(VERSION ${NC4F_VERSION}) @@ -394,7 +394,8 @@ MACRO(build_bin_test F ext) ENDIF() ENDMACRO() -OPTION(LARGE_FILE_TESTS "Run large file tests, which are slow and take lots of disk." OFF) +OPTION(ENABLE_LARGE_FILE_TESTS "Run large file tests, which are slow and take lots of disk." OFF) +SET(LARGE_FILE_TESTS ${ENABLE_LARGE_FILE_TESTS}) OPTION(BUILD_BENCHMARKS "Run F90 I/O Benchmarks" OFF) OPTION(TEST_WITH_VALGRIND "Run extra tests with valgrind" OFF) OPTION(ENABLE_PARALLEL_TESTS "Run parallel I/O tests for F90 and F77" OFF) @@ -540,7 +541,7 @@ ENDIF() CHECK_FUNCTION_EXISTS(alloca HAVE_ALLOCA) -CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_def_opaque "" USE_NETCDF4) +CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_get_chunk_cache_ints "" USE_NETCDF4) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nccreate "" USE_NETCDF_V2) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} nc_set_log_level "" USE_LOGGING) CHECK_LIBRARY_EXISTS(${NETCDF_C_LIBRARY} oc_open "" BUILD_DAP) ===================================== Makefile.in ===================================== @@ -382,6 +382,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== RELEASE_NOTES.md ===================================== @@ -6,7 +6,16 @@ Release Notes {#nf_release_notes} This file contains a high-level description of this package's evolution. Entries are in reverse chronological order (most recent first). -## 4.5.2 - TBD +## 4.5.2 - September 18, 2019 + +### Requirements + +* netCDF-C: 4.6.0 or greater + +### Changes + +* Corrected an issue where netCDF-Fortran would fail to build correctly on some platforms when the underlying `libnetcdf` lacked netCDF-4 support. See [GitHub #200](https://github.com/Unidata/netcdf-fortran/issues/200) for more information. +* Corrected an issue where cmake-specific large file tests weren't being captured by `make dist`. See [Github #198](https://github.com/Unidata/netcdf-fortran/issues/198) for more details. ## 4.5.1 - September 4, 2019 ===================================== config.guess ===================================== @@ -2,7 +2,7 @@ # Attempt to guess a canonical system name. # Copyright 1992-2018 Free Software Foundation, Inc. -timestamp='2018-03-08' +timestamp='2018-02-24' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -1046,7 +1046,11 @@ EOF echo "$UNAME_MACHINE"-dec-linux-"$LIBC" exit ;; x86_64:Linux:*:*) - echo "$UNAME_MACHINE"-pc-linux-"$LIBC" + if objdump -f /bin/sh | grep -q elf32-x86-64; then + echo "$UNAME_MACHINE"-pc-linux-"$LIBC"x32 + else + echo "$UNAME_MACHINE"-pc-linux-"$LIBC" + fi exit ;; xtensa*:Linux:*:*) echo "$UNAME_MACHINE"-unknown-linux-"$LIBC" @@ -1469,7 +1473,7 @@ EOF exit 1 # Local variables: -# eval: (add-hook 'before-save-hook 'time-stamp) +# eval: (add-hook 'write-file-functions 'time-stamp) # time-stamp-start: "timestamp='" # time-stamp-format: "%:y-%02m-%02d" # time-stamp-end: "'" ===================================== config.sub ===================================== @@ -2,7 +2,7 @@ # Configuration validation subroutine script. # Copyright 1992-2018 Free Software Foundation, Inc. -timestamp='2018-03-08' +timestamp='2018-02-22' # This file is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by @@ -1376,7 +1376,7 @@ case $os in | -ekkobsd* | -kfreebsd* | -freebsd* | -riscix* | -lynxos* \ | -bosx* | -nextstep* | -cxux* | -aout* | -elf* | -oabi* \ | -ptx* | -coff* | -ecoff* | -winnt* | -domain* | -vsta* \ - | -udi* | -eabi* | -lites* | -ieee* | -go32* | -aux* | -hcos* \ + | -udi* | -eabi* | -lites* | -ieee* | -go32* | -aux* \ | -chorusos* | -chorusrdb* | -cegcc* | -glidix* \ | -cygwin* | -msys* | -pe* | -psos* | -moss* | -proelf* | -rtems* \ | -midipix* | -mingw32* | -mingw64* | -linux-gnu* | -linux-android* \ @@ -1794,7 +1794,7 @@ echo "$basic_machine$os" exit # Local variables: -# eval: (add-hook 'before-save-hook 'time-stamp) +# eval: (add-hook 'write-file-functions 'time-stamp) # time-stamp-start: "timestamp='" # time-stamp-format: "%:y-%02m-%02d" # time-stamp-end: "'" ===================================== configure ===================================== @@ -1,6 +1,6 @@ #! /bin/sh # Guess values for system-dependent variables and create Makefiles. -# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.1. +# Generated by GNU Autoconf 2.69 for netCDF-Fortran 4.5.2. # # Report bugs to . # @@ -590,8 +590,8 @@ MAKEFLAGS= # Identity of this package. PACKAGE_NAME='netCDF-Fortran' PACKAGE_TARNAME='netcdf-fortran' -PACKAGE_VERSION='4.5.1' -PACKAGE_STRING='netCDF-Fortran 4.5.1' +PACKAGE_VERSION='4.5.2' +PACKAGE_STRING='netCDF-Fortran 4.5.2' PACKAGE_BUGREPORT='support-netcdf at unidata.ucar.edu' PACKAGE_URL='' @@ -795,6 +795,7 @@ infodir docdir oldincludedir includedir +runstatedir localstatedir sharedstatedir sysconfdir @@ -898,6 +899,7 @@ datadir='${datarootdir}' sysconfdir='${prefix}/etc' sharedstatedir='${prefix}/com' localstatedir='${prefix}/var' +runstatedir='${localstatedir}/run' includedir='${prefix}/include' oldincludedir='/usr/include' docdir='${datarootdir}/doc/${PACKAGE_TARNAME}' @@ -1150,6 +1152,15 @@ do | -silent | --silent | --silen | --sile | --sil) silent=yes ;; + -runstatedir | --runstatedir | --runstatedi | --runstated \ + | --runstate | --runstat | --runsta | --runst | --runs \ + | --run | --ru | --r) + ac_prev=runstatedir ;; + -runstatedir=* | --runstatedir=* | --runstatedi=* | --runstated=* \ + | --runstate=* | --runstat=* | --runsta=* | --runst=* | --runs=* \ + | --run=* | --ru=* | --r=*) + runstatedir=$ac_optarg ;; + -sbindir | --sbindir | --sbindi | --sbind | --sbin | --sbi | --sb) ac_prev=sbindir ;; -sbindir=* | --sbindir=* | --sbindi=* | --sbind=* | --sbin=* \ @@ -1287,7 +1298,7 @@ fi for ac_var in exec_prefix prefix bindir sbindir libexecdir datarootdir \ datadir sysconfdir sharedstatedir localstatedir includedir \ oldincludedir docdir infodir htmldir dvidir pdfdir psdir \ - libdir localedir mandir + libdir localedir mandir runstatedir do eval ac_val=\$$ac_var # Remove trailing slashes. @@ -1400,7 +1411,7 @@ if test "$ac_init_help" = "long"; then # Omit some internal or obsolete options to make the list less imposing. # This message is too long to be a string in the A/UX 3.1 sh. cat <<_ACEOF -\`configure' configures netCDF-Fortran 4.5.1 to adapt to many kinds of systems. +\`configure' configures netCDF-Fortran 4.5.2 to adapt to many kinds of systems. Usage: $0 [OPTION]... [VAR=VALUE]... @@ -1440,6 +1451,7 @@ Fine tuning of the installation directories: --sysconfdir=DIR read-only single-machine data [PREFIX/etc] --sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com] --localstatedir=DIR modifiable single-machine data [PREFIX/var] + --runstatedir=DIR modifiable per-process data [LOCALSTATEDIR/run] --libdir=DIR object code libraries [EPREFIX/lib] --includedir=DIR C header files [PREFIX/include] --oldincludedir=DIR C header files for non-gcc [/usr/include] @@ -1471,7 +1483,7 @@ fi if test -n "$ac_init_help"; then case $ac_init_help in - short | recursive ) echo "Configuration of netCDF-Fortran 4.5.1:";; + short | recursive ) echo "Configuration of netCDF-Fortran 4.5.2:";; esac cat <<\_ACEOF @@ -1623,7 +1635,7 @@ fi test -n "$ac_init_help" && exit $ac_status if $ac_init_version; then cat <<\_ACEOF -netCDF-Fortran configure 4.5.1 +netCDF-Fortran configure 4.5.2 generated by GNU Autoconf 2.69 Copyright (C) 2012 Free Software Foundation, Inc. @@ -2428,7 +2440,7 @@ cat >config.log <<_ACEOF This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. -It was created by netCDF-Fortran $as_me 4.5.1, which was +It was created by netCDF-Fortran $as_me 4.5.2, which was generated by GNU Autoconf 2.69. Invocation command line was $ $0 $@ @@ -2779,11 +2791,11 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu # Create the VERSION file, which contains the package version from # AC_INIT. -echo -n 4.5.1>VERSION +echo -n 4.5.2>VERSION -{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.1" >&5 -$as_echo "$as_me: netCDF-Fortran 4.5.1" >&6;} +{ $as_echo "$as_me:${as_lineno-$LINENO}: netCDF-Fortran 4.5.2" >&5 +$as_echo "$as_me: netCDF-Fortran 4.5.2" >&6;} # Keep libtool macros in an m4 directory. @@ -3418,7 +3430,7 @@ fi # Define the identity of the package. PACKAGE='netcdf-fortran' - VERSION='4.5.1' + VERSION='4.5.2' cat >>confdefs.h <<_ACEOF @@ -3534,7 +3546,6 @@ fi MAINT=$MAINTAINER_MODE_TRUE - { $as_echo "$as_me:${as_lineno-$LINENO}: checking user options" >&5 $as_echo "$as_me: checking user options" >&6;} @@ -8216,7 +8227,7 @@ linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) lt_cv_deplibs_check_method=pass_all ;; -netbsd*) +netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then lt_cv_deplibs_check_method='match_pattern /lib[^/]+(\.so\.[0-9]+\.[0-9]+|_pic\.a)$' else @@ -9079,11 +9090,8 @@ _LT_EOF test $ac_status = 0; }; then # Now try to grab the symbols. nlist=conftest.nm - if { { eval echo "\"\$as_me\":${as_lineno-$LINENO}: \"$NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist\""; } >&5 - (eval $NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist) 2>&5 - ac_status=$? - $as_echo "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } && test -s "$nlist"; then + $ECHO "$as_me:$LINENO: $NM conftest.$ac_objext | $lt_cv_sys_global_symbol_pipe > $nlist" >&5 + if eval "$NM" conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist 2>&5 && test -s "$nlist"; then # Try sorting and uniquifying the output. if sort "$nlist" | uniq > "$nlist"T; then mv -f "$nlist"T "$nlist" @@ -11443,6 +11451,12 @@ lt_prog_compiler_static= lt_prog_compiler_pic='-KPIC' lt_prog_compiler_static='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + lt_prog_compiler_wl='-Wl,' + lt_prog_compiler_pic='-fPIC' + lt_prog_compiler_static='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -11919,6 +11933,9 @@ $as_echo_n "checking whether the $compiler linker ($LD) supports shared librarie openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + link_all_deplibs=no + ;; esac ld_shlibs=yes @@ -12173,7 +12190,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -12843,6 +12860,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } if test yes = "$lt_cv_irix_exported_symbol"; then archive_expsym_cmds='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + link_all_deplibs=no else archive_cmds='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' archive_expsym_cmds='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -12864,7 +12882,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -13979,6 +13997,18 @@ fi dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -15294,6 +15324,12 @@ lt_prog_compiler_static_F77= lt_prog_compiler_pic_F77='-KPIC' lt_prog_compiler_static_F77='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + lt_prog_compiler_wl_F77='-Wl,' + lt_prog_compiler_pic_F77='-fPIC' + lt_prog_compiler_static_F77='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -15755,6 +15791,9 @@ $as_echo_n "checking whether the $compiler linker ($LD) supports shared librarie openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + link_all_deplibs_F77=no + ;; esac ld_shlibs_F77=yes @@ -16009,7 +16048,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_F77='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -16629,6 +16668,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } if test yes = "$lt_cv_irix_exported_symbol"; then archive_expsym_cmds_F77='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + link_all_deplibs_F77=no else archive_cmds_F77='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' archive_expsym_cmds_F77='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -16650,7 +16690,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_F77='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -17587,6 +17627,18 @@ fi dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -18403,6 +18455,12 @@ lt_prog_compiler_static_FC= lt_prog_compiler_pic_FC='-KPIC' lt_prog_compiler_static_FC='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + lt_prog_compiler_wl_FC='-Wl,' + lt_prog_compiler_pic_FC='-fPIC' + lt_prog_compiler_static_FC='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -18864,6 +18922,9 @@ $as_echo_n "checking whether the $compiler linker ($LD) supports shared librarie openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + link_all_deplibs_FC=no + ;; esac ld_shlibs_FC=yes @@ -19118,7 +19179,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_FC='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -19738,6 +19799,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } if test yes = "$lt_cv_irix_exported_symbol"; then archive_expsym_cmds_FC='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + link_all_deplibs_FC=no else archive_cmds_FC='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' archive_expsym_cmds_FC='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -19759,7 +19821,7 @@ $as_echo "$lt_cv_irix_exported_symbol" >&6; } esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then archive_cmds_FC='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -20696,6 +20758,18 @@ fi dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -21074,7 +21148,7 @@ else We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21120,7 +21194,7 @@ else We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21144,7 +21218,7 @@ rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21189,7 +21263,7 @@ else We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -21213,7 +21287,7 @@ rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext We can't simply define LARGE_OFF_T to be 9223372036854775807, since some C++ compilers masquerading as C compilers incorrectly reject 9223372036854775807. */ -#define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62)) +#define LARGE_OFF_T ((((off_t) 1 << 31) << 31) - 1 + (((off_t) 1 << 31) << 31)) int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721 && LARGE_OFF_T % 2147483647 == 1) ? 1 : -1]; @@ -22765,12 +22839,12 @@ nc_build_v4=no nc_has_logging=no nc_has_dap=no -for ac_func in nc_def_opaque +for ac_func in nc_get_chunk_cache_ints do : - ac_fn_c_check_func "$LINENO" "nc_def_opaque" "ac_cv_func_nc_def_opaque" -if test "x$ac_cv_func_nc_def_opaque" = xyes; then : + ac_fn_c_check_func "$LINENO" "nc_get_chunk_cache_ints" "ac_cv_func_nc_get_chunk_cache_ints" +if test "x$ac_cv_func_nc_get_chunk_cache_ints" = xyes; then : cat >>confdefs.h <<_ACEOF -#define HAVE_NC_DEF_OPAQUE 1 +#define HAVE_NC_GET_CHUNK_CACHE_INTS 1 _ACEOF nc_build_v4=yes else @@ -24641,7 +24715,7 @@ cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 # report actual input values of CONFIG_FILES etc. instead of their # values after options handling. ac_log=" -This file was extended by netCDF-Fortran $as_me 4.5.1, which was +This file was extended by netCDF-Fortran $as_me 4.5.2, which was generated by GNU Autoconf 2.69. Invocation command line was CONFIG_FILES = $CONFIG_FILES @@ -24702,7 +24776,7 @@ _ACEOF cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`" ac_cs_version="\\ -netCDF-Fortran config.status 4.5.1 +netCDF-Fortran config.status 4.5.2 configured by $0, generated by GNU Autoconf 2.69, with options \\"\$ac_cs_config\\" @@ -25875,7 +25949,6 @@ See \`config.log' for more details" "$LINENO" 5; } cat <<_LT_EOF >> "$cfgfile" #! $SHELL # Generated automatically by $as_me ($PACKAGE) $VERSION -# Libtool was configured on host `(hostname || uname -n) 2>/dev/null | sed 1q`: # NOTE: Changes made to this file will be lost: look at ltmain.sh. # Provide generalized library-building support services. ===================================== configure.ac ===================================== @@ -9,7 +9,7 @@ AC_PREREQ([2.59]) # Initialize with name, version, and support email address. -AC_INIT([netCDF-Fortran], [4.5.1], [support-netcdf at unidata.ucar.edu]) +AC_INIT([netCDF-Fortran], [4.5.2], [support-netcdf at unidata.ucar.edu]) # Create the VERSION file, which contains the package version from # AC_INIT. @@ -30,7 +30,6 @@ AC_CANONICAL_TARGET # This call is required by automake. AM_INIT_AUTOMAKE([foreign dist-zip subdir-objects]) AM_MAINTAINER_MODE() - AC_MSG_NOTICE([checking user options]) AC_COMPILE_IFELSE([AC_LANG_PROGRAM([], [[ @@ -396,7 +395,7 @@ nc_build_v4=no nc_has_logging=no nc_has_dap=no -AC_CHECK_FUNCS([nc_def_opaque],[nc_build_v4=yes],[nc_build_v4=no]) +AC_CHECK_FUNCS([nc_get_chunk_cache_ints],[nc_build_v4=yes],[nc_build_v4=no]) test "x$ac_cv_func_nccreate" = xyes && nc_build_v2=yes test "x$ac_cv_func_nc_set_log_level" = xyes && nc_has_logging=yes test "x$ac_cv_func_oc_open" = xyes && nc_has_dap=yes ===================================== docs/Doxyfile.developer ===================================== @@ -38,13 +38,13 @@ PROJECT_NAME = netcdf-fortran # could be handy for archiving the generated documentation or if some version # control system is used. -PROJECT_NUMBER = 4.5.0-Development +PROJECT_NUMBER = 4.5.2 # Using the PROJECT_BRIEF tag one can provide an optional one line description # for a project that appears at the top of each page and should give viewer a # quick idea about the purpose of the project. Keep the description short. -PROJECT_BRIEF = +PROJECT_BRIEF = # With the PROJECT_LOGO tag one can specify a logo or an icon that is included # in the documentation. The maximum height of the logo should not exceed 55 @@ -58,7 +58,7 @@ PROJECT_LOGO = docs/netcdf-50x50.png # entered, it will be relative to the location where doxygen was started. If # left blank the current directory will be used. -OUTPUT_DIRECTORY = +OUTPUT_DIRECTORY = # If the CREATE_SUBDIRS tag is set to YES then doxygen will create 4096 sub- # directories (in 2 levels) under the output directory of each output format and @@ -118,7 +118,7 @@ REPEAT_BRIEF = YES # the entity):The $name class, The $name widget, The $name file, is, provides, # specifies, contains, represents, a, an and the. -ABBREVIATE_BRIEF = +ABBREVIATE_BRIEF = # If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then # doxygen will generate a detailed section even if there is only a brief @@ -152,7 +152,7 @@ FULL_PATH_NAMES = YES # will be relative from the directory where doxygen is started. # This tag requires that the tag FULL_PATH_NAMES is set to YES. -STRIP_FROM_PATH = +STRIP_FROM_PATH = # The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of the # path mentioned in the documentation of a class, which tells the reader which @@ -161,7 +161,7 @@ STRIP_FROM_PATH = # specify the list of include paths that are normally passed to the compiler # using the -I flag. -STRIP_FROM_INC_PATH = +STRIP_FROM_INC_PATH = # If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter (but # less readable) file names. This can be useful is your file systems doesn't @@ -228,13 +228,13 @@ TAB_SIZE = 4 # "Side Effects:". You can put \n's in the value part of an alias to insert # newlines. -ALIASES = +ALIASES = # This tag can be used to specify a number of word-keyword mappings (TCL only). # A mapping has the form "name=value". For example adding "class=itcl::class" # will allow you to use the command class in the itcl::class meaning. -TCL_SUBST = +TCL_SUBST = # Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C sources # only. Doxygen will then generate output that is more tailored for C. For @@ -281,7 +281,7 @@ OPTIMIZE_OUTPUT_VHDL = NO # Note that for custom extensions you also need to set FILE_PATTERNS otherwise # the files are not read by doxygen. -EXTENSION_MAPPING = +EXTENSION_MAPPING = # If the MARKDOWN_SUPPORT tag is enabled then doxygen pre-processes all comments # according to the Markdown format, which allows for more readable @@ -629,7 +629,7 @@ GENERATE_DEPRECATEDLIST= YES # sections, marked by \if ... \endif and \cond # ... \endcond blocks. -ENABLED_SECTIONS = +ENABLED_SECTIONS = # The MAX_INITIALIZER_LINES tag determines the maximum number of lines that the # initial value of a variable or macro / define can have for it to appear in the @@ -671,7 +671,7 @@ SHOW_NAMESPACES = YES # by doxygen. Whatever the program writes to standard output is used as the file # version. For an example see the documentation. -FILE_VERSION_FILTER = +FILE_VERSION_FILTER = # The LAYOUT_FILE tag can be used to specify a layout file which will be parsed # by doxygen. The layout file controls the global structure of the generated @@ -684,7 +684,7 @@ FILE_VERSION_FILTER = # DoxygenLayout.xml, doxygen will parse it automatically even if the LAYOUT_FILE # tag is left empty. -LAYOUT_FILE = +LAYOUT_FILE = # The CITE_BIB_FILES tag can be used to specify one or more bib files containing # the reference definitions. This must be a list of .bib files. The .bib @@ -694,7 +694,7 @@ LAYOUT_FILE = # LATEX_BIB_STYLE. To use this feature you need bibtex and perl available in the # search path. See also \cite for info how to create references. -CITE_BIB_FILES = +CITE_BIB_FILES = #--------------------------------------------------------------------------- # Configuration options related to warning and progress messages @@ -753,7 +753,7 @@ WARN_FORMAT = "$file:$line: $text" # messages should be written. If left blank the output is written to standard # error (stderr). -WARN_LOGFILE = +WARN_LOGFILE = #--------------------------------------------------------------------------- # Configuration options related to the input files @@ -810,7 +810,7 @@ RECURSIVE = YES # Note that relative paths are relative to the directory from which doxygen is # run. -EXCLUDE = +EXCLUDE = # The EXCLUDE_SYMLINKS tag can be used to select whether or not files or # directories that are symbolic links (a Unix file system feature) are excluded @@ -826,7 +826,7 @@ EXCLUDE_SYMLINKS = NO # Note that the wildcards are matched against the file with absolute path, so to # exclude all test directories for example use the pattern */test/* -EXCLUDE_PATTERNS = +EXCLUDE_PATTERNS = # The EXCLUDE_SYMBOLS tag can be used to specify one or more symbol names # (namespaces, classes, functions, etc.) that should be excluded from the @@ -837,20 +837,20 @@ EXCLUDE_PATTERNS = # Note that the wildcards are matched against the file with absolute path, so to # exclude all test directories use the pattern */test/* -EXCLUDE_SYMBOLS = +EXCLUDE_SYMBOLS = # The EXAMPLE_PATH tag can be used to specify one or more files or directories # that contain example code fragments that are included (see the \include # command). -EXAMPLE_PATH = +EXAMPLE_PATH = # If the value of the EXAMPLE_PATH tag contains directories, you can use the # EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp and # *.h) to filter out the source-files in the directories. If left blank all # files are included. -EXAMPLE_PATTERNS = +EXAMPLE_PATTERNS = # If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be # searched for input files to be used with the \include or \dontinclude commands @@ -863,7 +863,7 @@ EXAMPLE_RECURSIVE = NO # that contain images that are to be included in the documentation (see the # \image command). -IMAGE_PATH = +IMAGE_PATH = # The INPUT_FILTER tag can be used to specify a program that doxygen should # invoke to filter for each input file. Doxygen will invoke the filter program @@ -880,7 +880,7 @@ IMAGE_PATH = # code is scanned, but not when the output code is generated. If lines are added # or removed, the anchors will not be placed correctly. -INPUT_FILTER = +INPUT_FILTER = # The FILTER_PATTERNS tag can be used to specify filters on a per file pattern # basis. Doxygen will compare the file name with each pattern and apply the @@ -889,7 +889,7 @@ INPUT_FILTER = # filters are used. If the FILTER_PATTERNS tag is empty or if none of the # patterns match the file name, INPUT_FILTER is applied. -FILTER_PATTERNS = +FILTER_PATTERNS = # If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using # INPUT_FILTER) will also be used to filter the input files that are used for @@ -904,7 +904,7 @@ FILTER_SOURCE_FILES = NO # *.ext= (so without naming a filter). # This tag requires that the tag FILTER_SOURCE_FILES is set to YES. -FILTER_SOURCE_PATTERNS = +FILTER_SOURCE_PATTERNS = # If the USE_MDFILE_AS_MAINPAGE tag refers to the name of a markdown file that # is part of the input, its contents will be placed on the main page @@ -1016,7 +1016,7 @@ CLANG_ASSISTED_PARSING = NO # specified with INPUT and INCLUDE_PATH. # This tag requires that the tag CLANG_ASSISTED_PARSING is set to YES. -CLANG_OPTIONS = +CLANG_OPTIONS = #--------------------------------------------------------------------------- # Configuration options related to the alphabetical class index @@ -1042,7 +1042,7 @@ COLS_IN_ALPHA_INDEX = 5 # while generating the index headers. # This tag requires that the tag ALPHABETICAL_INDEX is set to YES. -IGNORE_PREFIX = +IGNORE_PREFIX = #--------------------------------------------------------------------------- # Configuration options related to the HTML output @@ -1086,7 +1086,7 @@ HTML_FILE_EXTENSION = .html # of the possible markers and block names see the documentation. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_HEADER = +HTML_HEADER = # The HTML_FOOTER tag can be used to specify a user-defined HTML footer for each # generated HTML page. If the tag is left blank doxygen will generate a standard @@ -1096,7 +1096,7 @@ HTML_HEADER = # that doxygen normally uses. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_FOOTER = +HTML_FOOTER = # The HTML_STYLESHEET tag can be used to specify a user-defined cascading style # sheet that is used by each HTML page. It can be used to fine-tune the look of @@ -1108,7 +1108,7 @@ HTML_FOOTER = # obsolete. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_STYLESHEET = +HTML_STYLESHEET = # The HTML_EXTRA_STYLESHEET tag can be used to specify additional user-defined # cascading style sheets that are included after the standard style sheets @@ -1121,7 +1121,7 @@ HTML_STYLESHEET = # list). For an example see the documentation. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_EXTRA_STYLESHEET = +HTML_EXTRA_STYLESHEET = # The HTML_EXTRA_FILES tag can be used to specify one or more extra images or # other source files which should be copied to the HTML output directory. Note @@ -1131,7 +1131,7 @@ HTML_EXTRA_STYLESHEET = # files will be copied as-is; there are no commands or markers available. # This tag requires that the tag GENERATE_HTML is set to YES. -HTML_EXTRA_FILES = +HTML_EXTRA_FILES = # The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. Doxygen # will adjust the colors in the style sheet and background images according to @@ -1260,7 +1260,7 @@ GENERATE_HTMLHELP = NO # written to the html output directory. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. -CHM_FILE = +CHM_FILE = # The HHC_LOCATION tag can be used to specify the location (absolute path # including file name) of the HTML help compiler (hhc.exe). If non-empty, @@ -1268,7 +1268,7 @@ CHM_FILE = # The file has to be specified with full path. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. -HHC_LOCATION = +HHC_LOCATION = # The GENERATE_CHI flag controls if a separate .chi index file is generated # (YES) or that it should be included in the master .chm file (NO). @@ -1281,7 +1281,7 @@ GENERATE_CHI = NO # and project file content. # This tag requires that the tag GENERATE_HTMLHELP is set to YES. -CHM_INDEX_ENCODING = +CHM_INDEX_ENCODING = # The BINARY_TOC flag controls whether a binary table of contents is generated # (YES) or a normal table of contents (NO) in the .chm file. Furthermore it @@ -1312,7 +1312,7 @@ GENERATE_QHP = NO # the HTML output folder. # This tag requires that the tag GENERATE_QHP is set to YES. -QCH_FILE = +QCH_FILE = # The QHP_NAMESPACE tag specifies the namespace to use when generating Qt Help # Project output. For more information please see Qt Help Project / Namespace @@ -1337,7 +1337,7 @@ QHP_VIRTUAL_FOLDER = doc # filters). # This tag requires that the tag GENERATE_QHP is set to YES. -QHP_CUST_FILTER_NAME = +QHP_CUST_FILTER_NAME = # The QHP_CUST_FILTER_ATTRS tag specifies the list of the attributes of the # custom filter to add. For more information please see Qt Help Project / Custom @@ -1345,21 +1345,21 @@ QHP_CUST_FILTER_NAME = # filters). # This tag requires that the tag GENERATE_QHP is set to YES. -QHP_CUST_FILTER_ATTRS = +QHP_CUST_FILTER_ATTRS = # The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this # project's filter section matches. Qt Help Project / Filter Attributes (see: # http://qt-project.org/doc/qt-4.8/qthelpproject.html#filter-attributes). # This tag requires that the tag GENERATE_QHP is set to YES. -QHP_SECT_FILTER_ATTRS = +QHP_SECT_FILTER_ATTRS = # The QHG_LOCATION tag can be used to specify the location of Qt's # qhelpgenerator. If non-empty doxygen will try to run qhelpgenerator on the # generated .qhp file. # This tag requires that the tag GENERATE_QHP is set to YES. -QHG_LOCATION = +QHG_LOCATION = # If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files will be # generated, together with the HTML files, they form an Eclipse help plugin. To @@ -1492,7 +1492,7 @@ MATHJAX_RELPATH = http://cdn.mathjax.org/mathjax/latest # MATHJAX_EXTENSIONS = TeX/AMSmath TeX/AMSsymbols # This tag requires that the tag USE_MATHJAX is set to YES. -MATHJAX_EXTENSIONS = +MATHJAX_EXTENSIONS = # The MATHJAX_CODEFILE tag can be used to specify a file with javascript pieces # of code that will be used on startup of the MathJax code. See the MathJax site @@ -1500,7 +1500,7 @@ MATHJAX_EXTENSIONS = # example see the documentation. # This tag requires that the tag USE_MATHJAX is set to YES. -MATHJAX_CODEFILE = +MATHJAX_CODEFILE = # When the SEARCHENGINE tag is enabled doxygen will generate a search box for # the HTML output. The underlying search engine uses javascript and DHTML and @@ -1560,7 +1560,7 @@ EXTERNAL_SEARCH = NO # Searching" for details. # This tag requires that the tag SEARCHENGINE is set to YES. -SEARCHENGINE_URL = +SEARCHENGINE_URL = # When SERVER_BASED_SEARCH and EXTERNAL_SEARCH are both enabled the unindexed # search data is written to a file for indexing by an external tool. With the @@ -1576,7 +1576,7 @@ SEARCHDATA_FILE = searchdata.xml # projects and redirect the results back to the right project. # This tag requires that the tag SEARCHENGINE is set to YES. -EXTERNAL_SEARCH_ID = +EXTERNAL_SEARCH_ID = # The EXTRA_SEARCH_MAPPINGS tag can be used to enable searching through doxygen # projects other than the one defined by this configuration file, but that are @@ -1586,7 +1586,7 @@ EXTERNAL_SEARCH_ID = # EXTRA_SEARCH_MAPPINGS = tagname1=loc1 tagname2=loc2 ... # This tag requires that the tag SEARCHENGINE is set to YES. -EXTRA_SEARCH_MAPPINGS = +EXTRA_SEARCH_MAPPINGS = #--------------------------------------------------------------------------- # Configuration options related to the LaTeX output @@ -1650,7 +1650,7 @@ PAPER_TYPE = a4 # If left blank no extra packages will be included. # This tag requires that the tag GENERATE_LATEX is set to YES. -EXTRA_PACKAGES = +EXTRA_PACKAGES = # The LATEX_HEADER tag can be used to specify a personal LaTeX header for the # generated LaTeX document. The header should contain everything until the first @@ -1666,7 +1666,7 @@ EXTRA_PACKAGES = # to HTML_HEADER. # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_HEADER = +LATEX_HEADER = # The LATEX_FOOTER tag can be used to specify a personal LaTeX footer for the # generated LaTeX document. The footer should contain everything after the last @@ -1677,7 +1677,7 @@ LATEX_HEADER = # Note: Only use a user-defined footer if you know what you are doing! # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_FOOTER = +LATEX_FOOTER = # The LATEX_EXTRA_STYLESHEET tag can be used to specify additional user-defined # LaTeX style sheets that are included after the standard style sheets created @@ -1688,7 +1688,7 @@ LATEX_FOOTER = # list). # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_EXTRA_STYLESHEET = +LATEX_EXTRA_STYLESHEET = # The LATEX_EXTRA_FILES tag can be used to specify one or more extra images or # other source files which should be copied to the LATEX_OUTPUT output @@ -1696,7 +1696,7 @@ LATEX_EXTRA_STYLESHEET = # markers available. # This tag requires that the tag GENERATE_LATEX is set to YES. -LATEX_EXTRA_FILES = +LATEX_EXTRA_FILES = # If the PDF_HYPERLINKS tag is set to YES, the LaTeX that is generated is # prepared for conversion to PDF (using ps2pdf or pdflatex). The PDF file will @@ -1796,14 +1796,14 @@ RTF_HYPERLINKS = NO # default style sheet that doxygen normally uses. # This tag requires that the tag GENERATE_RTF is set to YES. -RTF_STYLESHEET_FILE = +RTF_STYLESHEET_FILE = # Set optional variables used in the generation of an RTF document. Syntax is # similar to doxygen's config file. A template extensions file can be generated # using doxygen -e rtf extensionFile. # This tag requires that the tag GENERATE_RTF is set to YES. -RTF_EXTENSIONS_FILE = +RTF_EXTENSIONS_FILE = # If the RTF_SOURCE_CODE tag is set to YES then doxygen will include source code # with syntax highlighting in the RTF output. @@ -1848,7 +1848,7 @@ MAN_EXTENSION = .3 # MAN_EXTENSION with the initial . removed. # This tag requires that the tag GENERATE_MAN is set to YES. -MAN_SUBDIR = +MAN_SUBDIR = # If the MAN_LINKS tag is set to YES and doxygen generates man output, then it # will generate one additional man file for each entity documented in the real @@ -1961,7 +1961,7 @@ PERLMOD_PRETTY = YES # overwrite each other's variables. # This tag requires that the tag GENERATE_PERLMOD is set to YES. -PERLMOD_MAKEVAR_PREFIX = +PERLMOD_MAKEVAR_PREFIX = #--------------------------------------------------------------------------- # Configuration options related to the preprocessor @@ -2002,7 +2002,7 @@ SEARCH_INCLUDES = YES # preprocessor. # This tag requires that the tag SEARCH_INCLUDES is set to YES. -INCLUDE_PATH = +INCLUDE_PATH = # You can use the INCLUDE_FILE_PATTERNS tag to specify one or more wildcard # patterns (like *.h and *.hpp) to filter out the header-files in the @@ -2010,7 +2010,7 @@ INCLUDE_PATH = # used. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES. -INCLUDE_FILE_PATTERNS = +INCLUDE_FILE_PATTERNS = # The PREDEFINED tag can be used to specify one or more macro names that are # defined before the preprocessor is started (similar to the -D option of e.g. @@ -2020,7 +2020,7 @@ INCLUDE_FILE_PATTERNS = # recursively expanded use the := operator instead of the = operator. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES. -PREDEFINED = +PREDEFINED = # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # tag can be used to specify a list of macro names that should be expanded. The @@ -2029,7 +2029,7 @@ PREDEFINED = # definition found in the source code. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES. -EXPAND_AS_DEFINED = +EXPAND_AS_DEFINED = # If the SKIP_FUNCTION_MACROS tag is set to YES then doxygen's preprocessor will # remove all references to function-like macros that are alone on a line, have @@ -2058,13 +2058,13 @@ SKIP_FUNCTION_MACROS = YES # the path). If a tag file is not located in the directory in which doxygen is # run, you must also specify the path to the tagfile here. -TAGFILES = +TAGFILES = # When a file name is specified after GENERATE_TAGFILE, doxygen will create a # tag file that is based on the input files it reads. See section "Linking to # external documentation" for more information about the usage of tag files. -GENERATE_TAGFILE = +GENERATE_TAGFILE = # If the ALLEXTERNALS tag is set to YES, all external class will be listed in # the class index. If set to NO, only the inherited external classes will be @@ -2113,14 +2113,14 @@ CLASS_DIAGRAMS = YES # the mscgen tool resides. If left empty the tool is assumed to be found in the # default search path. -MSCGEN_PATH = +MSCGEN_PATH = # You can include diagrams made with dia in doxygen documentation. Doxygen will # then run dia to produce the diagram and insert it in the documentation. The # DIA_PATH tag allows you to specify the directory where the dia binary resides. # If left empty dia is assumed to be found in the default search path. -DIA_PATH = +DIA_PATH = # If set to YES the inheritance and collaboration graphs will hide inheritance # and usage relations if the target is undocumented or is not a class. @@ -2169,7 +2169,7 @@ DOT_FONTSIZE = 10 # the path where dot can find it using this tag. # This tag requires that the tag HAVE_DOT is set to YES. -DOT_FONTPATH = +DOT_FONTPATH = # If the CLASS_GRAPH tag is set to YES then doxygen will generate a graph for # each documented class showing the direct and indirect inheritance relations. @@ -2313,26 +2313,26 @@ INTERACTIVE_SVG = NO # found. If left blank, it is assumed the dot tool can be found in the path. # This tag requires that the tag HAVE_DOT is set to YES. -DOT_PATH = +DOT_PATH = # The DOTFILE_DIRS tag can be used to specify one or more directories that # contain dot files that are included in the documentation (see the \dotfile # command). # This tag requires that the tag HAVE_DOT is set to YES. -DOTFILE_DIRS = +DOTFILE_DIRS = # The MSCFILE_DIRS tag can be used to specify one or more directories that # contain msc files that are included in the documentation (see the \mscfile # command). -MSCFILE_DIRS = +MSCFILE_DIRS = # The DIAFILE_DIRS tag can be used to specify one or more directories that # contain dia files that are included in the documentation (see the \diafile # command). -DIAFILE_DIRS = +DIAFILE_DIRS = # When using plantuml, the PLANTUML_JAR_PATH tag should be used to specify the # path where java can find the plantuml.jar file. If left blank, it is assumed @@ -2340,12 +2340,12 @@ DIAFILE_DIRS = # generate a warning when it encounters a \startuml command in this case and # will not generate output for the diagram. -PLANTUML_JAR_PATH = +PLANTUML_JAR_PATH = # When using plantuml, the specified paths are searched for files specified by # the !include statement in a plantuml block. -PLANTUML_INCLUDE_PATH = +PLANTUML_INCLUDE_PATH = # The DOT_GRAPH_MAX_NODES tag can be used to set the maximum number of nodes # that will be shown in the graph. If the number of nodes in a graph becomes ===================================== docs/Makefile.in ===================================== @@ -303,6 +303,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== examples/F77/Makefile.in ===================================== @@ -560,6 +560,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== examples/F90/Makefile.in ===================================== @@ -624,6 +624,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== examples/Makefile.in ===================================== @@ -325,6 +325,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== fortran/Makefile.in ===================================== @@ -494,6 +494,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== libsrc/Makefile.in ===================================== @@ -267,6 +267,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== ltmain.sh ===================================== @@ -31,7 +31,7 @@ PROGRAM=libtool PACKAGE=libtool -VERSION=2.4.6 +VERSION="2.4.6 Debian-2.4.6-10" package_revision=2.4.6 @@ -1370,7 +1370,7 @@ func_lt_ver () #! /bin/sh # Set a version string for this script. -scriptversion=2014-01-07.03; # UTC +scriptversion=2015-10-07.11; # UTC # A portable, pluggable option parser for Bourne shell. # Written by Gary V. Vaughan, 2010 @@ -1530,6 +1530,8 @@ func_run_hooks () { $debug_cmd + _G_rc_run_hooks=false + case " $hookable_fns " in *" $1 "*) ;; *) func_fatal_error "'$1' does not support hook funcions.n" ;; @@ -1538,16 +1540,16 @@ func_run_hooks () eval _G_hook_fns=\$$1_hooks; shift for _G_hook in $_G_hook_fns; do - eval $_G_hook '"$@"' - - # store returned options list back into positional - # parameters for next 'cmd' execution. - eval _G_hook_result=\$${_G_hook}_result - eval set dummy "$_G_hook_result"; shift + if eval $_G_hook '"$@"'; then + # store returned options list back into positional + # parameters for next 'cmd' execution. + eval _G_hook_result=\$${_G_hook}_result + eval set dummy "$_G_hook_result"; shift + _G_rc_run_hooks=: + fi done - func_quote_for_eval ${1+"$@"} - func_run_hooks_result=$func_quote_for_eval_result + $_G_rc_run_hooks && func_run_hooks_result=$_G_hook_result } @@ -1557,10 +1559,16 @@ func_run_hooks () ## --------------- ## # In order to add your own option parsing hooks, you must accept the -# full positional parameter list in your hook function, remove any -# options that you action, and then pass back the remaining unprocessed +# full positional parameter list in your hook function, you may remove/edit +# any options that you action, and then pass back the remaining unprocessed # options in '_result', escaped suitably for -# 'eval'. Like this: +# 'eval'. In this case you also must return $EXIT_SUCCESS to let the +# hook's caller know that it should pay attention to +# '_result'. Returning $EXIT_FAILURE signalizes that +# arguments are left untouched by the hook and therefore caller will ignore the +# result variable. +# +# Like this: # # my_options_prep () # { @@ -1570,9 +1578,11 @@ func_run_hooks () # usage_message=$usage_message' # -s, --silent don'\''t print informational messages # ' -# -# func_quote_for_eval ${1+"$@"} -# my_options_prep_result=$func_quote_for_eval_result +# # No change in '$@' (ignored completely by this hook). There is +# # no need to do the equivalent (but slower) action: +# # func_quote_for_eval ${1+"$@"} +# # my_options_prep_result=$func_quote_for_eval_result +# false # } # func_add_hook func_options_prep my_options_prep # @@ -1581,25 +1591,37 @@ func_run_hooks () # { # $debug_cmd # +# args_changed=false +# # # Note that for efficiency, we parse as many options as we can # # recognise in a loop before passing the remainder back to the # # caller on the first unrecognised argument we encounter. # while test $# -gt 0; do # opt=$1; shift # case $opt in -# --silent|-s) opt_silent=: ;; +# --silent|-s) opt_silent=: +# args_changed=: +# ;; # # Separate non-argument short options: # -s*) func_split_short_opt "$_G_opt" # set dummy "$func_split_short_opt_name" \ # "-$func_split_short_opt_arg" ${1+"$@"} # shift +# args_changed=: # ;; -# *) set dummy "$_G_opt" "$*"; shift; break ;; +# *) # Make sure the first unrecognised option "$_G_opt" +# # is added back to "$@", we could need that later +# # if $args_changed is true. +# set dummy "$_G_opt" ${1+"$@"}; shift; break ;; # esac # done # -# func_quote_for_eval ${1+"$@"} -# my_silent_option_result=$func_quote_for_eval_result +# if $args_changed; then +# func_quote_for_eval ${1+"$@"} +# my_silent_option_result=$func_quote_for_eval_result +# fi +# +# $args_changed # } # func_add_hook func_parse_options my_silent_option # @@ -1611,16 +1633,32 @@ func_run_hooks () # $opt_silent && $opt_verbose && func_fatal_help "\ # '--silent' and '--verbose' options are mutually exclusive." # -# func_quote_for_eval ${1+"$@"} -# my_option_validation_result=$func_quote_for_eval_result +# false # } # func_add_hook func_validate_options my_option_validation # -# You'll alse need to manually amend $usage_message to reflect the extra +# You'll also need to manually amend $usage_message to reflect the extra # options you parse. It's preferable to append if you can, so that # multiple option parsing hooks can be added safely. +# func_options_finish [ARG]... +# ---------------------------- +# Finishing the option parse loop (call 'func_options' hooks ATM). +func_options_finish () +{ + $debug_cmd + + _G_func_options_finish_exit=false + if func_run_hooks func_options ${1+"$@"}; then + func_options_finish_result=$func_run_hooks_result + _G_func_options_finish_exit=: + fi + + $_G_func_options_finish_exit +} + + # func_options [ARG]... # --------------------- # All the functions called inside func_options are hookable. See the @@ -1630,17 +1668,28 @@ func_options () { $debug_cmd - func_options_prep ${1+"$@"} - eval func_parse_options \ - ${func_options_prep_result+"$func_options_prep_result"} - eval func_validate_options \ - ${func_parse_options_result+"$func_parse_options_result"} + _G_rc_options=false + + for my_func in options_prep parse_options validate_options options_finish + do + if eval func_$my_func '${1+"$@"}'; then + eval _G_res_var='$'"func_${my_func}_result" + eval set dummy "$_G_res_var" ; shift + _G_rc_options=: + fi + done - eval func_run_hooks func_options \ - ${func_validate_options_result+"$func_validate_options_result"} + # Save modified positional parameters for caller. As a top-level + # options-parser function we always need to set the 'func_options_result' + # variable (regardless the $_G_rc_options value). + if $_G_rc_options; then + func_options_result=$_G_res_var + else + func_quote_for_eval ${1+"$@"} + func_options_result=$func_quote_for_eval_result + fi - # save modified positional parameters for caller - func_options_result=$func_run_hooks_result + $_G_rc_options } @@ -1649,9 +1698,9 @@ func_options () # All initialisations required before starting the option parse loop. # Note that when calling hook functions, we pass through the list of # positional parameters. If a hook function modifies that list, and -# needs to propogate that back to rest of this script, then the complete +# needs to propagate that back to rest of this script, then the complete # modified list must be put in 'func_run_hooks_result' before -# returning. +# returning $EXIT_SUCCESS (otherwise $EXIT_FAILURE is returned). func_hookable func_options_prep func_options_prep () { @@ -1661,10 +1710,14 @@ func_options_prep () opt_verbose=false opt_warning_types= - func_run_hooks func_options_prep ${1+"$@"} + _G_rc_options_prep=false + if func_run_hooks func_options_prep ${1+"$@"}; then + _G_rc_options_prep=: + # save modified positional parameters for caller + func_options_prep_result=$func_run_hooks_result + fi - # save modified positional parameters for caller - func_options_prep_result=$func_run_hooks_result + $_G_rc_options_prep } @@ -1678,18 +1731,20 @@ func_parse_options () func_parse_options_result= + _G_rc_parse_options=false # this just eases exit handling while test $# -gt 0; do # Defer to hook functions for initial option parsing, so they # get priority in the event of reusing an option name. - func_run_hooks func_parse_options ${1+"$@"} - - # Adjust func_parse_options positional parameters to match - eval set dummy "$func_run_hooks_result"; shift + if func_run_hooks func_parse_options ${1+"$@"}; then + eval set dummy "$func_run_hooks_result"; shift + _G_rc_parse_options=: + fi # Break out of the loop if we already parsed every option. test $# -gt 0 || break + _G_match_parse_options=: _G_opt=$1 shift case $_G_opt in @@ -1704,7 +1759,10 @@ func_parse_options () ;; --warnings|--warning|-W) - test $# = 0 && func_missing_arg $_G_opt && break + if test $# = 0 && func_missing_arg $_G_opt; then + _G_rc_parse_options=: + break + fi case " $warning_categories $1" in *" $1 "*) # trailing space prevents matching last $1 above @@ -1757,15 +1815,25 @@ func_parse_options () shift ;; - --) break ;; + --) _G_rc_parse_options=: ; break ;; -*) func_fatal_help "unrecognised option: '$_G_opt'" ;; - *) set dummy "$_G_opt" ${1+"$@"}; shift; break ;; + *) set dummy "$_G_opt" ${1+"$@"}; shift + _G_match_parse_options=false + break + ;; esac + + $_G_match_parse_options && _G_rc_parse_options=: done - # save modified positional parameters for caller - func_quote_for_eval ${1+"$@"} - func_parse_options_result=$func_quote_for_eval_result + + if $_G_rc_parse_options; then + # save modified positional parameters for caller + func_quote_for_eval ${1+"$@"} + func_parse_options_result=$func_quote_for_eval_result + fi + + $_G_rc_parse_options } @@ -1778,16 +1846,21 @@ func_validate_options () { $debug_cmd + _G_rc_validate_options=false + # Display all warnings if -W was not given. test -n "$opt_warning_types" || opt_warning_types=" $warning_categories" - func_run_hooks func_validate_options ${1+"$@"} + if func_run_hooks func_validate_options ${1+"$@"}; then + # save modified positional parameters for caller + func_validate_options_result=$func_run_hooks_result + _G_rc_validate_options=: + fi # Bail if the options were screwed! $exit_cmd $EXIT_FAILURE - # save modified positional parameters for caller - func_validate_options_result=$func_run_hooks_result + $_G_rc_validate_options } @@ -2068,12 +2141,12 @@ include the following information: compiler: $LTCC compiler flags: $LTCFLAGS linker: $LD (gnu? $with_gnu_ld) - version: $progname (GNU libtool) 2.4.6 + version: $progname $scriptversion Debian-2.4.6-10 automake: `($AUTOMAKE --version) 2>/dev/null |$SED 1q` autoconf: `($AUTOCONF --version) 2>/dev/null |$SED 1q` Report bugs to . -GNU libtool home page: . +GNU libtool home page: . General help using GNU software: ." exit 0 } @@ -2270,6 +2343,8 @@ libtool_options_prep () nonopt= preserve_args= + _G_rc_lt_options_prep=: + # Shorthand for --mode=foo, only valid as the first argument case $1 in clean|clea|cle|cl) @@ -2293,11 +2368,18 @@ libtool_options_prep () uninstall|uninstal|uninsta|uninst|unins|unin|uni|un|u) shift; set dummy --mode uninstall ${1+"$@"}; shift ;; + *) + _G_rc_lt_options_prep=false + ;; esac - # Pass back the list of options. - func_quote_for_eval ${1+"$@"} - libtool_options_prep_result=$func_quote_for_eval_result + if $_G_rc_lt_options_prep; then + # Pass back the list of options. + func_quote_for_eval ${1+"$@"} + libtool_options_prep_result=$func_quote_for_eval_result + fi + + $_G_rc_lt_options_prep } func_add_hook func_options_prep libtool_options_prep @@ -2309,9 +2391,12 @@ libtool_parse_options () { $debug_cmd + _G_rc_lt_parse_options=false + # Perform our own loop to consume as many options as possible in # each iteration. while test $# -gt 0; do + _G_match_lt_parse_options=: _G_opt=$1 shift case $_G_opt in @@ -2386,15 +2471,22 @@ libtool_parse_options () func_append preserve_args " $_G_opt" ;; - # An option not handled by this hook function: - *) set dummy "$_G_opt" ${1+"$@"}; shift; break ;; + # An option not handled by this hook function: + *) set dummy "$_G_opt" ${1+"$@"} ; shift + _G_match_lt_parse_options=false + break + ;; esac + $_G_match_lt_parse_options && _G_rc_lt_parse_options=: done + if $_G_rc_lt_parse_options; then + # save modified positional parameters for caller + func_quote_for_eval ${1+"$@"} + libtool_parse_options_result=$func_quote_for_eval_result + fi - # save modified positional parameters for caller - func_quote_for_eval ${1+"$@"} - libtool_parse_options_result=$func_quote_for_eval_result + $_G_rc_lt_parse_options } func_add_hook func_parse_options libtool_parse_options @@ -7272,10 +7364,14 @@ func_mode_link () # -tp=* Portland pgcc target processor selection # --sysroot=* for sysroot support # -O*, -g*, -flto*, -fwhopr*, -fuse-linker-plugin GCC link-time optimization + # -specs=* GCC specs files # -stdlib=* select c++ std lib with clang + # -fsanitize=* Clang/GCC memory and address sanitizer + # -fuse-ld=* Linker select flags for GCC -64|-mips[0-9]|-r[0-9][0-9]*|-xarch=*|-xtarget=*|+DA*|+DD*|-q*|-m*| \ -t[45]*|-txscale*|-p|-pg|--coverage|-fprofile-*|-F*|@*|-tp=*|--sysroot=*| \ - -O*|-g*|-flto*|-fwhopr*|-fuse-linker-plugin|-fstack-protector*|-stdlib=*) + -O*|-g*|-flto*|-fwhopr*|-fuse-linker-plugin|-fstack-protector*|-stdlib=*| \ + -specs=*|-fsanitize=*|-fuse-ld=*) func_quote_for_eval "$arg" arg=$func_quote_for_eval_result func_append compile_command " $arg" @@ -7568,7 +7664,10 @@ func_mode_link () case $pass in dlopen) libs=$dlfiles ;; dlpreopen) libs=$dlprefiles ;; - link) libs="$deplibs %DEPLIBS% $dependency_libs" ;; + link) + libs="$deplibs %DEPLIBS%" + test "X$link_all_deplibs" != Xno && libs="$libs $dependency_libs" + ;; esac fi if test lib,dlpreopen = "$linkmode,$pass"; then @@ -7887,19 +7986,19 @@ func_mode_link () # It is a libtool convenience library, so add in its objects. func_append convenience " $ladir/$objdir/$old_library" func_append old_convenience " $ladir/$objdir/$old_library" + tmp_libs= + for deplib in $dependency_libs; do + deplibs="$deplib $deplibs" + if $opt_preserve_dup_deps; then + case "$tmp_libs " in + *" $deplib "*) func_append specialdeplibs " $deplib" ;; + esac + fi + func_append tmp_libs " $deplib" + done elif test prog != "$linkmode" && test lib != "$linkmode"; then func_fatal_error "'$lib' is not a convenience library" fi - tmp_libs= - for deplib in $dependency_libs; do - deplibs="$deplib $deplibs" - if $opt_preserve_dup_deps; then - case "$tmp_libs " in - *" $deplib "*) func_append specialdeplibs " $deplib" ;; - esac - fi - func_append tmp_libs " $deplib" - done continue fi # $pass = conv @@ -8823,6 +8922,9 @@ func_mode_link () revision=$number_minor lt_irix_increment=no ;; + *) + func_fatal_configuration "$modename: unknown library version type '$version_type'" + ;; esac ;; no) ===================================== m4/libtool.m4 ===================================== @@ -728,7 +728,6 @@ _LT_CONFIG_SAVE_COMMANDS([ cat <<_LT_EOF >> "$cfgfile" #! $SHELL # Generated automatically by $as_me ($PACKAGE) $VERSION -# Libtool was configured on host `(hostname || uname -n) 2>/dev/null | sed 1q`: # NOTE: Changes made to this file will be lost: look at ltmain.sh. # Provide generalized library-building support services. @@ -2887,6 +2886,18 @@ linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) dynamic_linker='GNU/Linux ld.so' ;; +netbsdelf*-gnu) + version_type=linux + need_lib_prefix=no + need_version=no + library_names_spec='${libname}${release}${shared_ext}$versuffix ${libname}${release}${shared_ext}$major ${libname}${shared_ext}' + soname_spec='${libname}${release}${shared_ext}$major' + shlibpath_var=LD_LIBRARY_PATH + shlibpath_overrides_runpath=no + hardcode_into_libs=yes + dynamic_linker='NetBSD ld.elf_so' + ;; + netbsd*) version_type=sunos need_lib_prefix=no @@ -3546,7 +3557,7 @@ linux* | k*bsd*-gnu | kopensolaris*-gnu | gnu*) lt_cv_deplibs_check_method=pass_all ;; -netbsd*) +netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ > /dev/null; then lt_cv_deplibs_check_method='match_pattern /lib[[^/]]+(\.so\.[[0-9]]+\.[[0-9]]+|_pic\.a)$' else @@ -4052,7 +4063,8 @@ _LT_EOF if AC_TRY_EVAL(ac_compile); then # Now try to grab the symbols. nlist=conftest.nm - if AC_TRY_EVAL(NM conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist) && test -s "$nlist"; then + $ECHO "$as_me:$LINENO: $NM conftest.$ac_objext | $lt_cv_sys_global_symbol_pipe > $nlist" >&AS_MESSAGE_LOG_FD + if eval "$NM" conftest.$ac_objext \| "$lt_cv_sys_global_symbol_pipe" \> $nlist 2>&AS_MESSAGE_LOG_FD && test -s "$nlist"; then # Try sorting and uniquifying the output. if sort "$nlist" | uniq > "$nlist"T; then mv -f "$nlist"T "$nlist" @@ -4424,7 +4436,7 @@ m4_if([$1], [CXX], [ ;; esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) ;; *qnx* | *nto*) # QNX uses GNU C++, but need to define -shared option too, otherwise @@ -4692,6 +4704,12 @@ m4_if([$1], [CXX], [ _LT_TAGVAR(lt_prog_compiler_pic, $1)='-KPIC' _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' ;; + # flang / f18. f95 an alias for gfortran or flang on Debian + flang* | f18* | f95*) + _LT_TAGVAR(lt_prog_compiler_wl, $1)='-Wl,' + _LT_TAGVAR(lt_prog_compiler_pic, $1)='-fPIC' + _LT_TAGVAR(lt_prog_compiler_static, $1)='-static' + ;; # icc used to be incompatible with GCC. # ICC 10 doesn't accept -KPIC any more. icc* | ifort*) @@ -4936,6 +4954,9 @@ m4_if([$1], [CXX], [ ;; esac ;; + linux* | k*bsd*-gnu | gnu*) + _LT_TAGVAR(link_all_deplibs, $1)=no + ;; *) _LT_TAGVAR(export_symbols_cmds, $1)='$NM $libobjs $convenience | $global_symbol_pipe | $SED '\''s/.* //'\'' | sort | uniq > $export_symbols' ;; @@ -4998,6 +5019,9 @@ dnl Note also adjust exclude_expsyms for C++ above. openbsd* | bitrig*) with_gnu_ld=no ;; + linux* | k*bsd*-gnu | gnu*) + _LT_TAGVAR(link_all_deplibs, $1)=no + ;; esac _LT_TAGVAR(ld_shlibs, $1)=yes @@ -5252,7 +5276,7 @@ _LT_EOF fi ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable $libobjs $deplibs $linker_flags -o $lib' wlarc= @@ -5773,6 +5797,7 @@ _LT_EOF if test yes = "$lt_cv_irix_exported_symbol"; then _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $pic_flag $libobjs $deplibs $compiler_flags $wl-soname $wl$soname `test -n "$verstring" && func_echo_all "$wl-set_version $wl$verstring"` $wl-update_registry $wl$output_objdir/so_locations $wl-exports_file $wl$export_symbols -o $lib' fi + _LT_TAGVAR(link_all_deplibs, $1)=no else _LT_TAGVAR(archive_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -o $lib' _LT_TAGVAR(archive_expsym_cmds, $1)='$CC -shared $libobjs $deplibs $compiler_flags -soname $soname `test -n "$verstring" && func_echo_all "-set_version $verstring"` -update_registry $output_objdir/so_locations -exports_file $export_symbols -o $lib' @@ -5794,7 +5819,7 @@ _LT_EOF esac ;; - netbsd*) + netbsd* | netbsdelf*-gnu) if echo __ELF__ | $CC -E - | $GREP __ELF__ >/dev/null; then _LT_TAGVAR(archive_cmds, $1)='$LD -Bshareable -o $lib $libobjs $deplibs $linker_flags' # a.out else @@ -6420,7 +6445,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' else GXX=no @@ -6795,7 +6820,7 @@ if test yes != "$_lt_caught_CXX_error"; then # explicitly linking system object files so we need to strip them # from the output so that they don't get included in the library # dependencies. - output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $EGREP "\-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $EGREP " \-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' ;; *) if test yes = "$GXX"; then @@ -6860,7 +6885,7 @@ if test yes != "$_lt_caught_CXX_error"; then # explicitly linking system object files so we need to strip them # from the output so that they don't get included in the library # dependencies. - output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $GREP "\-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' + output_verbose_link_cmd='templist=`($CC -b $CFLAGS -v conftest.$objext 2>&1) | $GREP " \-L"`; list= ; for z in $templist; do case $z in conftest.$objext) list="$list $z";; *.$objext);; *) list="$list $z";;esac; done; func_echo_all "$list"' ;; *) if test yes = "$GXX"; then @@ -7199,7 +7224,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' else # FIXME: insert proper C++ library support @@ -7283,7 +7308,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -shared $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' else # g++ 2.7 appears to require '-G' NOT '-shared' on this # platform. @@ -7294,7 +7319,7 @@ if test yes != "$_lt_caught_CXX_error"; then # Commands to make compiler produce verbose output that lists # what "hidden" libraries, object files and flags are used when # linking a shared library. - output_verbose_link_cmd='$CC -G $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP "\-L"' + output_verbose_link_cmd='$CC -G $CFLAGS -v conftest.$objext 2>&1 | $GREP -v "^Configured with:" | $GREP " \-L"' fi _LT_TAGVAR(hardcode_libdir_flag_spec, $1)='$wl-R $wl$libdir' ===================================== nf03_test/Makefile.in ===================================== @@ -547,6 +547,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== nf03_test4/CMakeLists.txt ===================================== @@ -49,10 +49,10 @@ if (USE_NETCDF4) # This is the netCDF-4 F90 large file test. IF (LARGE_FILE_TESTS) - SET(nc4_largefile_PROGRAMS tst_flarge) + SET(nc4_largefile_PROGRAMS f90tst_flarge) SET(check_PROGRAMS ${check_PROGRAMS} ${nc4_largefile_PROGRAMS}) SET(TESTS ${TESTS} ${nc4_largefile_PROGRAMS}) - SET(tst_flarge_SOURCES tst_flarge.f90) + SET(f90tst_flarge_SOURCES f90tst_flarge.f90) ENDIF(LARGE_FILE_TESTS) # This is an f90 benchmark. ===================================== nf03_test4/Makefile.am ===================================== @@ -12,7 +12,7 @@ AM_FCFLAGS = -I$(top_builddir)/fortran # All tests need to link to fortran library. LDADD = ${top_builddir}/fortran/libnetcdff.la -# tst_f90_nc4 +# tst_f90_nc4 NC4_F90_TESTS = f90tst_vars f90tst_vars_vlen f90tst_grps f90tst_fill \ f90tst_fill2 f90tst_vars2 f90tst_vars3 f90tst_vars4 f90tst_path \ f90tst_rengrps f90tst_nc4 f90tst_types f90tst_types2 @@ -65,7 +65,7 @@ endif # TEST_PARALLEL # check-valgrind target, which runs all tests with valgrind. @VALGRIND_CHECK_RULES@ -EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh +EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh f90tst_flarge.f90 # Cleaning up files created during the testing. CLEANFILES = f90tst_*.nc fort.* ===================================== nf03_test4/Makefile.in ===================================== @@ -686,6 +686,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ @@ -707,7 +708,7 @@ AM_FCFLAGS = -I$(top_builddir)/fortran # All tests need to link to fortran library. LDADD = ${top_builddir}/fortran/libnetcdff.la -# tst_f90_nc4 +# tst_f90_nc4 NC4_F90_TESTS = f90tst_vars f90tst_vars_vlen f90tst_grps f90tst_fill \ f90tst_fill2 f90tst_vars2 f90tst_vars3 f90tst_vars4 f90tst_path \ f90tst_rengrps f90tst_nc4 f90tst_types f90tst_types2 @@ -734,7 +735,7 @@ f90tst_types2_SOURCES = f90tst_types2.f90 @TEST_PARALLEL_TRUE at f90tst_parallel3_SOURCES = f90tst_parallel3.F90 @TEST_PARALLEL_TRUE at f90tst_nc4_par_SOURCES = f90tst_nc4_par.F90 @TEST_PARALLEL_TRUE at f90tst_parallel_fill_SOURCES = f90tst_parallel_fill.f90 -EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh +EXTRA_DIST = CMakeLists.txt run_f90_par_test.sh f90tst_flarge.f90 # Cleaning up files created during the testing. CLEANFILES = f90tst_*.nc fort.* ===================================== nf_test/Makefile.in ===================================== @@ -644,6 +644,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ ===================================== nf_test4/Makefile.in ===================================== @@ -731,6 +731,7 @@ pdfdir = @pdfdir@ prefix = @prefix@ program_transform_name = @program_transform_name@ psdir = @psdir@ +runstatedir = @runstatedir@ sbindir = @sbindir@ sharedstatedir = @sharedstatedir@ srcdir = @srcdir@ View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/32dff86c97c5526d246ab8649f79b90c0beff225 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/32dff86c97c5526d246ab8649f79b90c0beff225 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 19 06:29:29 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 05:29:29 +0000 Subject: Processing of netcdf-fortran_4.5.2+ds-1~exp1_source.changes Message-ID: netcdf-fortran_4.5.2+ds-1~exp1_source.changes uploaded successfully to localhost along with the files: netcdf-fortran_4.5.2+ds-1~exp1.dsc netcdf-fortran_4.5.2+ds.orig.tar.xz netcdf-fortran_4.5.2+ds-1~exp1.debian.tar.xz netcdf-fortran_4.5.2+ds-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Thu Sep 19 06:34:01 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 05:34:01 +0000 Subject: netcdf-fortran_4.5.2+ds-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 19 Sep 2019 07:09:46 +0200 Source: netcdf-fortran Architecture: source Version: 4.5.2+ds-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: netcdf-fortran (4.5.2+ds-1~exp1) experimental; urgency=medium . * New upstream release. Checksums-Sha1: ddff9a423a25a3a65338b0ea2a97f67234ca5df2 2399 netcdf-fortran_4.5.2+ds-1~exp1.dsc 92aec757f7c24bce54e41b629c209243978d9986 670732 netcdf-fortran_4.5.2+ds.orig.tar.xz 9568045c4eb19d78ce6732f2334674ffe91d8d20 10228 netcdf-fortran_4.5.2+ds-1~exp1.debian.tar.xz 9c531e5b48b0ea346605d7c6eb7ed7de603f549f 10110 netcdf-fortran_4.5.2+ds-1~exp1_amd64.buildinfo Checksums-Sha256: 9a42a3567bbecd849e31964ccae7a50bd401679fc345c9094bd4d3199e8db0d3 2399 netcdf-fortran_4.5.2+ds-1~exp1.dsc 92ba524271b825c0f26ed89e1396bd4a9d365df21ddf0d23aaecb7a2852ce41f 670732 netcdf-fortran_4.5.2+ds.orig.tar.xz 5946e838e93ced033cb2a4c4f25db79bd7f350761039f4241f94eb7112b07b88 10228 netcdf-fortran_4.5.2+ds-1~exp1.debian.tar.xz 8088eae40ed987cd2ca400b08e218ffe97dd9d454bf11b3c1b009e27cf4465b9 10110 netcdf-fortran_4.5.2+ds-1~exp1_amd64.buildinfo Files: 8ecb602307f525d4b6525d74bdc6da41 2399 science optional netcdf-fortran_4.5.2+ds-1~exp1.dsc 9d2239dc27f40140606482fceaee9375 670732 science optional netcdf-fortran_4.5.2+ds.orig.tar.xz b7da9731477bb632a18cd9fbb1e62520 10228 science optional netcdf-fortran_4.5.2+ds-1~exp1.debian.tar.xz 1e1d0eac274103fff12f210375899dff 10110 science optional netcdf-fortran_4.5.2+ds-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2DD00ACgkQZ1DxCuiN SvEL0Q//W0gVuQA8BLP/9Fwtx49I//8VckYscdvITOEoTLBaqJDYGo39GgI3YwYc jgYT/PM/uGazzHB3ubyLRIcjphACIbI6DUKdoeZOjld7VmmX0R5flaRTb3ZDM0t/ eXqMX0USuiJWlolWUV/ve1MrpmqZ2UFgejvkqTV5Botxv3h50cJlPHigAQKZ5oiE HUaFj/0E0rywSKd516ut6ER0sAnUdeDaZ07RgHZad5LOxQexD/vOLJKX305SJN53 cRjB5JPrccWrcW4oQJqHhs3EtKpFiDjqHSsq1LEASc6XlWucN2RJxCM0MGwm1Qp8 9dH0J2lWcco7Tk2KJIcfiZBG+G0PZSeUKQJqt0jyHV5U0jXcc5mNRlBLcej47vX+ Ag/70PgfrbAxY+SW4nH9mJO3ttwABW/NxGZeO59aLCInVixmyHHiLsU6RuErFteo QdCRdwRC0dFCtmxC5aaj/y/A3r6WOKsdZB61eZvhb3uc5WGmewmNbYxMlXtv+fOr IyJ2yCvtw0lq+D+s1Cbv5JK2wMmXtcXwSXBPLuBEwINoObbkT6Sz/UyKb+5VcTG6 GVRgRFVxifDsPl999Sclpx92PCTVmZZU4P0+vVdpfHKy0ma/Bk+X3uVcaM/Ha55w VBCrEnZdbdTKNJYEIpNqp2IHq5ouWZrD3eWJPrcK5W3kRkprOaI= =nEa9 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Thu Sep 19 06:35:01 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 05:35:01 +0000 Subject: routino_3.3.2-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 19 Sep 2019 06:42:17 +0200 Source: routino Architecture: source Version: 3.3.2-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: routino (3.3.2-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: 3fd270734d6934b2ccf4d98138691263e490b371 2349 routino_3.3.2-1.dsc 96c7c5001d16ba88430b399853a67780fbb86892 2542029 routino_3.3.2.orig.tar.gz 2f65e6b1d6a710330a0f43937543bf2c6e917641 29792 routino_3.3.2-1.debian.tar.xz 94361b4f7b423d47e9cecfaca37483c411dbaa39 10715 routino_3.3.2-1_amd64.buildinfo Checksums-Sha256: d2d1ca863f099f730b232ba370f551c39c6951b1822ec82cf05dd227a51f00ee 2349 routino_3.3.2-1.dsc 4b7174d76955e058986d343635e0a184385f2756fa4ffc02eb5e2399921e9db1 2542029 routino_3.3.2.orig.tar.gz 74dc27ad31deea0917576bd2efc86164d136e8560c5e30722ee36631b3cb1e2b 29792 routino_3.3.2-1.debian.tar.xz 341e34463dfdb73773aa2f5a413fcb6221cec9825fb697c1e1f5a11d259857da 10715 routino_3.3.2-1_amd64.buildinfo Files: aed4b8c86b32f13dfb177e768f75e749 2349 misc optional routino_3.3.2-1.dsc 98bf4512f7fa40c78772fd1a66259c11 2542029 misc optional routino_3.3.2.orig.tar.gz c89cb1670d31594231409ef801d7bb63 29792 misc optional routino_3.3.2-1.debian.tar.xz 0b787c900acb6978deb623020aa6b743 10715 misc optional routino_3.3.2-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2DDIcACgkQZ1DxCuiN SvGI3BAAwo1Eioz7QrdCRHHvG8GuxofXqt0Iw7EPNAkkOew8XFtbfFjdDxGg0+4e kU0+OC3t9mZSrD85oFZQVtDv49HMdLFa3dsLDvveXUafBbJCI6g1FBE8kpGwTLxZ BcAtlKTF1z29Ma4Dxw+nZjY13qx6LeWiPrG/nwqAvGN4XUrq6KVYEzLwmkJ00ViG qWIe4wpzQhx+/DFp19BD3xFzw/43KMa5Xce5lMIKgZwFyUMa/mx57Xvs4sXEhFVh Q2s9HwbLl87ImWrK8EWmUN6tyDy4wkbAopLD6JJ8REzun33v7fnnlysASqWurvxQ w7L4sXVrU1dokld6RHhUpEbtMjlt803/YD65Iiy2ko53qJ/EdySIYJ1ta3O+7L1x Sd2bYMWMJTnwWkSvZHOob0Gns1KG/XsFzF9RFRLwZgyvAEv0RZrZFUlkXxihxP5h Kg8WKB42Kl83REfEgTwfhHUi4WhQ+Uhk57VY7062hv/vbY68aww20epqlkRIPVn4 1DeRmLwsBMpOSerDZEVzNa5k5XYCm7kf866EB36rwS1qMbqqvvAd54NrMH+FuUx6 OiyKz6IXMtVZt/TnAk1oZZvtzNRY21ric6/PZpdeMLgvYV/Ub+vQvwI2JIfHRbmG qtFA6rs831TTe8m6dfLAN1c3Sjgxm3Zb8WcMRKj195iqRUArVDE= =KoOf -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Thu Sep 19 07:21:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 06:21:25 +0000 Subject: [Git][debian-gis-team/python-pyproj][pristine-tar] pristine-tar data for python-pyproj_2.4~rc0+ds.orig.tar.xz Message-ID: <5d831e6524f6a_73483fbbb8398858154102a@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-pyproj Commits: b16ac903 by Bas Couwenberg at 2019-09-19T05:24:50Z pristine-tar data for python-pyproj_2.4~rc0+ds.orig.tar.xz - - - - - 2 changed files: - + python-pyproj_2.4~rc0+ds.orig.tar.xz.delta - + python-pyproj_2.4~rc0+ds.orig.tar.xz.id Changes: ===================================== python-pyproj_2.4~rc0+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/python-pyproj_2.4~rc0+ds.orig.tar.xz.delta differ ===================================== python-pyproj_2.4~rc0+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +5d61237d42110e4ff445cf0c4d2a58247c6d876a View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/b16ac903dacb29bb9dfcb4b83016ec842bf92783 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/b16ac903dacb29bb9dfcb4b83016ec842bf92783 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 07:21:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 06:21:27 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new branch experimental Message-ID: <5d831e673241c_73483fbbbf05478015414db@godard.mail> Bas Couwenberg pushed new branch experimental at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/experimental You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 07:21:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 06:21:27 +0000 Subject: [Git][debian-gis-team/python-pyproj][upstream] New upstream version 2.4~rc0+ds Message-ID: <5d831e6713f13_73483fbbb4751b4c154125a@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pyproj Commits: 59620342 by Bas Couwenberg at 2019-09-19T05:24:49Z New upstream version 2.4~rc0+ds - - - - - 30 changed files: - .all-contributorsrc - .github/ISSUE_TEMPLATE/bug_report.md - .travis.yml - + HOW_TO_RELEASE.md - MANIFEST.in - README.md - appveyor.yml - ci/travis/proj-dl-and-compile - pyproj/__init__.py - + pyproj/__main__.py - pyproj/_crs.pxd - pyproj/_crs.pyx - pyproj/_datadir.pxd - pyproj/_datadir.pyx - pyproj/_geod.pxd - pyproj/_geod.pyx - pyproj/_list.pyx - pyproj/_proj.pxd - pyproj/_proj.pyx - pyproj/_show_versions.py - pyproj/_transformer.pxd - pyproj/_transformer.pyx - pyproj/base.pxi - pyproj/crs.py - pyproj/datadir.py - pyproj/enums.py - pyproj/geod.py - pyproj/proj.pxi - pyproj/proj.py - pyproj/transformer.py The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/5962034205e007d4eccf18be892e6fa8eafb22dc -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/5962034205e007d4eccf18be892e6fa8eafb22dc You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 07:21:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 06:21:31 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag debian/2.4_rc0+ds-1_exp1 Message-ID: <5d831e6b3d8cd_73483fbbb4751b4c154161e@godard.mail> Bas Couwenberg pushed new tag debian/2.4_rc0+ds-1_exp1 at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/debian/2.4_rc0+ds-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Thu Sep 19 07:21:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 06:21:32 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag upstream/2.4_rc0+ds Message-ID: <5d831e6c87dec_73483fbbb839885815418d6@godard.mail> Bas Couwenberg pushed new tag upstream/2.4_rc0+ds at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/upstream/2.4_rc0+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Thu Sep 19 07:32:53 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 06:32:53 +0000 Subject: Processing of python-pyproj_2.4~rc0+ds-1~exp1_source.changes Message-ID: python-pyproj_2.4~rc0+ds-1~exp1_source.changes uploaded successfully to localhost along with the files: python-pyproj_2.4~rc0+ds-1~exp1.dsc python-pyproj_2.4~rc0+ds.orig.tar.xz python-pyproj_2.4~rc0+ds-1~exp1.debian.tar.xz python-pyproj_2.4~rc0+ds-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Thu Sep 19 07:34:04 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 06:34:04 +0000 Subject: python-pyproj_2.4~rc0+ds-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Thu, 19 Sep 2019 07:25:56 +0200 Source: python-pyproj Architecture: source Version: 2.4~rc0+ds-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pyproj (2.4~rc0+ds-1~exp1) experimental; urgency=medium . * New upstream release candidate. * Update watch file for pre-releases. * Add patch to use python3 interpreter for tests. * Add patch to fix spelling errors. * Use ${python3:Provides} substvar in Provides field. Checksums-Sha1: f1c7709c95ceee4b2653f2572ed26379af2b9172 2251 python-pyproj_2.4~rc0+ds-1~exp1.dsc 454a8391b56af9a97501c032c8def03a7da14756 83648 python-pyproj_2.4~rc0+ds.orig.tar.xz e2c3eed1b3ce11a1cf6b809ac20e6c54d352e3cb 6760 python-pyproj_2.4~rc0+ds-1~exp1.debian.tar.xz 9f3c20298f2b485d7584a2f4e4058212377d0c33 8723 python-pyproj_2.4~rc0+ds-1~exp1_amd64.buildinfo Checksums-Sha256: 0e4cab393d60af4a45f89635bd94f0e045cadb150b01d62b5fa34dc67c40c9c8 2251 python-pyproj_2.4~rc0+ds-1~exp1.dsc 46a07729bbbe09fb2c11436b92072eb1cf8d12348d3ec971d88eec62bf983b74 83648 python-pyproj_2.4~rc0+ds.orig.tar.xz f616e06cf67e60f9ab410b72322503a685c4bbac6b6a6211619bd532f4047046 6760 python-pyproj_2.4~rc0+ds-1~exp1.debian.tar.xz 86e395579c2f6af0cfd6cbd21bfa7cf95353f9549d18660cce27374ece6c639d 8723 python-pyproj_2.4~rc0+ds-1~exp1_amd64.buildinfo Files: 978ca5a539559d0a1518e6d07706a642 2251 python optional python-pyproj_2.4~rc0+ds-1~exp1.dsc d1b3f876559eee0b293df3cf8393b4ab 83648 python optional python-pyproj_2.4~rc0+ds.orig.tar.xz 50446c71193c7b1b8aae5d0b5b29261f 6760 python optional python-pyproj_2.4~rc0+ds-1~exp1.debian.tar.xz 7c73494c37e7706b7c7030943f50720e 8723 python optional python-pyproj_2.4~rc0+ds-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2DHkcACgkQZ1DxCuiN SvERLxAAkbWqpHB0QGGAnAdYqDqDBQAhdTzy1EJgs5E4XkZUguDzGV1OGMzPU73Y FaI28lyGWBNfx2XqWgUI0eJsiU5OEAdK1JQBY4gK7Z0MMb/a3Kn8oDwp2RdSCDqt t/W2NDTqMyso1kiVWkn91iYOM5kAye5gJICL9YPkz4vY6rWV+p9ZYuF/0wvlQlay YaYwzSVGpI4vsSSDUlvFYYunOvHIzjZECHF2tlMVmUBhzCINnMSQIg8N1ITUhWso pRqQogb9PQKNAvPGsEgBqFO0CVLBAE+RU8ZUPeic27pwYM/zujSVmCY82yQS88wu rivi4gRHIcN4xGMqta1ih+j0lIs4QxlAz/PyWksYtyR2wjTwMJ2SLGs5WROmvNJA qeAmdVZYxTw/pJJNqw/6raMRU45Dt/jxNHKvQVlf5pvGR4c7q0ayVEzi1pxChU83 lhK5tKSU6NNs3NjWLqvUoOeHtF/88VIgjKqoDwAWU7y/gzqAJg1Ndv+PD0MCPmgi +KzQDKoX9AS5SIiOjoTIACbLyVqHIY3JVMUYIteST5JUqa4HA0VqFGJdwbV8qS8x OTk6Y22L/Bf3U9EQiGvO6Ry+dPtAVyXWv+T0uOOsw4t174s2jSEObCtKT+mPesrX 8DZMkOxhrPYWO092Fm0XF8BSRK1MJI3mJQrsDOZJ+2O3NozJrtk= =vmMK -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Thu Sep 19 08:35:52 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Thu, 19 Sep 2019 07:35:52 +0000 Subject: Bug#940635: Removed package(s) from unstable Message-ID: We believe that the bug you reported is now fixed; the following package(s) have been removed from unstable: osmium-tool | 1.10.0-1 | s390x ------------------- Reason ------------------- ROM; Big endian specific test failure ---------------------------------------------- Note that the package(s) have simply been removed from the tag database and may (or may not) still be in the pool; this is not a bug. The package(s) will be physically removed automatically when no suite references them (and in the case of source, when no binary references it). Please also remember that the changes have been done on the master archive and will not propagate to any mirrors until the next dinstall run at the earliest. Packages are usually not removed from testing by hand. Testing tracks unstable and will automatically remove packages which were removed from unstable when removing them from testing causes no dependency problems. The release team can force a removal from testing if it is really needed, please contact them if this should be the case. Bugs which have been reported against this package are not automatically removed from the Bug Tracking System. Please check all open bugs and close them or re-assign them to another package if the removed package was superseded by another one. The version of this package that was in Debian prior to this removal can still be found using http://snapshot.debian.org/. Thank you for reporting the bug, which will now be closed. If you have further comments please address them to 940635 at bugs.debian.org. The full log for this bug can be viewed at https://bugs.debian.org/940635 This message was generated automatically; if you believe that there is a problem with it please contact the archive administrators by mailing ftpmaster at ftp-master.debian.org. Debian distribution maintenance software pp. Scott Kitterman (the ftpmaster behind the curtain) From gitlab at salsa.debian.org Thu Sep 19 13:31:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 12:31:44 +0000 Subject: [Git][debian-gis-team/python-pyproj][experimental] Mark spelling-errors.patch as Applied-Upstream. Message-ID: <5d837530c561b_73482ad96390d50015843d3@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / python-pyproj Commits: 602b3c90 by Bas Couwenberg at 2019-09-19T12:31:37Z Mark spelling-errors.patch as Applied-Upstream. - - - - - 1 changed file: - debian/patches/spelling-errors.patch Changes: ===================================== debian/patches/spelling-errors.patch ===================================== @@ -2,6 +2,7 @@ Description: Fix spelling errors. * intialized -> initialized Author: Bas Couwenberg Forwarded: https://github.com/pyproj4/pyproj/pull/453 +Applied-Upstream: https://github.com/pyproj4/pyproj/commit/9785bf81561ebf03446fb88d159613c4e4391f5a --- a/pyproj/_crs.pyx +++ b/pyproj/_crs.pyx View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/602b3c90919a89bb08d0e1e85c781082fa76559f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/602b3c90919a89bb08d0e1e85c781082fa76559f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From holger at layer-acht.org Thu Sep 19 17:06:48 2019 From: holger at layer-acht.org (Holger Levsen) Date: Thu, 19 Sep 2019 18:06:48 +0200 Subject: Bug#940781: please drop transitional package libsaga from src:saga Message-ID: <20190919160648.x6q2roexjglxsybh@layer-acht.org> Package: saga Version: 2.3.1+dfsg-4 Severity: normal user: qa.debian.org at packages.debian.org usertags: transitional Please drop the transitional package libsaga (from the source package saga) for bullseye, as it has been released with stretch and buster already. Description: SAGA GIS shared libraries - transitional package Package: libsaga Version: 2.3.1+dfsg-3 Version: 2.3.1+dfsg-4 Version: 7.3.0+dfsg-2 Thanks for maintaining saga! -- cheers, Holger ------------------------------------------------------------------------------- holger@(debian|reproducible-builds|layer-acht).org PGP fingerprint: B8BF 5413 7B09 D35C F026 FE9D 091A B856 069A AA1C -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: not available URL: From gitlab at salsa.debian.org Thu Sep 19 17:14:48 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Thu, 19 Sep 2019 16:14:48 +0000 Subject: [Git][debian-gis-team/saga][master] Drop obsolete libsaga transitional package. (closes: #940781) Message-ID: <5d83a97883ecf_73483fbbb4720b6416132cc@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / saga Commits: 84a5a8fc by Bas Couwenberg at 2019-09-19T16:14:35Z Drop obsolete libsaga transitional package. (closes: #940781) - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +saga (7.3.0+dfsg-3) UNRELEASED; urgency=medium + + * Drop obsolete libsaga transitional package. + (closes: #940781) + + -- Bas Couwenberg Thu, 19 Sep 2019 18:14:05 +0200 + saga (7.3.0+dfsg-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -98,20 +98,6 @@ Description: SAGA GIS shared library (graphical models) . This package contains the library files for developing graphical modules. -Package: libsaga -Architecture: all -Section: oldlibs -Depends: libsaga-api-7.3.0, - libsaga-gdi-7.3.0, - ${misc:Depends} -Description: SAGA GIS shared libraries - transitional package - SAGA GIS (System for Automated Geoscientific Analyses) is a geographic - information system used for editing and analysing spatial data. - . - This package is a transitional package from libsaga to - libsaga-api-7.3.0 and libsaga-gdi-7.3.0. - It can safely be removed. - Package: python3-saga Architecture: any Section: python ===================================== debian/control.in ===================================== @@ -98,20 +98,6 @@ Description: SAGA GIS shared library (graphical models) . This package contains the library files for developing graphical modules. -Package: libsaga -Architecture: all -Section: oldlibs -Depends: libsaga-api- at VERSION@, - libsaga-gdi- at VERSION@, - ${misc:Depends} -Description: SAGA GIS shared libraries - transitional package - SAGA GIS (System for Automated Geoscientific Analyses) is a geographic - information system used for editing and analysing spatial data. - . - This package is a transitional package from libsaga to - libsaga-api- at VERSION@ and libsaga-gdi- at VERSION@. - It can safely be removed. - Package: python3-saga Architecture: any Section: python View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/84a5a8fce45171b2b64c82f6b7a0f9675f7a1f28 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/84a5a8fce45171b2b64c82f6b7a0f9675f7a1f28 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sebastic at xs4all.nl Thu Sep 19 17:15:44 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Thu, 19 Sep 2019 18:15:44 +0200 Subject: Bug#940781: please drop transitional package libsaga from src:saga In-Reply-To: <20190919160648.x6q2roexjglxsybh@layer-acht.org> References: <20190919160648.x6q2roexjglxsybh@layer-acht.org> <20190919160648.x6q2roexjglxsybh@layer-acht.org> Message-ID: <6b681c08-a6cf-f948-bbbc-af7e0190c9fa@xs4all.nl> Control: tags -1 pending On 9/19/19 6:06 PM, Holger Levsen wrote: > Please drop the transitional package libsaga (from the source package saga) for > bullseye, as it has been released with stretch and buster already. Fixed in git. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: OpenPGP digital signature URL: From owner at bugs.debian.org Thu Sep 19 17:27:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Thu, 19 Sep 2019 16:27:03 +0000 Subject: Processed: Re: Bug#940781: please drop transitional package libsaga from src:saga References: <6b681c08-a6cf-f948-bbbc-af7e0190c9fa@xs4all.nl> <20190919160648.x6q2roexjglxsybh@layer-acht.org> Message-ID: Processing control commands: > tags -1 pending Bug #940781 [saga] please drop transitional package libsaga from src:saga Added tag(s) pending. -- 940781: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=940781 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From holger at layer-acht.org Thu Sep 19 17:16:39 2019 From: holger at layer-acht.org (Holger Levsen) Date: Thu, 19 Sep 2019 16:16:39 +0000 Subject: Bug#940781: please drop transitional package libsaga from src:saga In-Reply-To: <6b681c08-a6cf-f948-bbbc-af7e0190c9fa@xs4all.nl> References: <20190919160648.x6q2roexjglxsybh@layer-acht.org> <6b681c08-a6cf-f948-bbbc-af7e0190c9fa@xs4all.nl> <20190919160648.x6q2roexjglxsybh@layer-acht.org> Message-ID: <20190919161639.nmqxv5a4oyotqxo5@layer-acht.org> On Thu, Sep 19, 2019 at 06:15:44PM +0200, Sebastiaan Couwenberg wrote: > Fixed in git. yay, thanks! -- cheers, Holger ------------------------------------------------------------------------------- holger@(debian|reproducible-builds|layer-acht).org PGP fingerprint: B8BF 5413 7B09 D35C F026 FE9D 091A B856 069A AA1C -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: not available URL: From noreply at release.debian.org Fri Sep 20 05:39:12 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 20 Sep 2019 04:39:12 +0000 Subject: osgearth REMOVED from testing Message-ID: FYI: The status of the osgearth source package in Debian's testing distribution has changed. Previous version: 2.10.2+dfsg-1 Current version: (not in testing) Hint: Bug #875075: [openscenegraph] Future Qt4 removal from Buster # in openscenegraph The script that generates this mail tries to extract removal reasons from comments in the britney hint files. Those comments were not originally meant to be machine readable, so if the reason for removing your package seems to be nonsense, it is probably the reporting script that got confused. Please check the actual hints file before you complain about meaningless removals. -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Fri Sep 20 05:39:12 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 20 Sep 2019 04:39:12 +0000 Subject: pyresample 1.13.0-1 MIGRATED to testing Message-ID: FYI: The status of the pyresample source package in Debian's testing distribution has changed. Previous version: 1.12.3-6 Current version: 1.13.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Fri Sep 20 05:39:12 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 20 Sep 2019 04:39:12 +0000 Subject: otb REMOVED from testing Message-ID: FYI: The status of the otb source package in Debian's testing distribution has changed. Previous version: 6.6.1+dfsg-3 Current version: (not in testing) Hint: # 875075,935086 in insighttoolkit4,openscenegraph The script that generates this mail tries to extract removal reasons from comments in the britney hint files. Those comments were not originally meant to be machine readable, so if the reason for removing your package seems to be nonsense, it is probably the reporting script that got confused. Please check the actual hints file before you complain about meaningless removals. -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Fri Sep 20 05:58:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 04:58:23 +0000 Subject: [Git][debian-gis-team/python-pdal][pristine-tar] pristine-tar data for python-pdal_2.2.2+ds.orig.tar.xz Message-ID: <5d845c6f2baac_73482ad95cbba4a81664730@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-pdal Commits: a57d4bbf by Bas Couwenberg at 2019-09-20T04:48:38Z pristine-tar data for python-pdal_2.2.2+ds.orig.tar.xz - - - - - 2 changed files: - + python-pdal_2.2.2+ds.orig.tar.xz.delta - + python-pdal_2.2.2+ds.orig.tar.xz.id Changes: ===================================== python-pdal_2.2.2+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/python-pdal_2.2.2+ds.orig.tar.xz.delta differ ===================================== python-pdal_2.2.2+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +be1f9e893d1ab069e707bd815fc688e283f1ed0f View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/a57d4bbf975b32c11bdac1859a07b300cc236eee -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/a57d4bbf975b32c11bdac1859a07b300cc236eee You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 05:58:50 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 04:58:50 +0000 Subject: [Git][debian-gis-team/python-pdal] Pushed new tag debian/2.2.2+ds-1 Message-ID: <5d845c8a49c4_73482ad9617465981665099@godard.mail> Bas Couwenberg pushed new tag debian/2.2.2+ds-1 at Debian GIS Project / python-pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/tree/debian/2.2.2+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 05:59:01 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 04:59:01 +0000 Subject: [Git][debian-gis-team/python-pdal] Pushed new tag upstream/2.2.2+ds Message-ID: <5d845c9510793_73483fbbb4720b6416653ac@godard.mail> Bas Couwenberg pushed new tag upstream/2.2.2+ds at Debian GIS Project / python-pdal -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/tree/upstream/2.2.2+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 05:59:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 04:59:14 +0000 Subject: [Git][debian-gis-team/python-pdal][master] 4 commits: New upstream version 2.2.2+ds Message-ID: <5d845ca278fe3_73482ad96147edd81665444@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pdal Commits: 461c6250 by Bas Couwenberg at 2019-09-20T04:48:37Z New upstream version 2.2.2+ds - - - - - 6ad2b92c by Bas Couwenberg at 2019-09-20T04:48:38Z Update upstream source from tag 'upstream/2.2.2+ds' Update to upstream version '2.2.2+ds' with Debian dir 8b4e5927f9b4866c7bb1b618f30bb55b0b19dba8 - - - - - 2279192d by Bas Couwenberg at 2019-09-20T04:48:51Z New upstream release. - - - - - af2e6d83 by Bas Couwenberg at 2019-09-20T04:50:13Z Set distribution to unstable. - - - - - 9 changed files: - PKG-INFO - − VERSION.txt - debian/changelog - pdal/PyArray.hpp - pdal/__init__.py - pdal/libpdalpython.cpp - pdal/libpdalpython.pyx - setup.py - test/test_pipeline.py Changes: ===================================== PKG-INFO ===================================== @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: PDAL -Version: 2.2.1 +Version: 2.2.2 Summary: Point cloud data processing Home-page: http://pdal.io Author: Howard Butler ===================================== VERSION.txt deleted ===================================== @@ -1 +0,0 @@ -2.2.1 \ No newline at end of file ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-pdal (2.2.2+ds-1) unstable; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Fri, 20 Sep 2019 06:50:01 +0200 + python-pdal (2.2.1+ds-1) unstable; urgency=medium * New upstream release. ===================================== pdal/PyArray.hpp ===================================== @@ -74,7 +74,7 @@ private: Array& operator=(Array const& rhs); Fields m_fields; bool m_rowMajor; - Shape m_shape; + Shape m_shape {}; std::vector> m_iterators; }; ===================================== pdal/__init__.py ===================================== @@ -1,4 +1,4 @@ -__version__='2.2.1' +__version__='2.2.2' from .pipeline import Pipeline from .array import Array ===================================== pdal/libpdalpython.cpp ===================================== @@ -28,8 +28,7 @@ ], "language": "c++", "libraries": [ - "pdalcpp", - "pdal_plugin_reader_numpy" + "pdalcpp" ], "library_dirs": [ "/Users/hobu/miniconda3/envs/pdal/lib" @@ -1813,7 +1812,6 @@ int __pyx_module_is_main_pdal__libpdalpython = 0; /* Implementation of 'pdal.libpdalpython' */ static PyObject *__pyx_builtin_TypeError; -static PyObject *__pyx_builtin_print; static PyObject *__pyx_builtin_ValueError; static PyObject *__pyx_builtin_range; static PyObject *__pyx_builtin_RuntimeError; @@ -1829,7 +1827,6 @@ static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_array[] = "array"; static const char __pyx_k_dtype[] = "dtype"; static const char __pyx_k_loads[] = "loads"; -static const char __pyx_k_print[] = "print"; static const char __pyx_k_range[] = "range"; static const char __pyx_k_arrays[] = "arrays"; static const char __pyx_k_c_dims[] = "c_dims"; @@ -1850,7 +1847,6 @@ static const char __pyx_k_description[] = "description"; static const char __pyx_k_RuntimeError[] = "RuntimeError"; static const char __pyx_k_getDimensions[] = "getDimensions"; static const char __pyx_k_reduce_cython[] = "__reduce_cython__"; -static const char __pyx_k_Looping_arrays[] = "Looping arrays\n"; static const char __pyx_k_getVersionMajor[] = "getVersionMajor"; static const char __pyx_k_getVersionMinor[] = "getVersionMinor"; static const char __pyx_k_getVersionPatch[] = "getVersionPatch"; @@ -1875,7 +1871,6 @@ static PyObject *__pyx_kp_u_C_Pipeline_object_not_constructe; static PyObject *__pyx_kp_u_Format_string_allocated_too_shor; static PyObject *__pyx_kp_u_Format_string_allocated_too_shor_2; static PyObject *__pyx_n_s_ImportError; -static PyObject *__pyx_kp_u_Looping_arrays; static PyObject *__pyx_kp_u_Non_native_byte_order_not_suppor; static PyObject *__pyx_n_s_PyArray; static PyObject *__pyx_n_s_PyPipeline; @@ -1914,7 +1909,6 @@ static PyObject *__pyx_kp_u_numpy_core_umath_failed_to_impor; static PyObject *__pyx_n_s_output; static PyObject *__pyx_n_s_pdal_libpdalpython; static PyObject *__pyx_kp_s_pdal_libpdalpython_pyx; -static PyObject *__pyx_n_s_print; static PyObject *__pyx_n_s_ptr; static PyObject *__pyx_n_s_range; static PyObject *__pyx_n_s_reduce; @@ -1965,16 +1959,15 @@ static PyObject *__pyx_tuple__9; static PyObject *__pyx_tuple__10; static PyObject *__pyx_tuple__11; static PyObject *__pyx_tuple__12; -static PyObject *__pyx_tuple__13; -static PyObject *__pyx_tuple__21; +static PyObject *__pyx_tuple__20; +static PyObject *__pyx_codeobj__13; static PyObject *__pyx_codeobj__14; static PyObject *__pyx_codeobj__15; static PyObject *__pyx_codeobj__16; static PyObject *__pyx_codeobj__17; static PyObject *__pyx_codeobj__18; static PyObject *__pyx_codeobj__19; -static PyObject *__pyx_codeobj__20; -static PyObject *__pyx_codeobj__22; +static PyObject *__pyx_codeobj__21; /* Late includes */ /* "pdal/libpdalpython.pyx":24 @@ -3079,8 +3072,8 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob * cdef Array* a * * if arrays is not None: # <<<<<<<<<<<<<< - * print("Looping arrays\n") * for array in arrays: + * a = new Array(array) */ __pyx_t_1 = (__pyx_v_arrays != ((PyObject*)Py_None)); __pyx_t_3 = (__pyx_t_1 != 0); @@ -3089,54 +3082,43 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob /* "pdal/libpdalpython.pyx":109 * * if arrays is not None: - * print("Looping arrays\n") # <<<<<<<<<<<<<< - * for array in arrays: - * a = new Array(array) - */ - __pyx_t_4 = __Pyx_PyObject_Call(__pyx_builtin_print, __pyx_tuple__3, NULL); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 109, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - - /* "pdal/libpdalpython.pyx":110 - * if arrays is not None: - * print("Looping arrays\n") * for array in arrays: # <<<<<<<<<<<<<< * a = new Array(array) * c_arrays.push_back(a) */ if (unlikely(__pyx_v_arrays == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable"); - __PYX_ERR(1, 110, __pyx_L1_error) + __PYX_ERR(1, 109, __pyx_L1_error) } __pyx_t_4 = __pyx_v_arrays; __Pyx_INCREF(__pyx_t_4); __pyx_t_2 = 0; for (;;) { if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_4)) break; #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_5 = PyList_GET_ITEM(__pyx_t_4, __pyx_t_2); __Pyx_INCREF(__pyx_t_5); __pyx_t_2++; if (unlikely(0 < 0)) __PYX_ERR(1, 110, __pyx_L1_error) + __pyx_t_5 = PyList_GET_ITEM(__pyx_t_4, __pyx_t_2); __Pyx_INCREF(__pyx_t_5); __pyx_t_2++; if (unlikely(0 < 0)) __PYX_ERR(1, 109, __pyx_L1_error) #else - __pyx_t_5 = PySequence_ITEM(__pyx_t_4, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_5)) __PYX_ERR(1, 110, __pyx_L1_error) + __pyx_t_5 = PySequence_ITEM(__pyx_t_4, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_5)) __PYX_ERR(1, 109, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); #endif __Pyx_XDECREF_SET(__pyx_v_array, __pyx_t_5); __pyx_t_5 = 0; - /* "pdal/libpdalpython.pyx":111 - * print("Looping arrays\n") + /* "pdal/libpdalpython.pyx":110 + * if arrays is not None: * for array in arrays: * a = new Array(array) # <<<<<<<<<<<<<< * c_arrays.push_back(a) * */ - if (!(likely(((__pyx_v_array) == Py_None) || likely(__Pyx_TypeTest(__pyx_v_array, __pyx_ptype_5numpy_ndarray))))) __PYX_ERR(1, 111, __pyx_L1_error) + if (!(likely(((__pyx_v_array) == Py_None) || likely(__Pyx_TypeTest(__pyx_v_array, __pyx_ptype_5numpy_ndarray))))) __PYX_ERR(1, 110, __pyx_L1_error) try { __pyx_t_6 = new pdal::python::Array(((PyArrayObject *)__pyx_v_array)); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 111, __pyx_L1_error) + __PYX_ERR(1, 110, __pyx_L1_error) } __pyx_v_a = __pyx_t_6; - /* "pdal/libpdalpython.pyx":112 + /* "pdal/libpdalpython.pyx":111 * for array in arrays: * a = new Array(array) * c_arrays.push_back(a) # <<<<<<<<<<<<<< @@ -3147,12 +3129,12 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob __pyx_v_c_arrays.push_back(__pyx_v_a); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 112, __pyx_L1_error) + __PYX_ERR(1, 111, __pyx_L1_error) } - /* "pdal/libpdalpython.pyx":110 + /* "pdal/libpdalpython.pyx":109 + * * if arrays is not None: - * print("Looping arrays\n") * for array in arrays: # <<<<<<<<<<<<<< * a = new Array(array) * c_arrays.push_back(a) @@ -3160,7 +3142,7 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob } __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - /* "pdal/libpdalpython.pyx":114 + /* "pdal/libpdalpython.pyx":113 * c_arrays.push_back(a) * * self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) # <<<<<<<<<<<<<< @@ -3169,16 +3151,16 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob */ if (unlikely(__pyx_v_json == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "encode"); - __PYX_ERR(1, 114, __pyx_L1_error) + __PYX_ERR(1, 113, __pyx_L1_error) } - __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 114, __pyx_L1_error) + __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 113, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); - __pyx_t_7 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_7) && PyErr_Occurred())) __PYX_ERR(1, 114, __pyx_L1_error) + __pyx_t_7 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_7) && PyErr_Occurred())) __PYX_ERR(1, 113, __pyx_L1_error) try { __pyx_t_8 = new pdal::python::Pipeline(__pyx_t_7, __pyx_v_c_arrays); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 114, __pyx_L1_error) + __PYX_ERR(1, 113, __pyx_L1_error) } __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_v_self->thisptr = __pyx_t_8; @@ -3187,13 +3169,13 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob * cdef Array* a * * if arrays is not None: # <<<<<<<<<<<<<< - * print("Looping arrays\n") * for array in arrays: + * a = new Array(array) */ goto __pyx_L4; } - /* "pdal/libpdalpython.pyx":116 + /* "pdal/libpdalpython.pyx":115 * self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) * else: * self.thisptr = new Pipeline(json.encode('UTF-8')) # <<<<<<<<<<<<<< @@ -3203,16 +3185,16 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob /*else*/ { if (unlikely(__pyx_v_json == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "encode"); - __PYX_ERR(1, 116, __pyx_L1_error) + __PYX_ERR(1, 115, __pyx_L1_error) } - __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 116, __pyx_L1_error) + __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 115, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); - __pyx_t_9 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_9) && PyErr_Occurred())) __PYX_ERR(1, 116, __pyx_L1_error) + __pyx_t_9 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_9) && PyErr_Occurred())) __PYX_ERR(1, 115, __pyx_L1_error) try { __pyx_t_8 = new pdal::python::Pipeline(__pyx_t_9); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 116, __pyx_L1_error) + __PYX_ERR(1, 115, __pyx_L1_error) } __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_v_self->thisptr = __pyx_t_8; @@ -3241,7 +3223,7 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob return __pyx_r; } -/* "pdal/libpdalpython.pyx":118 +/* "pdal/libpdalpython.pyx":117 * self.thisptr = new Pipeline(json.encode('UTF-8')) * * def __dealloc__(self): # <<<<<<<<<<<<<< @@ -3264,7 +3246,7 @@ static void __pyx_pf_4pdal_13libpdalpython_10PyPipeline_2__dealloc__(struct __py __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__dealloc__", 0); - /* "pdal/libpdalpython.pyx":119 + /* "pdal/libpdalpython.pyx":118 * * def __dealloc__(self): * del self.thisptr # <<<<<<<<<<<<<< @@ -3273,7 +3255,7 @@ static void __pyx_pf_4pdal_13libpdalpython_10PyPipeline_2__dealloc__(struct __py */ delete __pyx_v_self->thisptr; - /* "pdal/libpdalpython.pyx":118 + /* "pdal/libpdalpython.pyx":117 * self.thisptr = new Pipeline(json.encode('UTF-8')) * * def __dealloc__(self): # <<<<<<<<<<<<<< @@ -3285,7 +3267,7 @@ static void __pyx_pf_4pdal_13libpdalpython_10PyPipeline_2__dealloc__(struct __py __Pyx_RefNannyFinishContext(); } -/* "pdal/libpdalpython.pyx":122 +/* "pdal/libpdalpython.pyx":121 * * property pipeline: * def __get__(self): # <<<<<<<<<<<<<< @@ -3313,7 +3295,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8pipeline___get__(s PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":123 + /* "pdal/libpdalpython.pyx":122 * property pipeline: * def __get__(self): * return self.thisptr.getPipeline() # <<<<<<<<<<<<<< @@ -3325,15 +3307,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8pipeline___get__(s __pyx_t_1 = __pyx_v_self->thisptr->getPipeline(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 123, __pyx_L1_error) + __PYX_ERR(1, 122, __pyx_L1_error) } - __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 123, __pyx_L1_error) + __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 122, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":122 + /* "pdal/libpdalpython.pyx":121 * * property pipeline: * def __get__(self): # <<<<<<<<<<<<<< @@ -3352,7 +3334,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8pipeline___get__(s return __pyx_r; } -/* "pdal/libpdalpython.pyx":126 +/* "pdal/libpdalpython.pyx":125 * * property metadata: * def __get__(self): # <<<<<<<<<<<<<< @@ -3380,7 +3362,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8metadata___get__(s PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":127 + /* "pdal/libpdalpython.pyx":126 * property metadata: * def __get__(self): * return self.thisptr.getMetadata() # <<<<<<<<<<<<<< @@ -3392,15 +3374,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8metadata___get__(s __pyx_t_1 = __pyx_v_self->thisptr->getMetadata(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 127, __pyx_L1_error) + __PYX_ERR(1, 126, __pyx_L1_error) } - __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 127, __pyx_L1_error) + __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 126, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":126 + /* "pdal/libpdalpython.pyx":125 * * property metadata: * def __get__(self): # <<<<<<<<<<<<<< @@ -3419,7 +3401,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8metadata___get__(s return __pyx_r; } -/* "pdal/libpdalpython.pyx":130 +/* "pdal/libpdalpython.pyx":129 * * property loglevel: * def __get__(self): # <<<<<<<<<<<<<< @@ -3446,7 +3428,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel___get__(s PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":131 + /* "pdal/libpdalpython.pyx":130 * property loglevel: * def __get__(self): * return self.thisptr.getLogLevel() # <<<<<<<<<<<<<< @@ -3454,13 +3436,13 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel___get__(s * self.thisptr.setLogLevel(v) */ __Pyx_XDECREF(__pyx_r); - __pyx_t_1 = __Pyx_PyInt_From_int(__pyx_v_self->thisptr->getLogLevel()); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 131, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyInt_From_int(__pyx_v_self->thisptr->getLogLevel()); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":130 + /* "pdal/libpdalpython.pyx":129 * * property loglevel: * def __get__(self): # <<<<<<<<<<<<<< @@ -3479,7 +3461,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel___get__(s return __pyx_r; } -/* "pdal/libpdalpython.pyx":132 +/* "pdal/libpdalpython.pyx":131 * def __get__(self): * return self.thisptr.getLogLevel() * def __set__(self, v): # <<<<<<<<<<<<<< @@ -3506,17 +3488,17 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel_2__set__(struct int __pyx_t_1; __Pyx_RefNannySetupContext("__set__", 0); - /* "pdal/libpdalpython.pyx":133 + /* "pdal/libpdalpython.pyx":132 * return self.thisptr.getLogLevel() * def __set__(self, v): * self.thisptr.setLogLevel(v) # <<<<<<<<<<<<<< * * property log: */ - __pyx_t_1 = __Pyx_PyInt_As_int(__pyx_v_v); if (unlikely((__pyx_t_1 == (int)-1) && PyErr_Occurred())) __PYX_ERR(1, 133, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyInt_As_int(__pyx_v_v); if (unlikely((__pyx_t_1 == (int)-1) && PyErr_Occurred())) __PYX_ERR(1, 132, __pyx_L1_error) __pyx_v_self->thisptr->setLogLevel(__pyx_t_1); - /* "pdal/libpdalpython.pyx":132 + /* "pdal/libpdalpython.pyx":131 * def __get__(self): * return self.thisptr.getLogLevel() * def __set__(self, v): # <<<<<<<<<<<<<< @@ -3535,7 +3517,7 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel_2__set__(struct return __pyx_r; } -/* "pdal/libpdalpython.pyx":136 +/* "pdal/libpdalpython.pyx":135 * * property log: * def __get__(self): # <<<<<<<<<<<<<< @@ -3563,7 +3545,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_3log___get__(struct PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":138 + /* "pdal/libpdalpython.pyx":137 * def __get__(self): * * return self.thisptr.getLog() # <<<<<<<<<<<<<< @@ -3575,15 +3557,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_3log___get__(struct __pyx_t_1 = __pyx_v_self->thisptr->getLog(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 138, __pyx_L1_error) + __PYX_ERR(1, 137, __pyx_L1_error) } - __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 138, __pyx_L1_error) + __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 137, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":136 + /* "pdal/libpdalpython.pyx":135 * * property log: * def __get__(self): # <<<<<<<<<<<<<< @@ -3602,7 +3584,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_3log___get__(struct return __pyx_r; } -/* "pdal/libpdalpython.pyx":141 +/* "pdal/libpdalpython.pyx":140 * * property schema: * def __get__(self): # <<<<<<<<<<<<<< @@ -3635,19 +3617,19 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str PyObject *__pyx_t_5 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":142 + /* "pdal/libpdalpython.pyx":141 * property schema: * def __get__(self): * import json # <<<<<<<<<<<<<< * * j = self.thisptr.getSchema() */ - __pyx_t_1 = __Pyx_Import(__pyx_n_s_json, 0, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 142, __pyx_L1_error) + __pyx_t_1 = __Pyx_Import(__pyx_n_s_json, 0, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 141, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_json = __pyx_t_1; __pyx_t_1 = 0; - /* "pdal/libpdalpython.pyx":144 + /* "pdal/libpdalpython.pyx":143 * import json * * j = self.thisptr.getSchema() # <<<<<<<<<<<<<< @@ -3658,11 +3640,11 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str __pyx_t_2 = __pyx_v_self->thisptr->getSchema(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 144, __pyx_L1_error) + __PYX_ERR(1, 143, __pyx_L1_error) } __pyx_v_j = __pyx_t_2; - /* "pdal/libpdalpython.pyx":145 + /* "pdal/libpdalpython.pyx":144 * * j = self.thisptr.getSchema() * return json.loads(j) # <<<<<<<<<<<<<< @@ -3670,9 +3652,9 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str * property arrays: */ __Pyx_XDECREF(__pyx_r); - __pyx_t_3 = __Pyx_PyObject_GetAttrStr(__pyx_v_json, __pyx_n_s_loads); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 145, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_GetAttrStr(__pyx_v_json, __pyx_n_s_loads); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 144, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); - __pyx_t_4 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_v_j); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 145, __pyx_L1_error) + __pyx_t_4 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_v_j); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 144, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_3))) { @@ -3687,14 +3669,14 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str __pyx_t_1 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_3, __pyx_t_5, __pyx_t_4) : __Pyx_PyObject_CallOneArg(__pyx_t_3, __pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 145, __pyx_L1_error) + if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 144, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":141 + /* "pdal/libpdalpython.pyx":140 * * property schema: * def __get__(self): # <<<<<<<<<<<<<< @@ -3717,7 +3699,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str return __pyx_r; } -/* "pdal/libpdalpython.pyx":149 +/* "pdal/libpdalpython.pyx":148 * property arrays: * * def __get__(self): # <<<<<<<<<<<<<< @@ -3754,7 +3736,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str int __pyx_t_5; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":150 + /* "pdal/libpdalpython.pyx":149 * * def __get__(self): * v = self.thisptr.getArrays() # <<<<<<<<<<<<<< @@ -3765,23 +3747,23 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_t_1 = __pyx_v_self->thisptr->getArrays(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 150, __pyx_L1_error) + __PYX_ERR(1, 149, __pyx_L1_error) } __pyx_v_v = __pyx_t_1; - /* "pdal/libpdalpython.pyx":151 + /* "pdal/libpdalpython.pyx":150 * def __get__(self): * v = self.thisptr.getArrays() * output = [] # <<<<<<<<<<<<<< * cdef vector[Array*].iterator it = v.begin() * cdef Array* a */ - __pyx_t_2 = PyList_New(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 151, __pyx_L1_error) + __pyx_t_2 = PyList_New(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 150, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v_output = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; - /* "pdal/libpdalpython.pyx":152 + /* "pdal/libpdalpython.pyx":151 * v = self.thisptr.getArrays() * output = [] * cdef vector[Array*].iterator it = v.begin() # <<<<<<<<<<<<<< @@ -3790,7 +3772,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str */ __pyx_v_it = __pyx_v_v.begin(); - /* "pdal/libpdalpython.pyx":154 + /* "pdal/libpdalpython.pyx":153 * cdef vector[Array*].iterator it = v.begin() * cdef Array* a * while it != v.end(): # <<<<<<<<<<<<<< @@ -3801,7 +3783,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_t_3 = ((__pyx_v_it != __pyx_v_v.end()) != 0); if (!__pyx_t_3) break; - /* "pdal/libpdalpython.pyx":155 + /* "pdal/libpdalpython.pyx":154 * cdef Array* a * while it != v.end(): * ptr = deref(it) # <<<<<<<<<<<<<< @@ -3810,7 +3792,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str */ __pyx_v_ptr = (*__pyx_v_it); - /* "pdal/libpdalpython.pyx":156 + /* "pdal/libpdalpython.pyx":155 * while it != v.end(): * ptr = deref(it) * a = ptr#.get() # <<<<<<<<<<<<<< @@ -3819,7 +3801,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str */ __pyx_v_a = __pyx_v_ptr; - /* "pdal/libpdalpython.pyx":157 + /* "pdal/libpdalpython.pyx":156 * ptr = deref(it) * a = ptr#.get() * o = a.getPythonArray() # <<<<<<<<<<<<<< @@ -3830,20 +3812,20 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_t_4 = __pyx_v_a->getPythonArray(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 157, __pyx_L1_error) + __PYX_ERR(1, 156, __pyx_L1_error) } __pyx_v_o = __pyx_t_4; - /* "pdal/libpdalpython.pyx":158 + /* "pdal/libpdalpython.pyx":157 * a = ptr#.get() * o = a.getPythonArray() * output.append(o) # <<<<<<<<<<<<<< * inc(it) * return output */ - __pyx_t_5 = __Pyx_PyList_Append(__pyx_v_output, ((PyObject *)__pyx_v_o)); if (unlikely(__pyx_t_5 == ((int)-1))) __PYX_ERR(1, 158, __pyx_L1_error) + __pyx_t_5 = __Pyx_PyList_Append(__pyx_v_output, ((PyObject *)__pyx_v_o)); if (unlikely(__pyx_t_5 == ((int)-1))) __PYX_ERR(1, 157, __pyx_L1_error) - /* "pdal/libpdalpython.pyx":159 + /* "pdal/libpdalpython.pyx":158 * o = a.getPythonArray() * output.append(o) * inc(it) # <<<<<<<<<<<<<< @@ -3853,7 +3835,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str (void)((++__pyx_v_it)); } - /* "pdal/libpdalpython.pyx":160 + /* "pdal/libpdalpython.pyx":159 * output.append(o) * inc(it) * return output # <<<<<<<<<<<<<< @@ -3865,7 +3847,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_r = __pyx_v_output; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":149 + /* "pdal/libpdalpython.pyx":148 * property arrays: * * def __get__(self): # <<<<<<<<<<<<<< @@ -3885,7 +3867,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str return __pyx_r; } -/* "pdal/libpdalpython.pyx":163 +/* "pdal/libpdalpython.pyx":162 * * * def execute(self): # <<<<<<<<<<<<<< @@ -3914,7 +3896,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p int64_t __pyx_t_3; __Pyx_RefNannySetupContext("execute", 0); - /* "pdal/libpdalpython.pyx":164 + /* "pdal/libpdalpython.pyx":163 * * def execute(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -3924,20 +3906,20 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p __pyx_t_1 = ((!(__pyx_v_self->thisptr != 0)) != 0); if (unlikely(__pyx_t_1)) { - /* "pdal/libpdalpython.pyx":165 + /* "pdal/libpdalpython.pyx":164 * def execute(self): * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") # <<<<<<<<<<<<<< * return self.thisptr.execute() * */ - __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__4, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 165, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__3, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 164, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_Raise(__pyx_t_2, 0, 0, 0); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __PYX_ERR(1, 165, __pyx_L1_error) + __PYX_ERR(1, 164, __pyx_L1_error) - /* "pdal/libpdalpython.pyx":164 + /* "pdal/libpdalpython.pyx":163 * * def execute(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -3946,7 +3928,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p */ } - /* "pdal/libpdalpython.pyx":166 + /* "pdal/libpdalpython.pyx":165 * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") * return self.thisptr.execute() # <<<<<<<<<<<<<< @@ -3958,15 +3940,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p __pyx_t_3 = __pyx_v_self->thisptr->execute(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 166, __pyx_L1_error) + __PYX_ERR(1, 165, __pyx_L1_error) } - __pyx_t_2 = __Pyx_PyInt_From_int64_t(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 166, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyInt_From_int64_t(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 165, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":163 + /* "pdal/libpdalpython.pyx":162 * * * def execute(self): # <<<<<<<<<<<<<< @@ -3985,7 +3967,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p return __pyx_r; } -/* "pdal/libpdalpython.pyx":168 +/* "pdal/libpdalpython.pyx":167 * return self.thisptr.execute() * * def validate(self): # <<<<<<<<<<<<<< @@ -4014,7 +3996,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ bool __pyx_t_3; __Pyx_RefNannySetupContext("validate", 0); - /* "pdal/libpdalpython.pyx":169 + /* "pdal/libpdalpython.pyx":168 * * def validate(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -4024,19 +4006,19 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ __pyx_t_1 = ((!(__pyx_v_self->thisptr != 0)) != 0); if (unlikely(__pyx_t_1)) { - /* "pdal/libpdalpython.pyx":170 + /* "pdal/libpdalpython.pyx":169 * def validate(self): * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") # <<<<<<<<<<<<<< * return self.thisptr.validate() */ - __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__4, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 170, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__3, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 169, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_Raise(__pyx_t_2, 0, 0, 0); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __PYX_ERR(1, 170, __pyx_L1_error) + __PYX_ERR(1, 169, __pyx_L1_error) - /* "pdal/libpdalpython.pyx":169 + /* "pdal/libpdalpython.pyx":168 * * def validate(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -4045,7 +4027,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ */ } - /* "pdal/libpdalpython.pyx":171 + /* "pdal/libpdalpython.pyx":170 * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") * return self.thisptr.validate() # <<<<<<<<<<<<<< @@ -4055,15 +4037,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ __pyx_t_3 = __pyx_v_self->thisptr->validate(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 171, __pyx_L1_error) + __PYX_ERR(1, 170, __pyx_L1_error) } - __pyx_t_2 = __Pyx_PyBool_FromLong(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 171, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyBool_FromLong(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 170, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":168 + /* "pdal/libpdalpython.pyx":167 * return self.thisptr.execute() * * def validate(self): # <<<<<<<<<<<<<< @@ -4113,7 +4095,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8__reduce_cython__( * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ - __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__5, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 2, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__4, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; @@ -4166,7 +4148,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_10__setstate_cython * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ - __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__6, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 4, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__5, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; @@ -4306,7 +4288,7 @@ static int __pyx_pf_5numpy_7ndarray___getbuffer__(PyArrayObject *__pyx_v_self, P * * if ((flags & pybuf.PyBUF_F_CONTIGUOUS == pybuf.PyBUF_F_CONTIGUOUS) */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__7, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 272, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__6, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 272, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -4362,7 +4344,7 @@ static int __pyx_pf_5numpy_7ndarray___getbuffer__(PyArrayObject *__pyx_v_self, P * * info.buf = PyArray_DATA(self) */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__8, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 276, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__7, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 276, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -4620,7 +4602,7 @@ static int __pyx_pf_5numpy_7ndarray___getbuffer__(PyArrayObject *__pyx_v_self, P * if t == NPY_BYTE: f = "b" * elif t == NPY_UBYTE: f = "B" */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__9, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 306, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__8, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 306, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -5500,7 +5482,7 @@ static CYTHON_INLINE char *__pyx_f_5numpy__util_dtypestring(PyArray_Descr *__pyx * * if ((child.byteorder == c'>' and little_endian) or */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__10, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 856, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__9, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 856, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -5568,7 +5550,7 @@ static CYTHON_INLINE char *__pyx_f_5numpy__util_dtypestring(PyArray_Descr *__pyx * # One could encode it in the format string and have Cython * # complain instead, BUT: < and > in format strings also imply */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__9, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 860, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__8, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 860, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -5677,7 +5659,7 @@ static CYTHON_INLINE char *__pyx_f_5numpy__util_dtypestring(PyArray_Descr *__pyx * * # Until ticket #99 is fixed, use integers to avoid warnings */ - __pyx_t_4 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__11, NULL); if (unlikely(!__pyx_t_4)) __PYX_ERR(2, 880, __pyx_L1_error) + __pyx_t_4 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__10, NULL); if (unlikely(!__pyx_t_4)) __PYX_ERR(2, 880, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_Raise(__pyx_t_4, 0, 0, 0); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; @@ -6305,7 +6287,7 @@ static CYTHON_INLINE int __pyx_f_5numpy_import_array(void) { * * cdef inline int import_umath() except -1: */ - __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__12, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1038, __pyx_L5_except_error) + __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__11, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1038, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; @@ -6434,7 +6416,7 @@ static CYTHON_INLINE int __pyx_f_5numpy_import_umath(void) { * * cdef inline int import_ufunc() except -1: */ - __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__13, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1044, __pyx_L5_except_error) + __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__12, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1044, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; @@ -6560,7 +6542,7 @@ static CYTHON_INLINE int __pyx_f_5numpy_import_ufunc(void) { * except Exception: * raise ImportError("numpy.core.umath failed to import") # <<<<<<<<<<<<<< */ - __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__13, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1050, __pyx_L5_except_error) + __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__12, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1050, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; @@ -7135,7 +7117,6 @@ static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_kp_u_Format_string_allocated_too_shor, __pyx_k_Format_string_allocated_too_shor, sizeof(__pyx_k_Format_string_allocated_too_shor), 0, 1, 0, 0}, {&__pyx_kp_u_Format_string_allocated_too_shor_2, __pyx_k_Format_string_allocated_too_shor_2, sizeof(__pyx_k_Format_string_allocated_too_shor_2), 0, 1, 0, 0}, {&__pyx_n_s_ImportError, __pyx_k_ImportError, sizeof(__pyx_k_ImportError), 0, 0, 1, 1}, - {&__pyx_kp_u_Looping_arrays, __pyx_k_Looping_arrays, sizeof(__pyx_k_Looping_arrays), 0, 1, 0, 0}, {&__pyx_kp_u_Non_native_byte_order_not_suppor, __pyx_k_Non_native_byte_order_not_suppor, sizeof(__pyx_k_Non_native_byte_order_not_suppor), 0, 1, 0, 0}, {&__pyx_n_s_PyArray, __pyx_k_PyArray, sizeof(__pyx_k_PyArray), 0, 0, 1, 1}, {&__pyx_n_s_PyPipeline, __pyx_k_PyPipeline, sizeof(__pyx_k_PyPipeline), 0, 0, 1, 1}, @@ -7174,7 +7155,6 @@ static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_n_s_output, __pyx_k_output, sizeof(__pyx_k_output), 0, 0, 1, 1}, {&__pyx_n_s_pdal_libpdalpython, __pyx_k_pdal_libpdalpython, sizeof(__pyx_k_pdal_libpdalpython), 0, 0, 1, 1}, {&__pyx_kp_s_pdal_libpdalpython_pyx, __pyx_k_pdal_libpdalpython_pyx, sizeof(__pyx_k_pdal_libpdalpython_pyx), 0, 0, 1, 0}, - {&__pyx_n_s_print, __pyx_k_print, sizeof(__pyx_k_print), 0, 0, 1, 1}, {&__pyx_n_s_ptr, __pyx_k_ptr, sizeof(__pyx_k_ptr), 0, 0, 1, 1}, {&__pyx_n_s_range, __pyx_k_range, sizeof(__pyx_k_range), 0, 0, 1, 1}, {&__pyx_n_s_reduce, __pyx_k_reduce, sizeof(__pyx_k_reduce), 0, 0, 1, 1}, @@ -7188,7 +7168,6 @@ static __Pyx_StringTabEntry __pyx_string_tab[] = { }; static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_TypeError = __Pyx_GetBuiltinName(__pyx_n_s_TypeError); if (!__pyx_builtin_TypeError) __PYX_ERR(0, 2, __pyx_L1_error) - __pyx_builtin_print = __Pyx_GetBuiltinName(__pyx_n_s_print); if (!__pyx_builtin_print) __PYX_ERR(1, 109, __pyx_L1_error) __pyx_builtin_ValueError = __Pyx_GetBuiltinName(__pyx_n_s_ValueError); if (!__pyx_builtin_ValueError) __PYX_ERR(2, 272, __pyx_L1_error) __pyx_builtin_range = __Pyx_GetBuiltinName(__pyx_n_s_range); if (!__pyx_builtin_range) __PYX_ERR(2, 285, __pyx_L1_error) __pyx_builtin_RuntimeError = __Pyx_GetBuiltinName(__pyx_n_s_RuntimeError); if (!__pyx_builtin_RuntimeError) __PYX_ERR(2, 856, __pyx_L1_error) @@ -7221,27 +7200,16 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_GOTREF(__pyx_tuple__2); __Pyx_GIVEREF(__pyx_tuple__2); - /* "pdal/libpdalpython.pyx":109 - * - * if arrays is not None: - * print("Looping arrays\n") # <<<<<<<<<<<<<< - * for array in arrays: - * a = new Array(array) - */ - __pyx_tuple__3 = PyTuple_Pack(1, __pyx_kp_u_Looping_arrays); if (unlikely(!__pyx_tuple__3)) __PYX_ERR(1, 109, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__3); - __Pyx_GIVEREF(__pyx_tuple__3); - - /* "pdal/libpdalpython.pyx":165 + /* "pdal/libpdalpython.pyx":164 * def execute(self): * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") # <<<<<<<<<<<<<< * return self.thisptr.execute() * */ - __pyx_tuple__4 = PyTuple_Pack(1, __pyx_kp_u_C_Pipeline_object_not_constructe); if (unlikely(!__pyx_tuple__4)) __PYX_ERR(1, 165, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__4); - __Pyx_GIVEREF(__pyx_tuple__4); + __pyx_tuple__3 = PyTuple_Pack(1, __pyx_kp_u_C_Pipeline_object_not_constructe); if (unlikely(!__pyx_tuple__3)) __PYX_ERR(1, 164, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__3); + __Pyx_GIVEREF(__pyx_tuple__3); /* "(tree fragment)":2 * def __reduce_cython__(self): @@ -7249,18 +7217,18 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ - __pyx_tuple__5 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__5)) __PYX_ERR(0, 2, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__5); - __Pyx_GIVEREF(__pyx_tuple__5); + __pyx_tuple__4 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__4)) __PYX_ERR(0, 2, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__4); + __Pyx_GIVEREF(__pyx_tuple__4); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ - __pyx_tuple__6 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__6)) __PYX_ERR(0, 4, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__6); - __Pyx_GIVEREF(__pyx_tuple__6); + __pyx_tuple__5 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__5)) __PYX_ERR(0, 4, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__5); + __Pyx_GIVEREF(__pyx_tuple__5); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":272 * if ((flags & pybuf.PyBUF_C_CONTIGUOUS == pybuf.PyBUF_C_CONTIGUOUS) @@ -7269,9 +7237,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * if ((flags & pybuf.PyBUF_F_CONTIGUOUS == pybuf.PyBUF_F_CONTIGUOUS) */ - __pyx_tuple__7 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_C_contiguous); if (unlikely(!__pyx_tuple__7)) __PYX_ERR(2, 272, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__7); - __Pyx_GIVEREF(__pyx_tuple__7); + __pyx_tuple__6 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_C_contiguous); if (unlikely(!__pyx_tuple__6)) __PYX_ERR(2, 272, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__6); + __Pyx_GIVEREF(__pyx_tuple__6); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":276 * if ((flags & pybuf.PyBUF_F_CONTIGUOUS == pybuf.PyBUF_F_CONTIGUOUS) @@ -7280,9 +7248,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * info.buf = PyArray_DATA(self) */ - __pyx_tuple__8 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_Fortran_contiguou); if (unlikely(!__pyx_tuple__8)) __PYX_ERR(2, 276, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__8); - __Pyx_GIVEREF(__pyx_tuple__8); + __pyx_tuple__7 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_Fortran_contiguou); if (unlikely(!__pyx_tuple__7)) __PYX_ERR(2, 276, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__7); + __Pyx_GIVEREF(__pyx_tuple__7); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":306 * if ((descr.byteorder == c'>' and little_endian) or @@ -7291,9 +7259,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * if t == NPY_BYTE: f = "b" * elif t == NPY_UBYTE: f = "B" */ - __pyx_tuple__9 = PyTuple_Pack(1, __pyx_kp_u_Non_native_byte_order_not_suppor); if (unlikely(!__pyx_tuple__9)) __PYX_ERR(2, 306, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__9); - __Pyx_GIVEREF(__pyx_tuple__9); + __pyx_tuple__8 = PyTuple_Pack(1, __pyx_kp_u_Non_native_byte_order_not_suppor); if (unlikely(!__pyx_tuple__8)) __PYX_ERR(2, 306, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__8); + __Pyx_GIVEREF(__pyx_tuple__8); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":856 * @@ -7302,9 +7270,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * if ((child.byteorder == c'>' and little_endian) or */ - __pyx_tuple__10 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor); if (unlikely(!__pyx_tuple__10)) __PYX_ERR(2, 856, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__10); - __Pyx_GIVEREF(__pyx_tuple__10); + __pyx_tuple__9 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor); if (unlikely(!__pyx_tuple__9)) __PYX_ERR(2, 856, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__9); + __Pyx_GIVEREF(__pyx_tuple__9); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":880 * t = child.type_num @@ -7313,9 +7281,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * # Until ticket #99 is fixed, use integers to avoid warnings */ - __pyx_tuple__11 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor_2); if (unlikely(!__pyx_tuple__11)) __PYX_ERR(2, 880, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__11); - __Pyx_GIVEREF(__pyx_tuple__11); + __pyx_tuple__10 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor_2); if (unlikely(!__pyx_tuple__10)) __PYX_ERR(2, 880, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__10); + __Pyx_GIVEREF(__pyx_tuple__10); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":1038 * _import_array() @@ -7324,9 +7292,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * cdef inline int import_umath() except -1: */ - __pyx_tuple__12 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_multiarray_failed_to); if (unlikely(!__pyx_tuple__12)) __PYX_ERR(2, 1038, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__12); - __Pyx_GIVEREF(__pyx_tuple__12); + __pyx_tuple__11 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_multiarray_failed_to); if (unlikely(!__pyx_tuple__11)) __PYX_ERR(2, 1038, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__11); + __Pyx_GIVEREF(__pyx_tuple__11); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":1044 * _import_umath() @@ -7335,9 +7303,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * cdef inline int import_ufunc() except -1: */ - __pyx_tuple__13 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_umath_failed_to_impor); if (unlikely(!__pyx_tuple__13)) __PYX_ERR(2, 1044, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__13); - __Pyx_GIVEREF(__pyx_tuple__13); + __pyx_tuple__12 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_umath_failed_to_impor); if (unlikely(!__pyx_tuple__12)) __PYX_ERR(2, 1044, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__12); + __Pyx_GIVEREF(__pyx_tuple__12); /* "pdal/libpdalpython.pyx":24 * cdef string versionString() except+ @@ -7346,7 +7314,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionString() * def getVersionMajor(): */ - __pyx_codeobj__14 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionString, 24, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__14)) __PYX_ERR(1, 24, __pyx_L1_error) + __pyx_codeobj__13 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionString, 24, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__13)) __PYX_ERR(1, 24, __pyx_L1_error) /* "pdal/libpdalpython.pyx":26 * def getVersionString(): @@ -7355,7 +7323,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionMajor() * def getVersionMinor(): */ - __pyx_codeobj__15 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMajor, 26, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__15)) __PYX_ERR(1, 26, __pyx_L1_error) + __pyx_codeobj__14 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMajor, 26, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__14)) __PYX_ERR(1, 26, __pyx_L1_error) /* "pdal/libpdalpython.pyx":28 * def getVersionMajor(): @@ -7364,7 +7332,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionMinor() * def getVersionPatch(): */ - __pyx_codeobj__16 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMinor, 28, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__16)) __PYX_ERR(1, 28, __pyx_L1_error) + __pyx_codeobj__15 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMinor, 28, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__15)) __PYX_ERR(1, 28, __pyx_L1_error) /* "pdal/libpdalpython.pyx":30 * def getVersionMinor(): @@ -7373,7 +7341,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionPatch() * def getSha1(): */ - __pyx_codeobj__17 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionPatch, 30, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__17)) __PYX_ERR(1, 30, __pyx_L1_error) + __pyx_codeobj__16 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionPatch, 30, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__16)) __PYX_ERR(1, 30, __pyx_L1_error) /* "pdal/libpdalpython.pyx":32 * def getVersionPatch(): @@ -7382,7 +7350,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return sha1() * def getDebugInformation(): */ - __pyx_codeobj__18 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getSha1, 32, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__18)) __PYX_ERR(1, 32, __pyx_L1_error) + __pyx_codeobj__17 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getSha1, 32, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__17)) __PYX_ERR(1, 32, __pyx_L1_error) /* "pdal/libpdalpython.pyx":34 * def getSha1(): @@ -7391,7 +7359,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return debugInformation() * def getPluginInstallPath(): */ - __pyx_codeobj__19 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDebugInformation, 34, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__19)) __PYX_ERR(1, 34, __pyx_L1_error) + __pyx_codeobj__18 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDebugInformation, 34, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__18)) __PYX_ERR(1, 34, __pyx_L1_error) /* "pdal/libpdalpython.pyx":36 * def getDebugInformation(): @@ -7400,7 +7368,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return pluginInstallPath() * */ - __pyx_codeobj__20 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getPluginInstallPath, 36, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__20)) __PYX_ERR(1, 36, __pyx_L1_error) + __pyx_codeobj__19 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getPluginInstallPath, 36, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__19)) __PYX_ERR(1, 36, __pyx_L1_error) /* "pdal/libpdalpython.pyx":76 * @@ -7409,10 +7377,10 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * cdef vector[Dimension] c_dims; * c_dims = getValidDimensions() */ - __pyx_tuple__21 = PyTuple_Pack(6, __pyx_n_s_c_dims, __pyx_n_s_output, __pyx_n_s_it, __pyx_n_s_ptr, __pyx_n_s_d, __pyx_n_s_kind); if (unlikely(!__pyx_tuple__21)) __PYX_ERR(1, 76, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__21); - __Pyx_GIVEREF(__pyx_tuple__21); - __pyx_codeobj__22 = (PyObject*)__Pyx_PyCode_New(0, 0, 6, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__21, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDimensions, 76, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__22)) __PYX_ERR(1, 76, __pyx_L1_error) + __pyx_tuple__20 = PyTuple_Pack(6, __pyx_n_s_c_dims, __pyx_n_s_output, __pyx_n_s_it, __pyx_n_s_ptr, __pyx_n_s_d, __pyx_n_s_kind); if (unlikely(!__pyx_tuple__20)) __PYX_ERR(1, 76, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__20); + __Pyx_GIVEREF(__pyx_tuple__20); + __pyx_codeobj__21 = (PyObject*)__Pyx_PyCode_New(0, 0, 6, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__20, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDimensions, 76, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__21)) __PYX_ERR(1, 76, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; ===================================== pdal/libpdalpython.pyx ===================================== @@ -106,7 +106,6 @@ cdef class PyPipeline: cdef Array* a if arrays is not None: - print("Looping arrays\n") for array in arrays: a = new Array(array) c_arrays.push_back(a) ===================================== setup.py ===================================== @@ -91,9 +91,6 @@ open_kwds = {} if sys.version_info >= (3,): open_kwds['encoding'] = 'utf-8' -with open('VERSION.txt', 'w', **open_kwds) as fp: - fp.write(str(module_version)) - with open('README.rst', 'r', **open_kwds) as fp: readme = fp.read() ===================================== test/test_pipeline.py ===================================== @@ -159,6 +159,39 @@ class TestArrayLoad(PDALTest): self.assertEqual(len(data), 12) self.assertEqual(data['Intensity'].sum(), 1926) + def test_read_arrays(self): + """Can we read and filter data from a list of arrays to PDAL""" + if Version(pdal.info.version) < Version('1.8'): + return True + + # just some dummy data + x_vals = [1.0, 2.0, 3.0, 4.0, 5.0] + y_vals = [6.0, 7.0, 8.0, 9.0, 10.0] + z_vals = [1.5, 3.5, 5.5, 7.5, 9.5] + test_data = np.array( + [(x, y, z) for x, y, z in zip(x_vals, y_vals, z_vals)], + dtype=[('X', np.float), ('Y', np.float), ('Z', np.float)] + ) + + pipeline = """ + { + "pipeline": [ + { + "type":"filters.range", + "limits":"X[2.5:4.5]" + } + ] + } + """ + + p = pdal.Pipeline(pipeline, arrays=[test_data,]) + p.loglevel = 8 + count = p.execute() + arrays = p.arrays + self.assertEqual(count, 2) + self.assertEqual(len(arrays), 1) + + class TestDimensions(PDALTest): def test_fetch_dimensions(self): """Ask PDAL for its valid dimensions list""" View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/compare/7f39bf60d1a62491f62b104706eb7e08131ea801...af2e6d83debd909071143ba4a254ac459ea1c7c2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/compare/7f39bf60d1a62491f62b104706eb7e08131ea801...af2e6d83debd909071143ba4a254ac459ea1c7c2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 05:59:24 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 04:59:24 +0000 Subject: [Git][debian-gis-team/python-pdal][upstream] New upstream version 2.2.2+ds Message-ID: <5d845caca57fa_73482ad95f80969416655e8@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pdal Commits: 461c6250 by Bas Couwenberg at 2019-09-20T04:48:37Z New upstream version 2.2.2+ds - - - - - 8 changed files: - PKG-INFO - − VERSION.txt - pdal/PyArray.hpp - pdal/__init__.py - pdal/libpdalpython.cpp - pdal/libpdalpython.pyx - setup.py - test/test_pipeline.py Changes: ===================================== PKG-INFO ===================================== @@ -1,6 +1,6 @@ Metadata-Version: 1.2 Name: PDAL -Version: 2.2.1 +Version: 2.2.2 Summary: Point cloud data processing Home-page: http://pdal.io Author: Howard Butler ===================================== VERSION.txt deleted ===================================== @@ -1 +0,0 @@ -2.2.1 \ No newline at end of file ===================================== pdal/PyArray.hpp ===================================== @@ -74,7 +74,7 @@ private: Array& operator=(Array const& rhs); Fields m_fields; bool m_rowMajor; - Shape m_shape; + Shape m_shape {}; std::vector> m_iterators; }; ===================================== pdal/__init__.py ===================================== @@ -1,4 +1,4 @@ -__version__='2.2.1' +__version__='2.2.2' from .pipeline import Pipeline from .array import Array ===================================== pdal/libpdalpython.cpp ===================================== @@ -28,8 +28,7 @@ ], "language": "c++", "libraries": [ - "pdalcpp", - "pdal_plugin_reader_numpy" + "pdalcpp" ], "library_dirs": [ "/Users/hobu/miniconda3/envs/pdal/lib" @@ -1813,7 +1812,6 @@ int __pyx_module_is_main_pdal__libpdalpython = 0; /* Implementation of 'pdal.libpdalpython' */ static PyObject *__pyx_builtin_TypeError; -static PyObject *__pyx_builtin_print; static PyObject *__pyx_builtin_ValueError; static PyObject *__pyx_builtin_range; static PyObject *__pyx_builtin_RuntimeError; @@ -1829,7 +1827,6 @@ static const char __pyx_k_test[] = "__test__"; static const char __pyx_k_array[] = "array"; static const char __pyx_k_dtype[] = "dtype"; static const char __pyx_k_loads[] = "loads"; -static const char __pyx_k_print[] = "print"; static const char __pyx_k_range[] = "range"; static const char __pyx_k_arrays[] = "arrays"; static const char __pyx_k_c_dims[] = "c_dims"; @@ -1850,7 +1847,6 @@ static const char __pyx_k_description[] = "description"; static const char __pyx_k_RuntimeError[] = "RuntimeError"; static const char __pyx_k_getDimensions[] = "getDimensions"; static const char __pyx_k_reduce_cython[] = "__reduce_cython__"; -static const char __pyx_k_Looping_arrays[] = "Looping arrays\n"; static const char __pyx_k_getVersionMajor[] = "getVersionMajor"; static const char __pyx_k_getVersionMinor[] = "getVersionMinor"; static const char __pyx_k_getVersionPatch[] = "getVersionPatch"; @@ -1875,7 +1871,6 @@ static PyObject *__pyx_kp_u_C_Pipeline_object_not_constructe; static PyObject *__pyx_kp_u_Format_string_allocated_too_shor; static PyObject *__pyx_kp_u_Format_string_allocated_too_shor_2; static PyObject *__pyx_n_s_ImportError; -static PyObject *__pyx_kp_u_Looping_arrays; static PyObject *__pyx_kp_u_Non_native_byte_order_not_suppor; static PyObject *__pyx_n_s_PyArray; static PyObject *__pyx_n_s_PyPipeline; @@ -1914,7 +1909,6 @@ static PyObject *__pyx_kp_u_numpy_core_umath_failed_to_impor; static PyObject *__pyx_n_s_output; static PyObject *__pyx_n_s_pdal_libpdalpython; static PyObject *__pyx_kp_s_pdal_libpdalpython_pyx; -static PyObject *__pyx_n_s_print; static PyObject *__pyx_n_s_ptr; static PyObject *__pyx_n_s_range; static PyObject *__pyx_n_s_reduce; @@ -1965,16 +1959,15 @@ static PyObject *__pyx_tuple__9; static PyObject *__pyx_tuple__10; static PyObject *__pyx_tuple__11; static PyObject *__pyx_tuple__12; -static PyObject *__pyx_tuple__13; -static PyObject *__pyx_tuple__21; +static PyObject *__pyx_tuple__20; +static PyObject *__pyx_codeobj__13; static PyObject *__pyx_codeobj__14; static PyObject *__pyx_codeobj__15; static PyObject *__pyx_codeobj__16; static PyObject *__pyx_codeobj__17; static PyObject *__pyx_codeobj__18; static PyObject *__pyx_codeobj__19; -static PyObject *__pyx_codeobj__20; -static PyObject *__pyx_codeobj__22; +static PyObject *__pyx_codeobj__21; /* Late includes */ /* "pdal/libpdalpython.pyx":24 @@ -3079,8 +3072,8 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob * cdef Array* a * * if arrays is not None: # <<<<<<<<<<<<<< - * print("Looping arrays\n") * for array in arrays: + * a = new Array(array) */ __pyx_t_1 = (__pyx_v_arrays != ((PyObject*)Py_None)); __pyx_t_3 = (__pyx_t_1 != 0); @@ -3089,54 +3082,43 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob /* "pdal/libpdalpython.pyx":109 * * if arrays is not None: - * print("Looping arrays\n") # <<<<<<<<<<<<<< - * for array in arrays: - * a = new Array(array) - */ - __pyx_t_4 = __Pyx_PyObject_Call(__pyx_builtin_print, __pyx_tuple__3, NULL); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 109, __pyx_L1_error) - __Pyx_GOTREF(__pyx_t_4); - __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - - /* "pdal/libpdalpython.pyx":110 - * if arrays is not None: - * print("Looping arrays\n") * for array in arrays: # <<<<<<<<<<<<<< * a = new Array(array) * c_arrays.push_back(a) */ if (unlikely(__pyx_v_arrays == Py_None)) { PyErr_SetString(PyExc_TypeError, "'NoneType' object is not iterable"); - __PYX_ERR(1, 110, __pyx_L1_error) + __PYX_ERR(1, 109, __pyx_L1_error) } __pyx_t_4 = __pyx_v_arrays; __Pyx_INCREF(__pyx_t_4); __pyx_t_2 = 0; for (;;) { if (__pyx_t_2 >= PyList_GET_SIZE(__pyx_t_4)) break; #if CYTHON_ASSUME_SAFE_MACROS && !CYTHON_AVOID_BORROWED_REFS - __pyx_t_5 = PyList_GET_ITEM(__pyx_t_4, __pyx_t_2); __Pyx_INCREF(__pyx_t_5); __pyx_t_2++; if (unlikely(0 < 0)) __PYX_ERR(1, 110, __pyx_L1_error) + __pyx_t_5 = PyList_GET_ITEM(__pyx_t_4, __pyx_t_2); __Pyx_INCREF(__pyx_t_5); __pyx_t_2++; if (unlikely(0 < 0)) __PYX_ERR(1, 109, __pyx_L1_error) #else - __pyx_t_5 = PySequence_ITEM(__pyx_t_4, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_5)) __PYX_ERR(1, 110, __pyx_L1_error) + __pyx_t_5 = PySequence_ITEM(__pyx_t_4, __pyx_t_2); __pyx_t_2++; if (unlikely(!__pyx_t_5)) __PYX_ERR(1, 109, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_5); #endif __Pyx_XDECREF_SET(__pyx_v_array, __pyx_t_5); __pyx_t_5 = 0; - /* "pdal/libpdalpython.pyx":111 - * print("Looping arrays\n") + /* "pdal/libpdalpython.pyx":110 + * if arrays is not None: * for array in arrays: * a = new Array(array) # <<<<<<<<<<<<<< * c_arrays.push_back(a) * */ - if (!(likely(((__pyx_v_array) == Py_None) || likely(__Pyx_TypeTest(__pyx_v_array, __pyx_ptype_5numpy_ndarray))))) __PYX_ERR(1, 111, __pyx_L1_error) + if (!(likely(((__pyx_v_array) == Py_None) || likely(__Pyx_TypeTest(__pyx_v_array, __pyx_ptype_5numpy_ndarray))))) __PYX_ERR(1, 110, __pyx_L1_error) try { __pyx_t_6 = new pdal::python::Array(((PyArrayObject *)__pyx_v_array)); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 111, __pyx_L1_error) + __PYX_ERR(1, 110, __pyx_L1_error) } __pyx_v_a = __pyx_t_6; - /* "pdal/libpdalpython.pyx":112 + /* "pdal/libpdalpython.pyx":111 * for array in arrays: * a = new Array(array) * c_arrays.push_back(a) # <<<<<<<<<<<<<< @@ -3147,12 +3129,12 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob __pyx_v_c_arrays.push_back(__pyx_v_a); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 112, __pyx_L1_error) + __PYX_ERR(1, 111, __pyx_L1_error) } - /* "pdal/libpdalpython.pyx":110 + /* "pdal/libpdalpython.pyx":109 + * * if arrays is not None: - * print("Looping arrays\n") * for array in arrays: # <<<<<<<<<<<<<< * a = new Array(array) * c_arrays.push_back(a) @@ -3160,7 +3142,7 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob } __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - /* "pdal/libpdalpython.pyx":114 + /* "pdal/libpdalpython.pyx":113 * c_arrays.push_back(a) * * self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) # <<<<<<<<<<<<<< @@ -3169,16 +3151,16 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob */ if (unlikely(__pyx_v_json == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "encode"); - __PYX_ERR(1, 114, __pyx_L1_error) + __PYX_ERR(1, 113, __pyx_L1_error) } - __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 114, __pyx_L1_error) + __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 113, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); - __pyx_t_7 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_7) && PyErr_Occurred())) __PYX_ERR(1, 114, __pyx_L1_error) + __pyx_t_7 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_7) && PyErr_Occurred())) __PYX_ERR(1, 113, __pyx_L1_error) try { __pyx_t_8 = new pdal::python::Pipeline(__pyx_t_7, __pyx_v_c_arrays); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 114, __pyx_L1_error) + __PYX_ERR(1, 113, __pyx_L1_error) } __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_v_self->thisptr = __pyx_t_8; @@ -3187,13 +3169,13 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob * cdef Array* a * * if arrays is not None: # <<<<<<<<<<<<<< - * print("Looping arrays\n") * for array in arrays: + * a = new Array(array) */ goto __pyx_L4; } - /* "pdal/libpdalpython.pyx":116 + /* "pdal/libpdalpython.pyx":115 * self.thisptr = new Pipeline(json.encode('UTF-8'), c_arrays) * else: * self.thisptr = new Pipeline(json.encode('UTF-8')) # <<<<<<<<<<<<<< @@ -3203,16 +3185,16 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob /*else*/ { if (unlikely(__pyx_v_json == Py_None)) { PyErr_Format(PyExc_AttributeError, "'NoneType' object has no attribute '%.30s'", "encode"); - __PYX_ERR(1, 116, __pyx_L1_error) + __PYX_ERR(1, 115, __pyx_L1_error) } - __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 116, __pyx_L1_error) + __pyx_t_4 = PyUnicode_AsUTF8String(__pyx_v_json); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 115, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); - __pyx_t_9 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_9) && PyErr_Occurred())) __PYX_ERR(1, 116, __pyx_L1_error) + __pyx_t_9 = __Pyx_PyBytes_AsString(__pyx_t_4); if (unlikely((!__pyx_t_9) && PyErr_Occurred())) __PYX_ERR(1, 115, __pyx_L1_error) try { __pyx_t_8 = new pdal::python::Pipeline(__pyx_t_9); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 116, __pyx_L1_error) + __PYX_ERR(1, 115, __pyx_L1_error) } __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; __pyx_v_self->thisptr = __pyx_t_8; @@ -3241,7 +3223,7 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline___cinit__(struct __pyx_ob return __pyx_r; } -/* "pdal/libpdalpython.pyx":118 +/* "pdal/libpdalpython.pyx":117 * self.thisptr = new Pipeline(json.encode('UTF-8')) * * def __dealloc__(self): # <<<<<<<<<<<<<< @@ -3264,7 +3246,7 @@ static void __pyx_pf_4pdal_13libpdalpython_10PyPipeline_2__dealloc__(struct __py __Pyx_RefNannyDeclarations __Pyx_RefNannySetupContext("__dealloc__", 0); - /* "pdal/libpdalpython.pyx":119 + /* "pdal/libpdalpython.pyx":118 * * def __dealloc__(self): * del self.thisptr # <<<<<<<<<<<<<< @@ -3273,7 +3255,7 @@ static void __pyx_pf_4pdal_13libpdalpython_10PyPipeline_2__dealloc__(struct __py */ delete __pyx_v_self->thisptr; - /* "pdal/libpdalpython.pyx":118 + /* "pdal/libpdalpython.pyx":117 * self.thisptr = new Pipeline(json.encode('UTF-8')) * * def __dealloc__(self): # <<<<<<<<<<<<<< @@ -3285,7 +3267,7 @@ static void __pyx_pf_4pdal_13libpdalpython_10PyPipeline_2__dealloc__(struct __py __Pyx_RefNannyFinishContext(); } -/* "pdal/libpdalpython.pyx":122 +/* "pdal/libpdalpython.pyx":121 * * property pipeline: * def __get__(self): # <<<<<<<<<<<<<< @@ -3313,7 +3295,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8pipeline___get__(s PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":123 + /* "pdal/libpdalpython.pyx":122 * property pipeline: * def __get__(self): * return self.thisptr.getPipeline() # <<<<<<<<<<<<<< @@ -3325,15 +3307,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8pipeline___get__(s __pyx_t_1 = __pyx_v_self->thisptr->getPipeline(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 123, __pyx_L1_error) + __PYX_ERR(1, 122, __pyx_L1_error) } - __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 123, __pyx_L1_error) + __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 122, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":122 + /* "pdal/libpdalpython.pyx":121 * * property pipeline: * def __get__(self): # <<<<<<<<<<<<<< @@ -3352,7 +3334,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8pipeline___get__(s return __pyx_r; } -/* "pdal/libpdalpython.pyx":126 +/* "pdal/libpdalpython.pyx":125 * * property metadata: * def __get__(self): # <<<<<<<<<<<<<< @@ -3380,7 +3362,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8metadata___get__(s PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":127 + /* "pdal/libpdalpython.pyx":126 * property metadata: * def __get__(self): * return self.thisptr.getMetadata() # <<<<<<<<<<<<<< @@ -3392,15 +3374,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8metadata___get__(s __pyx_t_1 = __pyx_v_self->thisptr->getMetadata(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 127, __pyx_L1_error) + __PYX_ERR(1, 126, __pyx_L1_error) } - __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 127, __pyx_L1_error) + __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 126, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":126 + /* "pdal/libpdalpython.pyx":125 * * property metadata: * def __get__(self): # <<<<<<<<<<<<<< @@ -3419,7 +3401,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8metadata___get__(s return __pyx_r; } -/* "pdal/libpdalpython.pyx":130 +/* "pdal/libpdalpython.pyx":129 * * property loglevel: * def __get__(self): # <<<<<<<<<<<<<< @@ -3446,7 +3428,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel___get__(s PyObject *__pyx_t_1 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":131 + /* "pdal/libpdalpython.pyx":130 * property loglevel: * def __get__(self): * return self.thisptr.getLogLevel() # <<<<<<<<<<<<<< @@ -3454,13 +3436,13 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel___get__(s * self.thisptr.setLogLevel(v) */ __Pyx_XDECREF(__pyx_r); - __pyx_t_1 = __Pyx_PyInt_From_int(__pyx_v_self->thisptr->getLogLevel()); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 131, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyInt_From_int(__pyx_v_self->thisptr->getLogLevel()); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 130, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":130 + /* "pdal/libpdalpython.pyx":129 * * property loglevel: * def __get__(self): # <<<<<<<<<<<<<< @@ -3479,7 +3461,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel___get__(s return __pyx_r; } -/* "pdal/libpdalpython.pyx":132 +/* "pdal/libpdalpython.pyx":131 * def __get__(self): * return self.thisptr.getLogLevel() * def __set__(self, v): # <<<<<<<<<<<<<< @@ -3506,17 +3488,17 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel_2__set__(struct int __pyx_t_1; __Pyx_RefNannySetupContext("__set__", 0); - /* "pdal/libpdalpython.pyx":133 + /* "pdal/libpdalpython.pyx":132 * return self.thisptr.getLogLevel() * def __set__(self, v): * self.thisptr.setLogLevel(v) # <<<<<<<<<<<<<< * * property log: */ - __pyx_t_1 = __Pyx_PyInt_As_int(__pyx_v_v); if (unlikely((__pyx_t_1 == (int)-1) && PyErr_Occurred())) __PYX_ERR(1, 133, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyInt_As_int(__pyx_v_v); if (unlikely((__pyx_t_1 == (int)-1) && PyErr_Occurred())) __PYX_ERR(1, 132, __pyx_L1_error) __pyx_v_self->thisptr->setLogLevel(__pyx_t_1); - /* "pdal/libpdalpython.pyx":132 + /* "pdal/libpdalpython.pyx":131 * def __get__(self): * return self.thisptr.getLogLevel() * def __set__(self, v): # <<<<<<<<<<<<<< @@ -3535,7 +3517,7 @@ static int __pyx_pf_4pdal_13libpdalpython_10PyPipeline_8loglevel_2__set__(struct return __pyx_r; } -/* "pdal/libpdalpython.pyx":136 +/* "pdal/libpdalpython.pyx":135 * * property log: * def __get__(self): # <<<<<<<<<<<<<< @@ -3563,7 +3545,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_3log___get__(struct PyObject *__pyx_t_2 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":138 + /* "pdal/libpdalpython.pyx":137 * def __get__(self): * * return self.thisptr.getLog() # <<<<<<<<<<<<<< @@ -3575,15 +3557,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_3log___get__(struct __pyx_t_1 = __pyx_v_self->thisptr->getLog(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 138, __pyx_L1_error) + __PYX_ERR(1, 137, __pyx_L1_error) } - __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 138, __pyx_L1_error) + __pyx_t_2 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_t_1); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 137, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":136 + /* "pdal/libpdalpython.pyx":135 * * property log: * def __get__(self): # <<<<<<<<<<<<<< @@ -3602,7 +3584,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_3log___get__(struct return __pyx_r; } -/* "pdal/libpdalpython.pyx":141 +/* "pdal/libpdalpython.pyx":140 * * property schema: * def __get__(self): # <<<<<<<<<<<<<< @@ -3635,19 +3617,19 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str PyObject *__pyx_t_5 = NULL; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":142 + /* "pdal/libpdalpython.pyx":141 * property schema: * def __get__(self): * import json # <<<<<<<<<<<<<< * * j = self.thisptr.getSchema() */ - __pyx_t_1 = __Pyx_Import(__pyx_n_s_json, 0, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 142, __pyx_L1_error) + __pyx_t_1 = __Pyx_Import(__pyx_n_s_json, 0, 0); if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 141, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __pyx_v_json = __pyx_t_1; __pyx_t_1 = 0; - /* "pdal/libpdalpython.pyx":144 + /* "pdal/libpdalpython.pyx":143 * import json * * j = self.thisptr.getSchema() # <<<<<<<<<<<<<< @@ -3658,11 +3640,11 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str __pyx_t_2 = __pyx_v_self->thisptr->getSchema(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 144, __pyx_L1_error) + __PYX_ERR(1, 143, __pyx_L1_error) } __pyx_v_j = __pyx_t_2; - /* "pdal/libpdalpython.pyx":145 + /* "pdal/libpdalpython.pyx":144 * * j = self.thisptr.getSchema() * return json.loads(j) # <<<<<<<<<<<<<< @@ -3670,9 +3652,9 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str * property arrays: */ __Pyx_XDECREF(__pyx_r); - __pyx_t_3 = __Pyx_PyObject_GetAttrStr(__pyx_v_json, __pyx_n_s_loads); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 145, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_GetAttrStr(__pyx_v_json, __pyx_n_s_loads); if (unlikely(!__pyx_t_3)) __PYX_ERR(1, 144, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); - __pyx_t_4 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_v_j); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 145, __pyx_L1_error) + __pyx_t_4 = __pyx_convert_PyUnicode_string_to_py_std__in_string(__pyx_v_j); if (unlikely(!__pyx_t_4)) __PYX_ERR(1, 144, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __pyx_t_5 = NULL; if (CYTHON_UNPACK_METHODS && likely(PyMethod_Check(__pyx_t_3))) { @@ -3687,14 +3669,14 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str __pyx_t_1 = (__pyx_t_5) ? __Pyx_PyObject_Call2Args(__pyx_t_3, __pyx_t_5, __pyx_t_4) : __Pyx_PyObject_CallOneArg(__pyx_t_3, __pyx_t_4); __Pyx_XDECREF(__pyx_t_5); __pyx_t_5 = 0; __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; - if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 145, __pyx_L1_error) + if (unlikely(!__pyx_t_1)) __PYX_ERR(1, 144, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; __pyx_r = __pyx_t_1; __pyx_t_1 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":141 + /* "pdal/libpdalpython.pyx":140 * * property schema: * def __get__(self): # <<<<<<<<<<<<<< @@ -3717,7 +3699,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6schema___get__(str return __pyx_r; } -/* "pdal/libpdalpython.pyx":149 +/* "pdal/libpdalpython.pyx":148 * property arrays: * * def __get__(self): # <<<<<<<<<<<<<< @@ -3754,7 +3736,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str int __pyx_t_5; __Pyx_RefNannySetupContext("__get__", 0); - /* "pdal/libpdalpython.pyx":150 + /* "pdal/libpdalpython.pyx":149 * * def __get__(self): * v = self.thisptr.getArrays() # <<<<<<<<<<<<<< @@ -3765,23 +3747,23 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_t_1 = __pyx_v_self->thisptr->getArrays(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 150, __pyx_L1_error) + __PYX_ERR(1, 149, __pyx_L1_error) } __pyx_v_v = __pyx_t_1; - /* "pdal/libpdalpython.pyx":151 + /* "pdal/libpdalpython.pyx":150 * def __get__(self): * v = self.thisptr.getArrays() * output = [] # <<<<<<<<<<<<<< * cdef vector[Array*].iterator it = v.begin() * cdef Array* a */ - __pyx_t_2 = PyList_New(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 151, __pyx_L1_error) + __pyx_t_2 = PyList_New(0); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 150, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_v_output = ((PyObject*)__pyx_t_2); __pyx_t_2 = 0; - /* "pdal/libpdalpython.pyx":152 + /* "pdal/libpdalpython.pyx":151 * v = self.thisptr.getArrays() * output = [] * cdef vector[Array*].iterator it = v.begin() # <<<<<<<<<<<<<< @@ -3790,7 +3772,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str */ __pyx_v_it = __pyx_v_v.begin(); - /* "pdal/libpdalpython.pyx":154 + /* "pdal/libpdalpython.pyx":153 * cdef vector[Array*].iterator it = v.begin() * cdef Array* a * while it != v.end(): # <<<<<<<<<<<<<< @@ -3801,7 +3783,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_t_3 = ((__pyx_v_it != __pyx_v_v.end()) != 0); if (!__pyx_t_3) break; - /* "pdal/libpdalpython.pyx":155 + /* "pdal/libpdalpython.pyx":154 * cdef Array* a * while it != v.end(): * ptr = deref(it) # <<<<<<<<<<<<<< @@ -3810,7 +3792,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str */ __pyx_v_ptr = (*__pyx_v_it); - /* "pdal/libpdalpython.pyx":156 + /* "pdal/libpdalpython.pyx":155 * while it != v.end(): * ptr = deref(it) * a = ptr#.get() # <<<<<<<<<<<<<< @@ -3819,7 +3801,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str */ __pyx_v_a = __pyx_v_ptr; - /* "pdal/libpdalpython.pyx":157 + /* "pdal/libpdalpython.pyx":156 * ptr = deref(it) * a = ptr#.get() * o = a.getPythonArray() # <<<<<<<<<<<<<< @@ -3830,20 +3812,20 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_t_4 = __pyx_v_a->getPythonArray(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 157, __pyx_L1_error) + __PYX_ERR(1, 156, __pyx_L1_error) } __pyx_v_o = __pyx_t_4; - /* "pdal/libpdalpython.pyx":158 + /* "pdal/libpdalpython.pyx":157 * a = ptr#.get() * o = a.getPythonArray() * output.append(o) # <<<<<<<<<<<<<< * inc(it) * return output */ - __pyx_t_5 = __Pyx_PyList_Append(__pyx_v_output, ((PyObject *)__pyx_v_o)); if (unlikely(__pyx_t_5 == ((int)-1))) __PYX_ERR(1, 158, __pyx_L1_error) + __pyx_t_5 = __Pyx_PyList_Append(__pyx_v_output, ((PyObject *)__pyx_v_o)); if (unlikely(__pyx_t_5 == ((int)-1))) __PYX_ERR(1, 157, __pyx_L1_error) - /* "pdal/libpdalpython.pyx":159 + /* "pdal/libpdalpython.pyx":158 * o = a.getPythonArray() * output.append(o) * inc(it) # <<<<<<<<<<<<<< @@ -3853,7 +3835,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str (void)((++__pyx_v_it)); } - /* "pdal/libpdalpython.pyx":160 + /* "pdal/libpdalpython.pyx":159 * output.append(o) * inc(it) * return output # <<<<<<<<<<<<<< @@ -3865,7 +3847,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str __pyx_r = __pyx_v_output; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":149 + /* "pdal/libpdalpython.pyx":148 * property arrays: * * def __get__(self): # <<<<<<<<<<<<<< @@ -3885,7 +3867,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6arrays___get__(str return __pyx_r; } -/* "pdal/libpdalpython.pyx":163 +/* "pdal/libpdalpython.pyx":162 * * * def execute(self): # <<<<<<<<<<<<<< @@ -3914,7 +3896,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p int64_t __pyx_t_3; __Pyx_RefNannySetupContext("execute", 0); - /* "pdal/libpdalpython.pyx":164 + /* "pdal/libpdalpython.pyx":163 * * def execute(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -3924,20 +3906,20 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p __pyx_t_1 = ((!(__pyx_v_self->thisptr != 0)) != 0); if (unlikely(__pyx_t_1)) { - /* "pdal/libpdalpython.pyx":165 + /* "pdal/libpdalpython.pyx":164 * def execute(self): * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") # <<<<<<<<<<<<<< * return self.thisptr.execute() * */ - __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__4, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 165, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__3, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 164, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_Raise(__pyx_t_2, 0, 0, 0); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __PYX_ERR(1, 165, __pyx_L1_error) + __PYX_ERR(1, 164, __pyx_L1_error) - /* "pdal/libpdalpython.pyx":164 + /* "pdal/libpdalpython.pyx":163 * * def execute(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -3946,7 +3928,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p */ } - /* "pdal/libpdalpython.pyx":166 + /* "pdal/libpdalpython.pyx":165 * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") * return self.thisptr.execute() # <<<<<<<<<<<<<< @@ -3958,15 +3940,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p __pyx_t_3 = __pyx_v_self->thisptr->execute(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 166, __pyx_L1_error) + __PYX_ERR(1, 165, __pyx_L1_error) } - __pyx_t_2 = __Pyx_PyInt_From_int64_t(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 166, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyInt_From_int64_t(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 165, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":163 + /* "pdal/libpdalpython.pyx":162 * * * def execute(self): # <<<<<<<<<<<<<< @@ -3985,7 +3967,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_4execute(struct __p return __pyx_r; } -/* "pdal/libpdalpython.pyx":168 +/* "pdal/libpdalpython.pyx":167 * return self.thisptr.execute() * * def validate(self): # <<<<<<<<<<<<<< @@ -4014,7 +3996,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ bool __pyx_t_3; __Pyx_RefNannySetupContext("validate", 0); - /* "pdal/libpdalpython.pyx":169 + /* "pdal/libpdalpython.pyx":168 * * def validate(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -4024,19 +4006,19 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ __pyx_t_1 = ((!(__pyx_v_self->thisptr != 0)) != 0); if (unlikely(__pyx_t_1)) { - /* "pdal/libpdalpython.pyx":170 + /* "pdal/libpdalpython.pyx":169 * def validate(self): * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") # <<<<<<<<<<<<<< * return self.thisptr.validate() */ - __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__4, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 170, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyObject_Call(((PyObject *)(&((PyTypeObject*)PyExc_Exception)[0])), __pyx_tuple__3, NULL); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 169, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __Pyx_Raise(__pyx_t_2, 0, 0, 0); __Pyx_DECREF(__pyx_t_2); __pyx_t_2 = 0; - __PYX_ERR(1, 170, __pyx_L1_error) + __PYX_ERR(1, 169, __pyx_L1_error) - /* "pdal/libpdalpython.pyx":169 + /* "pdal/libpdalpython.pyx":168 * * def validate(self): * if not self.thisptr: # <<<<<<<<<<<<<< @@ -4045,7 +4027,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ */ } - /* "pdal/libpdalpython.pyx":171 + /* "pdal/libpdalpython.pyx":170 * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") * return self.thisptr.validate() # <<<<<<<<<<<<<< @@ -4055,15 +4037,15 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_6validate(struct __ __pyx_t_3 = __pyx_v_self->thisptr->validate(); } catch(...) { __Pyx_CppExn2PyErr(); - __PYX_ERR(1, 171, __pyx_L1_error) + __PYX_ERR(1, 170, __pyx_L1_error) } - __pyx_t_2 = __Pyx_PyBool_FromLong(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 171, __pyx_L1_error) + __pyx_t_2 = __Pyx_PyBool_FromLong(__pyx_t_3); if (unlikely(!__pyx_t_2)) __PYX_ERR(1, 170, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_2); __pyx_r = __pyx_t_2; __pyx_t_2 = 0; goto __pyx_L0; - /* "pdal/libpdalpython.pyx":168 + /* "pdal/libpdalpython.pyx":167 * return self.thisptr.execute() * * def validate(self): # <<<<<<<<<<<<<< @@ -4113,7 +4095,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_8__reduce_cython__( * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ - __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__5, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 2, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__4, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 2, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; @@ -4166,7 +4148,7 @@ static PyObject *__pyx_pf_4pdal_13libpdalpython_10PyPipeline_10__setstate_cython * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ - __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__6, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 4, __pyx_L1_error) + __pyx_t_1 = __Pyx_PyObject_Call(__pyx_builtin_TypeError, __pyx_tuple__5, NULL); if (unlikely(!__pyx_t_1)) __PYX_ERR(0, 4, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_1); __Pyx_Raise(__pyx_t_1, 0, 0, 0); __Pyx_DECREF(__pyx_t_1); __pyx_t_1 = 0; @@ -4306,7 +4288,7 @@ static int __pyx_pf_5numpy_7ndarray___getbuffer__(PyArrayObject *__pyx_v_self, P * * if ((flags & pybuf.PyBUF_F_CONTIGUOUS == pybuf.PyBUF_F_CONTIGUOUS) */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__7, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 272, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__6, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 272, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -4362,7 +4344,7 @@ static int __pyx_pf_5numpy_7ndarray___getbuffer__(PyArrayObject *__pyx_v_self, P * * info.buf = PyArray_DATA(self) */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__8, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 276, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__7, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 276, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -4620,7 +4602,7 @@ static int __pyx_pf_5numpy_7ndarray___getbuffer__(PyArrayObject *__pyx_v_self, P * if t == NPY_BYTE: f = "b" * elif t == NPY_UBYTE: f = "B" */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__9, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 306, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__8, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 306, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -5500,7 +5482,7 @@ static CYTHON_INLINE char *__pyx_f_5numpy__util_dtypestring(PyArray_Descr *__pyx * * if ((child.byteorder == c'>' and little_endian) or */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__10, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 856, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__9, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 856, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -5568,7 +5550,7 @@ static CYTHON_INLINE char *__pyx_f_5numpy__util_dtypestring(PyArray_Descr *__pyx * # One could encode it in the format string and have Cython * # complain instead, BUT: < and > in format strings also imply */ - __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__9, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 860, __pyx_L1_error) + __pyx_t_3 = __Pyx_PyObject_Call(__pyx_builtin_ValueError, __pyx_tuple__8, NULL); if (unlikely(!__pyx_t_3)) __PYX_ERR(2, 860, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_3); __Pyx_Raise(__pyx_t_3, 0, 0, 0); __Pyx_DECREF(__pyx_t_3); __pyx_t_3 = 0; @@ -5677,7 +5659,7 @@ static CYTHON_INLINE char *__pyx_f_5numpy__util_dtypestring(PyArray_Descr *__pyx * * # Until ticket #99 is fixed, use integers to avoid warnings */ - __pyx_t_4 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__11, NULL); if (unlikely(!__pyx_t_4)) __PYX_ERR(2, 880, __pyx_L1_error) + __pyx_t_4 = __Pyx_PyObject_Call(__pyx_builtin_RuntimeError, __pyx_tuple__10, NULL); if (unlikely(!__pyx_t_4)) __PYX_ERR(2, 880, __pyx_L1_error) __Pyx_GOTREF(__pyx_t_4); __Pyx_Raise(__pyx_t_4, 0, 0, 0); __Pyx_DECREF(__pyx_t_4); __pyx_t_4 = 0; @@ -6305,7 +6287,7 @@ static CYTHON_INLINE int __pyx_f_5numpy_import_array(void) { * * cdef inline int import_umath() except -1: */ - __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__12, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1038, __pyx_L5_except_error) + __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__11, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1038, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; @@ -6434,7 +6416,7 @@ static CYTHON_INLINE int __pyx_f_5numpy_import_umath(void) { * * cdef inline int import_ufunc() except -1: */ - __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__13, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1044, __pyx_L5_except_error) + __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__12, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1044, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; @@ -6560,7 +6542,7 @@ static CYTHON_INLINE int __pyx_f_5numpy_import_ufunc(void) { * except Exception: * raise ImportError("numpy.core.umath failed to import") # <<<<<<<<<<<<<< */ - __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__13, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1050, __pyx_L5_except_error) + __pyx_t_8 = __Pyx_PyObject_Call(__pyx_builtin_ImportError, __pyx_tuple__12, NULL); if (unlikely(!__pyx_t_8)) __PYX_ERR(2, 1050, __pyx_L5_except_error) __Pyx_GOTREF(__pyx_t_8); __Pyx_Raise(__pyx_t_8, 0, 0, 0); __Pyx_DECREF(__pyx_t_8); __pyx_t_8 = 0; @@ -7135,7 +7117,6 @@ static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_kp_u_Format_string_allocated_too_shor, __pyx_k_Format_string_allocated_too_shor, sizeof(__pyx_k_Format_string_allocated_too_shor), 0, 1, 0, 0}, {&__pyx_kp_u_Format_string_allocated_too_shor_2, __pyx_k_Format_string_allocated_too_shor_2, sizeof(__pyx_k_Format_string_allocated_too_shor_2), 0, 1, 0, 0}, {&__pyx_n_s_ImportError, __pyx_k_ImportError, sizeof(__pyx_k_ImportError), 0, 0, 1, 1}, - {&__pyx_kp_u_Looping_arrays, __pyx_k_Looping_arrays, sizeof(__pyx_k_Looping_arrays), 0, 1, 0, 0}, {&__pyx_kp_u_Non_native_byte_order_not_suppor, __pyx_k_Non_native_byte_order_not_suppor, sizeof(__pyx_k_Non_native_byte_order_not_suppor), 0, 1, 0, 0}, {&__pyx_n_s_PyArray, __pyx_k_PyArray, sizeof(__pyx_k_PyArray), 0, 0, 1, 1}, {&__pyx_n_s_PyPipeline, __pyx_k_PyPipeline, sizeof(__pyx_k_PyPipeline), 0, 0, 1, 1}, @@ -7174,7 +7155,6 @@ static __Pyx_StringTabEntry __pyx_string_tab[] = { {&__pyx_n_s_output, __pyx_k_output, sizeof(__pyx_k_output), 0, 0, 1, 1}, {&__pyx_n_s_pdal_libpdalpython, __pyx_k_pdal_libpdalpython, sizeof(__pyx_k_pdal_libpdalpython), 0, 0, 1, 1}, {&__pyx_kp_s_pdal_libpdalpython_pyx, __pyx_k_pdal_libpdalpython_pyx, sizeof(__pyx_k_pdal_libpdalpython_pyx), 0, 0, 1, 0}, - {&__pyx_n_s_print, __pyx_k_print, sizeof(__pyx_k_print), 0, 0, 1, 1}, {&__pyx_n_s_ptr, __pyx_k_ptr, sizeof(__pyx_k_ptr), 0, 0, 1, 1}, {&__pyx_n_s_range, __pyx_k_range, sizeof(__pyx_k_range), 0, 0, 1, 1}, {&__pyx_n_s_reduce, __pyx_k_reduce, sizeof(__pyx_k_reduce), 0, 0, 1, 1}, @@ -7188,7 +7168,6 @@ static __Pyx_StringTabEntry __pyx_string_tab[] = { }; static CYTHON_SMALL_CODE int __Pyx_InitCachedBuiltins(void) { __pyx_builtin_TypeError = __Pyx_GetBuiltinName(__pyx_n_s_TypeError); if (!__pyx_builtin_TypeError) __PYX_ERR(0, 2, __pyx_L1_error) - __pyx_builtin_print = __Pyx_GetBuiltinName(__pyx_n_s_print); if (!__pyx_builtin_print) __PYX_ERR(1, 109, __pyx_L1_error) __pyx_builtin_ValueError = __Pyx_GetBuiltinName(__pyx_n_s_ValueError); if (!__pyx_builtin_ValueError) __PYX_ERR(2, 272, __pyx_L1_error) __pyx_builtin_range = __Pyx_GetBuiltinName(__pyx_n_s_range); if (!__pyx_builtin_range) __PYX_ERR(2, 285, __pyx_L1_error) __pyx_builtin_RuntimeError = __Pyx_GetBuiltinName(__pyx_n_s_RuntimeError); if (!__pyx_builtin_RuntimeError) __PYX_ERR(2, 856, __pyx_L1_error) @@ -7221,27 +7200,16 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { __Pyx_GOTREF(__pyx_tuple__2); __Pyx_GIVEREF(__pyx_tuple__2); - /* "pdal/libpdalpython.pyx":109 - * - * if arrays is not None: - * print("Looping arrays\n") # <<<<<<<<<<<<<< - * for array in arrays: - * a = new Array(array) - */ - __pyx_tuple__3 = PyTuple_Pack(1, __pyx_kp_u_Looping_arrays); if (unlikely(!__pyx_tuple__3)) __PYX_ERR(1, 109, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__3); - __Pyx_GIVEREF(__pyx_tuple__3); - - /* "pdal/libpdalpython.pyx":165 + /* "pdal/libpdalpython.pyx":164 * def execute(self): * if not self.thisptr: * raise Exception("C++ Pipeline object not constructed!") # <<<<<<<<<<<<<< * return self.thisptr.execute() * */ - __pyx_tuple__4 = PyTuple_Pack(1, __pyx_kp_u_C_Pipeline_object_not_constructe); if (unlikely(!__pyx_tuple__4)) __PYX_ERR(1, 165, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__4); - __Pyx_GIVEREF(__pyx_tuple__4); + __pyx_tuple__3 = PyTuple_Pack(1, __pyx_kp_u_C_Pipeline_object_not_constructe); if (unlikely(!__pyx_tuple__3)) __PYX_ERR(1, 164, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__3); + __Pyx_GIVEREF(__pyx_tuple__3); /* "(tree fragment)":2 * def __reduce_cython__(self): @@ -7249,18 +7217,18 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") */ - __pyx_tuple__5 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__5)) __PYX_ERR(0, 2, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__5); - __Pyx_GIVEREF(__pyx_tuple__5); + __pyx_tuple__4 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__4)) __PYX_ERR(0, 2, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__4); + __Pyx_GIVEREF(__pyx_tuple__4); /* "(tree fragment)":4 * raise TypeError("no default __reduce__ due to non-trivial __cinit__") * def __setstate_cython__(self, __pyx_state): * raise TypeError("no default __reduce__ due to non-trivial __cinit__") # <<<<<<<<<<<<<< */ - __pyx_tuple__6 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__6)) __PYX_ERR(0, 4, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__6); - __Pyx_GIVEREF(__pyx_tuple__6); + __pyx_tuple__5 = PyTuple_Pack(1, __pyx_kp_s_no_default___reduce___due_to_non); if (unlikely(!__pyx_tuple__5)) __PYX_ERR(0, 4, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__5); + __Pyx_GIVEREF(__pyx_tuple__5); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":272 * if ((flags & pybuf.PyBUF_C_CONTIGUOUS == pybuf.PyBUF_C_CONTIGUOUS) @@ -7269,9 +7237,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * if ((flags & pybuf.PyBUF_F_CONTIGUOUS == pybuf.PyBUF_F_CONTIGUOUS) */ - __pyx_tuple__7 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_C_contiguous); if (unlikely(!__pyx_tuple__7)) __PYX_ERR(2, 272, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__7); - __Pyx_GIVEREF(__pyx_tuple__7); + __pyx_tuple__6 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_C_contiguous); if (unlikely(!__pyx_tuple__6)) __PYX_ERR(2, 272, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__6); + __Pyx_GIVEREF(__pyx_tuple__6); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":276 * if ((flags & pybuf.PyBUF_F_CONTIGUOUS == pybuf.PyBUF_F_CONTIGUOUS) @@ -7280,9 +7248,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * info.buf = PyArray_DATA(self) */ - __pyx_tuple__8 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_Fortran_contiguou); if (unlikely(!__pyx_tuple__8)) __PYX_ERR(2, 276, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__8); - __Pyx_GIVEREF(__pyx_tuple__8); + __pyx_tuple__7 = PyTuple_Pack(1, __pyx_kp_u_ndarray_is_not_Fortran_contiguou); if (unlikely(!__pyx_tuple__7)) __PYX_ERR(2, 276, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__7); + __Pyx_GIVEREF(__pyx_tuple__7); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":306 * if ((descr.byteorder == c'>' and little_endian) or @@ -7291,9 +7259,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * if t == NPY_BYTE: f = "b" * elif t == NPY_UBYTE: f = "B" */ - __pyx_tuple__9 = PyTuple_Pack(1, __pyx_kp_u_Non_native_byte_order_not_suppor); if (unlikely(!__pyx_tuple__9)) __PYX_ERR(2, 306, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__9); - __Pyx_GIVEREF(__pyx_tuple__9); + __pyx_tuple__8 = PyTuple_Pack(1, __pyx_kp_u_Non_native_byte_order_not_suppor); if (unlikely(!__pyx_tuple__8)) __PYX_ERR(2, 306, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__8); + __Pyx_GIVEREF(__pyx_tuple__8); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":856 * @@ -7302,9 +7270,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * if ((child.byteorder == c'>' and little_endian) or */ - __pyx_tuple__10 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor); if (unlikely(!__pyx_tuple__10)) __PYX_ERR(2, 856, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__10); - __Pyx_GIVEREF(__pyx_tuple__10); + __pyx_tuple__9 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor); if (unlikely(!__pyx_tuple__9)) __PYX_ERR(2, 856, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__9); + __Pyx_GIVEREF(__pyx_tuple__9); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":880 * t = child.type_num @@ -7313,9 +7281,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * # Until ticket #99 is fixed, use integers to avoid warnings */ - __pyx_tuple__11 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor_2); if (unlikely(!__pyx_tuple__11)) __PYX_ERR(2, 880, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__11); - __Pyx_GIVEREF(__pyx_tuple__11); + __pyx_tuple__10 = PyTuple_Pack(1, __pyx_kp_u_Format_string_allocated_too_shor_2); if (unlikely(!__pyx_tuple__10)) __PYX_ERR(2, 880, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__10); + __Pyx_GIVEREF(__pyx_tuple__10); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":1038 * _import_array() @@ -7324,9 +7292,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * cdef inline int import_umath() except -1: */ - __pyx_tuple__12 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_multiarray_failed_to); if (unlikely(!__pyx_tuple__12)) __PYX_ERR(2, 1038, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__12); - __Pyx_GIVEREF(__pyx_tuple__12); + __pyx_tuple__11 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_multiarray_failed_to); if (unlikely(!__pyx_tuple__11)) __PYX_ERR(2, 1038, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__11); + __Pyx_GIVEREF(__pyx_tuple__11); /* "../../../miniconda3/envs/pdal/lib/python3.7/site-packages/Cython/Includes/numpy/__init__.pxd":1044 * _import_umath() @@ -7335,9 +7303,9 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * * cdef inline int import_ufunc() except -1: */ - __pyx_tuple__13 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_umath_failed_to_impor); if (unlikely(!__pyx_tuple__13)) __PYX_ERR(2, 1044, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__13); - __Pyx_GIVEREF(__pyx_tuple__13); + __pyx_tuple__12 = PyTuple_Pack(1, __pyx_kp_u_numpy_core_umath_failed_to_impor); if (unlikely(!__pyx_tuple__12)) __PYX_ERR(2, 1044, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__12); + __Pyx_GIVEREF(__pyx_tuple__12); /* "pdal/libpdalpython.pyx":24 * cdef string versionString() except+ @@ -7346,7 +7314,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionString() * def getVersionMajor(): */ - __pyx_codeobj__14 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionString, 24, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__14)) __PYX_ERR(1, 24, __pyx_L1_error) + __pyx_codeobj__13 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionString, 24, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__13)) __PYX_ERR(1, 24, __pyx_L1_error) /* "pdal/libpdalpython.pyx":26 * def getVersionString(): @@ -7355,7 +7323,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionMajor() * def getVersionMinor(): */ - __pyx_codeobj__15 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMajor, 26, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__15)) __PYX_ERR(1, 26, __pyx_L1_error) + __pyx_codeobj__14 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMajor, 26, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__14)) __PYX_ERR(1, 26, __pyx_L1_error) /* "pdal/libpdalpython.pyx":28 * def getVersionMajor(): @@ -7364,7 +7332,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionMinor() * def getVersionPatch(): */ - __pyx_codeobj__16 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMinor, 28, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__16)) __PYX_ERR(1, 28, __pyx_L1_error) + __pyx_codeobj__15 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionMinor, 28, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__15)) __PYX_ERR(1, 28, __pyx_L1_error) /* "pdal/libpdalpython.pyx":30 * def getVersionMinor(): @@ -7373,7 +7341,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return versionPatch() * def getSha1(): */ - __pyx_codeobj__17 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionPatch, 30, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__17)) __PYX_ERR(1, 30, __pyx_L1_error) + __pyx_codeobj__16 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getVersionPatch, 30, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__16)) __PYX_ERR(1, 30, __pyx_L1_error) /* "pdal/libpdalpython.pyx":32 * def getVersionPatch(): @@ -7382,7 +7350,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return sha1() * def getDebugInformation(): */ - __pyx_codeobj__18 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getSha1, 32, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__18)) __PYX_ERR(1, 32, __pyx_L1_error) + __pyx_codeobj__17 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getSha1, 32, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__17)) __PYX_ERR(1, 32, __pyx_L1_error) /* "pdal/libpdalpython.pyx":34 * def getSha1(): @@ -7391,7 +7359,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return debugInformation() * def getPluginInstallPath(): */ - __pyx_codeobj__19 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDebugInformation, 34, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__19)) __PYX_ERR(1, 34, __pyx_L1_error) + __pyx_codeobj__18 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDebugInformation, 34, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__18)) __PYX_ERR(1, 34, __pyx_L1_error) /* "pdal/libpdalpython.pyx":36 * def getDebugInformation(): @@ -7400,7 +7368,7 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * return pluginInstallPath() * */ - __pyx_codeobj__20 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getPluginInstallPath, 36, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__20)) __PYX_ERR(1, 36, __pyx_L1_error) + __pyx_codeobj__19 = (PyObject*)__Pyx_PyCode_New(0, 0, 0, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getPluginInstallPath, 36, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__19)) __PYX_ERR(1, 36, __pyx_L1_error) /* "pdal/libpdalpython.pyx":76 * @@ -7409,10 +7377,10 @@ static CYTHON_SMALL_CODE int __Pyx_InitCachedConstants(void) { * cdef vector[Dimension] c_dims; * c_dims = getValidDimensions() */ - __pyx_tuple__21 = PyTuple_Pack(6, __pyx_n_s_c_dims, __pyx_n_s_output, __pyx_n_s_it, __pyx_n_s_ptr, __pyx_n_s_d, __pyx_n_s_kind); if (unlikely(!__pyx_tuple__21)) __PYX_ERR(1, 76, __pyx_L1_error) - __Pyx_GOTREF(__pyx_tuple__21); - __Pyx_GIVEREF(__pyx_tuple__21); - __pyx_codeobj__22 = (PyObject*)__Pyx_PyCode_New(0, 0, 6, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__21, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDimensions, 76, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__22)) __PYX_ERR(1, 76, __pyx_L1_error) + __pyx_tuple__20 = PyTuple_Pack(6, __pyx_n_s_c_dims, __pyx_n_s_output, __pyx_n_s_it, __pyx_n_s_ptr, __pyx_n_s_d, __pyx_n_s_kind); if (unlikely(!__pyx_tuple__20)) __PYX_ERR(1, 76, __pyx_L1_error) + __Pyx_GOTREF(__pyx_tuple__20); + __Pyx_GIVEREF(__pyx_tuple__20); + __pyx_codeobj__21 = (PyObject*)__Pyx_PyCode_New(0, 0, 6, 0, CO_OPTIMIZED|CO_NEWLOCALS, __pyx_empty_bytes, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_tuple__20, __pyx_empty_tuple, __pyx_empty_tuple, __pyx_kp_s_pdal_libpdalpython_pyx, __pyx_n_s_getDimensions, 76, __pyx_empty_bytes); if (unlikely(!__pyx_codeobj__21)) __PYX_ERR(1, 76, __pyx_L1_error) __Pyx_RefNannyFinishContext(); return 0; __pyx_L1_error:; ===================================== pdal/libpdalpython.pyx ===================================== @@ -106,7 +106,6 @@ cdef class PyPipeline: cdef Array* a if arrays is not None: - print("Looping arrays\n") for array in arrays: a = new Array(array) c_arrays.push_back(a) ===================================== setup.py ===================================== @@ -91,9 +91,6 @@ open_kwds = {} if sys.version_info >= (3,): open_kwds['encoding'] = 'utf-8' -with open('VERSION.txt', 'w', **open_kwds) as fp: - fp.write(str(module_version)) - with open('README.rst', 'r', **open_kwds) as fp: readme = fp.read() ===================================== test/test_pipeline.py ===================================== @@ -159,6 +159,39 @@ class TestArrayLoad(PDALTest): self.assertEqual(len(data), 12) self.assertEqual(data['Intensity'].sum(), 1926) + def test_read_arrays(self): + """Can we read and filter data from a list of arrays to PDAL""" + if Version(pdal.info.version) < Version('1.8'): + return True + + # just some dummy data + x_vals = [1.0, 2.0, 3.0, 4.0, 5.0] + y_vals = [6.0, 7.0, 8.0, 9.0, 10.0] + z_vals = [1.5, 3.5, 5.5, 7.5, 9.5] + test_data = np.array( + [(x, y, z) for x, y, z in zip(x_vals, y_vals, z_vals)], + dtype=[('X', np.float), ('Y', np.float), ('Z', np.float)] + ) + + pipeline = """ + { + "pipeline": [ + { + "type":"filters.range", + "limits":"X[2.5:4.5]" + } + ] + } + """ + + p = pdal.Pipeline(pipeline, arrays=[test_data,]) + p.loglevel = 8 + count = p.execute() + arrays = p.arrays + self.assertEqual(count, 2) + self.assertEqual(len(arrays), 1) + + class TestDimensions(PDALTest): def test_fetch_dimensions(self): """Ask PDAL for its valid dimensions list""" View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/461c6250bad81f9e4a2a4a2a9a368c1f701ba4ef -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/461c6250bad81f9e4a2a4a2a9a368c1f701ba4ef You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 20 06:10:02 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 20 Sep 2019 05:10:02 +0000 Subject: Processing of python-pdal_2.2.2+ds-1_source.changes Message-ID: python-pdal_2.2.2+ds-1_source.changes uploaded successfully to localhost along with the files: python-pdal_2.2.2+ds-1.dsc python-pdal_2.2.2+ds.orig.tar.xz python-pdal_2.2.2+ds-1.debian.tar.xz python-pdal_2.2.2+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 20 06:19:40 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 20 Sep 2019 05:19:40 +0000 Subject: python-pdal_2.2.2+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 20 Sep 2019 06:50:01 +0200 Source: python-pdal Architecture: source Version: 2.2.2+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pdal (2.2.2+ds-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: 99cc4687ab6924cba4edc4001f09e54a733deea1 2103 python-pdal_2.2.2+ds-1.dsc 9c41aeae9cdabd6d7936ca1dc045e5d5005dc946 53668 python-pdal_2.2.2+ds.orig.tar.xz 3283dd75e425af914fd11fd61a51ff3925ff4472 4228 python-pdal_2.2.2+ds-1.debian.tar.xz df45310edd416e1858f6cd10ec1f0165600eb77f 13760 python-pdal_2.2.2+ds-1_amd64.buildinfo Checksums-Sha256: 541d2e6a89952710ba348806862c05982a6f6a58cbde0a74e0529fde24db131b 2103 python-pdal_2.2.2+ds-1.dsc 5b06953522844de6372fb71c8e6e67b8d92ee9d86138ac85fb07a260e5d7e66d 53668 python-pdal_2.2.2+ds.orig.tar.xz 984eac3ab5f84e26c1f9e622abe193ab665d952d72670d5a482082276d6e3190 4228 python-pdal_2.2.2+ds-1.debian.tar.xz 3bc3282736a1146e70a036097b06e35a9dd7b19f1ad23cfcad29762f1830c081 13760 python-pdal_2.2.2+ds-1_amd64.buildinfo Files: eedda4242a0c0247f1516075855feffb 2103 science optional python-pdal_2.2.2+ds-1.dsc a75b5c5512ce5a40a5d220f7cc362d22 53668 science optional python-pdal_2.2.2+ds.orig.tar.xz 20792639e0738b2b22e764f0a4e3b786 4228 science optional python-pdal_2.2.2+ds-1.debian.tar.xz 3b77ae7071b80094a4a2d45aadf7f288 13760 science optional python-pdal_2.2.2+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2EXEMACgkQZ1DxCuiN SvHSwxAAimhRlFburDJPHjS2rqcR4TE5kDsBY54WKsb5K4VhdZ5Xxt0zJDmIqF+a lS2iYENnQxDAadXyr4eR2PhPbFVbUmDZ0JzM5GfbXdQRXSKUmXKSZBX0j/IqRq4j x8HSg8eeNhSawmk4MP3WVibnGZHq5fSAp0FJhdL2DQ6FhF8ltE4XJ6wXl/7Lk3k5 GkPBGWYPeTK+KrSsIizRcARVJPDuitdfUNfr8rs5fw9TmkAsCkj5Q9Jt3TJMz0Td qMegl7PbSoFBl0tv7DOkrEFlADWSKkSBKrXtHw5iCqF1/jROa8/cFpvjG1u0WIw6 XUvBQW4Ul44lnr1ZUiivFm0E9srNQynPOa/sbajHD5OgyhhE7eeQQYAghV4z7lyp q5Gs27Ut0R8Y+brStIk+ckQwtHAV9vlpWdpci4BZX4rMJ65/LMhW+npqsH32iHc8 iqi0XQPgvmAN+ajXdK+2wb1crkD9whHqfNy/VQyb8NOMxoGNdV/f0DJiUpgafTqQ DOJ8iEQZk0GjoIXz4Xv411Ba3imkMA6ssUI2FyhN2aavn1BviOljdGNeW5GdBXVw 1n6rDumyw1jYwRXOf9KSm8yzQJOz08SzhbCKQaVycoykhpBEnf5ITXZeCMjYT7cR etL0TNUfNQIqPvJ1eNSRypz+1RdV6XXm+q1hC7kiG53por7qZ5o= =xpwy -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Fri Sep 20 13:56:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 12:56:06 +0000 Subject: [Git][debian-gis-team/python-pyproj][pristine-tar] pristine-tar data for python-pyproj_2.4~rc1+ds.orig.tar.xz Message-ID: <5d84cc662a9c7_73483fbbb4720b641701857@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-pyproj Commits: 1edb1095 by Bas Couwenberg at 2019-09-20T12:48:28Z pristine-tar data for python-pyproj_2.4~rc1+ds.orig.tar.xz - - - - - 2 changed files: - + python-pyproj_2.4~rc1+ds.orig.tar.xz.delta - + python-pyproj_2.4~rc1+ds.orig.tar.xz.id Changes: ===================================== python-pyproj_2.4~rc1+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/python-pyproj_2.4~rc1+ds.orig.tar.xz.delta differ ===================================== python-pyproj_2.4~rc1+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +6f0f8ebbe2eeeeb8119472f6ce2c047c178eb45a View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/1edb1095f5235fb697778f0004731d0a12a1195c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/1edb1095f5235fb697778f0004731d0a12a1195c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 13:56:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 12:56:06 +0000 Subject: [Git][debian-gis-team/python-pyproj][experimental] 5 commits: New upstream version 2.4~rc1+ds Message-ID: <5d84cc6630820_73483fbbbf1573bc1701922@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / python-pyproj Commits: fadcc549 by Bas Couwenberg at 2019-09-20T12:48:27Z New upstream version 2.4~rc1+ds - - - - - f3acd63c by Bas Couwenberg at 2019-09-20T12:48:28Z Update upstream source from tag 'upstream/2.4_rc1+ds' Update to upstream version '2.4~rc1+ds' with Debian dir 1a473cf13ddb2480d395dd0fe371baf123e0c604 - - - - - 7466dce0 by Bas Couwenberg at 2019-09-20T12:49:23Z New upstream release candidate. - - - - - 1086f419 by Bas Couwenberg at 2019-09-20T12:50:18Z Drop patches, applied/fixed upstream. - - - - - 683cef97 by Bas Couwenberg at 2019-09-20T12:50:36Z Set distribution to experimental. - - - - - 7 changed files: - debian/changelog - − debian/patches/python3.patch - − debian/patches/series - − debian/patches/spelling-errors.patch - pyproj/__init__.py - pyproj/_crs.pyx - test/test__main__.py Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +python-pyproj (2.4~rc1+ds-1~exp1) experimental; urgency=medium + + * New upstream release candidate. + * Drop patches, applied/fixed upstream. + + -- Bas Couwenberg Fri, 20 Sep 2019 14:50:23 +0200 + python-pyproj (2.4~rc0+ds-1~exp1) experimental; urgency=medium * New upstream release candidate. ===================================== debian/patches/python3.patch deleted ===================================== @@ -1,25 +0,0 @@ -Description: Use python3 interpreter. -Author: Bas Couwenberg -Bug: https://github.com/pyproj4/pyproj/issues/451 -Forwarded: https://github.com/pyproj4/pyproj/pull/452 - ---- a/test/test__main__.py -+++ b/test/test__main__.py -@@ -4,7 +4,7 @@ import pytest - - - def test_main(): -- output = subprocess.check_output(["python", "-m", "pyproj"]).decode("utf-8") -+ output = subprocess.check_output(["python3", "-m", "pyproj"]).decode("utf-8") - assert "pyproj version:" in output - assert "PROJ version:" in output - assert "-v, --verbose Show verbose debugging version information." in output -@@ -12,7 +12,7 @@ def test_main(): - - @pytest.mark.parametrize("option", ["-v", "--verbose"]) - def test_main__verbose(option): -- output = subprocess.check_output(["python", "-m", "pyproj", option]).decode("utf-8") -+ output = subprocess.check_output(["python3", "-m", "pyproj", option]).decode("utf-8") - assert "pyproj:" in output - assert "PROJ:" in output - assert "data dir" in output ===================================== debian/patches/series deleted ===================================== @@ -1,2 +0,0 @@ -python3.patch -spelling-errors.patch ===================================== debian/patches/spelling-errors.patch deleted ===================================== @@ -1,44 +0,0 @@ -Description: Fix spelling errors. - * intialized -> initialized -Author: Bas Couwenberg -Forwarded: https://github.com/pyproj4/pyproj/pull/453 -Applied-Upstream: https://github.com/pyproj4/pyproj/commit/9785bf81561ebf03446fb88d159613c4e4391f5a - ---- a/pyproj/_crs.pyx -+++ b/pyproj/_crs.pyx -@@ -563,7 +563,7 @@ cdef class Ellipsoid(Base): - - def __init__(self): - raise RuntimeError( -- "Ellipsoid can only be intialized like 'Ellipsoid.from_*()'." -+ "Ellipsoid can only be initialized like 'Ellipsoid.from_*()'." - ) - - @staticmethod -@@ -777,7 +777,7 @@ cdef class PrimeMeridian(Base): - - def __init__(self): - raise RuntimeError( -- "PrimeMeridian can only be intialized like 'PrimeMeridian.from_*()'." -+ "PrimeMeridian can only be initialized like 'PrimeMeridian.from_*()'." - ) - - @staticmethod -@@ -951,7 +951,7 @@ cdef class Datum(Base): - - def __init__(self): - raise RuntimeError( -- "Datum can only be intialized like 'Datum.from_*()'." -+ "Datum can only be initialized like 'Datum.from_*()'." - ) - - @staticmethod -@@ -1368,7 +1368,7 @@ cdef class CoordinateOperation(Base): - - def __init__(self): - raise RuntimeError( -- "CoordinateOperation can only be intialized like 'CoordinateOperation.from_*()'." -+ "CoordinateOperation can only be initialized like 'CoordinateOperation.from_*()'." - ) - - @staticmethod ===================================== pyproj/__init__.py ===================================== @@ -47,7 +47,7 @@ CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. """ -__version__ = "2.4.rc0" +__version__ = "2.4.rc1" __all__ = [ "Proj", "Geod", ===================================== pyproj/_crs.pyx ===================================== @@ -563,7 +563,7 @@ cdef class Ellipsoid(Base): def __init__(self): raise RuntimeError( - "Ellipsoid can only be intialized like 'Ellipsoid.from_*()'." + "Ellipsoid can only be initialized like 'Ellipsoid.from_*()'." ) @staticmethod @@ -777,7 +777,7 @@ cdef class PrimeMeridian(Base): def __init__(self): raise RuntimeError( - "PrimeMeridian can only be intialized like 'PrimeMeridian.from_*()'." + "PrimeMeridian can only be initialized like 'PrimeMeridian.from_*()'." ) @staticmethod @@ -951,7 +951,7 @@ cdef class Datum(Base): def __init__(self): raise RuntimeError( - "Datum can only be intialized like 'Datum.from_*()'." + "Datum can only be initialized like 'Datum.from_*()'." ) @staticmethod @@ -1368,7 +1368,7 @@ cdef class CoordinateOperation(Base): def __init__(self): raise RuntimeError( - "CoordinateOperation can only be intialized like 'CoordinateOperation.from_*()'." + "CoordinateOperation can only be initialized like 'CoordinateOperation.from_*()'." ) @staticmethod ===================================== test/test__main__.py ===================================== @@ -1,18 +1,42 @@ +import contextlib +import os import subprocess +import sys import pytest -def test_main(): - output = subprocess.check_output(["python", "-m", "pyproj"]).decode("utf-8") + at contextlib.contextmanager +def tmp_chdir(new_dir): + """ + This temporarily changes directories when running the tests. + Useful for when testing wheels in the pyproj directory + when pyproj has not been build and prevents conflicts. + """ + curdir = os.getcwd() + try: + os.chdir(new_dir) + yield + finally: + os.chdir(curdir) + + +def test_main(tmpdir): + with tmp_chdir(str(tmpdir)): + output = subprocess.check_output( + [sys.executable, "-m", "pyproj"], stderr=subprocess.STDOUT + ).decode("utf-8") assert "pyproj version:" in output assert "PROJ version:" in output assert "-v, --verbose Show verbose debugging version information." in output @pytest.mark.parametrize("option", ["-v", "--verbose"]) -def test_main__verbose(option): - output = subprocess.check_output(["python", "-m", "pyproj", option]).decode("utf-8") +def test_main__verbose(option, tmpdir): + with tmp_chdir(str(tmpdir)): + output = subprocess.check_output( + [sys.executable, "-m", "pyproj", option], stderr=subprocess.STDOUT + ).decode("utf-8") assert "pyproj:" in output assert "PROJ:" in output assert "data dir" in output View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/compare/602b3c90919a89bb08d0e1e85c781082fa76559f...683cef972b79ed9d2a5a63ffbceb6dc9df670791 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/compare/602b3c90919a89bb08d0e1e85c781082fa76559f...683cef972b79ed9d2a5a63ffbceb6dc9df670791 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 13:56:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 12:56:08 +0000 Subject: [Git][debian-gis-team/python-pyproj][upstream] New upstream version 2.4~rc1+ds Message-ID: <5d84cc68b18dd_73482ad961746598170219d@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pyproj Commits: fadcc549 by Bas Couwenberg at 2019-09-20T12:48:27Z New upstream version 2.4~rc1+ds - - - - - 3 changed files: - pyproj/__init__.py - pyproj/_crs.pyx - test/test__main__.py Changes: ===================================== pyproj/__init__.py ===================================== @@ -47,7 +47,7 @@ CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. """ -__version__ = "2.4.rc0" +__version__ = "2.4.rc1" __all__ = [ "Proj", "Geod", ===================================== pyproj/_crs.pyx ===================================== @@ -563,7 +563,7 @@ cdef class Ellipsoid(Base): def __init__(self): raise RuntimeError( - "Ellipsoid can only be intialized like 'Ellipsoid.from_*()'." + "Ellipsoid can only be initialized like 'Ellipsoid.from_*()'." ) @staticmethod @@ -777,7 +777,7 @@ cdef class PrimeMeridian(Base): def __init__(self): raise RuntimeError( - "PrimeMeridian can only be intialized like 'PrimeMeridian.from_*()'." + "PrimeMeridian can only be initialized like 'PrimeMeridian.from_*()'." ) @staticmethod @@ -951,7 +951,7 @@ cdef class Datum(Base): def __init__(self): raise RuntimeError( - "Datum can only be intialized like 'Datum.from_*()'." + "Datum can only be initialized like 'Datum.from_*()'." ) @staticmethod @@ -1368,7 +1368,7 @@ cdef class CoordinateOperation(Base): def __init__(self): raise RuntimeError( - "CoordinateOperation can only be intialized like 'CoordinateOperation.from_*()'." + "CoordinateOperation can only be initialized like 'CoordinateOperation.from_*()'." ) @staticmethod ===================================== test/test__main__.py ===================================== @@ -1,18 +1,42 @@ +import contextlib +import os import subprocess +import sys import pytest -def test_main(): - output = subprocess.check_output(["python", "-m", "pyproj"]).decode("utf-8") + at contextlib.contextmanager +def tmp_chdir(new_dir): + """ + This temporarily changes directories when running the tests. + Useful for when testing wheels in the pyproj directory + when pyproj has not been build and prevents conflicts. + """ + curdir = os.getcwd() + try: + os.chdir(new_dir) + yield + finally: + os.chdir(curdir) + + +def test_main(tmpdir): + with tmp_chdir(str(tmpdir)): + output = subprocess.check_output( + [sys.executable, "-m", "pyproj"], stderr=subprocess.STDOUT + ).decode("utf-8") assert "pyproj version:" in output assert "PROJ version:" in output assert "-v, --verbose Show verbose debugging version information." in output @pytest.mark.parametrize("option", ["-v", "--verbose"]) -def test_main__verbose(option): - output = subprocess.check_output(["python", "-m", "pyproj", option]).decode("utf-8") +def test_main__verbose(option, tmpdir): + with tmp_chdir(str(tmpdir)): + output = subprocess.check_output( + [sys.executable, "-m", "pyproj", option], stderr=subprocess.STDOUT + ).decode("utf-8") assert "pyproj:" in output assert "PROJ:" in output assert "data dir" in output View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/fadcc5493f6b942ccd7104517fa65a63502bb2df -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/fadcc5493f6b942ccd7104517fa65a63502bb2df You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 13:56:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 12:56:15 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag debian/2.4_rc1+ds-1_exp1 Message-ID: <5d84cc6f3522_73482ad9617465981702386@godard.mail> Bas Couwenberg pushed new tag debian/2.4_rc1+ds-1_exp1 at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/debian/2.4_rc1+ds-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 20 13:56:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 20 Sep 2019 12:56:15 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag upstream/2.4_rc1+ds Message-ID: <5d84cc6fb7807_73483fbbba934fbc1702510@godard.mail> Bas Couwenberg pushed new tag upstream/2.4_rc1+ds at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/upstream/2.4_rc1+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 20 14:05:45 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 20 Sep 2019 13:05:45 +0000 Subject: Processing of python-pyproj_2.4~rc1+ds-1~exp1_source.changes Message-ID: python-pyproj_2.4~rc1+ds-1~exp1_source.changes uploaded successfully to localhost along with the files: python-pyproj_2.4~rc1+ds-1~exp1.dsc python-pyproj_2.4~rc1+ds.orig.tar.xz python-pyproj_2.4~rc1+ds-1~exp1.debian.tar.xz python-pyproj_2.4~rc1+ds-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 20 14:19:16 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 20 Sep 2019 13:19:16 +0000 Subject: python-pyproj_2.4~rc1+ds-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 20 Sep 2019 14:50:23 +0200 Source: python-pyproj Architecture: source Version: 2.4~rc1+ds-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pyproj (2.4~rc1+ds-1~exp1) experimental; urgency=medium . * New upstream release candidate. * Drop patches, applied/fixed upstream. Checksums-Sha1: 76e6911954db4574cf05f195d295f5b930ee87dc 2251 python-pyproj_2.4~rc1+ds-1~exp1.dsc b017d5700380769eea137e1669980101c7c098dd 83776 python-pyproj_2.4~rc1+ds.orig.tar.xz d83903349241e51ddd93c64d6e1fdfd03996a667 6132 python-pyproj_2.4~rc1+ds-1~exp1.debian.tar.xz d33b269bcb494045c8a8e91119185c328f6b8866 8723 python-pyproj_2.4~rc1+ds-1~exp1_amd64.buildinfo Checksums-Sha256: ebb4b27e3a52f4c667620783dc53903f72da7f4b86f0496458e91fdd0d982f47 2251 python-pyproj_2.4~rc1+ds-1~exp1.dsc 4f3ac6647d5ad19ae8e416812b23c025ef7d046c3edc1a60737c2a03be8c3b8b 83776 python-pyproj_2.4~rc1+ds.orig.tar.xz f0dddc590f9d19fb75df07acb11ae79f7f0924cd88449ae418c84f44ca334c3e 6132 python-pyproj_2.4~rc1+ds-1~exp1.debian.tar.xz dc300a8c089dec50397b4a721f59805cebaedd015d9072c09615a05369ba99b2 8723 python-pyproj_2.4~rc1+ds-1~exp1_amd64.buildinfo Files: d693a26eafb4ccb0541353217558c791 2251 python optional python-pyproj_2.4~rc1+ds-1~exp1.dsc 1c4624ccd1a05eb2fc2e7c262e53d5b3 83776 python optional python-pyproj_2.4~rc1+ds.orig.tar.xz cbfee489d487a6e7b061d78809652d0b 6132 python optional python-pyproj_2.4~rc1+ds-1~exp1.debian.tar.xz 98a6aafa1662ee519c6bb68a88ba88f4 8723 python optional python-pyproj_2.4~rc1+ds-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2EzEIACgkQZ1DxCuiN SvGKEg//b+tqaL5Y5WOMd6G5E98x8AVwwIFPxY/qiq9LyKhxaqKvaNeqG3ZPWNGD jxzXa+yJJrKwo9psoWWoIRCuw8egAhIdlDek6OiQ6EcEo3UIoOLOy/ppzYLU1QBt TC/nHxv+EE0s350teGP+VuFPneNKzhsXmaPk9SkUmAFhJRvhZ9RrBpAFza7I9rN1 rpqVbSq4f+8KmOqO03eXaT/Swx20a5xqB7tKk3W2x/x+exDsP0B2acgt888yCeAb CGvOWX5jN+yq5AubNoQ7oXzCAoSMyI5jzJ+yj+Pup7WR6GMvk1saEjeIu5nFLwA+ HhGdB023Jwr2i4Nr7Mroq3x8D1FBgBm3lVmjfNdJvDq1kOQYewlwwJpLg4Vt2Y8k jGiMcNLIJKtmX7kZT+KyeBOmmrOV84tJS84r1VJxfi2X4pqjeiDCI6FceH9TonrK DwWdFjV3tz6BBNcCAcdsaoX3UrYG9tLU0xxi1iRwrS2KJYYqgTWdPmB7937a/pz+ D39O6UpiI41W8DcWN1o268Z9qiLU3cfjOfRSrA/cFhpaJDK1DBBNvTYblro8JIn/ TPioR47QY2SciO9cghLTci8rDYiKzpFBycCzP3rDdsPCzyrDrZ+KNqyvbSzIxbFe rHhu9sEL+Exomy19KSUzRlqFiPLHu0kcMFzWLKZugATTBrBY0ss= =SOso -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From news at newsletter-canalsolar.com.br Sat Sep 21 01:45:02 2019 From: news at newsletter-canalsolar.com.br (Canal Solar) Date: Sat, 21 Sep 2019 00:45:02 +0000 Subject: Energia Solar FV - Canal Solar - 19/09/2019 Message-ID: <4u8m1k7z24wt.RmvaVSANb2axzxASnw7nSw2@VB3T.trk.elasticemail.com> *|MC_PREVIEW_TEXT|* Pkg se não conseguir ver o email clique aqui para abrir no navegador. ( http://VB3T.trk.elasticemail.com/tracking/click?d=4H1Cs8hqdxg5TdCPLzpfmllHDh620aBrYy36HIKkw1ZWMcn-loIURDglQN_VHgd3w3vXXwJ5xE01_vZR8X995OKCBujB7c5si0cjMXbmJWKjWM8YOvaoRmcCKz-9ZCA8Jg2*|ARCHIVE|* ) Informativo do Canal Solar O melhor conteúdo sobre energia solar fotovoltaica da internet brasileira. www.canalsolar.com.br ( http://VB3T.trk.elasticemail.com/tracking/click?d=wC2SqEYuegF_90_qYkz6DZhOfqh9WJSD585y33KQvZL4_4zQWV_jUKBQhDp62vZvhCr_5BNH9RCrl4ByswVcj1yktwFt3Ej3-wZIp33F-UZWQDTHPNGz6pqaDRagirR_TXp_vLjThtH9R_KGJh9eRV01 ) Clique aqui e conheça os cursos do Canal Solar - O melhor conteúdo para sua formação em sistemas fotovoltaicos - Professores especialistas do mercado ( http://VB3T.trk.elasticemail.com/tracking/click?d=1opH5oNjtZ6bDD_4b_9gfQUiAdQ9tD5SKGqx1ItEZQ-LxnyJ4F9hAGU1enBMBjCzxD7Lah0N8_cjHBPRl15XrOL6yaEKJk4KBFdTFauCnm6tmXOjksf_P1MDNTrMdy6Fgvk9XAH2WiDHtou3QG50s2BBzh9-5VPPXu_VPEad_0gH0 ) Como adicionar arquivos .PAN no PVSyst ( http://VB3T.trk.elasticemail.com/tracking/click?d=Nlym0eiejvbT9r5mtfgO2lxMZjAjWDggEGG433mpB2OAygRhwMwE2longXmgVENP87P6YU7lFa2Pbbk9UfzgAliUX_K0ZdZwSgWXAaDwkYh3_U5tv48zjmv-AY2T5Gvo-ZOoQ9YoRdzg5NQwa7TEjCDWfe6roJwLYL0jHxBEgPyP932W4yn5pSueuuUc5DDw3GG3zWZZHiVedDm5WjF3GVI1 ) Ensaio de resistência de isolamento em sistemas fotovoltaicos ( http://VB3T.trk.elasticemail.com/tracking/click?d=pLITmIOZgO1nAVX8AARdJ1FsdOaNWPglTa0VWLfmsISfI-X7vejfUbI2x9kz6BqG5Y1gLzOG3wA_yk61npPb54bug2d9v9IbmJsX9fCfjemHFP8JuyDtxFAv7eevyyBVIG-RsKWqbpFo8YPimM-GOY1LP1PossSTT-TPWiajsqGz3zTBLvri4L1Dqtn2RIDrw4zAqK98wOoh8m0XNvOL5NToo814bj6qxRvmk54IHuCA3OnxnmNVjov0kVpb3jtNDw2 ) O telhadômetro foi atualizado este mês. Mais um telhado caiu. ( http://VB3T.trk.elasticemail.com/tracking/click?d=hXYsf2KCSyekva6kGqDGPDATmDwVG6o5KfEQrlfiHJtEkJJPPkhb1U_9tSJDx5aJ0MeIKdnrQ1vy8ozjeyOv2fg8eIN1yK5mJL2qsiSCBHapztm4SkmN5rUfDt7UDXs4m5Vg1SwCFmcHqBAPyj4mLzUBWLNJhnR0ynMFnRRufaNIxpm2SP_DlaVOYpE5aoUmWg2 ) É possível soldar os conectores MC4? ( http://VB3T.trk.elasticemail.com/tracking/click?d=q5KJENsg9-eA-_uc97f3iNLGjzJcq-f_UuIm9HSKfmPH7S29E3YjbaW9vCSKMdC9qM8J5hF0dUKX5zVS8zUb_DlnEBhv6YB-8DbTVo76X9cnH7kI5hSxueTNKDRhy4WQdPXpGX6CfahZfmuNYB2DevrmNGB05SNrmFFwv2Jos62GaOmqeRWmBMwAuXiUbiGdrBRvCFnLE8SBDg2DD1PTmcM9w7kEEnBlAQ8ytRydRf7R0 ) Curso Avançado de Projeto de Usinas de Geração Distribuída ( http://VB3T.trk.elasticemail.com/tracking/click?d=BhMtb4n7O9doS4obyZcRafAR68h9EDyoXszUPsLO7lX9o3kpjmBkAevykzb4IQUehMsfuZdD-MhniLdFDXghi7hrNS7GN_65-JugqaTLpFmsxt4kbRgqvA2-EBMO93p7MaLC3fV_AXNjRgBETUDrS90dTKPCODX0I58sCNwLzssVTL0s8HSuGTZpabCrz6Y_s4Eve9kFJW4dZlD_-fuBXFV6b02uCL0eeQ3ogSe17yFKDDeYmFgiJW0-sLSio3a4LQ2 ) O que é DPS e como é usado nos sistemas fotovoltaicos ( http://VB3T.trk.elasticemail.com/tracking/click?d=M0xzrJ52HxPT_OTuTOBWXaz2bNxrw6faTsbhUc2u5Qebc8f_gs2Eo7OIzYz5lCNGrB9iskyFO9sSrBXOIhysPMQ_tVWelkGF7AAloZQQiD_tQAvlDLGVpIJxsXJGa_2J5blrlSJJ18iY9BOT2IM01zoNzqJPYHW9c73vPNn0thM2991oerlssJn_KrYwoLPQdkRGPLZXvHCjRVTSCSqYspoFs0N4IbtpbxaH5h7-7ZrL0 ) Empresa espanhola anuncia investimento de 765 MW no Brasil ( http://VB3T.trk.elasticemail.com/tracking/click?d=KKkPvDMbQd2hFJkACW00_0m3TwvWrKdo8sujil3UcNFDNDWMan8c25XbvjOU5vUJp6i_RmI7PPcT2FVaToRNyKnKhd3t_LQi1rIWAGg_RbdA_S5kVKwAGe6gzvH7Srpghm1ChTiBDsCkBE0_0txDj6ewEyc1JH1ClGezeJs4cKJdL8rXYr5HjP7esaB9ZeDT5lslEmfhdg5_cmAbGZtojXNuXqwhSntrMfOtRNmGsqkJFSlef-h0t1FKegw1DcKhEfWONteCREktmIV0FFKMYvM1 ) Economizando cabos com a conexão leap-frog dos módulos fovoltaicos ( http://VB3T.trk.elasticemail.com/tracking/click?d=sftQuiRXdGW6WTqCj-uNIGremUw-J-blunIMQ9-DcT_OLimz_rG9ZPAVOFkhb7ZQSeM7XXoqRBVrfV_EOav7M-meM4b4cXdNRMl28BDZY-od-OJHOAgl3hgzYFA44MgCsRBV0k4RHpfrGjX7T3JtpmvBm9Znet3oiNi2QeZxFUrJBr7uI1V6xyjxfo0HUoPAYISGhBQiV2JFnmShoOIudwJ7Lx3AAdNcRjCuF7ThXQo10 ) Conheça os parceiros do Canal Solar: Copyright © 2019 CANAL SOLAR Fale com a gente: contato at canalsolar.com.br ( mailto:contato at canalsolar.com.br ) (19) 98969-6114 Whatsapp ( http://VB3T.trk.elasticemail.com/tracking/click?d=dNIHuFgqN6pERQGmd5wgHgwZ6T81P4nd3K2aKjMC9AZK9uV-tAUG7nuB6n58oQNHWDbJ9LYA8GSeAFtUTxyOqH1NwxJzEWpQzJxHMhwHm9-58bdPL_0xl_iS2sZqUuI-RUBoQDxOH3j_uWF3CF6XcSj1v1o2jTuUxxpFJiIXzUQrXnOgQq0UBVZ40c0q5KqFqhxCu2n9Vgn6edQzo8RCyVo1 ) (19) 3296-6103 http://VB3T.trk.elasticemail.com/tracking/unsubscribe?msgid=RmvaVSANb2axzxASnw7nSw2 -------------- next part -------------- An HTML attachment was scrubbed... URL: From noreply at release.debian.org Sat Sep 21 05:39:15 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 21 Sep 2019 04:39:15 +0000 Subject: spatialite-gui 2.1.0~beta0+really2.0.0~devel2-5 MIGRATED to testing Message-ID: FYI: The status of the spatialite-gui source package in Debian's testing distribution has changed. Previous version: 2.1.0~beta0+really2.0.0~devel2-4 Current version: 2.1.0~beta0+really2.0.0~devel2-5 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sat Sep 21 05:39:06 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Sat, 21 Sep 2019 04:39:06 +0000 Subject: pywps is marked for autoremoval from testing Message-ID: pywps 4.2.1-3 is marked for autoremoval from testing on 2019-10-12 It is affected by these RC bugs: 940185: pywps: Debian/copyright needs update From noreply at release.debian.org Sat Sep 21 05:39:20 2019 From: noreply at release.debian.org (Debian testing autoremoval watch) Date: Sat, 21 Sep 2019 04:39:20 +0000 Subject: h5utils is marked for autoremoval from testing Message-ID: h5utils 1.13.1-3 is marked for autoremoval from testing on 2019-10-21 It (build-)depends on packages with these RC bugs: 885212: libmatheval: please migrate to guile-2.2 From gitlab at salsa.debian.org Sat Sep 21 05:39:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 04:39:29 +0000 Subject: [Git][debian-gis-team/python-pyproj][master] 20 commits: New upstream version 2.4~rc0+ds Message-ID: <5d85a981ed770_56b2aeb2f55ea0c119740@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pyproj Commits: 59620342 by Bas Couwenberg at 2019-09-19T05:24:49Z New upstream version 2.4~rc0+ds - - - - - f694b1e7 by Bas Couwenberg at 2019-09-19T05:28:44Z Update branch in gbp.conf & Vcs-Git URL. - - - - - e11333be by Bas Couwenberg at 2019-09-19T05:29:43Z Update watch file for pre-releases. - - - - - 6ddc19b9 by Bas Couwenberg at 2019-09-19T05:29:54Z Merge tag 'upstream/2.4_rc0+ds' into experimental Upstream version 2.4~rc0+ds - - - - - cc85c2c8 by Bas Couwenberg at 2019-09-19T05:30:16Z New upstream release candidate. - - - - - de94942e by Bas Couwenberg at 2019-09-19T06:07:57Z Add patch to use python3 interpreter for tests. - - - - - 1048389b by Bas Couwenberg at 2019-09-19T06:07:57Z Add patch to fix spelling errors. - - - - - 39c5d27c by Bas Couwenberg at 2019-09-19T06:15:10Z Use ${python3:Provides} substvar in Provides field. - - - - - 507d0c56 by Bas Couwenberg at 2019-09-19T06:15:10Z Set distribution to experimental. - - - - - 602b3c90 by Bas Couwenberg at 2019-09-19T12:31:37Z Mark spelling-errors.patch as Applied-Upstream. - - - - - fadcc549 by Bas Couwenberg at 2019-09-20T12:48:27Z New upstream version 2.4~rc1+ds - - - - - f3acd63c by Bas Couwenberg at 2019-09-20T12:48:28Z Update upstream source from tag 'upstream/2.4_rc1+ds' Update to upstream version '2.4~rc1+ds' with Debian dir 1a473cf13ddb2480d395dd0fe371baf123e0c604 - - - - - 7466dce0 by Bas Couwenberg at 2019-09-20T12:49:23Z New upstream release candidate. - - - - - 1086f419 by Bas Couwenberg at 2019-09-20T12:50:18Z Drop patches, applied/fixed upstream. - - - - - 683cef97 by Bas Couwenberg at 2019-09-20T12:50:36Z Set distribution to experimental. - - - - - d0f0ca1e by Bas Couwenberg at 2019-09-21T04:25:55Z Revert "Update branch in gbp.conf & Vcs-Git URL." This reverts commit f694b1e72b93b2fa21269727626af9c85737fe8c. - - - - - 8a50040c by Bas Couwenberg at 2019-09-21T04:26:24Z New upstream version 2.4.0+ds - - - - - 92ca16d6 by Bas Couwenberg at 2019-09-21T04:26:25Z Update upstream source from tag 'upstream/2.4.0+ds' Update to upstream version '2.4.0+ds' with Debian dir f4757529e06f5afcd909b81adedd9e6065e477b1 - - - - - a06057fe by Bas Couwenberg at 2019-09-21T04:26:38Z New upstream release. - - - - - 6c5ecb10 by Bas Couwenberg at 2019-09-21T04:27:26Z Set distribution to unstable. - - - - - 30 changed files: - .all-contributorsrc - .github/ISSUE_TEMPLATE/bug_report.md - .travis.yml - + HOW_TO_RELEASE.md - MANIFEST.in - README.md - appveyor.yml - ci/travis/proj-dl-and-compile - debian/changelog - debian/control - debian/watch - pyproj/__init__.py - + pyproj/__main__.py - pyproj/_crs.pxd - pyproj/_crs.pyx - pyproj/_datadir.pxd - pyproj/_datadir.pyx - pyproj/_geod.pxd - pyproj/_geod.pyx - pyproj/_list.pyx - pyproj/_proj.pxd - pyproj/_proj.pyx - pyproj/_show_versions.py - pyproj/_transformer.pxd - pyproj/_transformer.pyx - pyproj/base.pxi - pyproj/crs.py - pyproj/datadir.py - pyproj/enums.py - pyproj/geod.py The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/compare/5220dae973d1767f0c171504500ddf279b3876fc...6c5ecb106d0f59fe0562c350d82fb14151494cfd -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/compare/5220dae973d1767f0c171504500ddf279b3876fc...6c5ecb106d0f59fe0562c350d82fb14151494cfd You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From noreply at release.debian.org Sat Sep 21 05:39:14 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sat, 21 Sep 2019 04:39:14 +0000 Subject: saga 7.3.0+dfsg-2 MIGRATED to testing Message-ID: FYI: The status of the saga source package in Debian's testing distribution has changed. Previous version: 7.3.0+dfsg-1 Current version: 7.3.0+dfsg-2 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Sat Sep 21 05:39:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 04:39:34 +0000 Subject: [Git][debian-gis-team/python-pyproj][upstream] New upstream version 2.4.0+ds Message-ID: <5d85a98635e38_56b2aeb2f2d12581201cd@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pyproj Commits: 8a50040c by Bas Couwenberg at 2019-09-21T04:26:24Z New upstream version 2.4.0+ds - - - - - 2 changed files: - pyproj/__init__.py - pyproj/crs.py Changes: ===================================== pyproj/__init__.py ===================================== @@ -47,7 +47,7 @@ CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. """ -__version__ = "2.4.rc1" +__version__ = "2.4.0" __all__ = [ "Proj", "Geod", ===================================== pyproj/crs.py ===================================== @@ -438,7 +438,10 @@ class CRS(_CRS): @classmethod def from_authority(cls, auth_name, code): - """Make a CRS from an authority name and authority code + """ + .. versionadded:: 2.2.0 + + Make a CRS from an authority name and authority code Parameters ---------- @@ -470,7 +473,10 @@ class CRS(_CRS): @classmethod def from_proj4(cls, in_proj_string): - """Make a CRS from a PROJ string + """ + .. versionadded:: 2.2.0 + + Make a CRS from a PROJ string Parameters ---------- @@ -487,7 +493,10 @@ class CRS(_CRS): @classmethod def from_wkt(cls, in_wkt_string): - """Make a CRS from a WKT string + """ + .. versionadded:: 2.2.0 + + Make a CRS from a WKT string Parameters ---------- @@ -504,7 +513,8 @@ class CRS(_CRS): @classmethod def from_string(cls, in_crs_string): - """Make a CRS from: + """ + Make a CRS from: Initialize a CRS class instance with: - PROJ string @@ -524,7 +534,10 @@ class CRS(_CRS): return cls(_prepare_from_string(in_crs_string)) def to_string(self): - """Convert the CRS to a string. + """ + .. versionadded:: 2.2.0 + + Convert the CRS to a string. It attempts to convert it to the authority string. Otherwise, it uses the string format of the user @@ -585,7 +598,10 @@ class CRS(_CRS): @classmethod def from_dict(cls, proj_dict): - """Make a CRS from a dictionary of PROJ parameters. + """ + .. versionadded:: 2.2.0 + + Make a CRS from a dictionary of PROJ parameters. Parameters ---------- @@ -601,10 +617,10 @@ class CRS(_CRS): @classmethod def from_json(cls, crs_json): """ - Create CRS from a CRS JSON string. - .. versionadded:: 2.4.0 + Create CRS from a CRS JSON string. + Parameters ---------- crs_json: str @@ -619,10 +635,10 @@ class CRS(_CRS): @classmethod def from_json_dict(cls, crs_dict): """ - Create CRS from a JSON dictionary. - .. versionadded:: 2.4.0 + Create CRS from a JSON dictionary. + Parameters ---------- crs_dict: dict View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/8a50040ce42e961f4415de2987eaf0b36ea659b3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/8a50040ce42e961f4415de2987eaf0b36ea659b3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 05:39:35 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 04:39:35 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag debian/2.4.0+ds-1 Message-ID: <5d85a9875e500_56b2aeb2f55ea0c120357@godard.mail> Bas Couwenberg pushed new tag debian/2.4.0+ds-1 at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/debian/2.4.0+ds-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 05:39:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 04:39:32 +0000 Subject: [Git][debian-gis-team/python-pyproj][pristine-tar] pristine-tar data for python-pyproj_2.4.0+ds.orig.tar.xz Message-ID: <5d85a9841ce5d_56b2aeb2fadc7e0119999@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-pyproj Commits: a5cb14b4 by Bas Couwenberg at 2019-09-21T04:26:25Z pristine-tar data for python-pyproj_2.4.0+ds.orig.tar.xz - - - - - 2 changed files: - + python-pyproj_2.4.0+ds.orig.tar.xz.delta - + python-pyproj_2.4.0+ds.orig.tar.xz.id Changes: ===================================== python-pyproj_2.4.0+ds.orig.tar.xz.delta ===================================== Binary files /dev/null and b/python-pyproj_2.4.0+ds.orig.tar.xz.delta differ ===================================== python-pyproj_2.4.0+ds.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +0f285420cdcd036d71e165572830c811951e8793 View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/a5cb14b4a4d62a98c7fdfce3f9cb2deaada810e4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/a5cb14b4a4d62a98c7fdfce3f9cb2deaada810e4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 05:39:36 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 04:39:36 +0000 Subject: [Git][debian-gis-team/python-pyproj] Pushed new tag upstream/2.4.0+ds Message-ID: <5d85a98843c96_56b2aeb2fadc7e01205b0@godard.mail> Bas Couwenberg pushed new tag upstream/2.4.0+ds at Debian GIS Project / python-pyproj -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/tree/upstream/2.4.0+ds You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 21 05:47:13 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 21 Sep 2019 04:47:13 +0000 Subject: Processing of python-pyproj_2.4.0+ds-1_source.changes Message-ID: python-pyproj_2.4.0+ds-1_source.changes uploaded successfully to localhost along with the files: python-pyproj_2.4.0+ds-1.dsc python-pyproj_2.4.0+ds.orig.tar.xz python-pyproj_2.4.0+ds-1.debian.tar.xz python-pyproj_2.4.0+ds-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 21 05:49:13 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 21 Sep 2019 04:49:13 +0000 Subject: python-pyproj_2.4.0+ds-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 21 Sep 2019 06:27:00 +0200 Source: python-pyproj Architecture: source Version: 2.4.0+ds-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-pyproj (2.4.0+ds-1) unstable; urgency=medium . * New upstream release. * Move from experimental to unstable. Checksums-Sha1: c65ab52a897efc0385cf9e09bbb82c47d09dc578 2201 python-pyproj_2.4.0+ds-1.dsc c53205e083a7206fe50743038f4806fd878e9039 83824 python-pyproj_2.4.0+ds.orig.tar.xz c95a9c6c86e153caa818133ca3701479d1066464 6156 python-pyproj_2.4.0+ds-1.debian.tar.xz 81a1158930837fd63aa02dd80593299bbf823213 8651 python-pyproj_2.4.0+ds-1_amd64.buildinfo Checksums-Sha256: 21cd1dfb2638e37ab16fb7c859f25da54bed9c97386f2b8b71f9a6d879a75112 2201 python-pyproj_2.4.0+ds-1.dsc 6d098135c127149d4492b0697b6e8d25bec41dd2581a84a5924cec47387c24a0 83824 python-pyproj_2.4.0+ds.orig.tar.xz 37bf6607264f224222557d1701b1dc1a1b9d01f21981e4f0762094331de4ee93 6156 python-pyproj_2.4.0+ds-1.debian.tar.xz 5b9b325fbe1f6e7424f0eb592b46c4b817a1a236117ae56d575113a4a5884dc5 8651 python-pyproj_2.4.0+ds-1_amd64.buildinfo Files: 1a8ed55583dbe85ff400e5c9d76129ac 2201 python optional python-pyproj_2.4.0+ds-1.dsc 9adb1a941a7be64acd5ca0eb6133b9ff 83824 python optional python-pyproj_2.4.0+ds.orig.tar.xz 5bc8f130690608637995c8ab4ce8d630 6156 python optional python-pyproj_2.4.0+ds-1.debian.tar.xz 0c17a0ebd30e9aec0d533eabdf40ebe8 8651 python optional python-pyproj_2.4.0+ds-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2FqWkACgkQZ1DxCuiN SvFDsw/8CHTi8w/BQYR4/mrk4ln63VXkeVVci1QiZ9zxX2XCGh1Aktv1N3gGxNPj o5AOBE/p2AKFJuQtsPD/QUAbrJpE6+Gq/dXQprOZMaMvaCXU992+Y5R6/gVnzxOA Kh6De3Of5OKFs4pfU7Ls8hYtemsstZEZ3Nq/tA/bHAj8C2TQfd8Oh9eLEHE3wiM3 AxWHqUh8JTWljqOsFfrY67wc0UHCawSDx6dvQu+LdLZZNJmRt5NjQZI6Cyk9l+Gz FKkv17r+XjV0TNV5gGEAfGxodzscZVeWENeTnHkmXFn1m3YkE1g+ca/mDJD6gT/u KX7vJRb+Kp4+sFN63q/UMmJh09BwN0Z3pVoOFigLatQ2s2zwgt4+RL12r2azkN87 c/6UckdVK8ivSyi/WClP9IPfWfnDf0hwq9BRa9WsD7s8waQ9VWz2r8tVWcgdKwli 4HF95CV87GCEKg5r63UKXuCnzSm95DhAPnThqi+ZlGLx3TwsnPku+nrIqO0D+KdX 8KkIdh9Nh+yvRtAqsZtesikEAFSPPBxX3FGhYk+Iu3k2ybZ6zGvlLMn1pVnJ8WD/ xcwNy79rEIz54Fg0ZQLOXGxctksAFeAHot7HUq7QiAAfzW3kE0W/o89aH6tMQ9gi SGqswwFvd8vOAmBJ985N3ipHARpUEmMLP3GpQnVYWUrD0v/cwLo= =uw6D -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 21 10:42:27 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 21 Sep 2019 09:42:27 +0000 Subject: [Git][debian-gis-team/pyninjotiff][pristine-tar] pristine-tar data for pyninjotiff_0.2.0.orig.tar.gz Message-ID: <5d85f0837a8c_56b2aeb2f2d1258135958@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / pyninjotiff Commits: 2b24779a by Antonio Valentino at 2019-09-21T07:52:09Z pristine-tar data for pyninjotiff_0.2.0.orig.tar.gz - - - - - 2 changed files: - + pyninjotiff_0.2.0.orig.tar.gz.delta - + pyninjotiff_0.2.0.orig.tar.gz.id Changes: ===================================== pyninjotiff_0.2.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/pyninjotiff_0.2.0.orig.tar.gz.delta differ ===================================== pyninjotiff_0.2.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +6eb9c8a7ac2a863a28d783d898a34c7d3a8590b7 View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/commit/2b24779ade75bc80dd4563fbdce57f9eeb1a561c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/commit/2b24779ade75bc80dd4563fbdce57f9eeb1a561c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 10:42:51 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 21 Sep 2019 09:42:51 +0000 Subject: [Git][debian-gis-team/pyninjotiff] Pushed new tag upstream/0.2.0 Message-ID: <5d85f09bce7d0_56b2aeb2f2d12581365c2@godard.mail> Antonio Valentino pushed new tag upstream/0.2.0 at Debian GIS Project / pyninjotiff -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/tree/upstream/0.2.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 10:42:50 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 21 Sep 2019 09:42:50 +0000 Subject: [Git][debian-gis-team/pyninjotiff][master] 9 commits: New upstream version 0.2.0 Message-ID: <5d85f09ab8dc5_56b2aeb2fadc7e0136335@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyninjotiff Commits: 27575121 by Antonio Valentino at 2019-09-21T07:52:09Z New upstream version 0.2.0 - - - - - 70d5b60a by Antonio Valentino at 2019-09-21T07:52:09Z Update upstream source from tag 'upstream/0.2.0' Update to upstream version '0.2.0' with Debian dir bb04c6a19d6537b0fa1300b1f5a09ff1078fdcac - - - - - c153f87c by Antonio Valentino at 2019-09-21T07:57:34Z New upstream release - - - - - 08340838 by Antonio Valentino at 2019-09-21T08:00:08Z Bump debhelper from old 11 to 12. Fixes lintian: package-uses-old-debhelper-compat-version See https://lintian.debian.org/tags/package-uses-old-debhelper-compat-version.html for more details. - - - - - 3ee2a063 by Antonio Valentino at 2019-09-21T08:00:18Z Remove obsolete fields Name from debian/upstream/metadata. - - - - - 20e9d94f by Antonio Valentino at 2019-09-21T08:11:29Z Update copyright file - - - - - 14a90248 by Antonio Valentino at 2019-09-21T08:22:20Z Drop all patches - - - - - ebb4c7e1 by Antonio Valentino at 2019-09-21T09:30:03Z Enable testing - - - - - 64a19fe9 by Antonio Valentino at 2019-09-21T09:38:42Z Set distribution to unstable - - - - - 22 changed files: - .bumpversion.cfg - + .stickler.yml - + .travis.yml - changelog.rst - debian/changelog - − debian/compat - debian/control - debian/copyright - − debian/patches/0001-Python-3-compatibility.patch - − debian/patches/0002-Disable-pointless-warning.patch - − debian/patches/series - debian/rules - debian/upstream/metadata - pyninjotiff/ninjotiff.py - + pyninjotiff/ninjotiff_config-file_satpy_example.py - pyninjotiff/ninjotiff_satpy_example - + pyninjotiff/rgb_ninjotiff_satpy_example - + pyninjotiff/tests/test_ninjotiff.py - pyninjotiff/tifffile.py - pyninjotiff/version.py - + setup.cfg - setup.py Changes: ===================================== .bumpversion.cfg ===================================== @@ -1,5 +1,5 @@ [bumpversion] -current_version = 0.1.0 +current_version = 0.2.0 commit = True tag = True ===================================== .stickler.yml ===================================== @@ -0,0 +1,7 @@ +linters: + flake8: + python: 3 + fixer: true + max-line-length: 120 +fixers: + enable: true ===================================== .travis.yml ===================================== @@ -0,0 +1,15 @@ +language: python +python: + - "2.7" + - "3.6" + - "3.7" + +install: + - pip install codecov pytest pytest-cov trollimage xarray dask[array] + - pip install -e . + +script: + - pytest --cov=./ + +after_success: + - codecov ===================================== changelog.rst ===================================== @@ -2,6 +2,85 @@ Changelog ========= +v0.2.0 (2019-09-19) +------------------- +- update changelog. [Martin Raspaud] +- Bump version: 0.1.0 → 0.2.0. [Martin Raspaud] +- Merge pull request #18 from mraspaud/fix-user-home-path. [Martin + Raspaud] + + Fix user home path +- Fix travis to improve coverage. [Martin Raspaud] +- Expand the config filename in case ~ is used. [Martin Raspaud] +- Merge pull request #17 from mraspaud/fix-python3-configparser. [Martin + Raspaud] + + Fix python2-only configparser import +- Fix python2-only configparser import. [Martin Raspaud] +- Merge pull request #16 from mraspaud/fix-tests. [Martin Raspaud] + + Fix test dependencies +- Fix area definitions in the tests. [Martin Raspaud] +- Add pyresample to setup dependencies. [Martin Raspaud] +- Add pyproj to setup dependencies. [Martin Raspaud] +- Fix dask array dependencies. [Martin Raspaud] +- Fix test dependencies. [Martin Raspaud] +- Fix .travis.yml file. [Martin Raspaud] +- Merge pull request #14 from pytroll/feature-python3. [Martin Raspaud] + + Support for python3 and unittests +- Fix P test. [Martin Raspaud] +- Add draft test for P mode. [Martin Raspaud] +- Use _FillValue to mask integer arrays. [Martin Raspaud] +- Add trollimage to test dependencies. [Martin Raspaud] +- Add codecov to travis. [Martin Raspaud] +- Fix channel in vis tests. [Martin Raspaud] +- Fix stickler line length. [Martin Raspaud] +- Fixing style errors. [stickler-ci] +- Add tests. [Martin Raspaud] +- Fix scaling bw images. [Martin Raspaud] +- Fix style. [Martin Raspaud] +- Fixing style errors. [stickler-ci] +- Start supporting python3. [Martin Raspaud] +- Merge pull request #13 from pytroll/add-stickler-config. [Martin + Raspaud] + + Adding .stickler.yml configuration file +- Adding .stickler.yml. [stickler-ci] +- Merge pull request #9 from pytroll/develop. [David Hoese] + + Merge the develop branch in to master +- Merge pull request #3 from goodsonr/compatability-python3. [Martin + Raspaud] + + change all occurences of xrange to range for compatability with Python3 +- change all occurences of xrange to range for compatability with + Python3. [ron goodson] +- Add zero seconds option to zero the seconds of the DateID. [Martin + Raspaud] +- Fix package description. [Martin Raspaud] +- Merge pull request #5 from loreclem/master. [David Hoese] + + WIP|PCW: first attempt to make pyninjotiff xarray compatible. +- Merge pull request #2 from vgiuffrida/master. [lorenzo clementi] + + fix not assigned fill_value and a config file loader issue +- fix not assigned fill_value and config file loader. [root] +- fix fill_value and config file loader. [root] +- Merge pull request #1 from vgiuffrida/master. [lorenzo clementi] + + Add new parameter to configure the ninjotiff config file to use +- Add ninjotiff configuration file loading. [root] +- Typos corrected and removed is_masked. [cll] +- Bugfix (is_masked computed twice) [cll] +- WIP Improvements here and there. [cll] +- Using finalize instead of fill_or_alpha. [cll] +- It now can handle also RGB images. [cll] +- WIP: first attempt to make pyninjotiff xarray compatible. For the + moment, only the 'L' case (1 band) has been upgraded. Still to be + verified. [cll] + + v0.1.0 (2017-10-16) ------------------- - update changelog. [Martin Raspaud] ===================================== debian/changelog ===================================== @@ -1,10 +1,19 @@ -pyninjotiff (0.1.0-2) UNRELEASED; urgency=medium +pyninjotiff (0.2.0-1) unstable; urgency=medium - * Team upload. + [ Bas Couwenberg ] * Update gbp.conf to use --source-only-changes by default. * Bump Standards-Version to 4.4.0, no changes. - -- Bas Couwenberg Sun, 07 Jul 2019 09:40:37 +0200 + [ Antonio Valentino ] + * New upstream release. + * Bump debhelper from old 11 to 12. + * Remove obsolete fields Name from debian/upstream/metadata. + * Update copyright file. + * debian/patches: + - remove all patches (no longer needed) + * Enable testing (add dependencies on pytest, xarray and trollimage). + + -- Antonio Valentino Sat, 21 Sep 2019 09:38:30 +0000 pyninjotiff (0.1.0-1) unstable; urgency=medium ===================================== debian/compat deleted ===================================== @@ -1 +0,0 @@ -11 ===================================== debian/control ===================================== @@ -4,15 +4,19 @@ Uploaders: Antonio Valentino Section: python Priority: optional Testsuite: autopkgtest-pkg-python -Build-Depends: debhelper (>= 11), +Build-Depends: debhelper-compat (= 12), dh-python, python3-all, + python3-dask, python3-matplotlib, python3-numpy, python3-pyproj, python3-pyresample, + python3-pytest, python3-setuptools, - python3-six + python3-six, + python3-trollimage, + python3-xarray Standards-Version: 4.4.0 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyninjotiff Vcs-Git: https://salsa.debian.org/debian-gis-team/pyninjotiff.git ===================================== debian/copyright ===================================== @@ -4,7 +4,7 @@ Upstream-Contact: Martin Raspaud Source: https://github.com/pytroll/pyninjotiff Files: * -Copyright: 2017 Martin Raspaud +Copyright: 2017-2019 Martin Raspaud 2013 Space Science and Engineering Center (SSEC), University of Wisconsin-Madison. Lars Ørum Rasmussen, DMI. License: GPL-3+ @@ -14,7 +14,7 @@ Copyright: 2008-2014 Christoph Gohlke License: BSD-3-clause Files: debian/* -Copyright: 2018 Antonio Valentino +Copyright: 2018-2019 Antonio Valentino License: GPL-3+ License: GPL-3+ ===================================== debian/patches/0001-Python-3-compatibility.patch deleted ===================================== @@ -1,61 +0,0 @@ -From: Antonio Valentino -Date: Mon, 31 Dec 2018 17:15:28 +0000 -Subject: Python 3 compatibility - ---- - pyninjotiff/ninjotiff.py | 17 +++++++++++------ - 1 file changed, 11 insertions(+), 6 deletions(-) - -diff --git a/pyninjotiff/ninjotiff.py b/pyninjotiff/ninjotiff.py -index 16b8374..813f672 100644 ---- a/pyninjotiff/ninjotiff.py -+++ b/pyninjotiff/ninjotiff.py -@@ -35,6 +35,8 @@ Edited by Christian Kliche (Ernst Basler + Partner) to replace pylibtiff with - a modified version of tifffile.py (created by Christoph Gohlke) - """ - -+from __future__ import print_function -+ - import calendar - import logging - import os -@@ -176,7 +178,10 @@ class ProductConfigs(object): - return sorted(self._products.keys()) - - def read_config(self): -- from ConfigParser import ConfigParser -+ try: -+ from ConfigParser import ConfigParser -+ except ImportError: -+ from configparser import ConfigParser - - def _eval(val): - try: -@@ -1060,9 +1065,9 @@ if __name__ == '__main__': - try: - filename = args[0] - except IndexError: -- print >> sys.stderr, """usage: python ninjotiff.py [<-p page-number>] [-c] -+ print("""usage: python ninjotiff.py [<-p page-number>] [-c] - -p : print page number (default are all pages). -- -c: print color maps (default is not to print color maps).""" -+ -c: print color maps (default is not to print color maps).""", file=sys.stderr) - sys.exit(2) - - pages = read_tags(filename) -@@ -1070,12 +1075,12 @@ if __name__ == '__main__': - try: - pages = [pages[page_no]] - except IndexError: -- print >>sys.stderr, "Invalid page number '%d'" % page_no -+ print("Invalid page number '%d'" % page_no, file=sys.stderr) - sys.exit(2) - for page in pages: - names = sorted(page.keys()) -- print "" -+ print("") - for name in names: - if not print_color_maps and name == "color_map": - continue -- print name, page[name] -+ print(name, page[name]) ===================================== debian/patches/0002-Disable-pointless-warning.patch deleted ===================================== @@ -1,33 +0,0 @@ -From: Antonio Valentino -Date: Mon, 31 Dec 2018 17:33:47 +0000 -Subject: Disable pointless warning - ---- - pyninjotiff/tifffile.py | 14 +++++++------- - 1 file changed, 7 insertions(+), 7 deletions(-) - -diff --git a/pyninjotiff/tifffile.py b/pyninjotiff/tifffile.py -index 3e0cf23..29ebadb 100644 ---- a/pyninjotiff/tifffile.py -+++ b/pyninjotiff/tifffile.py -@@ -148,13 +148,13 @@ from xml.etree import cElementTree as etree - - import numpy - --try: -- import _tifffile --except ImportError: -- warnings.warn( -- "failed to import the optional _tifffile C extension module.\n" -- "Loading of some compressed images will be slow.\n" -- "Tifffile.c can be obtained at http://www.lfd.uci.edu/~gohlke/") -+# try: -+# import _tifffile -+# except ImportError: -+# warnings.warn( -+# "failed to import the optional _tifffile C extension module.\n" -+# "Loading of some compressed images will be slow.\n" -+# "Tifffile.c can be obtained at http://www.lfd.uci.edu/~gohlke/") - - __version__ = '2014.08.24' - __docformat__ = 'restructuredtext en' ===================================== debian/patches/series deleted ===================================== @@ -1,2 +0,0 @@ -0001-Python-3-compatibility.patch -0002-Disable-pointless-warning.patch ===================================== debian/rules ===================================== @@ -5,7 +5,7 @@ #export DH_VERBOSE=1 export PYBUILD_NAME=pyninjotiff -export PYBUILD_DISABLE=test +export PYBUILD_BEFORE_TEST=cp -r {dir}/pyninjotiff/tests {build_dir} %: dh $@ --with python3 --buildsystem=pybuild ===================================== debian/upstream/metadata ===================================== @@ -1,6 +1,4 @@ ---- Bug-Database: https://github.com/pytroll/pyninjotiff/issues Bug-Submit: https://github.com/pytroll/pyninjotiff/issues/new -Name: PyNinjoTiff Repository: https://github.com/pytroll/pyninjotiff.git Repository-Browse: https://github.com/pytroll/pyninjotiff ===================================== pyninjotiff/ninjotiff.py ===================================== @@ -1,39 +1,38 @@ # -*- coding: utf-8 -*- -""" -ninjotiff.py - -Created on Mon Apr 15 13:41:55 2013 - -A big amount of the tiff writer are (PFE) from -https://github.com/davidh-ssec/polar2grid by David Hoese - -License: -Copyright (C) 2013 Space Science and Engineering Center (SSEC), - University of Wisconsin-Madison. - Lars Ørum Rasmussen, DMI. - - This program is free software: you can redistribute it and/or modify - it under the terms of the GNU General Public License as published by - the Free Software Foundation, either version 3 of the License, or - (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU General Public License for more details. - - You should have received a copy of the GNU General Public License - along with this program. If not, see . - -Original scripts and automation included as part of this package are -distributed under the GNU GENERAL PUBLIC LICENSE agreement version 3. -Binary executable files included as part of this software package are -copyrighted and licensed by their respective organizations, and -distributed consistent with their licensing terms. - -Edited by Christian Kliche (Ernst Basler + Partner) to replace pylibtiff with -a modified version of tifffile.py (created by Christoph Gohlke) -""" +# ninjotiff.py +# +# Created on Mon Apr 15 13:41:55 2013 +# +# A big amount of the tiff writer are (PFE) from +# https://github.com/davidh-ssec/polar2grid by David Hoese +# +# License: +# Copyright (C) 2013 Space Science and Engineering Center (SSEC), +# University of Wisconsin-Madison. +# Lars Ørum Rasmussen, DMI. +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . +# +# Original scripts and automation included as part of this package are +# distributed under the GNU GENERAL PUBLIC LICENSE agreement version 3. +# Binary executable files included as part of this software package are +# copyrighted and licensed by their respective organizations, and +# distributed consistent with their licensing terms. +# +# Edited by Christian Kliche (Ernst Basler + Partner) to replace pylibtiff with +# a modified version of tifffile.py (created by Christoph Gohlke) +"""Ninjotiff writing utility.""" import calendar import logging @@ -46,16 +45,16 @@ import numpy as np from pyproj import Proj from pyresample.utils import proj4_radius_parameters -#import mpop.imageo.formats.writer_options as write_opts from pyninjotiff import tifffile log = logging.getLogger(__name__) -#------------------------------------------------------------------------- + +# ------------------------------------------------------------------------- # # Ninjo tiff tags from DWD # -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Geotiff tags. GTF_ModelPixelScale = 33550 GTF_ModelTiepoint = 33922 @@ -123,12 +122,34 @@ MODEL_PIXEL_SCALE_COUNT = int(os.environ.get( "GEOTIFF_MODEL_PIXEL_SCALE_COUNT", 3)) -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # # Read Ninjo products config file. # -#------------------------------------------------------------------------- -def get_product_config(product_name, force_read=False): +# ------------------------------------------------------------------------- +def get_writer_config(config_fname, prod, single_product_config, scn_metadata): + """Writer_config function for Trollflow_sat: calls the get_product_config function. + + :Parameters: + config_fname: str + Name of the Ninjo product configuration file + + prod: str + Name of Ninjo product. + + single_product_config: dict + config params for the current product + + scn_metadata: dict + Satpy satellite data + """ + ninjo_product = prod + if 'ninjo_product_name' in single_product_config: + ninjo_product = single_product_config['ninjo_product_name'] + return get_product_config(ninjo_product, True, config_fname) + + +def get_product_config(product_name, force_read=False, config_filename=None): """Read Ninjo configuration entry for a given product name. :Parameters: @@ -145,74 +166,95 @@ def get_product_config(product_name, force_read=False): * As an example, see *ninjotiff_products.cfg.template* in MPOP's *etc* directory. """ - return ProductConfigs()(product_name, force_read) + return ProductConfigs()(product_name, force_read, config_filename) class _Singleton(type): - def __init__(cls, name_, bases_, dict_): - super(_Singleton, cls).__init__(name_, bases_, dict_) - cls.instance = None + def __init__(self, name_, bases_, dict_): + """Init the singleton.""" + super(_Singleton, self).__init__(name_, bases_, dict_) + self.instance = None - def __call__(cls, *args, **kwargs): - if cls.instance is None: - cls.instance = super(_Singleton, cls).__call__(*args, **kwargs) - return cls.instance + def __call__(self, *args, **kwargs): + """Call the singleton.""" + if self.instance is None: + self.instance = super(_Singleton, self).__call__(*args, **kwargs) + return self.instance class ProductConfigs(object): - __metaclass__ = _Singleton + """Product config.""" + + __metaclass__ = _Singleton # noqa def __init__(self): + """Init the product config.""" self.read_config() - def __call__(self, product_name, force_read=False): + def __call__(self, product_name, force_read=False, config_filename=None): + """Call the product config.""" if force_read: - self.read_config() - return self._products[product_name] + self.read_config(config_filename) + if product_name in self._products: + return self._products[product_name] + else: + return {} @property def product_names(self): + """Get the product names.""" return sorted(self._products.keys()) - def read_config(self): - from ConfigParser import ConfigParser + def read_config(self, config_filename=None): + """Read the ninjo products config file.""" + from six.moves.configparser import RawConfigParser + import ast def _eval(val): try: - return eval(val) - except: + return ast.literal_eval(val) + except (ValueError, SyntaxError): return str(val) - filename = self._find_a_config_file() + if config_filename is not None: + filename = self._find_a_config_file(config_filename) + else: + filename = self._find_a_config_file('ninjotiff_products.cfg') log.info("Reading Ninjo config file: '%s'" % filename) - cfg = ConfigParser() - cfg.read(filename) + cfg = RawConfigParser() products = {} - for sec in cfg.sections(): - prd = {} - for key, val in cfg.items(sec): - prd[key] = _eval(val) - products[sec] = prd + if filename is not None: + cfg.read(filename) + for sec in cfg.sections(): + prd = {} + for key, val in cfg.items(sec): + prd[key] = _eval(val) + products[sec] = prd self._products = products @staticmethod - def _find_a_config_file(): - name_ = 'ninjotiff_products.cfg' - home_ = os.path.dirname(os.path.abspath(__file__)) - penv_ = os.environ.get('PPP_CONFIG_DIR', '') - for fname_ in [os.path.join(x, name_) for x in (home_, penv_)]: - if os.path.isfile(fname_): - return fname_ - raise ValueError("Could not find a Ninjo tiff config file") + def _find_a_config_file(fname): + # if config file (fname) is not found as absolute path: look for the + # config file in the PPP_CONFIG_DIR or current dir + name_ = os.path.abspath(os.path.expanduser(fname)) + if os.path.isfile(name_): + return name_ + else: + home_ = os.path.dirname(os.path.abspath(__file__)) + penv_ = os.environ.get('PPP_CONFIG_DIR', '') + for fname_ in [os.path.join(x, name_) for x in (home_, penv_)]: + if os.path.isfile(fname_): + return fname_ + # raise ValueError("Could not find a Ninjo tiff config file") -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # # Write Ninjo Products # -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- def _get_physic_value(physic_unit): # return Ninjo's physics unit and value. if physic_unit.upper() in ('K', 'KELVIN'): @@ -281,9 +323,11 @@ def _get_satellite_altitude(filename): return None -def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_scaled_01=True): - """Finalize a mpop GeoImage for Ninjo. Specialy take care of phycical scale - and offset. +def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, + data_is_scaled_01=True, fill_value=None): + """Finalize a mpop GeoImage for Ninjo. + + Specialy take care of phycical scale and offset. :Parameters: img : mpop.imageo.img.GeoImage @@ -308,14 +352,34 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc **Notes**: physic_val = image*scale + offset Example values for value_range_measurement_unit are (0, 125) or (40.0, -87.5) + + ***Warning*** + Only the 'L' and 'RGB' cases are compatible with xarray.XRImage. + They still have to be tested thoroughly. """ if img.mode == 'L': # PFE: mpop.satout.cfscene - data = img.channels[0] - fill_value = np.iinfo(dtype).min - log.debug("Transparent pixel are forced to be %d" % fill_value) + if isinstance(img, np.ma.MaskedArray): + data = img.channels[0] + else: + # TODO: check what is the correct fill value for NinJo! + if fill_value is not None: + log.debug("Forcing fill value to %s", fill_value) + # Go back to the masked_array for compatibility + # with the following part of the code. + if (np.issubdtype(img.data[0].dtype, np.integer) + and '_FillValue' in img.data[0].attrs): + nodata_value = img.data[0].attrs['_FillValue'] + data = img.data[0].values + data = np.ma.array(data, mask=(data == nodata_value)) + else: + data = img.data[0].to_masked_array() + + fill_value = fill_value if fill_value is not None else np.iinfo(dtype).min + log.debug("Before scaling: %.2f, %.2f, %.2f" % (data.min(), data.mean(), data.max())) + if np.ma.count_masked(data) == data.size: # All data is masked data = np.ones(data.shape, dtype=dtype) * fill_value @@ -328,26 +392,20 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc # value_range_measurement_unit[0] and 1.0 as # value_range_measurement_unit[1] - # Make room for transparent pixel. - scale_fill_value = ( - (np.iinfo(dtype).max) / (np.iinfo(dtype).max + 1.0)) - img = deepcopy(img) - img.channels[0] *= scale_fill_value - - img.channels[0] += 1 / (np.iinfo(dtype).max + 1.0) + # Make room for the transparent pixel value. + data = data.clip(0, 1) + data *= (np.iinfo(dtype).max - 1) + data += 1 - channels, fill_value = img._finalize(dtype) - data = channels[0] - - scale = ((value_range_measurement_unit[1] - - value_range_measurement_unit[0]) / - (np.iinfo(dtype).max)) + scale = ((value_range_measurement_unit[1] + - value_range_measurement_unit[0]) + / (np.iinfo(dtype).max - 1)) # Handle the case where all data has the same value. scale = scale or 1 offset = value_range_measurement_unit[0] mask = data.mask - + data = np.round(data.data).astype(dtype) offset -= scale if fill_value is None: @@ -397,8 +455,17 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, scale, offset, fill_value elif img.mode == 'RGB': - channels, fill_value = img._finalize(dtype) - if fill_value is None: + if isinstance(img, np.ma.MaskedArray): + channels, fill_value = img._finalize(dtype) + else: + data, mode = img.finalize(fill_value=fill_value, dtype=dtype) + # Go back to the masked_array for compatibility with + # the rest of the code. + channels = data.to_masked_array() + # Is this fill_value ok or what should it be? + fill_value = (0, 0, 0, 0) + + if isinstance(img, np.ma.MaskedArray) and fill_value is None: mask = (np.ma.getmaskarray(channels[0]) & np.ma.getmaskarray(channels[1]) & np.ma.getmaskarray(channels[2])) @@ -411,6 +478,8 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, 1.0, 0.0, fill_value[0] elif img.mode == 'RGBA': + if not isinstance(img, np.ma.MaskedArray): + raise NotImplementedError("The 'RGBA' case has not been updated to xarray") channels, fill_value = img._finalize(dtype) fill_value = fill_value or (0, 0, 0, 0) data = np.dstack((channels[0].filled(fill_value[0]), @@ -420,6 +489,8 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, 1.0, 0.0, fill_value[0] elif img.mode == 'P': + if not isinstance(img, np.ma.MaskedArray): + raise NotImplementedError("The 'P' case has not been updated to xarray") fill_value = 0 data = img.channels[0] if isinstance(data, np.ma.core.MaskedArray): @@ -430,11 +501,11 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, 1.0, 0.0, fill_value else: - raise ValueError("Don't known how til handle image mode '%s'" % + raise ValueError("Don't know how to handle image mode '%s'" % str(img.mode)) -def save(img, filename, ninjo_product_name=None, writer_options=None, +def save(img, filename, ninjo_product_name=None, writer_options=None, data_is_scaled_01=True, **kwargs): """Ninjo TIFF writer. @@ -460,7 +531,6 @@ def save(img, filename, ninjo_product_name=None, writer_options=None, * min value will be reserved for transparent color. * If possible mpop.imageo.image's standard finalize will be used. """ - if writer_options: # add writer_options kwargs.update(writer_options) @@ -473,21 +543,31 @@ def save(img, filename, ninjo_product_name=None, writer_options=None, if nbits == 16: dtype = np.uint16 + fill_value = None + if 'fill_value' in kwargs and kwargs['fill_value'] is not None: + fill_value = int(kwargs['fill_value']) + try: value_range_measurement_unit = (float(kwargs["ch_min_measurement_unit"]), float(kwargs["ch_max_measurement_unit"])) except KeyError: value_range_measurement_unit = None - data_is_scaled_01 = bool(kwargs.get("data_is_scaled_01", True)) + # In case we are working on a trollimage.xrimage.XRImage, + # a conversion to the previously used masked_array is needed data, scale, offset, fill_value = _finalize(img, dtype=dtype, data_is_scaled_01=data_is_scaled_01, - value_range_measurement_unit=value_range_measurement_unit,) + value_range_measurement_unit=value_range_measurement_unit, + fill_value=fill_value,) - area_def = img.info['area'] - time_slot = img.info['start_time'] + if isinstance(img, np.ma.MaskedArray): + area_def = img.info['area'] + time_slot = img.info['start_time'] + else: + area_def = img.data.area + time_slot = img.data.start_time # Some Ninjo tiff names kwargs['gradient'] = scale @@ -505,14 +585,13 @@ def save(img, filename, ninjo_product_name=None, writer_options=None, g += [0] * (256 - len(g)) b += [0] * (256 - len(b)) kwargs['cmap'] = r, g, b - write(data, filename, area_def, ninjo_product_name, **kwargs) def write(image_data, output_fn, area_def, product_name=None, **kwargs): - """Generic Ninjo TIFF writer. + """Write a Generic Ninjo TIFF. - If 'prodcut_name' is given, it will load corresponding Ninjo tiff metadata + If 'product_name' is given, it will load corresponding Ninjo tiff metadata from '${PPP_CONFIG_DIR}/ninjotiff.cfg'. Else, all Ninjo tiff metadata should be passed by '**kwargs'. A mixture is allowed, where passed arguments overwrite config file. @@ -531,9 +610,7 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): kwargs : dict See _write """ - - - proj = Proj(area_def.proj_dict) + proj = Proj(area_def.proj_dict) upper_left = proj( area_def.area_extent[0], area_def.area_extent[3], @@ -541,8 +618,8 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): lower_right = proj( area_def.area_extent[2], area_def.area_extent[1], - inverse=True) - + inverse=True) + if len(image_data.shape) == 3: if image_data.shape[2] == 4: shape = (area_def.y_size, area_def.x_size, 4) @@ -585,7 +662,11 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): kwargs['altitude'] = altitude if product_name: - options = deepcopy(get_product_config(product_name)) + # If ninjo_product_file in kwargs, load ninjo_product_file as config file + if 'ninjo_product_file' in kwargs: + options = deepcopy(get_product_config(product_name, True, kwargs['ninjo_product_file'])) + else: + options = deepcopy(get_product_config(product_name)) else: options = {} @@ -601,9 +682,8 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): options['ref_lat2'] = 0 if 'lon_0' in area_def.proj_dict: options['central_meridian'] = area_def.proj_dict['lon_0'] - - a,b = proj4_radius_parameters(area_def.proj_dict) + a, b = proj4_radius_parameters(area_def.proj_dict) options['radius_a'] = a options['radius_b'] = b options['origin_lon'] = upper_left[0] @@ -621,11 +701,12 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): # # ----------------------------------------------------------------------------- def _write(image_data, output_fn, write_rgb=False, **kwargs): - """Proudly Found Elsewhere (PFE) https://github.com/davidh-ssec/polar2grid + """Create a NinJo compatible TIFF file. + + Proudly Found Elsewhere (PFE) https://github.com/davidh-ssec/polar2grid by David Hoese. - Create a NinJo compatible TIFF file with the tags used - by the DWD's version of NinJo. Also stores the image as tiles on disk + Also stores the image as tiles on disk and creates a multi-resolution/pyramid/overview set of images (deresolution: 2,4,8,16). @@ -769,6 +850,7 @@ def _write(image_data, output_fn, write_rgb=False, **kwargs): origin_lat = float(kwargs.pop("origin_lat")) origin_lon = float(kwargs.pop("origin_lon")) image_dt = kwargs.pop("image_dt") + zero_seconds = kwargs.pop("zero_seconds", False) projection = str(kwargs.pop("projection")) meridian_west = float(kwargs.pop("meridian_west", 0.0)) meridian_east = float(kwargs.pop("meridian_east", 0.0)) @@ -842,7 +924,13 @@ def _write(image_data, output_fn, write_rgb=False, **kwargs): file_dt = datetime.utcnow() file_epoch = calendar.timegm(file_dt.timetuple()) - image_epoch = calendar.timegm(image_dt.timetuple()) + if zero_seconds: + log.debug("Applying zero seconds correction") + image_dt_corr = datetime(image_dt.year, image_dt.month, image_dt.day, + image_dt.hour, image_dt.minute) + else: + image_dt_corr = image_dt + image_epoch = calendar.timegm(image_dt_corr.timetuple()) compression = _eval_or_default("compression", int, 6) @@ -1045,6 +1133,7 @@ def read_tags(filename): pages.append(tags) return pages + if __name__ == '__main__': import sys import getopt @@ -1060,9 +1149,9 @@ if __name__ == '__main__': try: filename = args[0] except IndexError: - print >> sys.stderr, """usage: python ninjotiff.py [<-p page-number>] [-c] + print("""usage: python ninjotiff.py [<-p page-number>] [-c] -p : print page number (default are all pages). - -c: print color maps (default is not to print color maps).""" + -c: print color maps (default is not to print color maps).""", sys.stderr) sys.exit(2) pages = read_tags(filename) @@ -1070,12 +1159,12 @@ if __name__ == '__main__': try: pages = [pages[page_no]] except IndexError: - print >>sys.stderr, "Invalid page number '%d'" % page_no + print("Invalid page number '%d'" % page_no, sys.stderr) sys.exit(2) for page in pages: names = sorted(page.keys()) - print "" + print("") for name in names: if not print_color_maps and name == "color_map": continue - print name, page[name] + print(name, page[name]) ===================================== pyninjotiff/ninjotiff_config-file_satpy_example.py ===================================== @@ -0,0 +1,24 @@ +import os +from satpy import Scene +from datetime import datetime +from satpy.utils import debug_on +import pyninjotiff +from glob import glob +from pyresample.utils import load_area +import copy +debug_on() + + +chn = "IR_108" +ninjoRegion = load_area("areas.def", "nrEURO3km") + +filenames = glob("data/*__") +global_scene = Scene(reader="hrit_msg", filenames=filenames) +global_scene.load([chn]) +local_scene = global_scene.resample(ninjoRegion) +local_scene.save_dataset(chn, filename="msg.tif", writer='ninjotiff', + # ninjo product name to look for in .cfg file + ninjo_product_name="IR_108", + # custom configuration file for ninjo tiff products + # if not specified PPP_CONFIG_DIR is used as config file directory + ninjo_product_file="/config_dir/ninjotiff_products.cfg") ===================================== pyninjotiff/ninjotiff_satpy_example ===================================== @@ -3,19 +3,23 @@ from satpy import Scene from datetime import datetime from satpy.utils import debug_on import pyninjotiff +from glob import glob +from pyresample.utils import load_area +import copy debug_on() -chn=10.8 -time_slot = datetime(2017, 1, 27, 7, 45) -global_data = Scene(platform_name="Meteosat-10", sensor="seviri", reader="hrit_msg", start_time=time_slot) +chn = "IR_108" +ninjoRegion = load_area("areas.def", "nrEURO3km") -global_data.load([chn]) -local_scene = global_data.resample("NinJoRegion") +filenames = glob("data/*__") +global_scene = Scene(reader="hrit_msg", filenames=filenames) +global_scene.load([chn]) +local_scene = global_scene.resample(ninjoRegion) local_scene.save_dataset(chn, filename="msg.tif", writer='ninjotiff', - sat_id=1234, - chan_id=5678, + sat_id=6300014, + chan_id=900015, data_cat='GORN', - data_source='EUMETSAT/MeteoSwiss', + data_source='EUMCAST', physic_unit='K', nbits=8) ===================================== pyninjotiff/rgb_ninjotiff_satpy_example ===================================== @@ -0,0 +1,23 @@ +import os +from satpy import Scene +from datetime import datetime +from satpy.utils import debug_on +import pyninjotiff +from glob import glob +from pyresample.utils import load_area +debug_on() + + +chn = "airmass" +ninjoRegion = load_area("areas.def", "nrEURO3km") + +filenames = glob("data/*__") +global_scene = Scene(reader="hrit_msg", filenames=filenames) +global_scene.load([chn]) +local_scene = global_scene.resample(ninjoRegion) +local_scene.save_dataset(chn, filename="airmass.tif", writer='ninjotiff', + sat_id=6300014, + chan_id=6500015, + data_cat='GPRN', + data_source='EUMCAST', + nbits=8) ===================================== pyninjotiff/tests/test_ninjotiff.py ===================================== @@ -0,0 +1,456 @@ +#!/usr/bin/env python +# -*- coding: utf-8 -*- + +# Copyright (c) 2019 Martin Raspaud + +# Author(s): + +# Martin Raspaud + +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +"""Test the ninjotiff writing.""" + +import numpy as np +import datetime +import tempfile +import xarray as xr +import dask.array as da +import colorsys +import pytest + +TIME = datetime.datetime.utcnow() +DELETE_FILES = True + + +class FakeImage(object): + """Fake Image object for testing purposes.""" + + def __init__(self, data): + """Initialize the image.""" + self.mode = ''.join(data.bands.values) + self.data = data + + def finalize(self, fill_value=None, dtype=None): + if dtype is None: + dtype = np.uint8 + if np.issubdtype(self.data.dtype, np.floating) and np.issubdtype(dtype, np.integer): + res = self.data.clip(0, 1) * np.iinfo(dtype).max + res = res.astype(dtype) + else: + res = self.data + return [res.astype(dtype)] + + +class FakeArea(object): + def __init__(self, proj_dict, extent, y_size, x_size): + self.proj_dict = proj_dict + self.area_extent = extent + self.x_size, self.y_size = x_size, y_size + self.pixel_size_x = (extent[2] - extent[0]) / x_size + self.pixel_size_y = (extent[3] - extent[1]) / y_size + + +def test_write_bw(): + """Test saving a BW image.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 0.0 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', '%'), + ('name', '1'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=5)), + ('end_time', TIME), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([0]), + 'ch_max_measurement_unit': np.array([120]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 100015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': '%', 'nbits': 8} + + data = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 1024).reshape((1, 1024, 1024)) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([256, 22016, 43520, 65280]))) + + +def test_write_bw_inverted_ir(): + """Test saving a BW image.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 70.0 / 120 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', 'K'), + ('name', '4'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=15)), + ('end_time', TIME - datetime.timedelta(minutes=10)), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([-70]), + 'ch_max_measurement_unit': np.array([50]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 900015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': 'C', 'nbits': 8} + + data = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 1024).reshape((1, 1024, 1024)) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([65024, 43264, 21760, 0]))) + + +def test_write_bw_fill(): + """Test saving a BW image with transparency.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 0.0 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', '%'), + ('name', '1'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=25)), + ('end_time', TIME - datetime.timedelta(minutes=20)), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([0]), + 'ch_max_measurement_unit': np.array([120]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 100015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': '%', 'nbits': 8} + + data1 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 256).reshape((1, 256, 1024)) + datanan = da.ones((1, 256, 1024), chunks=1024) * np.nan + data2 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 512).reshape((1, 512, 1024)) + data = da.concatenate((data1, datanan, data2), axis=1) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([256, 22016, 43520, 65280]))) + + +def test_write_bw_inverted_ir_fill(): + """Test saving a BW image with transparency.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 70.0 / 120 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', 'K'), + ('name', '4'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=35)), + ('end_time', TIME - datetime.timedelta(minutes=30)), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([-70]), + 'ch_max_measurement_unit': np.array([50]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 900015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': 'C', 'nbits': 8} + + data1 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 256).reshape((1, 256, 1024)) + datanan = da.ones((1, 256, 1024), chunks=1024) * np.nan + data2 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 512).reshape((1, 512, 1024)) + data = da.concatenate((data1, datanan, data2), axis=1) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([65024, 43264, 21760, 0]))) + + +def test_write_rgb(): + """Test saving a non-trasparent RGB.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + + x_size, y_size = 1024, 1024 + arr = np.zeros((3, y_size, x_size)) + radius = min(x_size, y_size) / 2.0 + centre = x_size / 2, y_size / 2 + + for x in range(x_size): + for y in range(y_size): + rx = x - centre[0] + ry = y - centre[1] + s = ((x - centre[0])**2.0 + (y - centre[1])**2.0)**0.5 / radius + if s <= 1.0: + h = ((np.arctan2(ry, rx) / np.pi) + 1.0) / 2.0 + rgb = colorsys.hsv_to_rgb(h, s, 1.0) + arr[:, y, x] = np.array(rgb) + + attrs = dict([('platform_name', 'NOAA-18'), + ('resolution', 1050), + ('polarization', None), + ('level', None), + ('sensor', 'avhrr-3'), + ('ancillary_variables', []), + ('area', area), + ('start_time', TIME - datetime.timedelta(minutes=45)), + ('end_time', TIME - datetime.timedelta(minutes=40)), + ('wavelength', None), + ('optional_datasets', []), + ('standard_name', 'overview'), + ('name', 'overview'), + ('prerequisites', [0.6, 0.8, 10.8]), + ('optional_prerequisites', []), + ('calibration', None), + ('modifiers', None), + ('mode', 'RGB'), + ('enhancement_history', [{'scale': np.array([1, 1, -1]), 'offset': np.array([0, 0, 1])}, + {'scale': np.array([0.0266347, 0.03559078, 0.01329783]), + 'offset': np.array([-0.02524969, -0.01996642, 3.8918446])}, + {'gamma': 1.6}])]) + + kwargs = {'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 6500015, 'data_cat': 'PPRN', 'data_source': 'SMHI', 'nbits': 8} + data = da.from_array(arr.clip(0, 1), chunks=1024) + data = xr.DataArray(data, coords={'bands': ['R', 'G', 'B']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + + from trollimage.xrimage import XRImage + img = XRImage(data) + + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=False, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + for idx in range(3): + np.testing.assert_allclose(res[:, :, idx], np.round( + arr[idx, :, :] * 255).astype(np.uint8)) + + +def test_write_rgb_with_a(): + """Test saving a transparent RGB.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + + x_size, y_size = 1024, 1024 + arr = np.zeros((3, y_size, x_size)) + radius = min(x_size, y_size) / 2.0 + centre = x_size / 2, y_size / 2 + + for x in range(x_size): + for y in range(y_size): + rx = x - centre[0] + ry = y - centre[1] + s = ((x - centre[0])**2.0 + (y - centre[1])**2.0)**0.5 / radius + if s <= 1.0: + h = ((np.arctan2(ry, rx) / np.pi) + 1.0) / 2.0 + rgb = colorsys.hsv_to_rgb(h, s, 1.0) + arr[:, y, x] = np.array(rgb) + else: + arr[:, y, x] = np.nan + + attrs = dict([('platform_name', 'NOAA-18'), + ('resolution', 1050), + ('polarization', None), + ('start_time', TIME - datetime.timedelta(minutes=55)), + ('end_time', TIME - datetime.timedelta(minutes=50)), + ('level', None), + ('sensor', 'avhrr-3'), + ('ancillary_variables', []), + ('area', area), + ('wavelength', None), + ('optional_datasets', []), + ('standard_name', 'overview'), + ('name', 'overview'), + ('prerequisites', [0.6, 0.8, 10.8]), + ('optional_prerequisites', []), + ('calibration', None), + ('modifiers', None), + ('mode', 'RGB'), + ('enhancement_history', [{'scale': np.array([1, 1, -1]), 'offset': np.array([0, 0, 1])}, + {'scale': np.array([0.0266347, 0.03559078, 0.01329783]), + 'offset': np.array([-0.02524969, -0.01996642, 3.8918446])}, + {'gamma': 1.6}])]) + + kwargs = {'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 6500015, 'data_cat': 'PPRN', 'data_source': 'SMHI', 'nbits': 8} + data = da.from_array(arr.clip(0, 1), chunks=1024) + + data = xr.DataArray(data, coords={'bands': ['R', 'G', 'B']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + from trollimage.xrimage import XRImage + img = XRImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + for idx in range(3): + np.testing.assert_allclose(res[:, :, idx], np.round( + np.nan_to_num(arr[idx, :, :]) * 255).astype(np.uint8)) + np.testing.assert_allclose(res[:, :, 3] == 0, np.isnan(arr[0, :, :])) + + + at pytest.mark.skip(reason="this is no implemented yet.") +def test_write_rgb_classified(): + """Test saving a transparent RGB.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + + x_size, y_size = 1024, 1024 + arr = np.zeros((3, y_size, x_size)) + + attrs = dict([('platform_name', 'NOAA-18'), + ('resolution', 1050), + ('polarization', None), + ('start_time', TIME - datetime.timedelta(minutes=55)), + ('end_time', TIME - datetime.timedelta(minutes=50)), + ('level', None), + ('sensor', 'avhrr-3'), + ('ancillary_variables', []), + ('area', area), + ('wavelength', None), + ('optional_datasets', []), + ('standard_name', 'overview'), + ('name', 'overview'), + ('prerequisites', [0.6, 0.8, 10.8]), + ('optional_prerequisites', []), + ('calibration', None), + ('modifiers', None), + ('mode', 'P')]) + + kwargs = {'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 1700015, 'data_cat': 'PPRN', 'data_source': 'SMHI', 'nbits': 8} + + data1 = da.tile(da.repeat(da.arange(4, chunks=1024), 256), 256).reshape((1, 256, 1024)) + datanan = da.ones((1, 256, 1024), chunks=1024) * 4 + data2 = da.tile(da.repeat(da.arange(4, chunks=1024), 256), 512).reshape((1, 512, 1024)) + data = da.concatenate((data1, datanan, data2), axis=1) + data = xr.DataArray(data, coords={'bands': ['P']}, dims=['bands', 'y', 'x'], attrs=attrs) + + from trollimage.xrimage import XRImage + img = XRImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + for idx in range(3): + np.testing.assert_allclose(res[:, :, idx], np.round( + np.nan_to_num(arr[idx, :, :]) * 255).astype(np.uint8)) + np.testing.assert_allclose(res[:, :, 3] == 0, np.isnan(arr[0, :, :])) ===================================== pyninjotiff/tifffile.py ===================================== @@ -337,7 +337,7 @@ class TiffWriter(object): def save(self, data, photometric=None, planarconfig=None, resolution=None, description=None, volume=False, writeshape=False, compress=0, - colormap=None, extrasamples_type=1, tile_width=None, + colormap=None, extrasamples_type=1, tile_width=None, tile_length=None, extratags=()): """Write image data to TIFF file. @@ -711,8 +711,8 @@ class TiffWriter(object): # reset and use compress sizes strip_byte_counts = [] for plane in data[pageindex]: - for ty in xrange(0, tiles_y): - for tx in xrange(0, tiles_x): + for ty in range(0, tiles_y): + for tx in range(0, tiles_x): # allocate fixed size tile filled with zeros tile = numpy.zeros((tile_width * tile_length, shape[-1]), data.dtype) @@ -724,7 +724,7 @@ class TiffWriter(object): itw = min(tile_width, shape[3] - tx*tile_width) ioffs = tx*tile_width - for tl in xrange(0, itl): + for tl in range(0, itl): # copy data to tile line ir = ty*tile_length+tl tile[tl*tile_width:tl*tile_width+itw] \ @@ -4989,4 +4989,3 @@ if sys.version_info[0] > 2: if __name__ == "__main__": sys.exit(main()) - ===================================== pyninjotiff/version.py ===================================== @@ -22,4 +22,4 @@ """Version file.""" -__version__ = "v0.1.0" +__version__ = "v0.2.0" ===================================== setup.cfg ===================================== @@ -0,0 +1,2 @@ +[flake8] +max-line-length = 120 ===================================== setup.py ===================================== @@ -31,7 +31,7 @@ version = imp.load_source('pyninjotiff.version', 'pyninjotiff/version.py') setup(name="pyninjotiff", version=version.__version__, - description='Pytroll imaging library', + description='Python Ninjo TIFF writing library', author='Martin Raspaud', author_email='martin.raspaud at smhi.se', classifiers=["Development Status :: 5 - Production/Stable", @@ -44,6 +44,6 @@ setup(name="pyninjotiff", url="https://github.com/pytroll/pyninjotiff", packages=['pyninjotiff'], zip_safe=False, - install_requires=['numpy >=1.6', 'six'], + install_requires=['numpy >=1.6', 'six', 'pyproj', 'pyresample'], # test_suite='pyninjotiff.tests.suite', ) View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/compare/8e6d804c56c1ac3abf37a8c08cd47b5739430f24...64a19fe966789a6558e1fd54c56b727f154c3b82 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/compare/8e6d804c56c1ac3abf37a8c08cd47b5739430f24...64a19fe966789a6558e1fd54c56b727f154c3b82 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 10:43:08 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sat, 21 Sep 2019 09:43:08 +0000 Subject: [Git][debian-gis-team/pyninjotiff][upstream] New upstream version 0.2.0 Message-ID: <5d85f0ac52893_56b2aeb2b8df2c0136663@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / pyninjotiff Commits: 27575121 by Antonio Valentino at 2019-09-21T07:52:09Z New upstream version 0.2.0 - - - - - 13 changed files: - .bumpversion.cfg - + .stickler.yml - + .travis.yml - changelog.rst - pyninjotiff/ninjotiff.py - + pyninjotiff/ninjotiff_config-file_satpy_example.py - pyninjotiff/ninjotiff_satpy_example - + pyninjotiff/rgb_ninjotiff_satpy_example - + pyninjotiff/tests/test_ninjotiff.py - pyninjotiff/tifffile.py - pyninjotiff/version.py - + setup.cfg - setup.py Changes: ===================================== .bumpversion.cfg ===================================== @@ -1,5 +1,5 @@ [bumpversion] -current_version = 0.1.0 +current_version = 0.2.0 commit = True tag = True ===================================== .stickler.yml ===================================== @@ -0,0 +1,7 @@ +linters: + flake8: + python: 3 + fixer: true + max-line-length: 120 +fixers: + enable: true ===================================== .travis.yml ===================================== @@ -0,0 +1,15 @@ +language: python +python: + - "2.7" + - "3.6" + - "3.7" + +install: + - pip install codecov pytest pytest-cov trollimage xarray dask[array] + - pip install -e . + +script: + - pytest --cov=./ + +after_success: + - codecov ===================================== changelog.rst ===================================== @@ -2,6 +2,85 @@ Changelog ========= +v0.2.0 (2019-09-19) +------------------- +- update changelog. [Martin Raspaud] +- Bump version: 0.1.0 → 0.2.0. [Martin Raspaud] +- Merge pull request #18 from mraspaud/fix-user-home-path. [Martin + Raspaud] + + Fix user home path +- Fix travis to improve coverage. [Martin Raspaud] +- Expand the config filename in case ~ is used. [Martin Raspaud] +- Merge pull request #17 from mraspaud/fix-python3-configparser. [Martin + Raspaud] + + Fix python2-only configparser import +- Fix python2-only configparser import. [Martin Raspaud] +- Merge pull request #16 from mraspaud/fix-tests. [Martin Raspaud] + + Fix test dependencies +- Fix area definitions in the tests. [Martin Raspaud] +- Add pyresample to setup dependencies. [Martin Raspaud] +- Add pyproj to setup dependencies. [Martin Raspaud] +- Fix dask array dependencies. [Martin Raspaud] +- Fix test dependencies. [Martin Raspaud] +- Fix .travis.yml file. [Martin Raspaud] +- Merge pull request #14 from pytroll/feature-python3. [Martin Raspaud] + + Support for python3 and unittests +- Fix P test. [Martin Raspaud] +- Add draft test for P mode. [Martin Raspaud] +- Use _FillValue to mask integer arrays. [Martin Raspaud] +- Add trollimage to test dependencies. [Martin Raspaud] +- Add codecov to travis. [Martin Raspaud] +- Fix channel in vis tests. [Martin Raspaud] +- Fix stickler line length. [Martin Raspaud] +- Fixing style errors. [stickler-ci] +- Add tests. [Martin Raspaud] +- Fix scaling bw images. [Martin Raspaud] +- Fix style. [Martin Raspaud] +- Fixing style errors. [stickler-ci] +- Start supporting python3. [Martin Raspaud] +- Merge pull request #13 from pytroll/add-stickler-config. [Martin + Raspaud] + + Adding .stickler.yml configuration file +- Adding .stickler.yml. [stickler-ci] +- Merge pull request #9 from pytroll/develop. [David Hoese] + + Merge the develop branch in to master +- Merge pull request #3 from goodsonr/compatability-python3. [Martin + Raspaud] + + change all occurences of xrange to range for compatability with Python3 +- change all occurences of xrange to range for compatability with + Python3. [ron goodson] +- Add zero seconds option to zero the seconds of the DateID. [Martin + Raspaud] +- Fix package description. [Martin Raspaud] +- Merge pull request #5 from loreclem/master. [David Hoese] + + WIP|PCW: first attempt to make pyninjotiff xarray compatible. +- Merge pull request #2 from vgiuffrida/master. [lorenzo clementi] + + fix not assigned fill_value and a config file loader issue +- fix not assigned fill_value and config file loader. [root] +- fix fill_value and config file loader. [root] +- Merge pull request #1 from vgiuffrida/master. [lorenzo clementi] + + Add new parameter to configure the ninjotiff config file to use +- Add ninjotiff configuration file loading. [root] +- Typos corrected and removed is_masked. [cll] +- Bugfix (is_masked computed twice) [cll] +- WIP Improvements here and there. [cll] +- Using finalize instead of fill_or_alpha. [cll] +- It now can handle also RGB images. [cll] +- WIP: first attempt to make pyninjotiff xarray compatible. For the + moment, only the 'L' case (1 band) has been upgraded. Still to be + verified. [cll] + + v0.1.0 (2017-10-16) ------------------- - update changelog. [Martin Raspaud] ===================================== pyninjotiff/ninjotiff.py ===================================== @@ -1,39 +1,38 @@ # -*- coding: utf-8 -*- -""" -ninjotiff.py - -Created on Mon Apr 15 13:41:55 2013 - -A big amount of the tiff writer are (PFE) from -https://github.com/davidh-ssec/polar2grid by David Hoese - -License: -Copyright (C) 2013 Space Science and Engineering Center (SSEC), - University of Wisconsin-Madison. - Lars Ørum Rasmussen, DMI. - - This program is free software: you can redistribute it and/or modify - it under the terms of the GNU General Public License as published by - the Free Software Foundation, either version 3 of the License, or - (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU General Public License for more details. - - You should have received a copy of the GNU General Public License - along with this program. If not, see . - -Original scripts and automation included as part of this package are -distributed under the GNU GENERAL PUBLIC LICENSE agreement version 3. -Binary executable files included as part of this software package are -copyrighted and licensed by their respective organizations, and -distributed consistent with their licensing terms. - -Edited by Christian Kliche (Ernst Basler + Partner) to replace pylibtiff with -a modified version of tifffile.py (created by Christoph Gohlke) -""" +# ninjotiff.py +# +# Created on Mon Apr 15 13:41:55 2013 +# +# A big amount of the tiff writer are (PFE) from +# https://github.com/davidh-ssec/polar2grid by David Hoese +# +# License: +# Copyright (C) 2013 Space Science and Engineering Center (SSEC), +# University of Wisconsin-Madison. +# Lars Ørum Rasmussen, DMI. +# +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . +# +# Original scripts and automation included as part of this package are +# distributed under the GNU GENERAL PUBLIC LICENSE agreement version 3. +# Binary executable files included as part of this software package are +# copyrighted and licensed by their respective organizations, and +# distributed consistent with their licensing terms. +# +# Edited by Christian Kliche (Ernst Basler + Partner) to replace pylibtiff with +# a modified version of tifffile.py (created by Christoph Gohlke) +"""Ninjotiff writing utility.""" import calendar import logging @@ -46,16 +45,16 @@ import numpy as np from pyproj import Proj from pyresample.utils import proj4_radius_parameters -#import mpop.imageo.formats.writer_options as write_opts from pyninjotiff import tifffile log = logging.getLogger(__name__) -#------------------------------------------------------------------------- + +# ------------------------------------------------------------------------- # # Ninjo tiff tags from DWD # -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # Geotiff tags. GTF_ModelPixelScale = 33550 GTF_ModelTiepoint = 33922 @@ -123,12 +122,34 @@ MODEL_PIXEL_SCALE_COUNT = int(os.environ.get( "GEOTIFF_MODEL_PIXEL_SCALE_COUNT", 3)) -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # # Read Ninjo products config file. # -#------------------------------------------------------------------------- -def get_product_config(product_name, force_read=False): +# ------------------------------------------------------------------------- +def get_writer_config(config_fname, prod, single_product_config, scn_metadata): + """Writer_config function for Trollflow_sat: calls the get_product_config function. + + :Parameters: + config_fname: str + Name of the Ninjo product configuration file + + prod: str + Name of Ninjo product. + + single_product_config: dict + config params for the current product + + scn_metadata: dict + Satpy satellite data + """ + ninjo_product = prod + if 'ninjo_product_name' in single_product_config: + ninjo_product = single_product_config['ninjo_product_name'] + return get_product_config(ninjo_product, True, config_fname) + + +def get_product_config(product_name, force_read=False, config_filename=None): """Read Ninjo configuration entry for a given product name. :Parameters: @@ -145,74 +166,95 @@ def get_product_config(product_name, force_read=False): * As an example, see *ninjotiff_products.cfg.template* in MPOP's *etc* directory. """ - return ProductConfigs()(product_name, force_read) + return ProductConfigs()(product_name, force_read, config_filename) class _Singleton(type): - def __init__(cls, name_, bases_, dict_): - super(_Singleton, cls).__init__(name_, bases_, dict_) - cls.instance = None + def __init__(self, name_, bases_, dict_): + """Init the singleton.""" + super(_Singleton, self).__init__(name_, bases_, dict_) + self.instance = None - def __call__(cls, *args, **kwargs): - if cls.instance is None: - cls.instance = super(_Singleton, cls).__call__(*args, **kwargs) - return cls.instance + def __call__(self, *args, **kwargs): + """Call the singleton.""" + if self.instance is None: + self.instance = super(_Singleton, self).__call__(*args, **kwargs) + return self.instance class ProductConfigs(object): - __metaclass__ = _Singleton + """Product config.""" + + __metaclass__ = _Singleton # noqa def __init__(self): + """Init the product config.""" self.read_config() - def __call__(self, product_name, force_read=False): + def __call__(self, product_name, force_read=False, config_filename=None): + """Call the product config.""" if force_read: - self.read_config() - return self._products[product_name] + self.read_config(config_filename) + if product_name in self._products: + return self._products[product_name] + else: + return {} @property def product_names(self): + """Get the product names.""" return sorted(self._products.keys()) - def read_config(self): - from ConfigParser import ConfigParser + def read_config(self, config_filename=None): + """Read the ninjo products config file.""" + from six.moves.configparser import RawConfigParser + import ast def _eval(val): try: - return eval(val) - except: + return ast.literal_eval(val) + except (ValueError, SyntaxError): return str(val) - filename = self._find_a_config_file() + if config_filename is not None: + filename = self._find_a_config_file(config_filename) + else: + filename = self._find_a_config_file('ninjotiff_products.cfg') log.info("Reading Ninjo config file: '%s'" % filename) - cfg = ConfigParser() - cfg.read(filename) + cfg = RawConfigParser() products = {} - for sec in cfg.sections(): - prd = {} - for key, val in cfg.items(sec): - prd[key] = _eval(val) - products[sec] = prd + if filename is not None: + cfg.read(filename) + for sec in cfg.sections(): + prd = {} + for key, val in cfg.items(sec): + prd[key] = _eval(val) + products[sec] = prd self._products = products @staticmethod - def _find_a_config_file(): - name_ = 'ninjotiff_products.cfg' - home_ = os.path.dirname(os.path.abspath(__file__)) - penv_ = os.environ.get('PPP_CONFIG_DIR', '') - for fname_ in [os.path.join(x, name_) for x in (home_, penv_)]: - if os.path.isfile(fname_): - return fname_ - raise ValueError("Could not find a Ninjo tiff config file") + def _find_a_config_file(fname): + # if config file (fname) is not found as absolute path: look for the + # config file in the PPP_CONFIG_DIR or current dir + name_ = os.path.abspath(os.path.expanduser(fname)) + if os.path.isfile(name_): + return name_ + else: + home_ = os.path.dirname(os.path.abspath(__file__)) + penv_ = os.environ.get('PPP_CONFIG_DIR', '') + for fname_ in [os.path.join(x, name_) for x in (home_, penv_)]: + if os.path.isfile(fname_): + return fname_ + # raise ValueError("Could not find a Ninjo tiff config file") -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- # # Write Ninjo Products # -#------------------------------------------------------------------------- +# ------------------------------------------------------------------------- def _get_physic_value(physic_unit): # return Ninjo's physics unit and value. if physic_unit.upper() in ('K', 'KELVIN'): @@ -281,9 +323,11 @@ def _get_satellite_altitude(filename): return None -def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_scaled_01=True): - """Finalize a mpop GeoImage for Ninjo. Specialy take care of phycical scale - and offset. +def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, + data_is_scaled_01=True, fill_value=None): + """Finalize a mpop GeoImage for Ninjo. + + Specialy take care of phycical scale and offset. :Parameters: img : mpop.imageo.img.GeoImage @@ -308,14 +352,34 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc **Notes**: physic_val = image*scale + offset Example values for value_range_measurement_unit are (0, 125) or (40.0, -87.5) + + ***Warning*** + Only the 'L' and 'RGB' cases are compatible with xarray.XRImage. + They still have to be tested thoroughly. """ if img.mode == 'L': # PFE: mpop.satout.cfscene - data = img.channels[0] - fill_value = np.iinfo(dtype).min - log.debug("Transparent pixel are forced to be %d" % fill_value) + if isinstance(img, np.ma.MaskedArray): + data = img.channels[0] + else: + # TODO: check what is the correct fill value for NinJo! + if fill_value is not None: + log.debug("Forcing fill value to %s", fill_value) + # Go back to the masked_array for compatibility + # with the following part of the code. + if (np.issubdtype(img.data[0].dtype, np.integer) + and '_FillValue' in img.data[0].attrs): + nodata_value = img.data[0].attrs['_FillValue'] + data = img.data[0].values + data = np.ma.array(data, mask=(data == nodata_value)) + else: + data = img.data[0].to_masked_array() + + fill_value = fill_value if fill_value is not None else np.iinfo(dtype).min + log.debug("Before scaling: %.2f, %.2f, %.2f" % (data.min(), data.mean(), data.max())) + if np.ma.count_masked(data) == data.size: # All data is masked data = np.ones(data.shape, dtype=dtype) * fill_value @@ -328,26 +392,20 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc # value_range_measurement_unit[0] and 1.0 as # value_range_measurement_unit[1] - # Make room for transparent pixel. - scale_fill_value = ( - (np.iinfo(dtype).max) / (np.iinfo(dtype).max + 1.0)) - img = deepcopy(img) - img.channels[0] *= scale_fill_value - - img.channels[0] += 1 / (np.iinfo(dtype).max + 1.0) + # Make room for the transparent pixel value. + data = data.clip(0, 1) + data *= (np.iinfo(dtype).max - 1) + data += 1 - channels, fill_value = img._finalize(dtype) - data = channels[0] - - scale = ((value_range_measurement_unit[1] - - value_range_measurement_unit[0]) / - (np.iinfo(dtype).max)) + scale = ((value_range_measurement_unit[1] + - value_range_measurement_unit[0]) + / (np.iinfo(dtype).max - 1)) # Handle the case where all data has the same value. scale = scale or 1 offset = value_range_measurement_unit[0] mask = data.mask - + data = np.round(data.data).astype(dtype) offset -= scale if fill_value is None: @@ -397,8 +455,17 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, scale, offset, fill_value elif img.mode == 'RGB': - channels, fill_value = img._finalize(dtype) - if fill_value is None: + if isinstance(img, np.ma.MaskedArray): + channels, fill_value = img._finalize(dtype) + else: + data, mode = img.finalize(fill_value=fill_value, dtype=dtype) + # Go back to the masked_array for compatibility with + # the rest of the code. + channels = data.to_masked_array() + # Is this fill_value ok or what should it be? + fill_value = (0, 0, 0, 0) + + if isinstance(img, np.ma.MaskedArray) and fill_value is None: mask = (np.ma.getmaskarray(channels[0]) & np.ma.getmaskarray(channels[1]) & np.ma.getmaskarray(channels[2])) @@ -411,6 +478,8 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, 1.0, 0.0, fill_value[0] elif img.mode == 'RGBA': + if not isinstance(img, np.ma.MaskedArray): + raise NotImplementedError("The 'RGBA' case has not been updated to xarray") channels, fill_value = img._finalize(dtype) fill_value = fill_value or (0, 0, 0, 0) data = np.dstack((channels[0].filled(fill_value[0]), @@ -420,6 +489,8 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, 1.0, 0.0, fill_value[0] elif img.mode == 'P': + if not isinstance(img, np.ma.MaskedArray): + raise NotImplementedError("The 'P' case has not been updated to xarray") fill_value = 0 data = img.channels[0] if isinstance(data, np.ma.core.MaskedArray): @@ -430,11 +501,11 @@ def _finalize(img, dtype=np.uint8, value_range_measurement_unit=None, data_is_sc return data, 1.0, 0.0, fill_value else: - raise ValueError("Don't known how til handle image mode '%s'" % + raise ValueError("Don't know how to handle image mode '%s'" % str(img.mode)) -def save(img, filename, ninjo_product_name=None, writer_options=None, +def save(img, filename, ninjo_product_name=None, writer_options=None, data_is_scaled_01=True, **kwargs): """Ninjo TIFF writer. @@ -460,7 +531,6 @@ def save(img, filename, ninjo_product_name=None, writer_options=None, * min value will be reserved for transparent color. * If possible mpop.imageo.image's standard finalize will be used. """ - if writer_options: # add writer_options kwargs.update(writer_options) @@ -473,21 +543,31 @@ def save(img, filename, ninjo_product_name=None, writer_options=None, if nbits == 16: dtype = np.uint16 + fill_value = None + if 'fill_value' in kwargs and kwargs['fill_value'] is not None: + fill_value = int(kwargs['fill_value']) + try: value_range_measurement_unit = (float(kwargs["ch_min_measurement_unit"]), float(kwargs["ch_max_measurement_unit"])) except KeyError: value_range_measurement_unit = None - data_is_scaled_01 = bool(kwargs.get("data_is_scaled_01", True)) + # In case we are working on a trollimage.xrimage.XRImage, + # a conversion to the previously used masked_array is needed data, scale, offset, fill_value = _finalize(img, dtype=dtype, data_is_scaled_01=data_is_scaled_01, - value_range_measurement_unit=value_range_measurement_unit,) + value_range_measurement_unit=value_range_measurement_unit, + fill_value=fill_value,) - area_def = img.info['area'] - time_slot = img.info['start_time'] + if isinstance(img, np.ma.MaskedArray): + area_def = img.info['area'] + time_slot = img.info['start_time'] + else: + area_def = img.data.area + time_slot = img.data.start_time # Some Ninjo tiff names kwargs['gradient'] = scale @@ -505,14 +585,13 @@ def save(img, filename, ninjo_product_name=None, writer_options=None, g += [0] * (256 - len(g)) b += [0] * (256 - len(b)) kwargs['cmap'] = r, g, b - write(data, filename, area_def, ninjo_product_name, **kwargs) def write(image_data, output_fn, area_def, product_name=None, **kwargs): - """Generic Ninjo TIFF writer. + """Write a Generic Ninjo TIFF. - If 'prodcut_name' is given, it will load corresponding Ninjo tiff metadata + If 'product_name' is given, it will load corresponding Ninjo tiff metadata from '${PPP_CONFIG_DIR}/ninjotiff.cfg'. Else, all Ninjo tiff metadata should be passed by '**kwargs'. A mixture is allowed, where passed arguments overwrite config file. @@ -531,9 +610,7 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): kwargs : dict See _write """ - - - proj = Proj(area_def.proj_dict) + proj = Proj(area_def.proj_dict) upper_left = proj( area_def.area_extent[0], area_def.area_extent[3], @@ -541,8 +618,8 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): lower_right = proj( area_def.area_extent[2], area_def.area_extent[1], - inverse=True) - + inverse=True) + if len(image_data.shape) == 3: if image_data.shape[2] == 4: shape = (area_def.y_size, area_def.x_size, 4) @@ -585,7 +662,11 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): kwargs['altitude'] = altitude if product_name: - options = deepcopy(get_product_config(product_name)) + # If ninjo_product_file in kwargs, load ninjo_product_file as config file + if 'ninjo_product_file' in kwargs: + options = deepcopy(get_product_config(product_name, True, kwargs['ninjo_product_file'])) + else: + options = deepcopy(get_product_config(product_name)) else: options = {} @@ -601,9 +682,8 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): options['ref_lat2'] = 0 if 'lon_0' in area_def.proj_dict: options['central_meridian'] = area_def.proj_dict['lon_0'] - - a,b = proj4_radius_parameters(area_def.proj_dict) + a, b = proj4_radius_parameters(area_def.proj_dict) options['radius_a'] = a options['radius_b'] = b options['origin_lon'] = upper_left[0] @@ -621,11 +701,12 @@ def write(image_data, output_fn, area_def, product_name=None, **kwargs): # # ----------------------------------------------------------------------------- def _write(image_data, output_fn, write_rgb=False, **kwargs): - """Proudly Found Elsewhere (PFE) https://github.com/davidh-ssec/polar2grid + """Create a NinJo compatible TIFF file. + + Proudly Found Elsewhere (PFE) https://github.com/davidh-ssec/polar2grid by David Hoese. - Create a NinJo compatible TIFF file with the tags used - by the DWD's version of NinJo. Also stores the image as tiles on disk + Also stores the image as tiles on disk and creates a multi-resolution/pyramid/overview set of images (deresolution: 2,4,8,16). @@ -769,6 +850,7 @@ def _write(image_data, output_fn, write_rgb=False, **kwargs): origin_lat = float(kwargs.pop("origin_lat")) origin_lon = float(kwargs.pop("origin_lon")) image_dt = kwargs.pop("image_dt") + zero_seconds = kwargs.pop("zero_seconds", False) projection = str(kwargs.pop("projection")) meridian_west = float(kwargs.pop("meridian_west", 0.0)) meridian_east = float(kwargs.pop("meridian_east", 0.0)) @@ -842,7 +924,13 @@ def _write(image_data, output_fn, write_rgb=False, **kwargs): file_dt = datetime.utcnow() file_epoch = calendar.timegm(file_dt.timetuple()) - image_epoch = calendar.timegm(image_dt.timetuple()) + if zero_seconds: + log.debug("Applying zero seconds correction") + image_dt_corr = datetime(image_dt.year, image_dt.month, image_dt.day, + image_dt.hour, image_dt.minute) + else: + image_dt_corr = image_dt + image_epoch = calendar.timegm(image_dt_corr.timetuple()) compression = _eval_or_default("compression", int, 6) @@ -1045,6 +1133,7 @@ def read_tags(filename): pages.append(tags) return pages + if __name__ == '__main__': import sys import getopt @@ -1060,9 +1149,9 @@ if __name__ == '__main__': try: filename = args[0] except IndexError: - print >> sys.stderr, """usage: python ninjotiff.py [<-p page-number>] [-c] + print("""usage: python ninjotiff.py [<-p page-number>] [-c] -p : print page number (default are all pages). - -c: print color maps (default is not to print color maps).""" + -c: print color maps (default is not to print color maps).""", sys.stderr) sys.exit(2) pages = read_tags(filename) @@ -1070,12 +1159,12 @@ if __name__ == '__main__': try: pages = [pages[page_no]] except IndexError: - print >>sys.stderr, "Invalid page number '%d'" % page_no + print("Invalid page number '%d'" % page_no, sys.stderr) sys.exit(2) for page in pages: names = sorted(page.keys()) - print "" + print("") for name in names: if not print_color_maps and name == "color_map": continue - print name, page[name] + print(name, page[name]) ===================================== pyninjotiff/ninjotiff_config-file_satpy_example.py ===================================== @@ -0,0 +1,24 @@ +import os +from satpy import Scene +from datetime import datetime +from satpy.utils import debug_on +import pyninjotiff +from glob import glob +from pyresample.utils import load_area +import copy +debug_on() + + +chn = "IR_108" +ninjoRegion = load_area("areas.def", "nrEURO3km") + +filenames = glob("data/*__") +global_scene = Scene(reader="hrit_msg", filenames=filenames) +global_scene.load([chn]) +local_scene = global_scene.resample(ninjoRegion) +local_scene.save_dataset(chn, filename="msg.tif", writer='ninjotiff', + # ninjo product name to look for in .cfg file + ninjo_product_name="IR_108", + # custom configuration file for ninjo tiff products + # if not specified PPP_CONFIG_DIR is used as config file directory + ninjo_product_file="/config_dir/ninjotiff_products.cfg") ===================================== pyninjotiff/ninjotiff_satpy_example ===================================== @@ -3,19 +3,23 @@ from satpy import Scene from datetime import datetime from satpy.utils import debug_on import pyninjotiff +from glob import glob +from pyresample.utils import load_area +import copy debug_on() -chn=10.8 -time_slot = datetime(2017, 1, 27, 7, 45) -global_data = Scene(platform_name="Meteosat-10", sensor="seviri", reader="hrit_msg", start_time=time_slot) +chn = "IR_108" +ninjoRegion = load_area("areas.def", "nrEURO3km") -global_data.load([chn]) -local_scene = global_data.resample("NinJoRegion") +filenames = glob("data/*__") +global_scene = Scene(reader="hrit_msg", filenames=filenames) +global_scene.load([chn]) +local_scene = global_scene.resample(ninjoRegion) local_scene.save_dataset(chn, filename="msg.tif", writer='ninjotiff', - sat_id=1234, - chan_id=5678, + sat_id=6300014, + chan_id=900015, data_cat='GORN', - data_source='EUMETSAT/MeteoSwiss', + data_source='EUMCAST', physic_unit='K', nbits=8) ===================================== pyninjotiff/rgb_ninjotiff_satpy_example ===================================== @@ -0,0 +1,23 @@ +import os +from satpy import Scene +from datetime import datetime +from satpy.utils import debug_on +import pyninjotiff +from glob import glob +from pyresample.utils import load_area +debug_on() + + +chn = "airmass" +ninjoRegion = load_area("areas.def", "nrEURO3km") + +filenames = glob("data/*__") +global_scene = Scene(reader="hrit_msg", filenames=filenames) +global_scene.load([chn]) +local_scene = global_scene.resample(ninjoRegion) +local_scene.save_dataset(chn, filename="airmass.tif", writer='ninjotiff', + sat_id=6300014, + chan_id=6500015, + data_cat='GPRN', + data_source='EUMCAST', + nbits=8) ===================================== pyninjotiff/tests/test_ninjotiff.py ===================================== @@ -0,0 +1,456 @@ +#!/usr/bin/env python +# -*- coding: utf-8 -*- + +# Copyright (c) 2019 Martin Raspaud + +# Author(s): + +# Martin Raspaud + +# This program is free software: you can redistribute it and/or modify +# it under the terms of the GNU General Public License as published by +# the Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. + +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU General Public License for more details. + +# You should have received a copy of the GNU General Public License +# along with this program. If not, see . + +"""Test the ninjotiff writing.""" + +import numpy as np +import datetime +import tempfile +import xarray as xr +import dask.array as da +import colorsys +import pytest + +TIME = datetime.datetime.utcnow() +DELETE_FILES = True + + +class FakeImage(object): + """Fake Image object for testing purposes.""" + + def __init__(self, data): + """Initialize the image.""" + self.mode = ''.join(data.bands.values) + self.data = data + + def finalize(self, fill_value=None, dtype=None): + if dtype is None: + dtype = np.uint8 + if np.issubdtype(self.data.dtype, np.floating) and np.issubdtype(dtype, np.integer): + res = self.data.clip(0, 1) * np.iinfo(dtype).max + res = res.astype(dtype) + else: + res = self.data + return [res.astype(dtype)] + + +class FakeArea(object): + def __init__(self, proj_dict, extent, y_size, x_size): + self.proj_dict = proj_dict + self.area_extent = extent + self.x_size, self.y_size = x_size, y_size + self.pixel_size_x = (extent[2] - extent[0]) / x_size + self.pixel_size_y = (extent[3] - extent[1]) / y_size + + +def test_write_bw(): + """Test saving a BW image.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 0.0 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', '%'), + ('name', '1'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=5)), + ('end_time', TIME), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([0]), + 'ch_max_measurement_unit': np.array([120]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 100015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': '%', 'nbits': 8} + + data = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 1024).reshape((1, 1024, 1024)) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([256, 22016, 43520, 65280]))) + + +def test_write_bw_inverted_ir(): + """Test saving a BW image.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 70.0 / 120 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', 'K'), + ('name', '4'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=15)), + ('end_time', TIME - datetime.timedelta(minutes=10)), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([-70]), + 'ch_max_measurement_unit': np.array([50]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 900015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': 'C', 'nbits': 8} + + data = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 1024).reshape((1, 1024, 1024)) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([65024, 43264, 21760, 0]))) + + +def test_write_bw_fill(): + """Test saving a BW image with transparency.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 0.0 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', '%'), + ('name', '1'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=25)), + ('end_time', TIME - datetime.timedelta(minutes=20)), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([0]), + 'ch_max_measurement_unit': np.array([120]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 100015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': '%', 'nbits': 8} + + data1 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 256).reshape((1, 256, 1024)) + datanan = da.ones((1, 256, 1024), chunks=1024) * np.nan + data2 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 512).reshape((1, 512, 1024)) + data = da.concatenate((data1, datanan, data2), axis=1) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([256, 22016, 43520, 65280]))) + + +def test_write_bw_inverted_ir_fill(): + """Test saving a BW image with transparency.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + scale = 1.0 / 120 + offset = 70.0 / 120 + attrs = dict([('resolution', 1050), + ('polarization', None), + ('platform_name', 'NOAA-18'), + ('sensor', 'avhrr-3'), + ('units', 'K'), + ('name', '4'), + ('level', None), + ('modifiers', ()), + ('wavelength', (10.3, 10.8, 11.3)), + ('calibration', 'brightness_temperature'), + ('start_time', TIME - datetime.timedelta(minutes=35)), + ('end_time', TIME - datetime.timedelta(minutes=30)), + ('area', area), + ('ancillary_variables', []), + ('enhancement_history', [{'offset': offset, 'scale': scale}])]) + + kwargs = {'ch_min_measurement_unit': np.array([-70]), + 'ch_max_measurement_unit': np.array([50]), + 'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 900015, 'data_cat': 'PORN', 'data_source': 'SMHI', + 'physic_unit': 'C', 'nbits': 8} + + data1 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 256).reshape((1, 256, 1024)) + datanan = da.ones((1, 256, 1024), chunks=1024) * np.nan + data2 = da.tile(da.repeat(da.arange(4, chunks=1024) / + 3.0, 256), 512).reshape((1, 512, 1024)) + data = da.concatenate((data1, datanan, data2), axis=1) + data = xr.DataArray(data, coords={'bands': ['L']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + img = FakeImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + assert(np.allclose(res[0, 0, ::256], + np.array([65024, 43264, 21760, 0]))) + + +def test_write_rgb(): + """Test saving a non-trasparent RGB.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + + x_size, y_size = 1024, 1024 + arr = np.zeros((3, y_size, x_size)) + radius = min(x_size, y_size) / 2.0 + centre = x_size / 2, y_size / 2 + + for x in range(x_size): + for y in range(y_size): + rx = x - centre[0] + ry = y - centre[1] + s = ((x - centre[0])**2.0 + (y - centre[1])**2.0)**0.5 / radius + if s <= 1.0: + h = ((np.arctan2(ry, rx) / np.pi) + 1.0) / 2.0 + rgb = colorsys.hsv_to_rgb(h, s, 1.0) + arr[:, y, x] = np.array(rgb) + + attrs = dict([('platform_name', 'NOAA-18'), + ('resolution', 1050), + ('polarization', None), + ('level', None), + ('sensor', 'avhrr-3'), + ('ancillary_variables', []), + ('area', area), + ('start_time', TIME - datetime.timedelta(minutes=45)), + ('end_time', TIME - datetime.timedelta(minutes=40)), + ('wavelength', None), + ('optional_datasets', []), + ('standard_name', 'overview'), + ('name', 'overview'), + ('prerequisites', [0.6, 0.8, 10.8]), + ('optional_prerequisites', []), + ('calibration', None), + ('modifiers', None), + ('mode', 'RGB'), + ('enhancement_history', [{'scale': np.array([1, 1, -1]), 'offset': np.array([0, 0, 1])}, + {'scale': np.array([0.0266347, 0.03559078, 0.01329783]), + 'offset': np.array([-0.02524969, -0.01996642, 3.8918446])}, + {'gamma': 1.6}])]) + + kwargs = {'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 6500015, 'data_cat': 'PPRN', 'data_source': 'SMHI', 'nbits': 8} + data = da.from_array(arr.clip(0, 1), chunks=1024) + data = xr.DataArray(data, coords={'bands': ['R', 'G', 'B']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + + from trollimage.xrimage import XRImage + img = XRImage(data) + + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=False, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + for idx in range(3): + np.testing.assert_allclose(res[:, :, idx], np.round( + arr[idx, :, :] * 255).astype(np.uint8)) + + +def test_write_rgb_with_a(): + """Test saving a transparent RGB.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + + x_size, y_size = 1024, 1024 + arr = np.zeros((3, y_size, x_size)) + radius = min(x_size, y_size) / 2.0 + centre = x_size / 2, y_size / 2 + + for x in range(x_size): + for y in range(y_size): + rx = x - centre[0] + ry = y - centre[1] + s = ((x - centre[0])**2.0 + (y - centre[1])**2.0)**0.5 / radius + if s <= 1.0: + h = ((np.arctan2(ry, rx) / np.pi) + 1.0) / 2.0 + rgb = colorsys.hsv_to_rgb(h, s, 1.0) + arr[:, y, x] = np.array(rgb) + else: + arr[:, y, x] = np.nan + + attrs = dict([('platform_name', 'NOAA-18'), + ('resolution', 1050), + ('polarization', None), + ('start_time', TIME - datetime.timedelta(minutes=55)), + ('end_time', TIME - datetime.timedelta(minutes=50)), + ('level', None), + ('sensor', 'avhrr-3'), + ('ancillary_variables', []), + ('area', area), + ('wavelength', None), + ('optional_datasets', []), + ('standard_name', 'overview'), + ('name', 'overview'), + ('prerequisites', [0.6, 0.8, 10.8]), + ('optional_prerequisites', []), + ('calibration', None), + ('modifiers', None), + ('mode', 'RGB'), + ('enhancement_history', [{'scale': np.array([1, 1, -1]), 'offset': np.array([0, 0, 1])}, + {'scale': np.array([0.0266347, 0.03559078, 0.01329783]), + 'offset': np.array([-0.02524969, -0.01996642, 3.8918446])}, + {'gamma': 1.6}])]) + + kwargs = {'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 6500015, 'data_cat': 'PPRN', 'data_source': 'SMHI', 'nbits': 8} + data = da.from_array(arr.clip(0, 1), chunks=1024) + + data = xr.DataArray(data, coords={'bands': ['R', 'G', 'B']}, dims=[ + 'bands', 'y', 'x'], attrs=attrs) + from trollimage.xrimage import XRImage + img = XRImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + for idx in range(3): + np.testing.assert_allclose(res[:, :, idx], np.round( + np.nan_to_num(arr[idx, :, :]) * 255).astype(np.uint8)) + np.testing.assert_allclose(res[:, :, 3] == 0, np.isnan(arr[0, :, :])) + + + at pytest.mark.skip(reason="this is no implemented yet.") +def test_write_rgb_classified(): + """Test saving a transparent RGB.""" + from pyninjotiff.ninjotiff import save + from pyninjotiff.tifffile import TiffFile + + area = FakeArea({'ellps': 'WGS84', 'lat_0': 90.0, 'lat_ts': 60.0, 'lon_0': 0.0, 'proj': 'stere'}, + (-1000000.0, -4500000.0, 2072000.0, -1428000.0), + 1024, 1024) + + x_size, y_size = 1024, 1024 + arr = np.zeros((3, y_size, x_size)) + + attrs = dict([('platform_name', 'NOAA-18'), + ('resolution', 1050), + ('polarization', None), + ('start_time', TIME - datetime.timedelta(minutes=55)), + ('end_time', TIME - datetime.timedelta(minutes=50)), + ('level', None), + ('sensor', 'avhrr-3'), + ('ancillary_variables', []), + ('area', area), + ('wavelength', None), + ('optional_datasets', []), + ('standard_name', 'overview'), + ('name', 'overview'), + ('prerequisites', [0.6, 0.8, 10.8]), + ('optional_prerequisites', []), + ('calibration', None), + ('modifiers', None), + ('mode', 'P')]) + + kwargs = {'compute': True, 'fill_value': None, 'sat_id': 6300014, + 'chan_id': 1700015, 'data_cat': 'PPRN', 'data_source': 'SMHI', 'nbits': 8} + + data1 = da.tile(da.repeat(da.arange(4, chunks=1024), 256), 256).reshape((1, 256, 1024)) + datanan = da.ones((1, 256, 1024), chunks=1024) * 4 + data2 = da.tile(da.repeat(da.arange(4, chunks=1024), 256), 512).reshape((1, 512, 1024)) + data = da.concatenate((data1, datanan, data2), axis=1) + data = xr.DataArray(data, coords={'bands': ['P']}, dims=['bands', 'y', 'x'], attrs=attrs) + + from trollimage.xrimage import XRImage + img = XRImage(data) + with tempfile.NamedTemporaryFile(delete=DELETE_FILES) as tmpfile: + filename = tmpfile.name + if not DELETE_FILES: + print(filename) + save(img, filename, data_is_scaled_01=True, **kwargs) + tif = TiffFile(filename) + res = tif[0].asarray() + for idx in range(3): + np.testing.assert_allclose(res[:, :, idx], np.round( + np.nan_to_num(arr[idx, :, :]) * 255).astype(np.uint8)) + np.testing.assert_allclose(res[:, :, 3] == 0, np.isnan(arr[0, :, :])) ===================================== pyninjotiff/tifffile.py ===================================== @@ -337,7 +337,7 @@ class TiffWriter(object): def save(self, data, photometric=None, planarconfig=None, resolution=None, description=None, volume=False, writeshape=False, compress=0, - colormap=None, extrasamples_type=1, tile_width=None, + colormap=None, extrasamples_type=1, tile_width=None, tile_length=None, extratags=()): """Write image data to TIFF file. @@ -711,8 +711,8 @@ class TiffWriter(object): # reset and use compress sizes strip_byte_counts = [] for plane in data[pageindex]: - for ty in xrange(0, tiles_y): - for tx in xrange(0, tiles_x): + for ty in range(0, tiles_y): + for tx in range(0, tiles_x): # allocate fixed size tile filled with zeros tile = numpy.zeros((tile_width * tile_length, shape[-1]), data.dtype) @@ -724,7 +724,7 @@ class TiffWriter(object): itw = min(tile_width, shape[3] - tx*tile_width) ioffs = tx*tile_width - for tl in xrange(0, itl): + for tl in range(0, itl): # copy data to tile line ir = ty*tile_length+tl tile[tl*tile_width:tl*tile_width+itw] \ @@ -4989,4 +4989,3 @@ if sys.version_info[0] > 2: if __name__ == "__main__": sys.exit(main()) - ===================================== pyninjotiff/version.py ===================================== @@ -22,4 +22,4 @@ """Version file.""" -__version__ = "v0.1.0" +__version__ = "v0.2.0" ===================================== setup.cfg ===================================== @@ -0,0 +1,2 @@ +[flake8] +max-line-length = 120 ===================================== setup.py ===================================== @@ -31,7 +31,7 @@ version = imp.load_source('pyninjotiff.version', 'pyninjotiff/version.py') setup(name="pyninjotiff", version=version.__version__, - description='Pytroll imaging library', + description='Python Ninjo TIFF writing library', author='Martin Raspaud', author_email='martin.raspaud at smhi.se', classifiers=["Development Status :: 5 - Production/Stable", @@ -44,6 +44,6 @@ setup(name="pyninjotiff", url="https://github.com/pytroll/pyninjotiff", packages=['pyninjotiff'], zip_safe=False, - install_requires=['numpy >=1.6', 'six'], + install_requires=['numpy >=1.6', 'six', 'pyproj', 'pyresample'], # test_suite='pyninjotiff.tests.suite', ) View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/commit/27575121dd399ceb37b16132dfa334a74e054022 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/commit/27575121dd399ceb37b16132dfa334a74e054022 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 12:12:35 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 11:12:35 +0000 Subject: [Git][debian-gis-team/pyninjotiff] Pushed new tag debian/0.2.0-1 Message-ID: <5d8605a3a7f8c_56b2aeb2fadc7e0144441@godard.mail> Bas Couwenberg pushed new tag debian/0.2.0-1 at Debian GIS Project / pyninjotiff -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/tree/debian/0.2.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 21 12:25:01 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 21 Sep 2019 11:25:01 +0000 Subject: Processing of pyninjotiff_0.2.0-1_source.changes Message-ID: pyninjotiff_0.2.0-1_source.changes uploaded successfully to localhost along with the files: pyninjotiff_0.2.0-1.dsc pyninjotiff_0.2.0.orig.tar.gz pyninjotiff_0.2.0-1.debian.tar.xz pyninjotiff_0.2.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 21 12:35:37 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 21 Sep 2019 11:35:37 +0000 Subject: pyninjotiff_0.2.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 21 Sep 2019 09:38:30 +0000 Source: pyninjotiff Architecture: source Version: 0.2.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: pyninjotiff (0.2.0-1) unstable; urgency=medium . [ Bas Couwenberg ] * Update gbp.conf to use --source-only-changes by default. * Bump Standards-Version to 4.4.0, no changes. . [ Antonio Valentino ] * New upstream release. * Bump debhelper from old 11 to 12. * Remove obsolete fields Name from debian/upstream/metadata. * Update copyright file. * debian/patches: - remove all patches (no longer needed) * Enable testing (add dependencies on pytest, xarray and trollimage). Checksums-Sha1: 3972406af17e5c0f8da75925d80ba66d9f507d95 2206 pyninjotiff_0.2.0-1.dsc 5b7da034fbfbddbff8a4b5661d4a109ca73bdbdc 79726 pyninjotiff_0.2.0.orig.tar.gz 3dc70898509b546dd184b6f92cdb272f1bf60b81 3440 pyninjotiff_0.2.0-1.debian.tar.xz 4eb0f7d539379329ddb3fcd17279a4154db3c0a6 10793 pyninjotiff_0.2.0-1_amd64.buildinfo Checksums-Sha256: e00ac4078fe1a8bb20d9d6ebe65d0865c021fb48ebf90095d41079cd65e7f369 2206 pyninjotiff_0.2.0-1.dsc d4d656884388de6c80ac5eef4448f2240fae8a0d6762ea65f51d3e8c800f83df 79726 pyninjotiff_0.2.0.orig.tar.gz e9e31018fd2a729ecf3074e0f8acd1fc5013e71b4db4c2cc6942d9ce18e8820d 3440 pyninjotiff_0.2.0-1.debian.tar.xz 3b190957f23734c621ca26c8c6d3dcf1fba81175efe1622c49db04cffd54b1dc 10793 pyninjotiff_0.2.0-1_amd64.buildinfo Files: ce6066f68caa28ea1bd7584de9ecd90d 2206 python optional pyninjotiff_0.2.0-1.dsc cec73ed41c1e4c8c737550a09a71e889 79726 python optional pyninjotiff_0.2.0.orig.tar.gz 30877fb232112af3432b58b05fde049c 3440 python optional pyninjotiff_0.2.0-1.debian.tar.xz 1c4192412998bf0fc12ae207c0bbcf99 10793 python optional pyninjotiff_0.2.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2GBXEACgkQZ1DxCuiN SvFzSA/6ApSk25qIuuqpA43eMAae2p6jkjC5wSDj6wz2D4j3QgxEP5BSmhyntAcy LwIBrkcbzHoZNai+jnXUfQwrAPcuE8FojNlaxcc1eFJA+tfiW6SG03fvLzjOtIFd oOd7SIWn5kLlSi/CZZin5dT8xcrCuNRmTQlDQQv57ZvwxDrZKbY2dJv9qVRcHvJu Cb7E3Iy6BWObraEH0Tf4fGnuTR2H8nOS9qHdReZrw24p4b68bFU+J52KYBgo9i9p SqSFUY43aT+UtJYZnOOX9gC/t04Bj6S0UgB/WUsuJMV8RJxONH4v3nwFMV1aS7kE IZDsQAN9EQl04ypdtmUHP8JTIrs9crXfEjx/AalhxX5pbEC9rW3rIHEvmwKGi37L SiwRT73WyOZhjc+Dlx6FEXPvOo72vdKN2WJL9D/hyZbIwc6XlfYIIURRVEBkqorw Df9LQx2oJpdN0CYBOWUrZud1D9hS5rRuGAO/nBLTdKi+NGmetqaPVZFG5uYF2xga Wr5Hh154kjmOEadyr2mIciXHHyumokDpPTYScokSxcMhUgTIrg9YvC/jKUjGtbyw Hu3U70W9jDVSJqhB9iUMj+hRaf00ItvmmRTh3rnKi2js/LaDRUrbbcmhhIe28OyD Z2hcDKM/SqCnW7gWQsrN6S7WSO0b9/71SiqZJqTEFeE0mFWXuWI= =91zh -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From sebastic at xs4all.nl Sat Sep 21 13:00:10 2019 From: sebastic at xs4all.nl (Sebastiaan Couwenberg) Date: Sat, 21 Sep 2019 14:00:10 +0200 Subject: qgis testing migration prevented by piuparts Message-ID: <86ecc679-fde1-b929-abde-25d66235137d@xs4all.nl> Testing migration of qgis is prevented by piuparts. libqgis-native3.4.12 failed-testing in sid, according to the log because it was Unable to locate package libqgis-native3.4.12. The package is available in unstable and works fine on my system and a clear chroot. It seems like an issue with the piuparts environment. Please retry the test to unblock testing migration for qgis. Kind Regards, Bas -- GPG Key ID: 4096R/6750F10AE88D4AF1 Fingerprint: 8182 DE41 7056 408D 6146 50D1 6750 F10A E88D 4AF1 From holger at layer-acht.org Sat Sep 21 13:19:30 2019 From: holger at layer-acht.org (Holger Levsen) Date: Sat, 21 Sep 2019 12:19:30 +0000 Subject: [Piuparts-devel] qgis testing migration prevented by piuparts In-Reply-To: <86ecc679-fde1-b929-abde-25d66235137d@xs4all.nl> References: <86ecc679-fde1-b929-abde-25d66235137d@xs4all.nl> Message-ID: <20190921121930.zxvnbn3geprpincl@layer-acht.org> On Sat, Sep 21, 2019 at 02:00:10PM +0200, Sebastiaan Couwenberg wrote: > Testing migration of qgis is prevented by piuparts. libqgis-native3.4.12 [...] > Please retry the test to unblock testing migration for qgis. done, results should become available in the next 18h. -- cheers, Holger ------------------------------------------------------------------------------- holger@(debian|reproducible-builds|layer-acht).org PGP fingerprint: B8BF 5413 7B09 D35C F026 FE9D 091A B856 069A AA1C -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 833 bytes Desc: not available URL: From gitlab at salsa.debian.org Sat Sep 21 16:54:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 15:54:00 +0000 Subject: [Git][debian-gis-team/pywps][master] 2 commits: Add license & copyright for geojson schemas. (closes: #940185) Message-ID: <5d8647985e6b3_56b2aeb2f2d125816294e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pywps Commits: c69a6d7d by Bas Couwenberg at 2019-09-21T15:48:56Z Add license & copyright for geojson schemas. (closes: #940185) - - - - - 14c60e0f by Bas Couwenberg at 2019-09-21T15:48:56Z Set distribution to unstable. - - - - - 2 changed files: - debian/changelog - debian/copyright Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +pywps (4.2.1-4) unstable; urgency=medium + + * Add license & copyright for geojson schemas. + (closes: #940185) + + -- Bas Couwenberg Sat, 21 Sep 2019 17:34:45 +0200 + pywps (4.2.1-3) unstable; urgency=medium * Add Breaks/Replaces to fix upgrade to python3-pywps. ===================================== debian/copyright ===================================== @@ -8,6 +8,10 @@ Copyright: 2018, Open Source Geospatial Foundation and others 2014-2016, PyWPS Development Team, represented by PyWPS Project Steering Committee License: Expat +Files: pywps/schemas/geojson/* +Copyright: Francis Galiegue +License: AFL-1.1 or BSD-3-Clause + Files: debian/* Copyright: 2006, Jáchym Čepický License: GPL-2+ @@ -56,3 +60,115 @@ License: GPL-2+ On Debian systems, the complete text of version 2 of the GNU General Public License can be found in `/usr/share/common-licenses/GPL-2'. +License: AFL-1.1 + Academic Free License + . + Version 1.1 + . + The Academic Free License applies to any original work of authorship + (the "Original Work") whose owner (the "Licensor") has placed the + following notice immediately following the copyright notice for the + Original Work: + . + "Licensed under the Academic Free License version 1.1." + . + Grant of License. Licensor hereby grants to any person obtaining a copy + of the Original Work ("You") a world-wide, royalty-free, non-exclusive, + perpetual, non-sublicenseable license + . + (1) to use, copy, modify, merge, publish, perform, distribute and/or + sell copies of the Original Work and derivative works thereof, and + (2) under patent claims owned or controlled by the Licensor that are + embodied in the Original Work as furnished by the Licensor, to make, + use, sell and offer for sale the Original Work and derivative works + thereof, subject to the following conditions. + . + Right of Attribution. Redistributions of the Original Work must reproduce + all copyright notices in the Original Work as furnished by the Licensor, + both in the Original Work itself and in any documentation and/or other + materials provided with the distribution of the Original Work in executable + form. + . + Exclusions from License Grant. Neither the names of Licensor, nor the names + of any contributors to the Original Work, nor any of their trademarks or + service marks, may be used to endorse or promote products derived from this + Original Work without express prior written permission of the Licensor. + . + WARRANTY AND DISCLAIMERS. LICENSOR WARRANTS THAT THE COPYRIGHT IN AND TO THE + ORIGINAL WORK IS OWNED BY THE LICENSOR OR THAT THE ORIGINAL WORK IS + DISTRIBUTED BY LICENSOR UNDER A VALID CURRENT LICENSE FROM THE COPYRIGHT + OWNER. EXCEPT AS EXPRESSLY STATED IN THE IMMEDIATELY PRECEDING SENTENCE, + THE ORIGINAL WORK IS PROVIDED UNDER THIS LICENSE ON AN "AS IS" BASIS, + WITHOUT WARRANTY, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, + THE WARRANTY OF NON-INFRINGEMENT AND WARRANTIES THAT THE ORIGINAL WORK IS + MERCHANTABLE OR FIT FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE + QUALITY OF THE ORIGINAL WORK IS WITH YOU. THIS DISCLAIMER OF WARRANTY + CONSTITUTES AN ESSENTIAL PART OF THIS LICENSE. NO LICENSE TO ORIGINAL WORK + IS GRANTED HEREUNDER EXCEPT UNDER THIS DISCLAIMER. + . + LIMITATION OF LIABILITY. UNDER NO CIRCUMSTANCES AND UNDER NO LEGAL THEORY, + WHETHER TORT (INCLUDING NEGLIGENCE), CONTRACT, OR OTHERWISE, SHALL THE + LICENSOR BE LIABLE TO ANY PERSON FOR ANY DIRECT, INDIRECT, SPECIAL, + INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY CHARACTER ARISING AS A RESULT + OF THIS LICENSE OR THE USE OF THE ORIGINAL WORK INCLUDING, WITHOUT + LIMITATION, DAMAGES FOR LOSS OF GOODWILL, WORK STOPPAGE, COMPUTER FAILURE + OR MALFUNCTION, OR ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSSES, EVEN IF + SUCH PERSON SHALL HAVE BEEN INFORMED OF THE POSSIBILITY OF SUCH DAMAGES. + THIS LIMITATION OF LIABILITY SHALL NOT APPLY TO LIABILITY FOR DEATH OR + PERSONAL INJURY RESULTING FROM SUCH PARTY'S NEGLIGENCE TO THE EXTENT + APPLICABLE LAW PROHIBITS SUCH LIMITATION. SOME JURISDICTIONS DO NOT ALLOW + THE EXCLUSION OR LIMITATION OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THIS + EXCLUSION AND LIMITATION MAY NOT APPLY TO YOU. + . + License to Source Code. The term "Source Code" means the preferred form of + the Original Work for making modifications to it and all available + documentation describing how to access and modify the Original Work. + Licensor hereby agrees to provide a machine-readable copy of the Source + Code of the Original Work along with each copy of the Original Work that + Licensor distributes. Licensor reserves the right to satisfy this obligation + by placing a machine-readable copy of the Source Code in an information + repository reasonably calculated to permit inexpensive and convenient + access by You for as long as Licensor continues to distribute the Original + Work, and by publishing the address of that information repository in a + notice immediately following the copyright notice that applies to the + Original Work. + . + Mutual Termination for Patent Action. This License shall terminate + automatically and You may no longer exercise any of the rights granted to + You by this License if You file a lawsuit in any court alleging that any + OSI Certified open source software that is licensed under any license + containing this "Mutual Termination for Patent Action" clause infringes + any patent claims that are essential to use that software. + . + This license is Copyright (C) 2002 Lawrence E. Rosen. All rights reserved. + . + Permission is hereby granted to copy and distribute this license without + modification. This license may not be modified without the express written + permission of its copyright owner. + +License: BSD-3-Clause + Redistribution and use in source and binary forms, with or without + modification, are permitted provided that the following conditions are met: + . + 1) Redistributions of source code must retain the above copyright notice, + this list of conditions and the following disclaimer. + . + 2) Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + . + 3) Neither the name of the ORGANIZATION nor the names of its contributors may + be used to endorse or promote products derived from this software without + specific prior written permission. + . + THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" + AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE + IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE + ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE + LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR + CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF + SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS + INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN + CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) + ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE + POSSIBILITY OF SUCH DAMAGE. View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/de8f0a46a53f1d66ff18e53f542d04a22fb4cceb...14c60e0f2e892f2b7331ef863a36b5b44afa7796 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/de8f0a46a53f1d66ff18e53f542d04a22fb4cceb...14c60e0f2e892f2b7331ef863a36b5b44afa7796 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 21 16:54:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 21 Sep 2019 15:54:04 +0000 Subject: [Git][debian-gis-team/pywps] Pushed new tag debian/4.2.1-4 Message-ID: <5d86479c59c9f_56b2aeb2f2d125816311d@godard.mail> Bas Couwenberg pushed new tag debian/4.2.1-4 at Debian GIS Project / pywps -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/tree/debian/4.2.1-4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 21 17:03:03 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 21 Sep 2019 16:03:03 +0000 Subject: Processing of pywps_4.2.1-4_source.changes Message-ID: pywps_4.2.1-4_source.changes uploaded successfully to localhost along with the files: pywps_4.2.1-4.dsc pywps_4.2.1-4.debian.tar.xz pywps_4.2.1-4_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 21 17:05:33 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 21 Sep 2019 16:05:33 +0000 Subject: pywps_4.2.1-4_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sat, 21 Sep 2019 17:34:45 +0200 Source: pywps Architecture: source Version: 4.2.1-4 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Closes: 940185 Changes: pywps (4.2.1-4) unstable; urgency=medium . * Add license & copyright for geojson schemas. (closes: #940185) Checksums-Sha1: f073e314c7a42cc9992b12aa457f7ae1d243c0be 2356 pywps_4.2.1-4.dsc 5ccbe51861b9e134c05bf2d268af666f3d220b1f 12268 pywps_4.2.1-4.debian.tar.xz 2debaad8b224fe542e121f346d3ddb54a6e23c11 11692 pywps_4.2.1-4_amd64.buildinfo Checksums-Sha256: 22c3ae2db9dbe27ac22fcbb5a07c8c4f977a5cab25c062d6a6dfaa737b02de99 2356 pywps_4.2.1-4.dsc 0df47e4a8879cf68d9b6b2bc7bb4b018b2070673e2d97b258a1c0ede66c44547 12268 pywps_4.2.1-4.debian.tar.xz fcf90591e2bace1c27ef3db531f63a33b995cd05c8799629a61bffff5275e5a1 11692 pywps_4.2.1-4_amd64.buildinfo Files: 6051c6037fa702db6b12dd0f92f7e44b 2356 python optional pywps_4.2.1-4.dsc e581971c18972aa9b4f1fec3e7feeece 12268 python optional pywps_4.2.1-4.debian.tar.xz f7381bf8cba4fc9e6a0af655e9a92f6e 11692 python optional pywps_4.2.1-4_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2GR3QACgkQZ1DxCuiN SvHY9g/+LVMuyjIOQ7j4DU4AMB8MakvhgbC9TrTatYdBTgSMFxqPoK7MccNhxEXU Gn5nhnokrQbBgPGerFpznWsoN7IFi8IQas0Vw6D1zUN/npyC/hHom9oJ9iUIWC1O 1fqsAZiSVp84wP+UabG/N9IBjWzdYCIj0JGAvHHfpeF6d00/gJRPwG1emJCnqbAg 1OCDdvkzHRrBCAtd4hr74JqshoQf7RXr+KDR8pby+Cx9/PLzANGfEBN39UaC69MC JP4qMlWyGm4UEEV7ei0ZM33pikp1v4ynpBBSgV6PByTBIdTuiQuiIJwSyz7evfq6 WVoEQCvkIHUl4zDaIZrKv33SemEExHajj7hQz3OZSTzIIZ0xDc0tY5Ba26umFiBj FClpTcJQJeP1YQUR6ln/ZD+9atJgYeYoIM832MP4cDpOqMyVlV+AUOruqSgRDXzQ hqf5B94/629d10+bN09ah3OrA6kBJh5uvJyKCYfv92lH3bLH8Ztm74x7wUXiyQy3 hmuheerDGfLC8lwDFeio7L22SwysJJBzITqFvQGGZ+FnmINSRsW1d6vMvYIX6tjp T3z+jnEFlA6rlqBIHeVCY9Elbfl+0BffdvuCXIJ2PFsorl2pIv0fhOsF2Rd/CPrk 7HVYloWXksCcxwr/xQYSkoWgRWWwwRVIe0SC+psgwTjqWt9xhbc= =vpx+ -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From owner at bugs.debian.org Sat Sep 21 17:09:07 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Sat, 21 Sep 2019 16:09:07 +0000 Subject: Bug#940185: marked as done (src:pywps: Debian/copyright needs update) References: <156838818187.29795.15334712978001753322.reportbug@l5580.kitterman.com> Message-ID: Your message dated Sat, 21 Sep 2019 16:05:33 +0000 with message-id and subject line Bug#940185: fixed in pywps 4.2.1-4 has caused the Debian Bug report #940185, regarding src:pywps: Debian/copyright needs update to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 940185: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=940185 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: Scott Kitterman Subject: src:pywps: Debian/copyright needs update Date: Fri, 13 Sep 2019 11:23:01 -0400 Size: 4053 URL: -------------- next part -------------- An embedded message was scrubbed... From: Bas Couwenberg Subject: Bug#940185: fixed in pywps 4.2.1-4 Date: Sat, 21 Sep 2019 16:05:33 +0000 Size: 5019 URL: From noreply at release.debian.org Sun Sep 22 05:39:18 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sun, 22 Sep 2019 04:39:18 +0000 Subject: libosmium 2.15.3-1 MIGRATED to testing Message-ID: FYI: The status of the libosmium source package in Debian's testing distribution has changed. Previous version: 2.15.2-1 Current version: 2.15.3-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sun Sep 22 05:39:19 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sun, 22 Sep 2019 04:39:19 +0000 Subject: osmium-tool 1.11.0-1 MIGRATED to testing Message-ID: FYI: The status of the osmium-tool source package in Debian's testing distribution has changed. Previous version: 1.10.0-1 Current version: 1.11.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Sun Sep 22 05:39:20 2019 From: noreply at release.debian.org (Debian testing watch) Date: Sun, 22 Sep 2019 04:39:20 +0000 Subject: qgis 3.4.12+dfsg-1 MIGRATED to testing Message-ID: FYI: The status of the qgis source package in Debian's testing distribution has changed. Previous version: 3.4.11+dfsg-2 Current version: 3.4.12+dfsg-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Sun Sep 22 05:58:59 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 22 Sep 2019 04:58:59 +0000 Subject: [Git][debian-gis-team/libosmium][buster-backports] 6 commits: New upstream version 2.15.3 Message-ID: <5d86ff9321d35_15123faf6a6fc2ec949df@godard.mail> Bas Couwenberg pushed to branch buster-backports at Debian GIS Project / libosmium Commits: 49940fd7 by Bas Couwenberg at 2019-09-17T03:54:01Z New upstream version 2.15.3 - - - - - 9a471f9d by Bas Couwenberg at 2019-09-17T03:54:04Z Update upstream source from tag 'upstream/2.15.3' Update to upstream version '2.15.3' with Debian dir 97b79a64cfc951fe5b65325cbf5727ab93e507b3 - - - - - 3f2183ca by Bas Couwenberg at 2019-09-17T03:54:14Z New upstream release. - - - - - 491ac059 by Bas Couwenberg at 2019-09-17T03:54:49Z Set distribution to unstable. - - - - - e8797482 by Bas Couwenberg at 2019-09-22T04:42:51Z Merge tag 'debian/2.15.3-1' into buster-backports releasing package libosmium version 2.15.3-1 - - - - - 9df134ac by Bas Couwenberg at 2019-09-22T04:43:05Z Rebuild for buster-backports. - - - - - 7 changed files: - CHANGELOG.md - CMakeLists.txt - debian/changelog - include/osmium/io/detail/pbf_decoder.hpp - include/osmium/io/detail/pbf_output_format.hpp - include/osmium/io/detail/xml_input_format.hpp - include/osmium/version.hpp Changes: ===================================== CHANGELOG.md ===================================== @@ -13,6 +13,24 @@ This project adheres to [Semantic Versioning](https://semver.org/). ### Fixed +## [2.15.3] - 2019-09-16 + +### Added + +* New header option "sorting" when reading and writing PBFs. If the header + option "sorting" is set to `Type_then_ID`, the optional header property + `Sort.Type_then_ID` is set on writing to PBF files. When reading PBF files + with this header property, the "sorting" header option is set accordingly. + +### Fixed + +* Do not propagate C++ exception through C code. We are using the Expat + XML parser, a C library. It calls callbacks in our code. When those + callbacks throw, the exception was propagated through the C code. This + did work in the tests, but that behaviour isn't guaranteed (C++ + standard says it is implementation defined). This fixes it by catching + the exception and rethrowing it later. + ## [2.15.2] - 2019-08-16 ### Added @@ -956,7 +974,8 @@ This project adheres to [Semantic Versioning](https://semver.org/). Doxygen (up to version 1.8.8). This version contains a workaround to fix this. -[unreleased]: https://github.com/osmcode/libosmium/compare/v2.15.2...HEAD +[unreleased]: https://github.com/osmcode/libosmium/compare/v2.15.3...HEAD +[2.15.3]: https://github.com/osmcode/libosmium/compare/v2.15.2...v2.15.3 [2.15.2]: https://github.com/osmcode/libosmium/compare/v2.15.1...v2.15.2 [2.15.1]: https://github.com/osmcode/libosmium/compare/v2.15.0...v2.15.1 [2.15.0]: https://github.com/osmcode/libosmium/compare/v2.14.2...v2.15.0 ===================================== CMakeLists.txt ===================================== @@ -40,7 +40,7 @@ project(libosmium) set(LIBOSMIUM_VERSION_MAJOR 2) set(LIBOSMIUM_VERSION_MINOR 15) -set(LIBOSMIUM_VERSION_PATCH 2) +set(LIBOSMIUM_VERSION_PATCH 3) set(LIBOSMIUM_VERSION "${LIBOSMIUM_VERSION_MAJOR}.${LIBOSMIUM_VERSION_MINOR}.${LIBOSMIUM_VERSION_PATCH}") ===================================== debian/changelog ===================================== @@ -1,3 +1,15 @@ +libosmium (2.15.3-1~bpo10+1) buster-backports; urgency=medium + + * Rebuild for buster-backports. + + -- Bas Couwenberg Sun, 22 Sep 2019 06:42:55 +0200 + +libosmium (2.15.3-1) unstable; urgency=medium + + * New upstream release. + + -- Bas Couwenberg Tue, 17 Sep 2019 05:54:40 +0200 + libosmium (2.15.2-1~bpo10+1) buster-backports; urgency=medium * Rebuild for buster-backports. ===================================== include/osmium/io/detail/pbf_decoder.hpp ===================================== @@ -847,8 +847,13 @@ namespace osmium { } } break; - case protozero::tag_and_type(OSMFormat::HeaderBlock::repeated_string_optional_features, protozero::pbf_wire_type::length_delimited): - header.set("pbf_optional_feature_" + std::to_string(i++), pbf_header_block.get_string()); + case protozero::tag_and_type(OSMFormat::HeaderBlock::repeated_string_optional_features, protozero::pbf_wire_type::length_delimited): { + const auto opt = pbf_header_block.get_string(); + header.set("pbf_optional_feature_" + std::to_string(i++), opt); + if (opt == "Sort.Type_then_ID") { + header.set("sorting", "Type_then_ID"); + } + } break; case protozero::tag_and_type(OSMFormat::HeaderBlock::optional_string_writingprogram, protozero::pbf_wire_type::length_delimited): header.set("generator", pbf_header_block.get_string()); ===================================== include/osmium/io/detail/pbf_output_format.hpp ===================================== @@ -577,6 +577,10 @@ namespace osmium { pbf_header_block.add_string(OSMFormat::HeaderBlock::repeated_string_optional_features, "LocationsOnWays"); } + if (header.get("sorting") == "Type_then_ID") { + pbf_header_block.add_string(OSMFormat::HeaderBlock::repeated_string_optional_features, "Sort.Type_then_ID"); + } + pbf_header_block.add_string(OSMFormat::HeaderBlock::optional_string_writingprogram, header.get("generator")); const std::string osmosis_replication_timestamp{header.get("osmosis_replication_timestamp")}; ===================================== include/osmium/io/detail/xml_input_format.hpp ===================================== @@ -60,6 +60,7 @@ DEALINGS IN THE SOFTWARE. #include #include +#include #include #include #include @@ -177,17 +178,44 @@ namespace osmium { class ExpatXMLParser { XML_Parser m_parser; + std::exception_ptr m_exception_ptr{}; - static void XMLCALL start_element_wrapper(void* data, const XML_Char* element, const XML_Char** attrs) { - static_cast(data)->start_element(element, attrs); + template + void member_wrap(XMLParser& xml_parser, TFunc&& func) noexcept { + if (m_exception_ptr) { + return; + } + try { + std::forward(func)(xml_parser); + } catch (...) { + m_exception_ptr = std::current_exception(); + XML_StopParser(m_parser, 0); + } + } + + template + static void wrap(void* data, TFunc&& func) noexcept { + assert(data); + auto& xml_parser = *static_cast(data); + xml_parser.m_expat_xml_parser->member_wrap(xml_parser, std::forward(func)); + } + + static void XMLCALL start_element_wrapper(void* data, const XML_Char* element, const XML_Char** attrs) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.start_element(element, attrs); + }); } - static void XMLCALL end_element_wrapper(void* data, const XML_Char* element) { - static_cast(data)->end_element(element); + static void XMLCALL end_element_wrapper(void* data, const XML_Char* element) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.end_element(element); + }); } - static void XMLCALL character_data_wrapper(void* data, const XML_Char* text, int len) { - static_cast(data)->characters(text, len); + static void XMLCALL character_data_wrapper(void* data, const XML_Char* text, int len) noexcept { + wrap(data, [&](XMLParser& xml_parser) { + xml_parser.characters(text, len); + }); } // This handler is called when there are any XML entities @@ -195,7 +223,7 @@ namespace osmium { // but they can be misused. See // https://en.wikipedia.org/wiki/Billion_laughs // The handler will just throw an error. - static void entity_declaration_handler(void* /*userData*/, + static void entity_declaration_handler(void* data, const XML_Char* /*entityName*/, int /*is_parameter_entity*/, const XML_Char* /*value*/, @@ -203,8 +231,10 @@ namespace osmium { const XML_Char* /*base*/, const XML_Char* /*systemId*/, const XML_Char* /*publicId*/, - const XML_Char* /*notationName*/) { - throw osmium::xml_error{"XML entities are not supported"}; + const XML_Char* /*notationName*/) noexcept { + wrap(data, [&](XMLParser& /*xml_parser*/) { + throw osmium::xml_error{"XML entities are not supported"}; + }); } public: @@ -233,12 +263,17 @@ namespace osmium { void operator()(const std::string& data, bool last) { assert(data.size() < std::numeric_limits::max()); if (XML_Parse(m_parser, data.data(), static_cast(data.size()), last) == XML_STATUS_ERROR) { + if (m_exception_ptr) { + std::rethrow_exception(m_exception_ptr); + } throw osmium::xml_error{m_parser}; } } }; // class ExpatXMLParser + ExpatXMLParser* m_expat_xml_parser{nullptr}; + template static void check_attributes(const XML_Char** attrs, T&& check) { while (*attrs) { @@ -739,6 +774,7 @@ namespace osmium { osmium::thread::set_thread_name("_osmium_xml_in"); ExpatXMLParser parser{this}; + m_expat_xml_parser = &parser; while (!input_done()) { const std::string data{get_input()}; ===================================== include/osmium/version.hpp ===================================== @@ -35,8 +35,8 @@ DEALINGS IN THE SOFTWARE. #define LIBOSMIUM_VERSION_MAJOR 2 #define LIBOSMIUM_VERSION_MINOR 15 -#define LIBOSMIUM_VERSION_PATCH 2 +#define LIBOSMIUM_VERSION_PATCH 3 -#define LIBOSMIUM_VERSION_STRING "2.15.2" +#define LIBOSMIUM_VERSION_STRING "2.15.3" #endif // OSMIUM_VERSION_HPP View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/compare/e9c3a68faa5c36282543c1c5922214a7600ddf7d...9df134ac1fac12cccb4518df73de6d229f95dcd7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/compare/e9c3a68faa5c36282543c1c5922214a7600ddf7d...9df134ac1fac12cccb4518df73de6d229f95dcd7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 22 05:59:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 22 Sep 2019 04:59:00 +0000 Subject: [Git][debian-gis-team/libosmium] Pushed new tag debian/2.15.3-1_bpo10+1 Message-ID: <5d86ff94a4cec_15123faf6a7171dc951c@godard.mail> Bas Couwenberg pushed new tag debian/2.15.3-1_bpo10+1 at Debian GIS Project / libosmium -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/tree/debian/2.15.3-1_bpo10+1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 22 06:09:09 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 05:09:09 +0000 Subject: Processing of libosmium_2.15.3-1~bpo10+1_source.changes Message-ID: libosmium_2.15.3-1~bpo10+1_source.changes uploaded successfully to localhost along with the files: libosmium_2.15.3-1~bpo10+1.dsc libosmium_2.15.3-1~bpo10+1.debian.tar.xz libosmium_2.15.3-1~bpo10+1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Sun Sep 22 06:18:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 22 Sep 2019 05:18:53 +0000 Subject: [Git][debian-gis-team/osmium-tool] Pushed new branch buster-backports Message-ID: <5d87043da724b_15123faf6a2a96189551c@godard.mail> Bas Couwenberg pushed new branch buster-backports at Debian GIS Project / osmium-tool -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/tree/buster-backports You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 22 06:18:59 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 22 Sep 2019 05:18:59 +0000 Subject: [Git][debian-gis-team/osmium-tool] Pushed new tag debian/1.11.0-1_bpo10+1 Message-ID: <5d8704436f954_15123faf6a6fc2ec95791@godard.mail> Bas Couwenberg pushed new tag debian/1.11.0-1_bpo10+1 at Debian GIS Project / osmium-tool -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/tree/debian/1.11.0-1_bpo10+1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 22 06:19:25 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 05:19:25 +0000 Subject: libosmium_2.15.3-1~bpo10+1_source.changes ACCEPTED into buster-backports Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 22 Sep 2019 06:42:55 +0200 Source: libosmium Architecture: source Version: 2.15.3-1~bpo10+1 Distribution: buster-backports Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: libosmium (2.15.3-1~bpo10+1) buster-backports; urgency=medium . * Rebuild for buster-backports. . libosmium (2.15.3-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: 3d4213444c2d65771b6a15e9b9915e73add7f9c2 2205 libosmium_2.15.3-1~bpo10+1.dsc 5ad297b64bffa48997df2fe45cf38bab2e0f9984 6364 libosmium_2.15.3-1~bpo10+1.debian.tar.xz 291390c44bc60dca804718fef954ff55fb6d2abd 12742 libosmium_2.15.3-1~bpo10+1_amd64.buildinfo Checksums-Sha256: 77498a0ebe4223ce875b1ffbcb873c7ad9ab71dce643eb63f28f8554795796b0 2205 libosmium_2.15.3-1~bpo10+1.dsc 9f44b958846c1fbf098ee4149d0baa92bd03f943f10e6136621347df84069910 6364 libosmium_2.15.3-1~bpo10+1.debian.tar.xz c4015547d9c4720aef9d28362035f3d9981d7b68174c3933161b8b283c47ea56 12742 libosmium_2.15.3-1~bpo10+1_amd64.buildinfo Files: 42b65f9d7ce1bce27fde91206b0c8dff 2205 science optional libosmium_2.15.3-1~bpo10+1.dsc dc0a496e6921cdfb684547f7211b39a4 6364 science optional libosmium_2.15.3-1~bpo10+1.debian.tar.xz 4551fd7c5cdeef694f26e47e0f169941 12742 science optional libosmium_2.15.3-1~bpo10+1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2G/yEACgkQZ1DxCuiN SvGqsg//cbCqXGxr5gMaNSYIe5QtF01MI8PefT0yj4S1wcViNtZVXCb1HdU9PMux hi4OClQnoCBQFVPKUlHcoQFTYDBH/jgT+szE4bjMzvmZUiaU+Lzhx5AfZfUTVRBy RoFLd6thz+2FF3fP2bCEscSK+BWXD+XKqerheeBsmmPTOnvU6eRP9/CSJ60qS6F9 Cf+D0lXMQQwQAk4mh5uARdHvbUpdIN21pC4OILY7zFoa746emmqgfBdNDPkKYxb3 GUKv6Jc8GXp7jOne1NekcJwL5KDjQHeoOrODHOj7vz6ZH06sEyGzX6ne/1kHy34s 9q7WXkzvj0cV/amcjxY1RUZHRSKfSiIxMGsqr3wATVycETFRRKL+gDrx171ndMQ9 2EteW4SSNYY7ChNQuqUlDtg+1BxcYekzcazPoA2XTysz27zbT1xtlRpAtNO6mu3G 6Nu11UQFLgYOwojI3t7aX0FrQzk+aNKFmNBvOaqjHSEMpAhDeV89iAeG3kKyFb0O FgUr11a44QzJF0qAkCbfwUnxaHSJtVBuy1HVuzXfVRjqgN3K7EL3ZFhxj0ida5ej Vp/mBRIdG4a0cZz+J3IzDO69MmxKzvhyYb7YGqEFuCwXk5DZESlbW7BueoI0MuTc nQnvZv3Zyp0S1RcZkQH9Cq3JW7mVOnRUPPkF1yMZCy8J7pL2EMU= =/Kvb -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Sun Sep 22 06:24:23 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 05:24:23 +0000 Subject: Processing of osmium-tool_1.11.0-1~bpo10+1_amd64.changes Message-ID: osmium-tool_1.11.0-1~bpo10+1_amd64.changes uploaded successfully to localhost along with the files: osmium-tool_1.11.0-1~bpo10+1.dsc osmium-tool_1.11.0-1~bpo10+1.debian.tar.xz osmium-tool-dbgsym_1.11.0-1~bpo10+1_amd64.deb osmium-tool_1.11.0-1~bpo10+1_amd64.buildinfo osmium-tool_1.11.0-1~bpo10+1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 22 06:34:28 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 05:34:28 +0000 Subject: osmium-tool_1.11.0-1~bpo10+1_amd64.changes is NEW Message-ID: binary:osmium-tool is NEW. binary:osmium-tool is NEW. source:osmium-tool is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From gitlab at salsa.debian.org Sun Sep 22 08:18:06 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 22 Sep 2019 07:18:06 +0000 Subject: [Git][debian-gis-team/trollimage][pristine-tar] pristine-tar data for trollimage_1.10.0.orig.tar.gz Message-ID: <5d87202e49c3c_15123faf6a6fc2ec98819@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / trollimage Commits: 293dcb9c by Antonio Valentino at 2019-09-22T06:57:10Z pristine-tar data for trollimage_1.10.0.orig.tar.gz - - - - - 2 changed files: - + trollimage_1.10.0.orig.tar.gz.delta - + trollimage_1.10.0.orig.tar.gz.id Changes: ===================================== trollimage_1.10.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/trollimage_1.10.0.orig.tar.gz.delta differ ===================================== trollimage_1.10.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +0aa01c4e27bd4831ee97109d165d1a302150a842 View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/293dcb9c69c39c7194b50aa74b2cbe03297bd807 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/293dcb9c69c39c7194b50aa74b2cbe03297bd807 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 22 08:18:32 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 22 Sep 2019 07:18:32 +0000 Subject: [Git][debian-gis-team/trollimage] Pushed new tag upstream/1.10.0 Message-ID: <5d8720484daf2_15123faf6a4c9a74992af@godard.mail> Antonio Valentino pushed new tag upstream/1.10.0 at Debian GIS Project / trollimage -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/tree/upstream/1.10.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 22 08:18:29 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 22 Sep 2019 07:18:29 +0000 Subject: [Git][debian-gis-team/trollimage][master] 6 commits: New upstream version 1.10.0 Message-ID: <5d872045e68a_15122b1bbd89b1fc990a2@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / trollimage Commits: b2240f1c by Antonio Valentino at 2019-09-22T06:57:07Z New upstream version 1.10.0 - - - - - e237c503 by Antonio Valentino at 2019-09-22T06:57:10Z Update upstream source from tag 'upstream/1.10.0' Update to upstream version '1.10.0' with Debian dir 4dd150bf59c33418444e99ebb99acdf6657ab925 - - - - - d73135bf by Antonio Valentino at 2019-09-22T06:58:53Z New upstream release - - - - - 1092646e by Antonio Valentino at 2019-09-22T07:02:56Z Refresh all patches - - - - - 03fadfaa by Antonio Valentino at 2019-09-22T07:13:37Z Explicit specification of Rules-Requires-Root in d/control - - - - - e5d9fe50 by Antonio Valentino at 2019-09-22T07:14:02Z Set distribution to unstable - - - - - 9 changed files: - + .pre-commit-config.yaml - CHANGELOG.md - debian/changelog - debian/control - debian/patches/0001-No-display.patch - trollimage/colormap.py - trollimage/tests/test_image.py - trollimage/version.py - trollimage/xrimage.py Changes: ===================================== .pre-commit-config.yaml ===================================== @@ -0,0 +1,8 @@ +exclude: '^$' +fail_fast: false +repos: +- repo: https://github.com/pre-commit/pre-commit-hooks + rev: v1.2.3 + hooks: + - id: flake8 + additional_dependencies: [flake8-docstrings, flake8-debugger, flake8-bugbear] ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,20 @@ +## Version 1.10.0 (2019/09/20) + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 53](https://github.com/pytroll/trollimage/pull/53) - Fix double format passing in saving functions + +#### Features added + +* [PR 55](https://github.com/pytroll/trollimage/pull/55) - Add enhancement-history to the image +* [PR 54](https://github.com/pytroll/trollimage/pull/54) - Add ability to use AreaDefinitions new "crs" property +* [PR 52](https://github.com/pytroll/trollimage/pull/52) - Add 'colors' and 'values' keyword arguments to Colormap + +In this release 4 pull requests were closed. + + ## Version 1.9.0 (2019/06/18) ### Pull Requests Merged ===================================== debian/changelog ===================================== @@ -1,3 +1,13 @@ +trollimage (1.10.0-1) unstable; urgency=medium + + * New upstream release. + * debian/patches: + - refresh all patches + * debian/control: + - explicit specification of Rules-Requires-Root + + -- Antonio Valentino Sun, 22 Sep 2019 07:13:42 +0000 + trollimage (1.9.0-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -3,6 +3,7 @@ Maintainer: Debian GIS Project Uploaders: Antonio Valentino Section: python Testsuite: autopkgtest-pkg-python +Rules-Requires-Root: no Priority: optional Build-Depends: debhelper-compat (= 12), dh-python, ===================================== debian/patches/0001-No-display.patch ===================================== @@ -8,14 +8,14 @@ Skip tests that require display. 1 file changed, 1 insertion(+) diff --git a/trollimage/tests/test_image.py b/trollimage/tests/test_image.py -index 4e7f34e..1b8cc74 100644 +index 06f5fd0..890bd85 100644 --- a/trollimage/tests/test_image.py +++ b/trollimage/tests/test_image.py -@@ -1859,6 +1859,7 @@ class TestXRImage(unittest.TestCase): - def test_putalpha(self): +@@ -1863,6 +1863,7 @@ class TestXRImage(unittest.TestCase): + """Test putalpha.""" pass + @unittest.skip("no display") def test_show(self): - """Test that the show commands calls PIL.show""" + """Test that the show commands calls PIL.show.""" import xarray as xr ===================================== trollimage/colormap.py ===================================== @@ -88,9 +88,13 @@ class Colormap(object): """ - def __init__(self, *tuples): - values = [a for (a, b) in tuples] - colors = [b for (a, b) in tuples] + def __init__(self, *tuples, **kwargs): + if 'colors' in kwargs and 'values' in kwargs: + values = kwargs['values'] + colors = kwargs['colors'] + else: + values = [a for (a, b) in tuples] + colors = [b for (a, b) in tuples] self.values = np.array(values) self.colors = np.array(colors) ===================================== trollimage/tests/test_image.py ===================================== @@ -21,8 +21,7 @@ # You should have received a copy of the GNU General Public License # along with mpop. If not, see . -"""Module for testing the imageo.image module. -""" +"""Module for testing the image and xrimage modules.""" import os import sys import random @@ -58,18 +57,15 @@ class CustomScheduler(object): class TestEmptyImage(unittest.TestCase): - """Class for testing the mpop.imageo.image module - """ + """Class for testing the mpop.imageo.image module.""" def setUp(self): - """Setup the test. - """ + """Set up the test case.""" self.img = image.Image() self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] def test_shape(self): - """Shape of an empty image. - """ + """Shape of an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -77,13 +73,11 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_is_empty(self): - """Test if an image is empty. - """ + """Test if an image is empty.""" self.assertEqual(self.img.is_empty(), True) def test_clip(self): - """Clip an empty image. - """ + """Clip an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -91,8 +85,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_convert(self): - """Convert an empty image. - """ + """Convert an empty image.""" for mode1 in self.modes: for mode2 in self.modes: self.img.convert(mode1) @@ -108,8 +101,7 @@ class TestEmptyImage(unittest.TestCase): self.assertRaises(ValueError, self.img.convert, randstr) def test_stretch(self): - """Stretch an empty image - """ + """Stretch an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -136,8 +128,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_gamma(self): - """Gamma correction on an empty image. - """ + """Gamma correction on an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -158,8 +149,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_invert(self): - """Invert an empty image. - """ + """Invert an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -174,8 +164,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_pil_image(self): - """Return an empty PIL image. - """ + """Return an empty PIL image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -189,8 +178,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_putalpha(self): - """Add an alpha channel to en empty image - """ + """Add an alpha channel to en empty image.""" # Putting alpha channel to an empty image should not do anything except # change the mode if necessary. oldmode = self.img.mode @@ -212,8 +200,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_save(self): - """Save an empty image. - """ + """Save an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -222,8 +209,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_replace_luminance(self): - """Replace luminance in an empty image. - """ + """Replace luminance in an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -234,13 +220,11 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_resize(self): - """Resize an empty image. - """ + """Resize an empty image.""" self.assertRaises(ValueError, self.img.resize, (10, 10)) def test_merge(self): - """Merging of an empty image with another. - """ + """Merging of an empty image with another.""" newimg = image.Image() self.assertRaises(ValueError, self.img.merge, newimg) newimg = image.Image(np.array([[1, 2], [3, 4]])) @@ -250,20 +234,16 @@ class TestEmptyImage(unittest.TestCase): class TestImageCreation(unittest.TestCase): - """Class for testing the mpop.imageo.image module - """ + """Class for testing the mpop.imageo.image module.""" def setUp(self): - """Setup the test. - """ + """Set up the test case.""" self.img = {} self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] self.modes_len = [1, 2, 3, 4, 3, 4, 1, 2] def test_creation(self): - """Creation of an image. - """ - + """Test creation of an image.""" self.assertRaises(TypeError, image.Image, channels=random.randint(1, 1000)) self.assertRaises(TypeError, image.Image, @@ -338,12 +318,10 @@ class TestImageCreation(unittest.TestCase): class TestRegularImage(unittest.TestCase): - """Class for testing the mpop.imageo.image module - """ + """Class for testing the mpop.imageo.image module.""" def setUp(self): - """Setup the test. - """ + """Set up the test case.""" one_channel = np.random.rand(random.randint(1, 10), random.randint(1, 10)) self.rand_img = image.Image(channels=[one_channel] * 3, @@ -370,8 +348,7 @@ class TestRegularImage(unittest.TestCase): os.chmod(self.tempdir, 0o444) def test_shape(self): - """Shape of an image. - """ + """Shape of an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -381,13 +358,11 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_is_empty(self): - """Test if an image is empty. - """ + """Test if an image is empty.""" self.assertEqual(self.img.is_empty(), False) def test_clip(self): - """Clip an image. - """ + """Clip an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -399,8 +374,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_convert(self): - """Convert an image. - """ + """Convert an image.""" i = 0 for mode1 in self.modes: j = 0 @@ -437,8 +411,7 @@ class TestRegularImage(unittest.TestCase): self.assertRaises(ValueError, self.img.convert, randstr) def test_stretch(self): - """Stretch an image. - """ + """Stretch an image.""" oldmode = self.img.mode for mode in "L": @@ -479,8 +452,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_gamma(self): - """Gamma correction on an image. - """ + """Gamma correction on an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -519,8 +491,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_invert(self): - """Invert an image. - """ + """Invert an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -543,11 +514,8 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_pil_image(self): - """Return an PIL image. - """ - + """Return an PIL image.""" # FIXME: Should test on palette images - oldmode = self.img.mode for mode in self.modes: if (mode == "YCbCr" or @@ -561,8 +529,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_putalpha(self): - """Add an alpha channel. - """ + """Add an alpha channel.""" # Putting alpha channel to an image should not do anything except # change the mode if necessary. oldmode = self.img.mode @@ -590,8 +557,7 @@ class TestRegularImage(unittest.TestCase): @unittest.skipIf(sys.platform.startswith('win'), "Read-only tmp dir not working under Windows") def test_save(self): - """Save an image. - """ + """Save an image.""" oldmode = self.img.mode for mode in self.modes: if (mode == "YCbCr" or @@ -614,8 +580,7 @@ class TestRegularImage(unittest.TestCase): @unittest.skipIf(sys.platform.startswith('win'), "Read-only tmp dir not working under Windows") def test_save_jpeg(self): - """Save a jpeg image. - """ + """Save a jpeg image.""" oldmode = self.img.mode self.img.convert('L') self.img.save("test.jpg") @@ -630,8 +595,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_replace_luminance(self): - """Replace luminance in an image. - """ + """Replace luminance in an image.""" oldmode = self.img.mode for mode in self.modes: if (mode == "P" or @@ -651,8 +615,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_resize(self): - """Resize an image. - """ + """Resize an image.""" self.img.resize((6, 6)) res = np.array([[0, 0, 0.5, 0.5, 0.5, 0.5], [0, 0, 0.5, 0.5, 0.5, 0.5], @@ -667,8 +630,7 @@ class TestRegularImage(unittest.TestCase): self.assertTrue(np.all(res == self.img.channels[0])) def test_merge(self): - """Merging of an image with another. - """ + """Merging of an image with another.""" newimg = image.Image() self.assertRaises(ValueError, self.img.merge, newimg) newimg = image.Image(np.array([[1, 2], [3, 4]])) @@ -686,17 +648,16 @@ class TestRegularImage(unittest.TestCase): EPSILON)) def tearDown(self): - """Clean up the mess. - """ + """Clean up the mess.""" os.chmod(self.tempdir, 0o777) os.rmdir(self.tempdir) class TestFlatImage(unittest.TestCase): - """Test a flat image, ie an image where min == max. - """ + """Test a flat image, ie an image where min == max.""" def setUp(self): + """Set up the test case.""" channel = np.ma.array([[0, 0.5, 0.5], [0.5, 0.25, 0.25]], mask=[[1, 1, 1], [1, 1, 0]]) self.img = image.Image(channels=[channel] * 3, @@ -704,8 +665,7 @@ class TestFlatImage(unittest.TestCase): self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] def test_stretch(self): - """Stretch a flat image. - """ + """Stretch a flat image.""" self.img.stretch() self.assertTrue(self.img.channels[0].shape == (2, 3) and np.ma.count_masked(self.img.channels[0]) == 5) @@ -724,10 +684,10 @@ class TestFlatImage(unittest.TestCase): class TestNoDataImage(unittest.TestCase): - """Test an image filled with no data. - """ + """Test an image filled with no data.""" def setUp(self): + """Set up the test case.""" channel = np.ma.array([[0, 0.5, 0.5], [0.5, 0.25, 0.25]], mask=[[1, 1, 1], [1, 1, 1]]) self.img = image.Image(channels=[channel] * 3, @@ -735,8 +695,7 @@ class TestNoDataImage(unittest.TestCase): self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] def test_stretch(self): - """Stretch a no data image. - """ + """Stretch a no data image.""" self.img.stretch() self.assertTrue(self.img.channels[0].shape == (2, 3)) self.img.stretch("crude") @@ -752,16 +711,16 @@ class TestNoDataImage(unittest.TestCase): def random_string(length, choices="abcdefghijklmnopqrstuvwxyz" "ABCDEFGHIJKLMNOPQRSTUVWXYZ"): - """Generates a random string with elements from *set* of the specified - *length*. - """ + """Generate a random string with elements from *set* of the specified *length*.""" return "".join([random.choice(choices) for dummy in range(length)]) class TestXRImage(unittest.TestCase): + """Test XRImage objects.""" def test_init(self): + """Test object initialization.""" import xarray as xr from trollimage import xrimage data = xr.DataArray([[0, 0.5, 0.5], [0.5, 0.25, 0.25]], dims=['y', 'x']) @@ -793,9 +752,23 @@ class TestXRImage(unittest.TestCase): img = xrimage.XRImage(data) self.assertEqual(img.mode, 'YCbCrA') + def test_regression_double_format_save(self): + """Test that double format information isn't passed to save.""" + import xarray as xr + from trollimage import xrimage + + data = xr.DataArray(np.arange(75).reshape(5, 5, 3) / 74., dims=[ + 'y', 'x', 'bands'], coords={'bands': ['R', 'G', 'B']}) + with mock.patch.object(xrimage.XRImage, 'pil_save') as pil_save: + img = xrimage.XRImage(data) + + img.save(filename='bla.png', fformat='png', format='png') + self.assertNotIn('format', pil_save.call_args_list[0][1]) + @unittest.skipIf(sys.platform.startswith('win'), "'NamedTemporaryFile' not supported on Windows") def test_save(self): + """Test saving.""" import xarray as xr import dask.array as da from dask.delayed import Delayed @@ -1236,8 +1209,10 @@ class TestXRImage(unittest.TestCase): img = xrimage.XRImage(data) img.gamma(.5) self.assertTrue(np.allclose(img.data.values, arr ** 2)) + self.assertDictEqual(img.data.attrs['enhancement_history'][0], {'gamma': 0.5}) img.gamma([2., 2., 2.]) + self.assertEqual(len(img.data.attrs['enhancement_history']), 2) self.assertTrue(np.allclose(img.data.values, arr)) def test_crude_stretch(self): @@ -1253,6 +1228,11 @@ class TestXRImage(unittest.TestCase): red = img.data.sel(bands='R') green = img.data.sel(bands='G') blue = img.data.sel(bands='B') + enhs = img.data.attrs['enhancement_history'][0] + scale_expected = np.array([0.01388889, 0.01388889, 0.01388889]) + offset_expected = np.array([0., -0.01388889, -0.02777778]) + np.testing.assert_allclose(enhs['scale'].values, scale_expected) + np.testing.assert_allclose(enhs['offset'].values, offset_expected) np.testing.assert_allclose(red, arr[:, :, 0] / 72.) np.testing.assert_allclose(green, (arr[:, :, 1] - 1.) / (73. - 1.)) np.testing.assert_allclose(blue, (arr[:, :, 2] - 2.) / (74. - 2.)) @@ -1275,7 +1255,8 @@ class TestXRImage(unittest.TestCase): img = xrimage.XRImage(data) img.invert(True) - + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'scale': -1, 'offset': 1}) self.assertTrue(np.allclose(img.data.values, 1 - arr)) data = xr.DataArray(arr.copy(), dims=['y', 'x', 'bands'], @@ -1299,6 +1280,9 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch_linear() + enhs = img.data.attrs['enhancement_history'][0] + np.testing.assert_allclose(enhs['scale'].values, np.array([1.03815937, 1.03815937, 1.03815937])) + np.testing.assert_allclose(enhs['offset'].values, np.array([-0.00505051, -0.01907969, -0.03310887]), atol=1e-8) res = np.array([[[-0.005051, -0.005051, -0.005051], [0.037037, 0.037037, 0.037037], [0.079125, 0.079125, 0.079125], @@ -1328,6 +1312,7 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_histogram_stretch(self): + """Test histogram stretching.""" import xarray as xr from trollimage import xrimage @@ -1336,6 +1321,8 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch('histogram') + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'hist_equalize': True}) res = np.array([[[0., 0., 0.], [0.04166667, 0.04166667, 0.04166667], [0.08333333, 0.08333333, 0.08333333], @@ -1369,6 +1356,7 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_logarithmic_stretch(self): + """Test logarithmic strecthing.""" import xarray as xr from trollimage import xrimage @@ -1377,6 +1365,8 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch(stretch='logarithmic') + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'log_factor': 100.0}) res = np.array([[[0., 0., 0.], [0.35484693, 0.35484693, 0.35484693], [0.48307087, 0.48307087, 0.48307087], @@ -1410,7 +1400,7 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_weber_fechner_stretch(self): - """S=2.3klog10I+C """ + """Test applying S=2.3klog10I+C to the data.""" import xarray as xr from trollimage import xrimage @@ -1419,6 +1409,8 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch_weber_fechner(2.5, 0.2) + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'weber_fechner': (2.5, 0.2)}) res = np.array([[[-np.inf, -6.73656795, -5.0037], [-3.99003723, -3.27083205, -2.71297317], [-2.25716928, -1.87179258, -1.5379641], @@ -1452,27 +1444,35 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_jpeg_save(self): + """Test saving to jpeg.""" pass def test_gtiff_save(self): + """Test saving to geotiff.""" pass def test_save_masked(self): + """Test saving masked data.""" pass def test_LA_save(self): + """Test LA saving.""" pass def test_L_save(self): + """Test L saving.""" pass def test_P_save(self): + """Test P saving.""" pass def test_PA_save(self): + """Test PA saving.""" pass def test_convert_modes(self): + """Test modes convertions.""" import dask import xarray as xr from trollimage import xrimage @@ -1782,7 +1782,7 @@ class TestXRImage(unittest.TestCase): self.assertTupleEqual((2, 4), bw.colors.shape) def test_stack(self): - + """Test stack.""" import xarray as xr from trollimage import xrimage @@ -1810,9 +1810,11 @@ class TestXRImage(unittest.TestCase): np.testing.assert_allclose(bkg.data, res.data, rtol=1e-05) def test_merge(self): + """Test merge.""" pass def test_blend(self): + """Test blend.""" import xarray as xr from trollimage import xrimage @@ -1854,13 +1856,15 @@ class TestXRImage(unittest.TestCase): img1.blend(wrongimg) def test_replace_luminance(self): + """Test luminance replacement.""" pass def test_putalpha(self): + """Test putalpha.""" pass def test_show(self): - """Test that the show commands calls PIL.show""" + """Test that the show commands calls PIL.show.""" import xarray as xr from trollimage import xrimage @@ -1873,7 +1877,7 @@ class TestXRImage(unittest.TestCase): def suite(): - """The suite for test_image.""" + """Create the suite for test_image.""" loader = unittest.TestLoader() mysuite = unittest.TestSuite() mysuite.addTest(loader.loadTestsFromTestCase(TestEmptyImage)) ===================================== trollimage/version.py ===================================== @@ -23,9 +23,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (tag: v1.9.0)" - git_full = "63fa32f2d40bb65ebc39c4be1fb1baf8f163db98" - git_date = "2019-06-18 06:13:40 -0500" + git_refnames = " (HEAD -> master, tag: v1.10.0)" + git_full = "b1fb06cbf6ef8b23e5816c423df1eeaf8e76d606" + git_date = "2019-09-20 09:41:02 +0200" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords ===================================== trollimage/xrimage.py ===================================== @@ -21,12 +21,14 @@ # # You should have received a copy of the GNU General Public License # along with this program. If not, see . -"""This module defines the XRImage class. It overlaps largely with the PIL -library, but has the advantage of using :class:`~xarray.DataArray` objects -backed by :class:`dask arrays ` as pixel arrays. This -allows for invalid values to be tracked, metadata to be assigned, and -stretching to be lazy evaluated. With the optional ``rasterio`` library -installed dask array chunks can be saved in parallel. +"""This module defines the XRImage class. + +It overlaps largely with the PIL library, but has the advantage of using +:class:`~xarray.DataArray` objects backed by :class:`dask arrays +` as pixel arrays. This allows for invalid values to +be tracked, metadata to be assigned, and stretching to be lazy +evaluated. With the optional ``rasterio`` library installed dask array +chunks can be saved in parallel. """ @@ -96,12 +98,14 @@ class RIOFile(object): indexes=indexes) def open(self, mode=None): + """Open the file.""" mode = mode or self.mode if self._closed: self.rfile = rasterio.open(self.path, mode, **self.kwargs) self._closed = False def close(self): + """Close the file.""" if not self._closed: if self.overviews: logger.debug('Building overviews %s', str(self.overviews)) @@ -119,6 +123,7 @@ class RIOFile(object): self.close() def __del__(self): + """Delete the instance.""" try: self.close() except (IOError, OSError): @@ -168,8 +173,13 @@ def color_interp(data): class XRImage(object): """Image class using an :class:`xarray.DataArray` as internal storage. - It can be saved to a variety of image formats, but if Rasterio is installed, - it can save to geotiff and jpeg2000 with geographical information. + It can be saved to a variety of image formats, but if Rasterio is + installed, it can save to geotiff and jpeg2000 with geographical + information. + + The enhancements functions are recording some parameters in the image's + data attribute called `enhancement_history`. + """ def __init__(self, data): @@ -252,7 +262,8 @@ class XRImage(object): saving with rasterio, used with keep_palette=True. Should be uint8. format_kwargs: Additional format options to pass to `rasterio` - or `PIL` saving methods. + or `PIL` saving methods. Any format argument passed + at this stage would be superseeded by `fformat`. Returns: Either `None` if `compute` is True or a `dask.Delayed` object or @@ -263,7 +274,8 @@ class XRImage(object): the caller. """ - fformat = fformat or os.path.splitext(filename)[1][1:4] + kwformat = format_kwargs.pop('format', None) + fformat = fformat or kwformat or os.path.splitext(filename)[1][1:] if fformat in ('tif', 'jp2') and rasterio: return self.rio_save(filename, fformat=fformat, fill_value=fill_value, compute=compute, @@ -284,7 +296,7 @@ class XRImage(object): img.rio_save('myfile.tif', overviews=[2, 4, 8, 16]) """ - fformat = fformat or os.path.splitext(filename)[1][1:4] + fformat = fformat or os.path.splitext(filename)[1][1:] drivers = {'jpg': 'JPEG', 'png': 'PNG', 'tif': 'GTiff', @@ -318,7 +330,11 @@ class XRImage(object): photometric_map[mode.upper()]) try: - crs = rasterio.crs.CRS(data.attrs['area'].proj_dict) + area = data.attrs['area'] + if hasattr(area, 'crs'): + crs = rasterio.crs.CRS.from_wkt(area.crs.to_wkt()) + else: + crs = rasterio.crs.CRS(data.attrs['area'].proj_dict) west, south, east, north = data.attrs['area'].area_extent height, width = data.sizes['y'], data.sizes['x'] transform = rasterio.transform.from_bounds(west, south, @@ -380,10 +396,11 @@ class XRImage(object): compute=True, **format_kwargs): """Save the image to the given *filename* using PIL. - For now, the compression level [0-9] is ignored, due to PIL's lack of - support. See also :meth:`save`. + For now, the compression level [0-9] is ignored, due to PIL's + lack of support. See also :meth:`save`. + """ - fformat = fformat or os.path.splitext(filename)[1][1:4] + fformat = fformat or os.path.splitext(filename)[1][1:] fformat = check_image_format(fformat) if fformat == 'png': @@ -402,6 +419,7 @@ class XRImage(object): Inspired by: public domain, Nick Galbreath http://blog.modp.com/2007/08/python-pil-and-png-metadata-take-2.html + """ reserved = ('interlace', 'gamma', 'dpi', 'transparency', 'aspect') @@ -531,8 +549,7 @@ class XRImage(object): return data def _l2rgb(self, mode): - """Convert from L (black and white) to RGB. - """ + """Convert from L (black and white) to RGB.""" self._check_modes(("L", "LA")) bands = ["L"] * 3 @@ -543,6 +560,7 @@ class XRImage(object): return data def convert(self, mode): + """Convert image to *mode*.""" if mode == self.mode: return self.__class__(self.data) @@ -588,7 +606,7 @@ class XRImage(object): return new_img def _finalize(self, fill_value=None, dtype=np.uint8, keep_palette=False, cmap=None): - """Wrapper around 'finalize' method for backwards compatibility.""" + """Wrap around 'finalize' method for backwards compatibility.""" import warnings warnings.warn("'_finalize' is deprecated, use 'finalize' instead.", DeprecationWarning) @@ -597,13 +615,14 @@ class XRImage(object): def finalize(self, fill_value=None, dtype=np.uint8, keep_palette=False, cmap=None): """Finalize the image to be written to an output file. - This adds an alpha band or fills data with a fill_value (if specified). - It also scales float data to the output range of the data type (0-255 - for uint8, default). For integer input data this method assumes the - data is already scaled to the proper desired range. It will still fill - in invalid values and add an alpha band if needed. Integer input - data's fill value is determined by a special ``_FillValue`` attribute - in the ``DataArray`` ``.attrs`` dictionary. + This adds an alpha band or fills data with a fill_value (if + specified). It also scales float data to the output range of the + data type (0-255 for uint8, default). For integer input data + this method assumes the data is already scaled to the proper + desired range. It will still fill in invalid values and add an + alpha band if needed. Integer input data's fill value is + determined by a special ``_FillValue`` attribute in the + ``DataArray`` ``.attrs`` dictionary. """ if keep_palette and not self.mode.startswith('P'): @@ -674,19 +693,20 @@ class XRImage(object): dims=['bands'], coords={'bands': self.data['bands']}) - def gamma(self, gamma=1.0): + def gamma(self, gamma=None): """Apply gamma correction to the channels of the image. - If *gamma* is a - tuple, then it should have as many elements as the channels of the - image, and the gamma correction is applied elementwise. If *gamma* is a - number, the same gamma correction is applied on every channel, if there - are several channels in the image. The behaviour of :func:`gamma` is - undefined outside the normal [0,1] range of the channels. + If *gamma* is a tuple, then it should have as many elements as + the channels of the image, and the gamma correction is applied + elementwise. If *gamma* is a number, the same gamma correction + is applied on every channel, if there are several channels in + the image. The behaviour of :func:`gamma` is undefined outside + the normal [0,1] range of the channels. + """ if isinstance(gamma, (list, tuple)): gamma = self.xrify_tuples(gamma) - elif gamma == 1.0: + elif gamma is None or gamma == 1.0: return logger.debug("Applying gamma %s", str(gamma)) @@ -694,18 +714,21 @@ class XRImage(object): self.data = self.data.clip(min=0) self.data **= 1.0 / gamma self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'gamma': gamma}) def stretch(self, stretch="crude", **kwargs): """Apply stretching to the current image. - The value of *stretch* sets the type of stretching applied. The values - "histogram", "linear", "crude" (or "crude-stretch") perform respectively - histogram equalization, contrast stretching (with 5% cutoff on both - sides), and contrast stretching without cutoff. The value "logarithmic" - or "log" will do a logarithmic enhancement towards white. If a tuple or - a list of two values is given as input, then a contrast stretching is - performed with the values as cutoff. These values should be normalized - in the range [0.0,1.0]. + The value of *stretch* sets the type of stretching applied. The + values "histogram", "linear", "crude" (or "crude-stretch") + perform respectively histogram equalization, contrast stretching + (with 5% cutoff on both sides), and contrast stretching without + cutoff. The value "logarithmic" or "log" will do a logarithmic + enhancement towards white. If a tuple or a list of two values is + given as input, then a contrast stretching is performed with the + values as cutoff. These values should be normalized in the range + [0.0,1.0]. + """ logger.debug("Applying stretch %s with parameters %s", stretch, str(kwargs)) @@ -735,7 +758,7 @@ class XRImage(object): @staticmethod def _compute_quantile(data, dims, cutoffs): - """Helper method for stretch_linear. + """Compute quantile for stretch_linear. Dask delayed functions need to be non-internal functions (created inside a function) to be serializable on a multi-process scheduler. @@ -756,6 +779,7 @@ class XRImage(object): """Stretch linearly the contrast of the current image. Use *cutoffs* for left and right trimming. + """ logger.debug("Perform a linear contrast stretch.") @@ -786,8 +810,9 @@ class XRImage(object): def crude_stretch(self, min_stretch=None, max_stretch=None): """Perform simple linear stretching. - This is done without any cutoff on the current image and normalizes to - the [0,1] range. + This is done without any cutoff on the current image and + normalizes to the [0,1] range. + """ if min_stretch is None: non_band_dims = tuple(x for x in self.data.dims if x != 'bands') @@ -808,9 +833,12 @@ class XRImage(object): else: scale_factor = 1.0 / delta attrs = self.data.attrs - self.data -= min_stretch + offset = -min_stretch * scale_factor self.data *= scale_factor + self.data += offset self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'scale': scale_factor, + 'offset': offset}) def stretch_hist_equalize(self, approximate=False): """Stretch the current image's colors through histogram equalization. @@ -858,6 +886,7 @@ class XRImage(object): band_results.append(self.data.sel(bands='A')) self.data.data = da.stack(band_results, axis=self.data.dims.index('bands')) + self.data.attrs.setdefault('enhancement_history', []).append({'hist_equalize': True}) def stretch_logarithmic(self, factor=100.): """Move data into range [1:factor] through normalized logarithm.""" @@ -885,6 +914,7 @@ class XRImage(object): band_results.append(self.data.sel(bands='A')) self.data.data = da.stack(band_results, axis=self.data.dims.index('bands')) + self.data.attrs.setdefault('enhancement_history', []).append({'log_factor': factor}) def stretch_weber_fechner(self, k, s0): """Stretch according to the Weber-Fechner law. @@ -892,10 +922,12 @@ class XRImage(object): p = k.ln(S/S0) p is perception, S is the stimulus, S0 is the stimulus threshold (the highest unpercieved stimulus), and k is the factor. + """ attrs = self.data.attrs self.data = k * xu.log(self.data / s0) self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'weber_fechner': (k, s0)}) def invert(self, invert=True): """Inverts all the channels of a image according to *invert*. @@ -905,6 +937,7 @@ class XRImage(object): Note: 'Inverting' means that black becomes white, and vice-versa, not that the values are negated ! + """ logger.debug("Applying invert with parameters %s", str(invert)) if isinstance(invert, (tuple, list)): @@ -917,10 +950,11 @@ class XRImage(object): attrs = self.data.attrs self.data = self.data * scale + offset self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'scale': scale, + 'offset': offset}) def stack(self, img): - """Stack the provided image on top of the current image. - """ + """Stack the provided image on top of the current image.""" # TODO: Conversions between different modes with notification # to the user, i.e. proper logging if self.mode != img.mode: @@ -929,8 +963,10 @@ class XRImage(object): self.data = self.data.where(img.data.isnull(), img.data) def merge(self, img): - """Use the provided image as background for the current *img* image, - that is if the current image has missing data. + """Use the provided image as background for the current *img* image. + + That is if the current image has missing data. + """ raise NotImplementedError("This method has not be implemented for " "xarray support.") @@ -966,7 +1002,6 @@ class XRImage(object): Works only on "L" or "LA" images. """ - if self.mode not in ("L", "LA"): raise ValueError("Image should be grayscale to colorize") @@ -997,7 +1032,7 @@ class XRImage(object): @staticmethod def _palettize(data, colormap): - """Helper for dask-friendly palettize operation.""" + """Operate in a dask-friendly manner.""" # returns data and palette, only need data return colormap.palettize(data)[0] @@ -1009,7 +1044,6 @@ class XRImage(object): Works only on "L" or "LA" images. """ - if self.mode not in ("L", "LA"): raise ValueError("Image should be grayscale to colorize") View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/compare/57d3249a227f615b25b3dce3f2895222e5026410...e5d9fe50bab42c71a3c76ce1409589efee173fb8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/compare/57d3249a227f615b25b3dce3f2895222e5026410...e5d9fe50bab42c71a3c76ce1409589efee173fb8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 22 08:18:33 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Sun, 22 Sep 2019 07:18:33 +0000 Subject: [Git][debian-gis-team/trollimage][upstream] New upstream version 1.10.0 Message-ID: <5d8720491edf8_15123faf6a2a9618993b0@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / trollimage Commits: b2240f1c by Antonio Valentino at 2019-09-22T06:57:07Z New upstream version 1.10.0 - - - - - 6 changed files: - + .pre-commit-config.yaml - CHANGELOG.md - trollimage/colormap.py - trollimage/tests/test_image.py - trollimage/version.py - trollimage/xrimage.py Changes: ===================================== .pre-commit-config.yaml ===================================== @@ -0,0 +1,8 @@ +exclude: '^$' +fail_fast: false +repos: +- repo: https://github.com/pre-commit/pre-commit-hooks + rev: v1.2.3 + hooks: + - id: flake8 + additional_dependencies: [flake8-docstrings, flake8-debugger, flake8-bugbear] ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,20 @@ +## Version 1.10.0 (2019/09/20) + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 53](https://github.com/pytroll/trollimage/pull/53) - Fix double format passing in saving functions + +#### Features added + +* [PR 55](https://github.com/pytroll/trollimage/pull/55) - Add enhancement-history to the image +* [PR 54](https://github.com/pytroll/trollimage/pull/54) - Add ability to use AreaDefinitions new "crs" property +* [PR 52](https://github.com/pytroll/trollimage/pull/52) - Add 'colors' and 'values' keyword arguments to Colormap + +In this release 4 pull requests were closed. + + ## Version 1.9.0 (2019/06/18) ### Pull Requests Merged ===================================== trollimage/colormap.py ===================================== @@ -88,9 +88,13 @@ class Colormap(object): """ - def __init__(self, *tuples): - values = [a for (a, b) in tuples] - colors = [b for (a, b) in tuples] + def __init__(self, *tuples, **kwargs): + if 'colors' in kwargs and 'values' in kwargs: + values = kwargs['values'] + colors = kwargs['colors'] + else: + values = [a for (a, b) in tuples] + colors = [b for (a, b) in tuples] self.values = np.array(values) self.colors = np.array(colors) ===================================== trollimage/tests/test_image.py ===================================== @@ -21,8 +21,7 @@ # You should have received a copy of the GNU General Public License # along with mpop. If not, see . -"""Module for testing the imageo.image module. -""" +"""Module for testing the image and xrimage modules.""" import os import sys import random @@ -58,18 +57,15 @@ class CustomScheduler(object): class TestEmptyImage(unittest.TestCase): - """Class for testing the mpop.imageo.image module - """ + """Class for testing the mpop.imageo.image module.""" def setUp(self): - """Setup the test. - """ + """Set up the test case.""" self.img = image.Image() self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] def test_shape(self): - """Shape of an empty image. - """ + """Shape of an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -77,13 +73,11 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_is_empty(self): - """Test if an image is empty. - """ + """Test if an image is empty.""" self.assertEqual(self.img.is_empty(), True) def test_clip(self): - """Clip an empty image. - """ + """Clip an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -91,8 +85,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_convert(self): - """Convert an empty image. - """ + """Convert an empty image.""" for mode1 in self.modes: for mode2 in self.modes: self.img.convert(mode1) @@ -108,8 +101,7 @@ class TestEmptyImage(unittest.TestCase): self.assertRaises(ValueError, self.img.convert, randstr) def test_stretch(self): - """Stretch an empty image - """ + """Stretch an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -136,8 +128,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_gamma(self): - """Gamma correction on an empty image. - """ + """Gamma correction on an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -158,8 +149,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_invert(self): - """Invert an empty image. - """ + """Invert an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -174,8 +164,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_pil_image(self): - """Return an empty PIL image. - """ + """Return an empty PIL image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -189,8 +178,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_putalpha(self): - """Add an alpha channel to en empty image - """ + """Add an alpha channel to en empty image.""" # Putting alpha channel to an empty image should not do anything except # change the mode if necessary. oldmode = self.img.mode @@ -212,8 +200,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_save(self): - """Save an empty image. - """ + """Save an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -222,8 +209,7 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_replace_luminance(self): - """Replace luminance in an empty image. - """ + """Replace luminance in an empty image.""" oldmode = self.img.mode for mode in self.modes: self.img.convert(mode) @@ -234,13 +220,11 @@ class TestEmptyImage(unittest.TestCase): self.img.convert(oldmode) def test_resize(self): - """Resize an empty image. - """ + """Resize an empty image.""" self.assertRaises(ValueError, self.img.resize, (10, 10)) def test_merge(self): - """Merging of an empty image with another. - """ + """Merging of an empty image with another.""" newimg = image.Image() self.assertRaises(ValueError, self.img.merge, newimg) newimg = image.Image(np.array([[1, 2], [3, 4]])) @@ -250,20 +234,16 @@ class TestEmptyImage(unittest.TestCase): class TestImageCreation(unittest.TestCase): - """Class for testing the mpop.imageo.image module - """ + """Class for testing the mpop.imageo.image module.""" def setUp(self): - """Setup the test. - """ + """Set up the test case.""" self.img = {} self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] self.modes_len = [1, 2, 3, 4, 3, 4, 1, 2] def test_creation(self): - """Creation of an image. - """ - + """Test creation of an image.""" self.assertRaises(TypeError, image.Image, channels=random.randint(1, 1000)) self.assertRaises(TypeError, image.Image, @@ -338,12 +318,10 @@ class TestImageCreation(unittest.TestCase): class TestRegularImage(unittest.TestCase): - """Class for testing the mpop.imageo.image module - """ + """Class for testing the mpop.imageo.image module.""" def setUp(self): - """Setup the test. - """ + """Set up the test case.""" one_channel = np.random.rand(random.randint(1, 10), random.randint(1, 10)) self.rand_img = image.Image(channels=[one_channel] * 3, @@ -370,8 +348,7 @@ class TestRegularImage(unittest.TestCase): os.chmod(self.tempdir, 0o444) def test_shape(self): - """Shape of an image. - """ + """Shape of an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -381,13 +358,11 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_is_empty(self): - """Test if an image is empty. - """ + """Test if an image is empty.""" self.assertEqual(self.img.is_empty(), False) def test_clip(self): - """Clip an image. - """ + """Clip an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -399,8 +374,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_convert(self): - """Convert an image. - """ + """Convert an image.""" i = 0 for mode1 in self.modes: j = 0 @@ -437,8 +411,7 @@ class TestRegularImage(unittest.TestCase): self.assertRaises(ValueError, self.img.convert, randstr) def test_stretch(self): - """Stretch an image. - """ + """Stretch an image.""" oldmode = self.img.mode for mode in "L": @@ -479,8 +452,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_gamma(self): - """Gamma correction on an image. - """ + """Gamma correction on an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -519,8 +491,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_invert(self): - """Invert an image. - """ + """Invert an image.""" oldmode = self.img.mode for mode in self.modes: if mode == "P" or mode == "PA": @@ -543,11 +514,8 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_pil_image(self): - """Return an PIL image. - """ - + """Return an PIL image.""" # FIXME: Should test on palette images - oldmode = self.img.mode for mode in self.modes: if (mode == "YCbCr" or @@ -561,8 +529,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_putalpha(self): - """Add an alpha channel. - """ + """Add an alpha channel.""" # Putting alpha channel to an image should not do anything except # change the mode if necessary. oldmode = self.img.mode @@ -590,8 +557,7 @@ class TestRegularImage(unittest.TestCase): @unittest.skipIf(sys.platform.startswith('win'), "Read-only tmp dir not working under Windows") def test_save(self): - """Save an image. - """ + """Save an image.""" oldmode = self.img.mode for mode in self.modes: if (mode == "YCbCr" or @@ -614,8 +580,7 @@ class TestRegularImage(unittest.TestCase): @unittest.skipIf(sys.platform.startswith('win'), "Read-only tmp dir not working under Windows") def test_save_jpeg(self): - """Save a jpeg image. - """ + """Save a jpeg image.""" oldmode = self.img.mode self.img.convert('L') self.img.save("test.jpg") @@ -630,8 +595,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_replace_luminance(self): - """Replace luminance in an image. - """ + """Replace luminance in an image.""" oldmode = self.img.mode for mode in self.modes: if (mode == "P" or @@ -651,8 +615,7 @@ class TestRegularImage(unittest.TestCase): self.img.convert(oldmode) def test_resize(self): - """Resize an image. - """ + """Resize an image.""" self.img.resize((6, 6)) res = np.array([[0, 0, 0.5, 0.5, 0.5, 0.5], [0, 0, 0.5, 0.5, 0.5, 0.5], @@ -667,8 +630,7 @@ class TestRegularImage(unittest.TestCase): self.assertTrue(np.all(res == self.img.channels[0])) def test_merge(self): - """Merging of an image with another. - """ + """Merging of an image with another.""" newimg = image.Image() self.assertRaises(ValueError, self.img.merge, newimg) newimg = image.Image(np.array([[1, 2], [3, 4]])) @@ -686,17 +648,16 @@ class TestRegularImage(unittest.TestCase): EPSILON)) def tearDown(self): - """Clean up the mess. - """ + """Clean up the mess.""" os.chmod(self.tempdir, 0o777) os.rmdir(self.tempdir) class TestFlatImage(unittest.TestCase): - """Test a flat image, ie an image where min == max. - """ + """Test a flat image, ie an image where min == max.""" def setUp(self): + """Set up the test case.""" channel = np.ma.array([[0, 0.5, 0.5], [0.5, 0.25, 0.25]], mask=[[1, 1, 1], [1, 1, 0]]) self.img = image.Image(channels=[channel] * 3, @@ -704,8 +665,7 @@ class TestFlatImage(unittest.TestCase): self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] def test_stretch(self): - """Stretch a flat image. - """ + """Stretch a flat image.""" self.img.stretch() self.assertTrue(self.img.channels[0].shape == (2, 3) and np.ma.count_masked(self.img.channels[0]) == 5) @@ -724,10 +684,10 @@ class TestFlatImage(unittest.TestCase): class TestNoDataImage(unittest.TestCase): - """Test an image filled with no data. - """ + """Test an image filled with no data.""" def setUp(self): + """Set up the test case.""" channel = np.ma.array([[0, 0.5, 0.5], [0.5, 0.25, 0.25]], mask=[[1, 1, 1], [1, 1, 1]]) self.img = image.Image(channels=[channel] * 3, @@ -735,8 +695,7 @@ class TestNoDataImage(unittest.TestCase): self.modes = ["L", "LA", "RGB", "RGBA", "YCbCr", "YCbCrA", "P", "PA"] def test_stretch(self): - """Stretch a no data image. - """ + """Stretch a no data image.""" self.img.stretch() self.assertTrue(self.img.channels[0].shape == (2, 3)) self.img.stretch("crude") @@ -752,16 +711,16 @@ class TestNoDataImage(unittest.TestCase): def random_string(length, choices="abcdefghijklmnopqrstuvwxyz" "ABCDEFGHIJKLMNOPQRSTUVWXYZ"): - """Generates a random string with elements from *set* of the specified - *length*. - """ + """Generate a random string with elements from *set* of the specified *length*.""" return "".join([random.choice(choices) for dummy in range(length)]) class TestXRImage(unittest.TestCase): + """Test XRImage objects.""" def test_init(self): + """Test object initialization.""" import xarray as xr from trollimage import xrimage data = xr.DataArray([[0, 0.5, 0.5], [0.5, 0.25, 0.25]], dims=['y', 'x']) @@ -793,9 +752,23 @@ class TestXRImage(unittest.TestCase): img = xrimage.XRImage(data) self.assertEqual(img.mode, 'YCbCrA') + def test_regression_double_format_save(self): + """Test that double format information isn't passed to save.""" + import xarray as xr + from trollimage import xrimage + + data = xr.DataArray(np.arange(75).reshape(5, 5, 3) / 74., dims=[ + 'y', 'x', 'bands'], coords={'bands': ['R', 'G', 'B']}) + with mock.patch.object(xrimage.XRImage, 'pil_save') as pil_save: + img = xrimage.XRImage(data) + + img.save(filename='bla.png', fformat='png', format='png') + self.assertNotIn('format', pil_save.call_args_list[0][1]) + @unittest.skipIf(sys.platform.startswith('win'), "'NamedTemporaryFile' not supported on Windows") def test_save(self): + """Test saving.""" import xarray as xr import dask.array as da from dask.delayed import Delayed @@ -1236,8 +1209,10 @@ class TestXRImage(unittest.TestCase): img = xrimage.XRImage(data) img.gamma(.5) self.assertTrue(np.allclose(img.data.values, arr ** 2)) + self.assertDictEqual(img.data.attrs['enhancement_history'][0], {'gamma': 0.5}) img.gamma([2., 2., 2.]) + self.assertEqual(len(img.data.attrs['enhancement_history']), 2) self.assertTrue(np.allclose(img.data.values, arr)) def test_crude_stretch(self): @@ -1253,6 +1228,11 @@ class TestXRImage(unittest.TestCase): red = img.data.sel(bands='R') green = img.data.sel(bands='G') blue = img.data.sel(bands='B') + enhs = img.data.attrs['enhancement_history'][0] + scale_expected = np.array([0.01388889, 0.01388889, 0.01388889]) + offset_expected = np.array([0., -0.01388889, -0.02777778]) + np.testing.assert_allclose(enhs['scale'].values, scale_expected) + np.testing.assert_allclose(enhs['offset'].values, offset_expected) np.testing.assert_allclose(red, arr[:, :, 0] / 72.) np.testing.assert_allclose(green, (arr[:, :, 1] - 1.) / (73. - 1.)) np.testing.assert_allclose(blue, (arr[:, :, 2] - 2.) / (74. - 2.)) @@ -1275,7 +1255,8 @@ class TestXRImage(unittest.TestCase): img = xrimage.XRImage(data) img.invert(True) - + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'scale': -1, 'offset': 1}) self.assertTrue(np.allclose(img.data.values, 1 - arr)) data = xr.DataArray(arr.copy(), dims=['y', 'x', 'bands'], @@ -1299,6 +1280,9 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch_linear() + enhs = img.data.attrs['enhancement_history'][0] + np.testing.assert_allclose(enhs['scale'].values, np.array([1.03815937, 1.03815937, 1.03815937])) + np.testing.assert_allclose(enhs['offset'].values, np.array([-0.00505051, -0.01907969, -0.03310887]), atol=1e-8) res = np.array([[[-0.005051, -0.005051, -0.005051], [0.037037, 0.037037, 0.037037], [0.079125, 0.079125, 0.079125], @@ -1328,6 +1312,7 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_histogram_stretch(self): + """Test histogram stretching.""" import xarray as xr from trollimage import xrimage @@ -1336,6 +1321,8 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch('histogram') + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'hist_equalize': True}) res = np.array([[[0., 0., 0.], [0.04166667, 0.04166667, 0.04166667], [0.08333333, 0.08333333, 0.08333333], @@ -1369,6 +1356,7 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_logarithmic_stretch(self): + """Test logarithmic strecthing.""" import xarray as xr from trollimage import xrimage @@ -1377,6 +1365,8 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch(stretch='logarithmic') + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'log_factor': 100.0}) res = np.array([[[0., 0., 0.], [0.35484693, 0.35484693, 0.35484693], [0.48307087, 0.48307087, 0.48307087], @@ -1410,7 +1400,7 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_weber_fechner_stretch(self): - """S=2.3klog10I+C """ + """Test applying S=2.3klog10I+C to the data.""" import xarray as xr from trollimage import xrimage @@ -1419,6 +1409,8 @@ class TestXRImage(unittest.TestCase): coords={'bands': ['R', 'G', 'B']}) img = xrimage.XRImage(data) img.stretch_weber_fechner(2.5, 0.2) + enhs = img.data.attrs['enhancement_history'][0] + self.assertDictEqual(enhs, {'weber_fechner': (2.5, 0.2)}) res = np.array([[[-np.inf, -6.73656795, -5.0037], [-3.99003723, -3.27083205, -2.71297317], [-2.25716928, -1.87179258, -1.5379641], @@ -1452,27 +1444,35 @@ class TestXRImage(unittest.TestCase): self.assertTrue(np.allclose(img.data.values, res, atol=1.e-6)) def test_jpeg_save(self): + """Test saving to jpeg.""" pass def test_gtiff_save(self): + """Test saving to geotiff.""" pass def test_save_masked(self): + """Test saving masked data.""" pass def test_LA_save(self): + """Test LA saving.""" pass def test_L_save(self): + """Test L saving.""" pass def test_P_save(self): + """Test P saving.""" pass def test_PA_save(self): + """Test PA saving.""" pass def test_convert_modes(self): + """Test modes convertions.""" import dask import xarray as xr from trollimage import xrimage @@ -1782,7 +1782,7 @@ class TestXRImage(unittest.TestCase): self.assertTupleEqual((2, 4), bw.colors.shape) def test_stack(self): - + """Test stack.""" import xarray as xr from trollimage import xrimage @@ -1810,9 +1810,11 @@ class TestXRImage(unittest.TestCase): np.testing.assert_allclose(bkg.data, res.data, rtol=1e-05) def test_merge(self): + """Test merge.""" pass def test_blend(self): + """Test blend.""" import xarray as xr from trollimage import xrimage @@ -1854,13 +1856,15 @@ class TestXRImage(unittest.TestCase): img1.blend(wrongimg) def test_replace_luminance(self): + """Test luminance replacement.""" pass def test_putalpha(self): + """Test putalpha.""" pass def test_show(self): - """Test that the show commands calls PIL.show""" + """Test that the show commands calls PIL.show.""" import xarray as xr from trollimage import xrimage @@ -1873,7 +1877,7 @@ class TestXRImage(unittest.TestCase): def suite(): - """The suite for test_image.""" + """Create the suite for test_image.""" loader = unittest.TestLoader() mysuite = unittest.TestSuite() mysuite.addTest(loader.loadTestsFromTestCase(TestEmptyImage)) ===================================== trollimage/version.py ===================================== @@ -23,9 +23,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (tag: v1.9.0)" - git_full = "63fa32f2d40bb65ebc39c4be1fb1baf8f163db98" - git_date = "2019-06-18 06:13:40 -0500" + git_refnames = " (HEAD -> master, tag: v1.10.0)" + git_full = "b1fb06cbf6ef8b23e5816c423df1eeaf8e76d606" + git_date = "2019-09-20 09:41:02 +0200" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords ===================================== trollimage/xrimage.py ===================================== @@ -21,12 +21,14 @@ # # You should have received a copy of the GNU General Public License # along with this program. If not, see . -"""This module defines the XRImage class. It overlaps largely with the PIL -library, but has the advantage of using :class:`~xarray.DataArray` objects -backed by :class:`dask arrays ` as pixel arrays. This -allows for invalid values to be tracked, metadata to be assigned, and -stretching to be lazy evaluated. With the optional ``rasterio`` library -installed dask array chunks can be saved in parallel. +"""This module defines the XRImage class. + +It overlaps largely with the PIL library, but has the advantage of using +:class:`~xarray.DataArray` objects backed by :class:`dask arrays +` as pixel arrays. This allows for invalid values to +be tracked, metadata to be assigned, and stretching to be lazy +evaluated. With the optional ``rasterio`` library installed dask array +chunks can be saved in parallel. """ @@ -96,12 +98,14 @@ class RIOFile(object): indexes=indexes) def open(self, mode=None): + """Open the file.""" mode = mode or self.mode if self._closed: self.rfile = rasterio.open(self.path, mode, **self.kwargs) self._closed = False def close(self): + """Close the file.""" if not self._closed: if self.overviews: logger.debug('Building overviews %s', str(self.overviews)) @@ -119,6 +123,7 @@ class RIOFile(object): self.close() def __del__(self): + """Delete the instance.""" try: self.close() except (IOError, OSError): @@ -168,8 +173,13 @@ def color_interp(data): class XRImage(object): """Image class using an :class:`xarray.DataArray` as internal storage. - It can be saved to a variety of image formats, but if Rasterio is installed, - it can save to geotiff and jpeg2000 with geographical information. + It can be saved to a variety of image formats, but if Rasterio is + installed, it can save to geotiff and jpeg2000 with geographical + information. + + The enhancements functions are recording some parameters in the image's + data attribute called `enhancement_history`. + """ def __init__(self, data): @@ -252,7 +262,8 @@ class XRImage(object): saving with rasterio, used with keep_palette=True. Should be uint8. format_kwargs: Additional format options to pass to `rasterio` - or `PIL` saving methods. + or `PIL` saving methods. Any format argument passed + at this stage would be superseeded by `fformat`. Returns: Either `None` if `compute` is True or a `dask.Delayed` object or @@ -263,7 +274,8 @@ class XRImage(object): the caller. """ - fformat = fformat or os.path.splitext(filename)[1][1:4] + kwformat = format_kwargs.pop('format', None) + fformat = fformat or kwformat or os.path.splitext(filename)[1][1:] if fformat in ('tif', 'jp2') and rasterio: return self.rio_save(filename, fformat=fformat, fill_value=fill_value, compute=compute, @@ -284,7 +296,7 @@ class XRImage(object): img.rio_save('myfile.tif', overviews=[2, 4, 8, 16]) """ - fformat = fformat or os.path.splitext(filename)[1][1:4] + fformat = fformat or os.path.splitext(filename)[1][1:] drivers = {'jpg': 'JPEG', 'png': 'PNG', 'tif': 'GTiff', @@ -318,7 +330,11 @@ class XRImage(object): photometric_map[mode.upper()]) try: - crs = rasterio.crs.CRS(data.attrs['area'].proj_dict) + area = data.attrs['area'] + if hasattr(area, 'crs'): + crs = rasterio.crs.CRS.from_wkt(area.crs.to_wkt()) + else: + crs = rasterio.crs.CRS(data.attrs['area'].proj_dict) west, south, east, north = data.attrs['area'].area_extent height, width = data.sizes['y'], data.sizes['x'] transform = rasterio.transform.from_bounds(west, south, @@ -380,10 +396,11 @@ class XRImage(object): compute=True, **format_kwargs): """Save the image to the given *filename* using PIL. - For now, the compression level [0-9] is ignored, due to PIL's lack of - support. See also :meth:`save`. + For now, the compression level [0-9] is ignored, due to PIL's + lack of support. See also :meth:`save`. + """ - fformat = fformat or os.path.splitext(filename)[1][1:4] + fformat = fformat or os.path.splitext(filename)[1][1:] fformat = check_image_format(fformat) if fformat == 'png': @@ -402,6 +419,7 @@ class XRImage(object): Inspired by: public domain, Nick Galbreath http://blog.modp.com/2007/08/python-pil-and-png-metadata-take-2.html + """ reserved = ('interlace', 'gamma', 'dpi', 'transparency', 'aspect') @@ -531,8 +549,7 @@ class XRImage(object): return data def _l2rgb(self, mode): - """Convert from L (black and white) to RGB. - """ + """Convert from L (black and white) to RGB.""" self._check_modes(("L", "LA")) bands = ["L"] * 3 @@ -543,6 +560,7 @@ class XRImage(object): return data def convert(self, mode): + """Convert image to *mode*.""" if mode == self.mode: return self.__class__(self.data) @@ -588,7 +606,7 @@ class XRImage(object): return new_img def _finalize(self, fill_value=None, dtype=np.uint8, keep_palette=False, cmap=None): - """Wrapper around 'finalize' method for backwards compatibility.""" + """Wrap around 'finalize' method for backwards compatibility.""" import warnings warnings.warn("'_finalize' is deprecated, use 'finalize' instead.", DeprecationWarning) @@ -597,13 +615,14 @@ class XRImage(object): def finalize(self, fill_value=None, dtype=np.uint8, keep_palette=False, cmap=None): """Finalize the image to be written to an output file. - This adds an alpha band or fills data with a fill_value (if specified). - It also scales float data to the output range of the data type (0-255 - for uint8, default). For integer input data this method assumes the - data is already scaled to the proper desired range. It will still fill - in invalid values and add an alpha band if needed. Integer input - data's fill value is determined by a special ``_FillValue`` attribute - in the ``DataArray`` ``.attrs`` dictionary. + This adds an alpha band or fills data with a fill_value (if + specified). It also scales float data to the output range of the + data type (0-255 for uint8, default). For integer input data + this method assumes the data is already scaled to the proper + desired range. It will still fill in invalid values and add an + alpha band if needed. Integer input data's fill value is + determined by a special ``_FillValue`` attribute in the + ``DataArray`` ``.attrs`` dictionary. """ if keep_palette and not self.mode.startswith('P'): @@ -674,19 +693,20 @@ class XRImage(object): dims=['bands'], coords={'bands': self.data['bands']}) - def gamma(self, gamma=1.0): + def gamma(self, gamma=None): """Apply gamma correction to the channels of the image. - If *gamma* is a - tuple, then it should have as many elements as the channels of the - image, and the gamma correction is applied elementwise. If *gamma* is a - number, the same gamma correction is applied on every channel, if there - are several channels in the image. The behaviour of :func:`gamma` is - undefined outside the normal [0,1] range of the channels. + If *gamma* is a tuple, then it should have as many elements as + the channels of the image, and the gamma correction is applied + elementwise. If *gamma* is a number, the same gamma correction + is applied on every channel, if there are several channels in + the image. The behaviour of :func:`gamma` is undefined outside + the normal [0,1] range of the channels. + """ if isinstance(gamma, (list, tuple)): gamma = self.xrify_tuples(gamma) - elif gamma == 1.0: + elif gamma is None or gamma == 1.0: return logger.debug("Applying gamma %s", str(gamma)) @@ -694,18 +714,21 @@ class XRImage(object): self.data = self.data.clip(min=0) self.data **= 1.0 / gamma self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'gamma': gamma}) def stretch(self, stretch="crude", **kwargs): """Apply stretching to the current image. - The value of *stretch* sets the type of stretching applied. The values - "histogram", "linear", "crude" (or "crude-stretch") perform respectively - histogram equalization, contrast stretching (with 5% cutoff on both - sides), and contrast stretching without cutoff. The value "logarithmic" - or "log" will do a logarithmic enhancement towards white. If a tuple or - a list of two values is given as input, then a contrast stretching is - performed with the values as cutoff. These values should be normalized - in the range [0.0,1.0]. + The value of *stretch* sets the type of stretching applied. The + values "histogram", "linear", "crude" (or "crude-stretch") + perform respectively histogram equalization, contrast stretching + (with 5% cutoff on both sides), and contrast stretching without + cutoff. The value "logarithmic" or "log" will do a logarithmic + enhancement towards white. If a tuple or a list of two values is + given as input, then a contrast stretching is performed with the + values as cutoff. These values should be normalized in the range + [0.0,1.0]. + """ logger.debug("Applying stretch %s with parameters %s", stretch, str(kwargs)) @@ -735,7 +758,7 @@ class XRImage(object): @staticmethod def _compute_quantile(data, dims, cutoffs): - """Helper method for stretch_linear. + """Compute quantile for stretch_linear. Dask delayed functions need to be non-internal functions (created inside a function) to be serializable on a multi-process scheduler. @@ -756,6 +779,7 @@ class XRImage(object): """Stretch linearly the contrast of the current image. Use *cutoffs* for left and right trimming. + """ logger.debug("Perform a linear contrast stretch.") @@ -786,8 +810,9 @@ class XRImage(object): def crude_stretch(self, min_stretch=None, max_stretch=None): """Perform simple linear stretching. - This is done without any cutoff on the current image and normalizes to - the [0,1] range. + This is done without any cutoff on the current image and + normalizes to the [0,1] range. + """ if min_stretch is None: non_band_dims = tuple(x for x in self.data.dims if x != 'bands') @@ -808,9 +833,12 @@ class XRImage(object): else: scale_factor = 1.0 / delta attrs = self.data.attrs - self.data -= min_stretch + offset = -min_stretch * scale_factor self.data *= scale_factor + self.data += offset self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'scale': scale_factor, + 'offset': offset}) def stretch_hist_equalize(self, approximate=False): """Stretch the current image's colors through histogram equalization. @@ -858,6 +886,7 @@ class XRImage(object): band_results.append(self.data.sel(bands='A')) self.data.data = da.stack(band_results, axis=self.data.dims.index('bands')) + self.data.attrs.setdefault('enhancement_history', []).append({'hist_equalize': True}) def stretch_logarithmic(self, factor=100.): """Move data into range [1:factor] through normalized logarithm.""" @@ -885,6 +914,7 @@ class XRImage(object): band_results.append(self.data.sel(bands='A')) self.data.data = da.stack(band_results, axis=self.data.dims.index('bands')) + self.data.attrs.setdefault('enhancement_history', []).append({'log_factor': factor}) def stretch_weber_fechner(self, k, s0): """Stretch according to the Weber-Fechner law. @@ -892,10 +922,12 @@ class XRImage(object): p = k.ln(S/S0) p is perception, S is the stimulus, S0 is the stimulus threshold (the highest unpercieved stimulus), and k is the factor. + """ attrs = self.data.attrs self.data = k * xu.log(self.data / s0) self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'weber_fechner': (k, s0)}) def invert(self, invert=True): """Inverts all the channels of a image according to *invert*. @@ -905,6 +937,7 @@ class XRImage(object): Note: 'Inverting' means that black becomes white, and vice-versa, not that the values are negated ! + """ logger.debug("Applying invert with parameters %s", str(invert)) if isinstance(invert, (tuple, list)): @@ -917,10 +950,11 @@ class XRImage(object): attrs = self.data.attrs self.data = self.data * scale + offset self.data.attrs = attrs + self.data.attrs.setdefault('enhancement_history', []).append({'scale': scale, + 'offset': offset}) def stack(self, img): - """Stack the provided image on top of the current image. - """ + """Stack the provided image on top of the current image.""" # TODO: Conversions between different modes with notification # to the user, i.e. proper logging if self.mode != img.mode: @@ -929,8 +963,10 @@ class XRImage(object): self.data = self.data.where(img.data.isnull(), img.data) def merge(self, img): - """Use the provided image as background for the current *img* image, - that is if the current image has missing data. + """Use the provided image as background for the current *img* image. + + That is if the current image has missing data. + """ raise NotImplementedError("This method has not be implemented for " "xarray support.") @@ -966,7 +1002,6 @@ class XRImage(object): Works only on "L" or "LA" images. """ - if self.mode not in ("L", "LA"): raise ValueError("Image should be grayscale to colorize") @@ -997,7 +1032,7 @@ class XRImage(object): @staticmethod def _palettize(data, colormap): - """Helper for dask-friendly palettize operation.""" + """Operate in a dask-friendly manner.""" # returns data and palette, only need data return colormap.palettize(data)[0] @@ -1009,7 +1044,6 @@ class XRImage(object): Works only on "L" or "LA" images. """ - if self.mode not in ("L", "LA"): raise ValueError("Image should be grayscale to colorize") View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/b2240f1cb42c5a8ed6055257d2c41f044b4ab4b0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/b2240f1cb42c5a8ed6055257d2c41f044b4ab4b0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 22 08:29:51 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 22 Sep 2019 07:29:51 +0000 Subject: [Git][debian-gis-team/trollimage] Pushed new tag debian/1.10.0-1 Message-ID: <5d8722ef63a3e_15123faf6a6fc2ec99595@godard.mail> Bas Couwenberg pushed new tag debian/1.10.0-1 at Debian GIS Project / trollimage -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/tree/debian/1.10.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 22 08:39:57 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 07:39:57 +0000 Subject: Processing of trollimage_1.10.0-1_source.changes Message-ID: trollimage_1.10.0-1_source.changes uploaded successfully to localhost along with the files: trollimage_1.10.0-1.dsc trollimage_1.10.0.orig.tar.gz trollimage_1.10.0-1.debian.tar.xz trollimage_1.10.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 22 08:52:05 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 07:52:05 +0000 Subject: trollimage_1.10.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 22 Sep 2019 07:13:42 +0000 Source: trollimage Architecture: source Version: 1.10.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: trollimage (1.10.0-1) unstable; urgency=medium . * New upstream release. * debian/patches: - refresh all patches * debian/control: - explicit specification of Rules-Requires-Root Checksums-Sha1: fb9582fbcfbca0c02464a8c8466806465b72cff9 2158 trollimage_1.10.0-1.dsc 64052fcf11643d601a1b77dbcfe88eed535a30a1 1418118 trollimage_1.10.0.orig.tar.gz c6eff4a603503cc90cbb4d80602ce27877e2e591 3152 trollimage_1.10.0-1.debian.tar.xz 44c5400e9f308b8a2662e0e01cfa99d2d0d90791 10017 trollimage_1.10.0-1_amd64.buildinfo Checksums-Sha256: df1b7a6caae10357db8cb12d3ecf852ef384085009dfa3f02c1111cf26d023a8 2158 trollimage_1.10.0-1.dsc 4127e3183f7132e999aeeb5ccc68515a310dd9e34755739716adbbbaa89087b6 1418118 trollimage_1.10.0.orig.tar.gz 45655b6b9cf9c4a5d6068d8928413280621d32555140b3b512c4d5a0be1f99d3 3152 trollimage_1.10.0-1.debian.tar.xz 8dccd11b19a157392c0de4bccb7f255cf18fd8160273736046384b19d620e922 10017 trollimage_1.10.0-1_amd64.buildinfo Files: a10d65327ae8b28aac358d03910469f1 2158 python optional trollimage_1.10.0-1.dsc a8c5e795dd1e1f016088499480f0ebda 1418118 python optional trollimage_1.10.0.orig.tar.gz ea3b0dc57c51078b02a19fc6d25a17f1 3152 python optional trollimage_1.10.0-1.debian.tar.xz 3df27ddaf3c82b33b5d26cf73f047862 10017 python optional trollimage_1.10.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2HItcACgkQZ1DxCuiN SvF3Fw//TNLbL5V6j1Wrlt/zkVMSZgrWkJ6ignU+hQIlhI9X6dKjiqdHzzEiBa4e O0jwDixitaW83R6HVo84ZE/WPpiSyJ/gwIEXUY/0X9Yiu6814V5CGZhUtZQUQ8qv UeKQyfftpcV0xN81rDPo8R8N07WEFTLkVyeMo0d7XVorH2dvCp81rM83GbngebQ+ AM2UbK/XR62WhpiF8Qko5CP5V4hC4lZgU9+Afcy/CTeTODNMm2d5IVfFjKkqJX54 lQAG5gvNLHKKfTL4BlZ0mmkWquG5viYoEsa/kLA0XByHsQo5mxqnuTXjWMJh8B4O QX6nFO1ikjwzh5eJE5WPbEZHopbcecdFE9gOStLkXGaNwGFAF+zrQ561LgXwsfNB Ax36bPmMnXCxUIDeuqC78g0AOOJCSAnn13aXApy6xquhuzJZxoK88dc7pHXV49OP ZLFsKwIqo8cIiz9jRUKQ4RC2QsqwME1vOTcEALzzqLt1Zo7F15fuh4hRDDTjMq3+ VA4p6lK76nfaxpYwv8rubcrI8TLPJttlTUrYEX/1yDbOS2uqh60DCL5w98URUzuk sk5B9AYBz1uAskqJXWfFh6fvk/6XYxMIjNoC/3UQQAP/0/XafKvx0BPB5ncoi2fC k1BEF/cewJolgJFr3HHMlLtNIWyMkwTLmvrXkVFmYgFRKdi4LkY= =YBy9 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sun Sep 22 10:38:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 22 Sep 2019 09:38:10 +0000 Subject: [Git][debian-gis-team/qgis][buster-backports] 13 commits: Update packaging for GRASS 7.8.0. Message-ID: <5d874102332ed_15123faf6a4c9a74105925@godard.mail> Bas Couwenberg pushed to branch buster-backports at Debian GIS Project / qgis Commits: ad75e62c by Bas Couwenberg at 2019-09-07T07:28:18Z Update packaging for GRASS 7.8.0. - - - - - f00f865d by Bas Couwenberg at 2019-09-07T08:08:12Z Update symbols for other architectures. - - - - - 7da12bde by Bas Couwenberg at 2019-09-07T08:08:27Z Set distribution to unstable. - - - - - 593fa849 by Bas Couwenberg at 2019-09-13T12:36:50Z Update watch file to use query string for cache avoidance. - - - - - 187dbc06 by Bas Couwenberg at 2019-09-13T12:37:12Z New upstream version 3.4.12+dfsg - - - - - 4dcd4765 by Bas Couwenberg at 2019-09-13T12:40:21Z Update upstream source from tag 'upstream/3.4.12+dfsg' Update to upstream version '3.4.12+dfsg' with Debian dir 042d9c15d9568095f568e449adbc93eb7ef0ece6 - - - - - c547b448 by Bas Couwenberg at 2019-09-13T12:46:31Z New upstream release. - - - - - 1058415e by Bas Couwenberg at 2019-09-13T15:07:52Z Update symbols for amd64. - - - - - 4ba0089a by Bas Couwenberg at 2019-09-13T15:07:52Z Set distribution to experimental. - - - - - baa5486d by Bas Couwenberg at 2019-09-16T04:14:51Z Update symbols for other architectures. - - - - - 572f9e88 by Bas Couwenberg at 2019-09-16T04:15:22Z Set distribution to unstable. - - - - - 37f0c78e by Bas Couwenberg at 2019-09-22T05:23:13Z Merge tag 'debian/3.4.12+dfsg-1' into buster-backports releasing package qgis version 3.4.12+dfsg-1 - - - - - 8253b78b by Bas Couwenberg at 2019-09-22T05:23:42Z Rebuild for buster-backports. - - - - - 30 changed files: - .ci/travis/scripts/ctest2travis.py - CMakeLists.txt - ChangeLog - debian/changelog - debian/control - debian/libqgis-3d3.4.11.install → debian/libqgis-3d3.4.12.install - debian/libqgis-3d3.4.11.symbols → debian/libqgis-3d3.4.12.symbols - debian/libqgis-analysis3.4.11.install → debian/libqgis-analysis3.4.12.install - debian/libqgis-analysis3.4.11.symbols → debian/libqgis-analysis3.4.12.symbols - debian/libqgis-app3.4.11.install → debian/libqgis-app3.4.12.install - debian/libqgis-app3.4.11.lintian-overrides → debian/libqgis-app3.4.12.lintian-overrides - debian/libqgis-app3.4.11.symbols → debian/libqgis-app3.4.12.symbols - debian/libqgis-core3.4.11.install → debian/libqgis-core3.4.12.install - debian/libqgis-core3.4.11.lintian-overrides → debian/libqgis-core3.4.12.lintian-overrides - debian/libqgis-core3.4.11.symbols → debian/libqgis-core3.4.12.symbols - debian/libqgis-gui3.4.11.install → debian/libqgis-gui3.4.12.install - debian/libqgis-gui3.4.11.symbols → debian/libqgis-gui3.4.12.symbols - debian/libqgis-native3.4.11.install → debian/libqgis-native3.4.12.install - debian/libqgis-native3.4.11.symbols → debian/libqgis-native3.4.12.symbols - debian/libqgis-server3.4.11.install → debian/libqgis-server3.4.12.install - debian/libqgis-server3.4.11.symbols → debian/libqgis-server3.4.12.symbols - debian/libqgisgrass7-3.4.11.install → debian/libqgisgrass7-3.4.12.install - debian/libqgisgrass7-3.4.11.lintian-overrides → debian/libqgisgrass7-3.4.12.lintian-overrides - debian/libqgisgrass7-3.4.11.symbols → debian/libqgisgrass7-3.4.12.symbols - debian/libqgispython3.4.11.install → debian/libqgispython3.4.12.install - debian/libqgispython3.4.11.symbols → debian/libqgispython3.4.12.symbols - + debian/patches/grass78.patch - debian/patches/series - debian/watch - doc/TRANSLATORS The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/0e550057a25f3d7fe18895e9f2e104549ec291b3...8253b78b6e087857bb8a764e879361b408fc5af4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/compare/0e550057a25f3d7fe18895e9f2e104549ec291b3...8253b78b6e087857bb8a764e879361b408fc5af4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 22 10:38:11 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 22 Sep 2019 09:38:11 +0000 Subject: [Git][debian-gis-team/qgis] Pushed new tag debian/3.4.12+dfsg-1_bpo10+1 Message-ID: <5d8741037e641_15123faf6a6fc2ec10615f@godard.mail> Bas Couwenberg pushed new tag debian/3.4.12+dfsg-1_bpo10+1 at Debian GIS Project / qgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qgis/tree/debian/3.4.12+dfsg-1_bpo10+1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 22 11:07:42 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 10:07:42 +0000 Subject: qgis_3.4.12+dfsg-1~bpo10+1_amd64.changes is NEW Message-ID: binary:libqgis-3d3.4.12 is NEW. binary:libqgis-analysis3.4.12 is NEW. binary:libqgis-app3.4.12 is NEW. binary:libqgis-core3.4.12 is NEW. binary:libqgis-gui3.4.12 is NEW. binary:libqgis-native3.4.12 is NEW. binary:libqgis-server3.4.12 is NEW. binary:libqgisgrass7-3.4.12 is NEW. binary:libqgispython3.4.12 is NEW. binary:libqgispython3.4.12 is NEW. binary:libqgis-analysis3.4.12 is NEW. binary:libqgis-core3.4.12 is NEW. binary:libqgisgrass7-3.4.12 is NEW. binary:libqgis-3d3.4.12 is NEW. binary:libqgis-server3.4.12 is NEW. binary:libqgis-gui3.4.12 is NEW. binary:libqgis-native3.4.12 is NEW. binary:libqgis-app3.4.12 is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From ftpmaster at ftp-master.debian.org Sun Sep 22 10:52:03 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 22 Sep 2019 09:52:03 +0000 Subject: Processing of qgis_3.4.12+dfsg-1~bpo10+1_amd64.changes Message-ID: qgis_3.4.12+dfsg-1~bpo10+1_amd64.changes uploaded successfully to localhost along with the files: qgis_3.4.12+dfsg-1~bpo10+1.dsc qgis_3.4.12+dfsg-1~bpo10+1.debian.tar.xz libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-3d3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-analysis3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-app3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-core3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-customwidgets_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-dev_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-gui3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-native3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgis-server3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgisgrass7-3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb libqgispython3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb python3-qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb python3-qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb python3-qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-api-doc_3.4.12+dfsg-1~bpo10+1_all.deb qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-plugin-grass-common_3.4.12+dfsg-1~bpo10+1_all.deb qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-plugin-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-provider-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-provider-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-providers-common_3.4.12+dfsg-1~bpo10+1_all.deb qgis-providers-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-providers_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-server-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis-server_3.4.12+dfsg-1~bpo10+1_amd64.deb qgis_3.4.12+dfsg-1~bpo10+1_amd64.buildinfo qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 23 09:13:14 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 23 Sep 2019 08:13:14 +0000 Subject: osmium-tool_1.11.0-1~bpo10+1_amd64.changes ACCEPTED into buster-backports, buster-backports Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 22 Sep 2019 07:00:52 +0200 Source: osmium-tool Binary: osmium-tool osmium-tool-dbgsym Architecture: source amd64 Version: 1.11.0-1~bpo10+1 Distribution: buster-backports Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: osmium-tool - Command line tool for working with OpenStreetMap data Changes: osmium-tool (1.11.0-1~bpo10+1) buster-backports; urgency=medium . * Rebuild for buster-backports. * Update branch in gbp.conf & Vcs-Git URL. . osmium-tool (1.11.0-1) unstable; urgency=medium . * New upstream release. * Bump Standards-Version to 4.4.0, no changes. * Append -DNDEBUG to CXXFLAGS to remove buildpath from binaries. * Update gbp.conf to use --source-only-changes by default. * Update copyright years for Jochen Topf. * Bump minimum required libosmium2-dev to 2.15.2. * Add patch to fix spelling errors. Checksums-Sha1: 26e8e7cb7355c3a2fce72ec7a462f4f912101bae 2125 osmium-tool_1.11.0-1~bpo10+1.dsc fdd3c7692ed74e109571b507e0c93dd6b6e7d02c 6376 osmium-tool_1.11.0-1~bpo10+1.debian.tar.xz 9f77f0384517935440f3b488d79908bf6a0dd555 27469732 osmium-tool-dbgsym_1.11.0-1~bpo10+1_amd64.deb 0fe2141d65abd49bd3b94fb1fee7b6823f854c4d 7724 osmium-tool_1.11.0-1~bpo10+1_amd64.buildinfo c921196e701d6ebab645d944232fdfd22dcf6f06 696200 osmium-tool_1.11.0-1~bpo10+1_amd64.deb Checksums-Sha256: 26dfe6a242e8cc844128a604f6575055bb1b0871e4458120bd7b5436ffcb082e 2125 osmium-tool_1.11.0-1~bpo10+1.dsc 68cb967233283802ee6b5a2edba2b1f01b23e1e0d67a74a6a083b74c08e57497 6376 osmium-tool_1.11.0-1~bpo10+1.debian.tar.xz 76488f7aa448c3954b21dd237277da6a9257b18efa5dc475583f8bac2441f03d 27469732 osmium-tool-dbgsym_1.11.0-1~bpo10+1_amd64.deb 63cff975b7458a96120de086fb113088203b40fc58a4e589a0a47db8d7e5582a 7724 osmium-tool_1.11.0-1~bpo10+1_amd64.buildinfo e2cf2b772694b47e4b87f93e89020a08a16e80b58e7b5af506fa58c79c6150c5 696200 osmium-tool_1.11.0-1~bpo10+1_amd64.deb Files: 543fe844339dd626a8f6ff56d8f5dc6b 2125 science optional osmium-tool_1.11.0-1~bpo10+1.dsc d0ec7e980d40317c1bc9b7f449322626 6376 science optional osmium-tool_1.11.0-1~bpo10+1.debian.tar.xz 29a3fff4fa2a422664c12cde8536d7b2 27469732 debug optional osmium-tool-dbgsym_1.11.0-1~bpo10+1_amd64.deb 839733d41ff41b1021d9d4010a2dd9bf 7724 science optional osmium-tool_1.11.0-1~bpo10+1_amd64.buildinfo 39112060129e2c127e01bb911f61f3fd 696200 utils optional osmium-tool_1.11.0-1~bpo10+1_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2HAzsACgkQZ1DxCuiN SvH4yBAAmCBW1bxKdjHWtirsKd/EFCggG5Ocd7jYiiMrsI2cmKX8cuuKWEOkE3Mh TUJB7sGDv26DSnM/aqzPWAqdFBgFaWcNHzzq4Cir0889aaJLovhhqEShqld9waef YCwvn0yltYK6YWBkWTvcLfUlwz0JVgx2tK0IFLXAWjbp1i7jLnaUjCj5291f9IUz Al/FmHCPw6D3IB7veHiG4ej0fydniyWvetZBRv4mUmcXpvNLjBDa1GVfUyvL1Y6W jHxON+iYg/F3m7edc6jk5Z1yfeTXYQCFWtFSg4UeRIx39G5zgW2vB8z5r8SGV6jv PBRmKN2y7UzbHO7eqymcT7N3m35O1qPWVolAypcyN7P4Gopn9Lj4KoW8qXJNP4QT 2+p+tYdhjkO2uCiVNVs021ABc1yoZjuX+h6bPZT1Caa5WsEEC033BjL70LYD5yMd J8P/sPBWQlApdUDbbNyCtFkLKL65DtaHfFn6DQqdklbscXcIjfFmvhE66ZAiNO2K pcyWjQH7pMCM9q5z0R0VTtDX7xiA4UJzllVMEw2418zt93oiIB4a0pSwJoqaZ147 lqWdLjLkWW/Mgf4/PCe/EoPTLXYZ262uuOgpOChvhfuNSQY4wG17Whl7RAaL7m+r GALsIl8EBuE8FDdiKxCqvslS6HK4PRBKtm5baskmnWznaXGOB2g= =dOuQ -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 23 09:13:23 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 23 Sep 2019 08:13:23 +0000 Subject: qgis_3.4.12+dfsg-1~bpo10+1_amd64.changes ACCEPTED into buster-backports, buster-backports Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 22 Sep 2019 07:23:39 +0200 Source: qgis Binary: libqgis-3d3.4.12 libqgis-3d3.4.12-dbgsym libqgis-analysis3.4.12 libqgis-analysis3.4.12-dbgsym libqgis-app3.4.12 libqgis-app3.4.12-dbgsym libqgis-core3.4.12 libqgis-core3.4.12-dbgsym libqgis-customwidgets libqgis-customwidgets-dbgsym libqgis-dev libqgis-gui3.4.12 libqgis-gui3.4.12-dbgsym libqgis-native3.4.12 libqgis-native3.4.12-dbgsym libqgis-server3.4.12 libqgis-server3.4.12-dbgsym libqgisgrass7-3.4.12 libqgisgrass7-3.4.12-dbgsym libqgispython3.4.12 libqgispython3.4.12-dbgsym python3-qgis python3-qgis-common python3-qgis-dbgsym qgis qgis-api-doc qgis-common qgis-dbgsym qgis-plugin-grass qgis-plugin-grass-common qgis-plugin-grass-dbgsym qgis-provider-grass qgis-provider-grass-dbgsym qgis-providers qgis-providers-common qgis-providers-dbgsym qgis-server qgis-server-dbgsym Architecture: source amd64 all Version: 3.4.12+dfsg-1~bpo10+1 Distribution: buster-backports Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Description: libqgis-3d3.4.12 - QGIS - shared 3d library libqgis-analysis3.4.12 - QGIS - shared analysis library libqgis-app3.4.12 - QGIS - shared app library libqgis-core3.4.12 - QGIS - shared core library libqgis-customwidgets - QGIS custom widgets for Qt Designer libqgis-dev - QGIS - development files libqgis-gui3.4.12 - QGIS - shared gui library libqgis-native3.4.12 - QGIS - shared native gui library libqgis-server3.4.12 - QGIS - shared server library libqgisgrass7-3.4.12 - QGIS - shared grass library libqgispython3.4.12 - QGIS - shared Python library python3-qgis - Python bindings to QGIS python3-qgis-common - Python bindings to QGIS - architecture-independent files qgis - Geographic Information System (GIS) qgis-api-doc - QGIS API documentation qgis-common - QGIS - architecture-independent data qgis-plugin-grass - GRASS plugin for QGIS qgis-plugin-grass-common - GRASS plugin for QGIS - architecture-independent data qgis-provider-grass - GRASS provider for QGIS qgis-providers - collection of data providers to QGIS qgis-providers-common - collection of data providers to QGIS - architecture-independent f qgis-server - QGIS server providing various OGC services Changes: qgis (3.4.12+dfsg-1~bpo10+1) buster-backports; urgency=medium . * Rebuild for buster-backports. . qgis (3.4.12+dfsg-1) unstable; urgency=medium . * Update symbols for other architectures. * Move from experimental to unstable. . qgis (3.4.12+dfsg-1~exp1) experimental; urgency=medium . * New upstream release. * Update symbols for amd64. . qgis (3.4.11+dfsg-2) unstable; urgency=medium . * Update packaging for GRASS 7.8.0. * Update symbols for other architectures. * Update watch file to use query string for cache avoidance. Checksums-Sha1: c0aa5323f9bd33a0de6d4c13a2fb4eed9bf9c245 4707 qgis_3.4.12+dfsg-1~bpo10+1.dsc 2a2f1671b7524be5a16cbaa27356f4c5f4093ca8 267104 qgis_3.4.12+dfsg-1~bpo10+1.debian.tar.xz 1305815111e0b8bb566a2670b4ff1d9bbc70480d 10435500 libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 62884aaa850bfa94b9f1e33c5e1c945361255bc2 2142116 libqgis-3d3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb b7e4adf1c2ea7739f5392eecb308b0e7b34fdb63 50371212 libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb d07234726569cbbcfdb3f5d7277e9bbc73c76e90 2713592 libqgis-analysis3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb f33ba5503845c6be717590e9a2e51441545e8b42 113141060 libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 5de409a830eda47828ba7b5c18eff6a36e30ec37 4673800 libqgis-app3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 2897848e73b9422605302326d83b6664d036f21b 157516180 libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb d1be5978d7d506fce52df3ac7704ed968d12001d 6279144 libqgis-core3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 414513511a3dfec5501cb53138448f23f1f82a85 4976436 libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 6843ce4e87deb83e473e11622ac82992d8e26bbd 5402664 libqgis-customwidgets_3.4.12+dfsg-1~bpo10+1_amd64.deb 059b2d62763a183b649cd0ee02d7accbf66ff60f 2931340 libqgis-dev_3.4.12+dfsg-1~bpo10+1_amd64.deb f8bdcc5b6f88d73f94cab4048199796e82005f7a 148596140 libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 3f2d367638f6bd5d902a53143b23f12459b0dd7e 4804128 libqgis-gui3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 203cb6c4fb0037ab6dd03e82ecc2b0bc83b764f4 603464 libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 4bcb0d9ffc572d0bbde6230377c9500f03eea417 2002192 libqgis-native3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb cec9115ca3201cb90619c2614d6d5d89390f5838 6224876 libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 3d28ad94a267f13e9a87e9f51c876b8f92dae2ad 2133288 libqgis-server3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 577ddb36bbfb13f0c4d71734578c18f53a8e6148 4912828 libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb ee565727ef8228997ec5a63731ddff59d07cc047 2184200 libqgisgrass7-3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb dd95f7e776ab0cc2416f9732826009db22d5081f 387844 libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb e45402356329d9e7f7686744777dc61dc23ac98a 2003856 libqgispython3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 7fcddc562bc003da6ac391584dd66e632df712ea 4313892 python3-qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb cd2a3528a9d5c45f184bbf3a996cc24448c744a8 41657184 python3-qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 897721d8408f5680060b83b62853586a476d7b89 9105300 python3-qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb 8ef0effc4bf64507c3a1763c7a51cd3e2b64d5ee 996481608 qgis-api-doc_3.4.12+dfsg-1~bpo10+1_all.deb 2c5a5e800354c57ea084c47a602d6a18b83f48f5 11810512 qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb 7730f34276081319bf6e9a28d5aef616b0c4cb50 23615696 qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb a4397817837041fdca9d1b29cba14b51b4d9299c 2463656 qgis-plugin-grass-common_3.4.12+dfsg-1~bpo10+1_all.deb c1e9da3f185f1fe712b62960ff19b84973cae432 11255276 qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 2995e3ef8a442d64f8604229fba9634d9302b0ff 2555488 qgis-plugin-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb 20fe50a132909972ca4af5b417fa570786533cd8 1663664 qgis-provider-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 18e6cd5abdddeda811e5a35b8a9f4230e7fbbaa6 2051932 qgis-provider-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb 5da94bf6df8d6e80674cbe370aff45c53180f4d7 2935388 qgis-providers-common_3.4.12+dfsg-1~bpo10+1_all.deb be2839be7f0eb9c803bf1e7858ce190fb817392b 67558868 qgis-providers-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 623f664c8341488e04b820afc4e4910553eb3554 3840304 qgis-providers_3.4.12+dfsg-1~bpo10+1_amd64.deb 6047a43fff8db960b1ab586dd699285aac9a77b7 12706840 qgis-server-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 142ff6a614305ed400119a0e233b79440482f7de 2465080 qgis-server_3.4.12+dfsg-1~bpo10+1_amd64.deb 4680e1011a99eb2e5f574e9bbe5c10d1345046f4 35784 qgis_3.4.12+dfsg-1~bpo10+1_amd64.buildinfo 50e9f38cc403c92ee07893270beb24a084606e8e 6804516 qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb Checksums-Sha256: dcd4e7e8109c54bea302de42864188a8b826fd0aa06bfabeffc20d7c0baefc8c 4707 qgis_3.4.12+dfsg-1~bpo10+1.dsc e4209ef6534ee572007e50cefe43562bf2beb1fd0be3b8c0999268a8c5cab462 267104 qgis_3.4.12+dfsg-1~bpo10+1.debian.tar.xz 2680e9fd9a8ac36a810eec63c1b2d886977626e48f9d81c6a8969cf4cede91d5 10435500 libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 7e39bf5be8e841df75bba5e204cbfd3659274e008708b9a4fb29b76bda6936cc 2142116 libqgis-3d3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb fdad7dc7afa4fafd71b953659cb71cb7839e5c60a30e6094dd6730e04291aaa5 50371212 libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 0a0d521cba4f5b2bf1d4d11c9a3d98a47fccec0d777c2586840f63a73a110349 2713592 libqgis-analysis3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb e1e8bdf0b32148e4add966ece1c046623bee5d48d19a53a25357620f83fd8474 113141060 libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 53823f5c320537f20de30b58c1e5a9a07d1b3040e18d0bf4f2c30599640edcd0 4673800 libqgis-app3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb ca5170de264900df287010a86f08212fbfd9ca58063d2c7618c61d5dba02b449 157516180 libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 51e6b97ceab080394fe8fa9706f37e327c64108a99167883389259574a38e10f 6279144 libqgis-core3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb cb0cbe7a53f03569c7a4456da3fb80bbee3203e57b9ba2711e9d7957b84af72c 4976436 libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb a5950982e06e7ecebb09ef21c1c6ee8697a385c8530b107cd819aefeb8601a7c 5402664 libqgis-customwidgets_3.4.12+dfsg-1~bpo10+1_amd64.deb 5cc319ed693b9c94d0ace6f05f47ab52aa4c6bfd57303f6b872eac4d826eb22f 2931340 libqgis-dev_3.4.12+dfsg-1~bpo10+1_amd64.deb c219a8809fc3199a4cf7ef80d8b278cf5bea1e309362c387532ebd7cbf5fe45c 148596140 libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb e6c6b94a8f830923a61844e67257c63c3fb9744bf711b35d1c169175ed9e1538 4804128 libqgis-gui3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 6e5151c2de7dbfb07fdf40b2194f5761a32a52df7ad6f03ea6a8156318ce07c1 603464 libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 77788324f96798a213f724c7f68bdd69a96169cce551949fdadaeb1586c2ffdb 2002192 libqgis-native3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 1173b2411dbf687347773ec1e85ba75f2f2b7a7b140a198ea575084722aaff63 6224876 libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 0e36e755d73fd932f88fe279806ec9f8f0a9ad883367912e2a31efc61fac47d6 2133288 libqgis-server3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb cc7dfea9d201d5921bfa46383bda39f1cc4e1d8df1cab4fa21ccedf470d062fd 4912828 libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 07eba9c981dc8f231ede3d5b96c2c276b6fc741769bcb647de143e092b06f778 2184200 libqgisgrass7-3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 286f5b954d7ed9ba28821066bd2ccc8e0c43d02b773dd9a88b58a5e25a061ae1 387844 libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb d98b9b06637e7f73fb12cee0623449a7f1ac7787b42c4d17334d5411d392bf5e 2003856 libqgispython3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 7f6aef6f39ba82294adf241c656aa37ec0da7f2359063ed2b441a206c8c27939 4313892 python3-qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb e9ed5a6956b83227e05ee371a77ff63e26fea8456a494d511c007bf0b949191b 41657184 python3-qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb d606efc8d470443604f5cc9fc37312a71a897932f52c0e5cbc69de46a463eb41 9105300 python3-qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb 8cbdd665d02fc6039db9237effa19a644020558d2291a804dd7356ec4f10d119 996481608 qgis-api-doc_3.4.12+dfsg-1~bpo10+1_all.deb 4ec5b8a05f168dbde2365d10f8c78867bd922c5564a65568c064cabfb1935bcc 11810512 qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb 5b22a2c8739f9253a952533d8b3f9315235aaef59847b49b953a4da7ce32209e 23615696 qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb d0849c10fd7c432b1a51bc5cee23177ea8f81fa0aaa9da4ee893463265b9d0cd 2463656 qgis-plugin-grass-common_3.4.12+dfsg-1~bpo10+1_all.deb de8609c79dcde8dde33e95068c6762454c24b161493f411a3a96aebfc87c5397 11255276 qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb c42e58dafe5bc741c09283ac4a159fca5ae78994bda3256c1e4c6014ae38975e 2555488 qgis-plugin-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb de5d012558f575f62999fcfcfe4a13cef6289d56e24d8ad158a7ff5e6af3da69 1663664 qgis-provider-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 983ee147f5025114e1a7010ced015ec967f3b94137c8f48266659450a5e9b9ef 2051932 qgis-provider-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb f9df4d55924debec0f9778badbaafa8e251a060f9584a676bc40d03846acdcc0 2935388 qgis-providers-common_3.4.12+dfsg-1~bpo10+1_all.deb 0382dc692443c594be5d2641c4bb4f747769e6d8bbd568afba6352563dee5d8a 67558868 qgis-providers-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb bfbb35619e410a33670af5d8be2b58902e276d280ec525f36e2f44d05124fc24 3840304 qgis-providers_3.4.12+dfsg-1~bpo10+1_amd64.deb b5e93e4c0d29771693be1ba915186e8354bed0c02afc180e4d7386e656ac1d73 12706840 qgis-server-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 537a8073f416076c01719defea71b255ba110460b459d8da2f1fef41794bc833 2465080 qgis-server_3.4.12+dfsg-1~bpo10+1_amd64.deb 9391b8272626845f5144137a02ed17e4674b0d547fbf1afaaffe8fbc88f1ab18 35784 qgis_3.4.12+dfsg-1~bpo10+1_amd64.buildinfo 12909eba1ac170e6959340fd7dad54e0f6dcf8bc9aeb2499f7f9784f631bc5b4 6804516 qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb Files: 86004f7855f3a61f9d7b43439cb70ad1 4707 science optional qgis_3.4.12+dfsg-1~bpo10+1.dsc 35f96f110ea126c436cdb30564078621 267104 science optional qgis_3.4.12+dfsg-1~bpo10+1.debian.tar.xz 783544182cd5c856d303b93215ea6f71 10435500 debug optional libqgis-3d3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb b5f6f4fc40418672f5a9b05363e6cc77 2142116 libs optional libqgis-3d3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb a6f84447b4865fc512c81448e1a8f71c 50371212 debug optional libqgis-analysis3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 36b613be84820bb0aca5a7c949fa4351 2713592 libs optional libqgis-analysis3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb e662d2110094a8aab622fd82a4b19ca0 113141060 debug optional libqgis-app3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 46243f163e2878a74bb54ce9b59250a9 4673800 libs optional libqgis-app3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb b433e02404f49c271e9397861dd2e3fa 157516180 debug optional libqgis-core3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb a2a73dc56d1873d2af467e8c1bf5711b 6279144 libs optional libqgis-core3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb f8d6c6075304e9c94d463cbfddf8ef93 4976436 debug optional libqgis-customwidgets-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 3b87a25ee331af05c08ef2f5359f3c75 5402664 science optional libqgis-customwidgets_3.4.12+dfsg-1~bpo10+1_amd64.deb 9d3198208e4486dcd2966f83954da309 2931340 libdevel optional libqgis-dev_3.4.12+dfsg-1~bpo10+1_amd64.deb 57dc42f94c767cd0a5c4a1c76e9130c9 148596140 debug optional libqgis-gui3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 8a5a65311053d7c9549a558a26939158 4804128 libs optional libqgis-gui3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 344642e282eea3f81b2cd4a948b26a74 603464 debug optional libqgis-native3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 89c6dc49d3a7a69588980d96df138611 2002192 libs optional libqgis-native3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb 8ec7ab934083a77322fc7ec14d636f1b 6224876 debug optional libqgis-server3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 3d2499fe51f0f220d3ee16f516c4f81d 2133288 libs optional libqgis-server3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb a465f79a7fd8432996c5af091ab4af6a 4912828 debug optional libqgisgrass7-3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb c181e8078fd47201eba83bb38c582cab 2184200 libs optional libqgisgrass7-3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb f57134f95defc89b9af94a88ccdfd4f9 387844 debug optional libqgispython3.4.12-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 3d2e64322638c8ba5ea209f4bf94f49d 2003856 libs optional libqgispython3.4.12_3.4.12+dfsg-1~bpo10+1_amd64.deb e084c1fdeac000ec50ef80df4adb4b5c 4313892 python optional python3-qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb 911f66879dedaf5dd8e5049d4d28486b 41657184 debug optional python3-qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb a873ff242da78e9761b9032aaadd31e4 9105300 python optional python3-qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb 780b5be72fcf46c794fea982b05a830e 996481608 doc optional qgis-api-doc_3.4.12+dfsg-1~bpo10+1_all.deb 0f5ba43aeca1d6e7117d75de980dad69 11810512 science optional qgis-common_3.4.12+dfsg-1~bpo10+1_all.deb 5114ce2e4a232ff20a4e12b766c53f5c 23615696 debug optional qgis-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 0303a0361a252ede63f995b73d8108d2 2463656 science optional qgis-plugin-grass-common_3.4.12+dfsg-1~bpo10+1_all.deb aecc0f0ec8cca186ae71ed8140e7e6aa 11255276 debug optional qgis-plugin-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 1a419f2b6dc5aa2d47e61d955b86e52f 2555488 science optional qgis-plugin-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb 4f1a690853cf66566fff1568c07c2276 1663664 debug optional qgis-provider-grass-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb af7912474020340e1e4eefee8ab26d4b 2051932 science optional qgis-provider-grass_3.4.12+dfsg-1~bpo10+1_amd64.deb c3b3b6080ee14c311b0f5da34c0786db 2935388 science optional qgis-providers-common_3.4.12+dfsg-1~bpo10+1_all.deb 16739e200534ca939e32b788b3e9f6ac 67558868 debug optional qgis-providers-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb fd0fc8344a21416e1422a93eca87be36 3840304 science optional qgis-providers_3.4.12+dfsg-1~bpo10+1_amd64.deb 5addba9fa2480c810d99302bb4a67442 12706840 debug optional qgis-server-dbgsym_3.4.12+dfsg-1~bpo10+1_amd64.deb 43f800df78baada9cbbe74417c042277 2465080 science optional qgis-server_3.4.12+dfsg-1~bpo10+1_amd64.deb 1748e79464674c8f8af103efd3965b98 35784 science optional qgis_3.4.12+dfsg-1~bpo10+1_amd64.buildinfo 127ddaa92a29b6f5dcdab56898480807 6804516 science optional qgis_3.4.12+dfsg-1~bpo10+1_amd64.deb -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2HQNUACgkQZ1DxCuiN SvEJYBAAihq9Z7Q4TCMVYSSYH+zJsBvun2JLNsnjdi+MQP4N11oWkmo4GWsMRSoo IKo9Ux0YYO5UBdr2LIWAVQ/f4lFWggNOJwRsS6Z4DuuPr8emzTcMDnpddkSDdY1+ C6AsV45WDsj9JjHcOO6nBUEJgVr00n65af4jYP6NXA/x56aZhC/frcOoQFdYwm+4 fJGinojIgOIwjg+uTiUWbjQ9u1etKYB6eMLBFmvhAEJl9iEZpLcGVHVulDb/tzst xN+sSQHeD/7qj+Sb2jqtR//QOdx/L+wDU5FVNg7ceIeVG6eCDVKyZ0Gdgyebnv8w 5FJLahfjzWfQgRXFnxbuBFFPyDWAMSrX84/VnUOYyUUiCfvB6e3fgGpZ9isLpE1G y/C99195C3U1tjJEeNwXam4RzOsOyc3MeHi528B7+KIrNkzTOaTbjS3Hcji+fkDJ 6UVHI88Wn3xZe1Yi8qc6aTH53QRULVLluoPbhpuLe03Dsz6Fa99z6WbxRQto7Ia9 Qu41YobM/RpcqpxnUCwf91hyjq1fZy58+cm2+QdrKVgWvXjnawi+TdiBylHZLN2/ eBgueFh3m4PsR+j56f5PQtfpcbKo0P5CtRR+VFnf9iY9qO1Ad/Sihjc+lTdvAEUM Gv9owK9Q/jcWgVD/9tAooBs7tJRDXTtehIWIABX6x20UOlgM76k= =7Qwt -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From sales at bonesca.nl Mon Sep 23 15:45:31 2019 From: sales at bonesca.nl (Bonesca Sales) Date: Mon, 23 Sep 2019 17:45:31 +0300 Subject: Guanaco / Lama Meat Message-ID: <76PjwXCmEV_6ss1w-newsletter@synergia.data.lt> An HTML attachment was scrubbed... URL: From noreply at release.debian.org Tue Sep 24 05:39:14 2019 From: noreply at release.debian.org (Debian testing watch) Date: Tue, 24 Sep 2019 04:39:14 +0000 Subject: routino 3.3.2-1 MIGRATED to testing Message-ID: FYI: The status of the routino source package in Debian's testing distribution has changed. Previous version: 3.3.1-1 Current version: 3.3.2-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Tue Sep 24 05:39:14 2019 From: noreply at release.debian.org (Debian testing watch) Date: Tue, 24 Sep 2019 04:39:14 +0000 Subject: python-snuggs 1.4.7-1 MIGRATED to testing Message-ID: FYI: The status of the python-snuggs source package in Debian's testing distribution has changed. Previous version: 1.4.6-2 Current version: 1.4.7-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Wed Sep 25 05:39:19 2019 From: noreply at release.debian.org (Debian testing watch) Date: Wed, 25 Sep 2019 04:39:19 +0000 Subject: python-pdal 2.2.2+ds-1 MIGRATED to testing Message-ID: FYI: The status of the python-pdal source package in Debian's testing distribution has changed. Previous version: 2.2.1+ds-1 Current version: 2.2.2+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Thu Sep 26 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 26 Sep 2019 04:39:21 +0000 Subject: pyninjotiff 0.2.0-1 MIGRATED to testing Message-ID: FYI: The status of the pyninjotiff source package in Debian's testing distribution has changed. Previous version: 0.1.0-1 Current version: 0.2.0-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Thu Sep 26 05:39:21 2019 From: noreply at release.debian.org (Debian testing watch) Date: Thu, 26 Sep 2019 04:39:21 +0000 Subject: python-pyproj 2.4.0+ds-1 MIGRATED to testing Message-ID: FYI: The status of the python-pyproj source package in Debian's testing distribution has changed. Previous version: 2.3.1+ds-1 Current version: 2.4.0+ds-1 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From noreply at release.debian.org Fri Sep 27 05:39:20 2019 From: noreply at release.debian.org (Debian testing watch) Date: Fri, 27 Sep 2019 04:39:20 +0000 Subject: pywps 4.2.1-4 MIGRATED to testing Message-ID: FYI: The status of the pywps source package in Debian's testing distribution has changed. Previous version: 4.2.1-3 Current version: 4.2.1-4 -- This email is automatically generated once a day. As the installation of new packages into testing happens multiple times a day you will receive later changes on the next day. See https://release.debian.org/testing-watch/ for more information. From gitlab at salsa.debian.org Fri Sep 27 06:35:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:35:32 +0000 Subject: [Git][debian-gis-team/pywps][master] 6 commits: New upstream version 4.2.2 Message-ID: <5d8d9fa4d00b_1efe3f977ec5fc241166c8@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pywps Commits: 1fd0970c by Bas Couwenberg at 2019-09-27T05:11:41Z New upstream version 4.2.2 - - - - - f27faae9 by Bas Couwenberg at 2019-09-27T05:11:45Z Update upstream source from tag 'upstream/4.2.2' Update to upstream version '4.2.2' with Debian dir d8d7c33c116277d080556901f39f9e120a7e5308 - - - - - dd31219e by Bas Couwenberg at 2019-09-27T05:12:18Z New upstream release. - - - - - e4045c80 by Bas Couwenberg at 2019-09-27T05:16:53Z Update URLs to use HTTPS. - - - - - 35f70e8a by Bas Couwenberg at 2019-09-27T05:17:57Z Drop python-3.7.patch, applied upstream. Refresh remaining patches. - - - - - 3bd4eb57 by Bas Couwenberg at 2019-09-27T05:18:11Z Set distribution to unstable. - - - - - 30 changed files: - CODE_OF_CONDUCT.md - CONTRIBUTING.rst - README.md - − RELEASE-howto.md - VERSION.txt - debian/changelog - debian/control - debian/patches/offline-tests.patch - − debian/patches/python-3.7.patch - debian/patches/series - debian/upstream/metadata - default-sample.cfg - docs/api.rst - docs/conf.py - docs/configuration.rst - + docs/contributing.rst - docs/demobuffer.py - docs/deployment.rst - − docs/development.rst - docs/index.rst - docs/install.rst - + docs/metalinkprocess.py - docs/migration.rst - docs/process.rst - docs/pywps.rst - + docs/show_error.py - docs/wps.rst - pywps/__init__.py - pywps/app/Common.py - pywps/app/Process.py The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/14c60e0f2e892f2b7331ef863a36b5b44afa7796...3bd4eb5743089ff4d71ddc5f5b5bd79c730471f9 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/compare/14c60e0f2e892f2b7331ef863a36b5b44afa7796...3bd4eb5743089ff4d71ddc5f5b5bd79c730471f9 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:35:33 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:35:33 +0000 Subject: [Git][debian-gis-team/pywps][pristine-tar] pristine-tar data for pywps_4.2.2.orig.tar.gz Message-ID: <5d8d9fa562549_1efe3f977ec5fc2411682a@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / pywps Commits: d5f60a59 by Bas Couwenberg at 2019-09-27T05:11:44Z pristine-tar data for pywps_4.2.2.orig.tar.gz - - - - - 2 changed files: - + pywps_4.2.2.orig.tar.gz.delta - + pywps_4.2.2.orig.tar.gz.id Changes: ===================================== pywps_4.2.2.orig.tar.gz.delta ===================================== Binary files /dev/null and b/pywps_4.2.2.orig.tar.gz.delta differ ===================================== pywps_4.2.2.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +66c730fbc1a2cc29c48cfb169d197675e6e8eee7 View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/commit/d5f60a59bc9d116de3be562691819988112af5b4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/commit/d5f60a59bc9d116de3be562691819988112af5b4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:35:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:35:34 +0000 Subject: [Git][debian-gis-team/pywps][upstream] New upstream version 4.2.2 Message-ID: <5d8d9fa66a076_1efe2b1553ee71481170f0@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / pywps Commits: 1fd0970c by Bas Couwenberg at 2019-09-27T05:11:41Z New upstream version 4.2.2 - - - - - 30 changed files: - CODE_OF_CONDUCT.md - CONTRIBUTING.rst - README.md - − RELEASE-howto.md - VERSION.txt - debian/changelog - debian/control - default-sample.cfg - docs/api.rst - docs/conf.py - docs/configuration.rst - + docs/contributing.rst - docs/demobuffer.py - docs/deployment.rst - − docs/development.rst - docs/index.rst - docs/install.rst - + docs/metalinkprocess.py - docs/migration.rst - docs/process.rst - docs/pywps.rst - + docs/show_error.py - docs/wps.rst - pywps/__init__.py - pywps/app/Common.py - pywps/app/Process.py - pywps/app/Service.py - pywps/app/WPSRequest.py - + pywps/app/exceptions.py - pywps/configuration.py The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/commit/1fd0970cfa518a96baa5aab6256af3e61cdb5984 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/commit/1fd0970cfa518a96baa5aab6256af3e61cdb5984 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:35:37 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:35:37 +0000 Subject: [Git][debian-gis-team/pywps] Pushed new tag debian/4.2.2-1 Message-ID: <5d8d9fa9bb0a9_1efe3f977ef7aa1c11723f@godard.mail> Bas Couwenberg pushed new tag debian/4.2.2-1 at Debian GIS Project / pywps -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/tree/debian/4.2.2-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:35:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:35:38 +0000 Subject: [Git][debian-gis-team/pywps] Pushed new tag upstream/4.2.2 Message-ID: <5d8d9faab8985_1efe2b155368714c117480@godard.mail> Bas Couwenberg pushed new tag upstream/4.2.2 at Debian GIS Project / pywps -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/tree/upstream/4.2.2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 27 06:40:30 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 05:40:30 +0000 Subject: Processing of pywps_4.2.2-1_source.changes Message-ID: pywps_4.2.2-1_source.changes uploaded successfully to localhost along with the files: pywps_4.2.2-1.dsc pywps_4.2.2.orig.tar.gz pywps_4.2.2-1.debian.tar.xz pywps_4.2.2-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 27 06:49:14 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 05:49:14 +0000 Subject: pywps_4.2.2-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 27 Sep 2019 07:18:00 +0200 Source: pywps Architecture: source Version: 4.2.2-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: pywps (4.2.2-1) unstable; urgency=medium . * New upstream release. * Update URLs to use HTTPS. * Drop python-3.7.patch, applied upstream. Refresh remaining patches. Checksums-Sha1: 4e878efea765bc69cb4846b9adb7e3cf793a9b62 2357 pywps_4.2.2-1.dsc 2bd3b9075b29379080ea40e56ea736272686243e 367007 pywps_4.2.2.orig.tar.gz 8ccfa26734a1e7557df8bd539bf8202938ef553f 11480 pywps_4.2.2-1.debian.tar.xz 03e46cf10f88111667d07c31d6bd986cbfe745f5 11695 pywps_4.2.2-1_amd64.buildinfo Checksums-Sha256: 163d7adc9d4d356b68d027f1703128e8352faab1863cb19a4dd516609513de76 2357 pywps_4.2.2-1.dsc 83098a1fb72f28b6fe775c624e0c2dd3cbc077e3f2c04ae805eedb9e91365098 367007 pywps_4.2.2.orig.tar.gz 394ea821eff901e0b50d1a950914ed32ef50d4c1a0c638edcfdf059d9edfacb9 11480 pywps_4.2.2-1.debian.tar.xz b9a73e7aa230fbdc7e2b3a3bce48707aa8294a72969326d4fc3ac65f4b94f7dc 11695 pywps_4.2.2-1_amd64.buildinfo Files: c5e02fbd4881f2058d842300901810c5 2357 python optional pywps_4.2.2-1.dsc fd9c2660963ba73213e61803d2734bb4 367007 python optional pywps_4.2.2.orig.tar.gz 42ca075dadcd824c358796df25d97615 11480 python optional pywps_4.2.2-1.debian.tar.xz c21c20b60a17782d5092fa5d4b6e7b4b 11695 python optional pywps_4.2.2-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2NnlcACgkQZ1DxCuiN SvEc2w/+Iq1V9ncpMkYXXalTKfRapbNwu3HNRuaatFHZc1koeWT/04PsgWGuB920 ci/b1kuiNxLRiWqrFUSh0wXFHSCvF4Fxgr3UXoGir3LOsosyGoRcO/TyXXFmJo58 11JCn5ETr5C5rcEdKYtzuHTrKtkvawXSWMhCUFEUy4wmPPqfmJKLktREpcWBLP0a CSmG1GxV/afXrpSfqCBcioBWG05LuotgDm29cgv9YJChtPGJHl5Yf/tV8ESY9Rpl mjoCN5mRhwjFQMQhU9aGD7bHognwafJtjPOYorNLpaYoDyya2R4vSbf7j8+QHAxj wnagdvGmeFZOB/AY7WYaTdsk/XOD2wN840sPS/zXtKnAHzzwtUawiTpAMzZjzQRl Yk/3g9KGT1MWuDQyxi6dDG8jU/bviXZ+jgPRId5q2D4Y9u84eydIzEszoShldnPx IBNFU1zdr+OP8qeHmmXqVx1ONB5hjeoYaAuifRdR4N+i2Ypaj8UOwM2i/SO9XMF5 EZ5PAhn+iBliuBigPkNjOQ9qLBiCQyITCeLMgP8YH8qBolUMKvVCz6XOov8w9eHl Tn9e2bjvzauf/tAnL0Mn5nY3WVkPvJRNXVJOfk5oV494QFw+5nU3dHMrze+/i+Dl gWoYWfZvVX+wGYcPP8dJu2GEyhEUl8Z0Lw1VTfkLXKN2aKcoEXo= =ID2D -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Fri Sep 27 06:52:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:52:05 +0000 Subject: [Git][debian-gis-team/fiona][master] 5 commits: New upstream version 1.8.7 Message-ID: <5d8da385a3e4_1efe2b155368714c117924@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / fiona Commits: 6f44711a by Bas Couwenberg at 2019-09-27T05:36:18Z New upstream version 1.8.7 - - - - - e627c390 by Bas Couwenberg at 2019-09-27T05:36:26Z New upstream version 1.8.8 - - - - - ee645c26 by Bas Couwenberg at 2019-09-27T05:36:28Z Update upstream source from tag 'upstream/1.8.8' Update to upstream version '1.8.8' with Debian dir 6ad12912ce2b186a51cb966a766ce4203a2eea7e - - - - - 29723457 by Bas Couwenberg at 2019-09-27T05:36:48Z New upstream release. - - - - - e0e0a03d by Bas Couwenberg at 2019-09-27T05:37:57Z Set distribution to unstable. - - - - - 7 changed files: - CHANGES.txt - debian/changelog - fiona/__init__.py - fiona/_geometry.pyx - fiona/path.py - + tests/data/!test.geojson - tests/test_open.py Changes: ===================================== CHANGES.txt ===================================== @@ -3,6 +3,32 @@ Changes All issue numbers are relative to https://github.com/Toblerity/Fiona/issues. +1.8.8 (2019-09-25) +------------------ + +- The schema of geopackage files with a geometry type code of 3000 could not be + reported using Fiona 1.8.7. This bug is fixed. + +1.8.7 (2019-09-24) +------------------ + +Bug fixes: + +- Regression in handling of polygons with M values noted under version 1.8.5 + below was in fact not fixed then (see new report #789), but is fixed in + version 1.8.7. +- Windows filenames containing "!" are now parsed correctly, fixing issue #742. + +Upcoming changes: + +- In version 1.9.0, the objects yielded when a Collection is iterated will be + mutable mappings but will no longer be instances of Python's dict. Version + 1.9 is intended to be backwards compatible with 1.8 except where user code + tests `isinstance(feature, dict)`. In version 2.0 the new Feature, Geometry, + and Properties classes will become immutable mappings. See + https://github.com/Toblerity/fiona-rfc/blob/master/rfc/0001-fiona-2-0-changes.md + for more discussion of the upcoming changes for version 2.0. + 1.8.6 (2019-03-18) ------------------ ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +fiona (1.8.8-1) unstable; urgency=medium + + * Team upload. + * New upstream release. + + -- Bas Couwenberg Fri, 27 Sep 2019 07:37:43 +0200 + fiona (1.8.6-3) unstable; urgency=medium * Team upload. ===================================== fiona/__init__.py ===================================== @@ -101,7 +101,7 @@ import uuid __all__ = ['bounds', 'listlayers', 'open', 'prop_type', 'prop_width'] -__version__ = "1.8.6" +__version__ = "1.8.8" __gdal_version__ = get_gdal_release_name() gdal_version = get_gdal_version_tuple() ===================================== fiona/_geometry.pyx ===================================== @@ -69,8 +69,10 @@ cdef unsigned int geometry_type_code(name) except? 9999: cdef object normalize_geometry_type_code(unsigned int code): """Normalize M geometry type codes.""" # Normalize 'M' types to 2D types. - if 2000 < code < 3000: + if 2000 <= code < 3000: code = code % 1000 + elif code == 3000: + code = 0 # Normalize 'ZM' types to 3D types. elif 3000 < code < 4000: code = (code % 1000) | 0x80000000 ===================================== fiona/path.py ===================================== @@ -132,7 +132,7 @@ def parse_path(path): elif path.startswith('/vsi'): return UnparsedPath(path) - else: + elif re.match("^[a-z0-9\\+]*://", path): parts = urlparse(path) # if the scheme is not one of Rasterio's supported schemes, we @@ -143,6 +143,9 @@ def parse_path(path): else: return ParsedPath.from_uri(path) + else: + return UnparsedPath(path) + def vsi_path(path): """Convert a parsed path to a GDAL VSI path ===================================== tests/data/!test.geojson ===================================== @@ -0,0 +1 @@ +{"features":[{"geometry":{"coordinates":[[[[-61.173214300000005,12.516654800000001],[-61.3827217,12.5301363],[-61.665747100000004,12.5966532],[-61.6661847,12.596],[-61.66814250000001,12.593],[-61.6700247,12.59],[-61.6718337,12.587],[-61.673571700000004,12.584],[-61.6752407,12.581],[-61.6768427,12.578],[-61.678379400000004,12.575000000000001],[-61.6803295,12.571],[-61.6830501,12.565000000000001],[-61.68553430000001,12.559000000000001],[-61.687063699999996,12.555000000000001],[-61.6884946,12.551],[-61.6898391,12.546999999999999],[-61.69209600000001,12.540999999999999],[-61.69413360000001,12.535],[-61.69595870000001,12.529],[-61.697577200000005,12.523],[-61.69899410000001,12.517],[-61.700213700000006,12.511],[-61.7012395,12.505],[-61.7020744,12.499],[-61.702626200000005,12.494],[-61.7033841,12.493],[-61.706211800000005,12.491],[-61.7089415,12.489],[-61.7141311,12.485000000000001],[-61.718995500000005,12.481],[-61.72356890000001,12.477],[-61.727879200000004,12.473],[-61.7319495,12.469000000000001],[-61.73579920000001,12.465000000000002],[-61.74032590000001,12.46],[-61.74373590000001,12.456000000000001],[-61.746971,12.452000000000002],[-61.7500412,12.447999999999999],[-61.75295580000001,12.443999999999999],[-61.753784499999995,12.443],[-61.756858300000005,12.44],[-61.7598054,12.437],[-61.762633400000006,12.434],[-61.76534870000001,12.431],[-61.767957200000005,12.427999999999999],[-61.7704641,12.425],[-61.7728741,12.422],[-61.775191500000005,12.419],[-61.7774201,12.416],[-61.7802595,12.412],[-61.782954800000006,12.408],[-61.78551270000001,12.404],[-61.7873446,12.401],[-61.789675900000006,12.397],[-61.7918847,12.393],[-61.79397550000001,12.389000000000001],[-61.794998400000004,12.388],[-61.79830060000001,12.386000000000001],[-61.8030062,12.383000000000001],[-61.8059936,12.381],[-61.810272399999995,12.378],[-61.8130009,12.376000000000001],[-61.815637599999995,12.374],[-61.8181882,12.372000000000002],[-61.82186339999999,12.369000000000002],[-61.8265048,12.365000000000002],[-61.830876599999996,12.361],[-61.8329692,12.359000000000002],[-61.835999,12.356000000000002],[-61.8413082,12.351],[-61.845319800000006,12.347],[-61.8464439,12.346],[-61.8501187,12.343],[-61.853625699999995,12.34],[-61.85697739999999,12.337],[-61.86122339999999,12.333],[-61.864252900000004,12.33],[-61.8671584,12.327],[-61.8699469,12.324],[-61.872645999999996,12.321],[-61.8754727,12.318],[-61.87906749999999,12.314],[-61.8833,12.309000000000001],[-61.88726319999999,12.304],[-61.88952,12.301],[-61.891690399999995,12.297999999999998],[-61.8937778,12.295],[-61.895785200000006,12.292],[-61.89771530000001,12.289],[-61.899570800000006,12.286],[-61.90251490000001,12.280999999999999],[-61.904753,12.277],[-61.9068719,12.273],[-61.908875900000005,12.269],[-61.911674299999994,12.263],[-61.9134062,12.259],[-61.9150578,12.255],[-61.9179797,12.248999999999999],[-61.920656900000004,12.242999999999999],[-61.92290190000001,12.238999999999999],[-61.925082,12.235],[-61.92666,12.232],[-61.9286637,12.227999999999998],[-61.930556100000004,12.223999999999998],[-61.9332651,12.217999999999998],[-61.936145100000005,12.212],[-61.938782200000006,12.206],[-61.943587599999994,12.193999999999999],[-61.94511500000001,12.19],[-61.9465439,12.186],[-61.9485074,12.18],[-61.95028749999999,12.174],[-61.95186999999999,12.168],[-61.9532519,12.162],[-61.95443739999999,12.156],[-61.954975999999995,12.154],[-61.9570107,12.147999999999998],[-61.9594482,12.139999999999999],[-61.961132600000006,12.133999999999999],[-61.962614,12.127999999999998],[-61.96295200000001,12.126999999999999],[-61.9668105,12.122],[-61.9704259,12.116999999999999],[-61.9738135,12.112],[-61.9769866,12.107],[-61.9799566,12.102],[-61.9827336,12.097],[-61.9853262,12.092],[-61.9882048,12.086],[-61.990875800000005,12.08],[-61.99252880000001,12.076],[-61.994819,12.07],[-61.996888999999996,12.064],[-61.99874590000001,12.058],[-62.000395600000004,12.052000000000001],[-62.0018433,12.046],[-62.0030933,12.04],[-62.003818700000004,12.036],[-62.0047472,12.03],[-62.0052609,12.026],[-62.005875200000006,12.02],[-62.0061812,12.016],[-62.0064861,12.01],[-62.0065868,12.006],[-62.006584499999995,12],[-62.006398100000006,11.994],[-62.0061714,11.99],[-62.0056768,11.984],[-62.0052436,11.98],[-62.004436999999996,11.974],[-62.003794,11.97],[-62.0026693,11.964],[-62.001811399999994,11.96],[-62.0003595,11.954],[-61.999279800000004,11.950000000000001],[-61.9974886,11.943999999999999],[-61.9961776,11.94],[-61.9940313,11.934],[-61.9924772,11.93],[-61.9908218,11.926],[-61.989062399999995,11.922],[-61.9871961,11.918],[-61.984707699999994,11.913],[-61.9825882,11.909],[-61.9803498,11.905000000000001],[-61.9773776,11.9],[-61.9748543,11.896],[-61.972195400000004,11.892000000000001],[-61.9693945,11.888],[-61.9664442,11.884],[-61.9641286,11.881],[-61.9617206,11.878],[-61.959215900000004,11.875000000000002],[-61.9557177,11.871],[-61.9520267,11.867],[-61.9496952,11.864],[-61.94728729999999,11.861],[-61.9430571,11.856000000000002],[-61.93853550000001,11.851],[-61.934690599999996,11.847],[-61.9306255,11.843],[-61.9274208,11.84],[-61.922921800000005,11.836],[-61.9193636,11.833],[-61.9156332,11.83],[-61.911715,11.827],[-61.9075906,11.824],[-61.903238,11.821],[-61.89863020000001,11.818],[-61.8937341,11.815000000000001],[-61.888507499999996,11.812000000000001],[-61.88481339999999,11.81],[-61.8789067,11.807],[-61.87468659999999,11.805000000000001],[-61.870200499999996,11.803],[-61.86540230000001,11.801],[-61.8602301,11.799],[-61.854597299999995,11.796999999999999],[-61.848375600000004,11.795],[-61.84498479999999,11.793999999999999],[-61.8413608,11.793],[-61.8374527,11.792],[-61.8331873,11.790999999999999],[-61.828452500000004,11.79],[-61.8230605,11.789],[-61.81664609999999,11.787999999999998],[-61.808274399999995,11.786999999999999],[-61.790283900000006,11.786],[-61.7840631,11.786],[-61.76607270000001,11.786999999999999],[-61.7573236,11.787999999999998],[-61.73933300000001,11.789],[-61.730961300000004,11.79],[-61.72079310000001,11.790999999999999],[-61.70280230000001,11.792],[-61.6944305,11.793],[-61.688016000000005,11.793999999999999],[-61.6826238,11.795],[-61.6732043,11.796999999999999],[-61.667812100000006,11.797999999999998],[-61.663077200000004,11.799],[-61.6588117,11.8],[-61.654903499999996,11.801],[-61.6512793,11.802000000000001],[-61.64788839999999,11.803],[-61.644693499999995,11.804],[-61.63878580000001,11.806000000000001],[-61.636033600000005,11.807],[-61.6308613,11.809000000000001],[-61.62841970000001,11.81],[-61.623784,11.812000000000001],[-61.621576700000006,11.813],[-61.6173564,11.815000000000001],[-61.6133668,11.817],[-61.60957990000001,11.819],[-61.6042318,11.822000000000001],[-61.6008621,11.824],[-61.5976324,11.826],[-61.5930244,11.829],[-61.590096,11.831],[-61.5872727,11.833],[-61.585739600000004,11.834],[-61.5816382,11.836],[-61.5758831,11.839],[-61.5705345,11.842],[-61.565532900000015,11.845],[-61.56375870000001,11.846],[-61.55785109999999,11.849],[-61.552374300000004,11.852000000000002],[-61.5472238,11.855000000000002],[-61.543925300000005,11.857000000000001],[-61.5407605,11.859000000000002],[-61.5362404,11.862000000000002],[-61.533365200000006,11.864],[-61.530591300000005,11.866000000000001],[-61.52791200000001,11.868],[-61.5240577,11.871],[-61.5215904,11.873000000000001],[-61.5180317,11.876000000000001],[-61.515748300000006,11.878],[-61.5124482,11.881],[-61.5103269,11.883000000000001],[-61.5072563,11.886000000000001],[-61.49835930000001,11.895000000000001],[-61.494617399999996,11.899000000000001],[-61.4902146,11.904],[-61.48533390000001,11.909],[-61.4816423,11.913],[-61.47729749999999,11.918],[-61.4732301,11.923],[-61.4694197,11.927999999999999],[-61.464353900000006,11.935],[-61.4615887,11.939],[-61.458328800000004,11.943999999999999],[-61.4552762,11.949],[-61.452420399999994,11.954],[-61.450271099999995,11.958],[-61.4482377,11.962000000000002],[-61.4454269,11.967],[-61.4438039,11.97],[-61.4412339,11.975000000000001],[-61.4388714,11.979000000000001],[-61.436091399999995,11.984],[-61.43451230000001,11.987],[-61.4329978,11.99],[-61.4310762,11.994],[-61.428396400000004,12],[-61.42595080000001,12.006],[-61.423730500000005,12.012],[-61.42313910000001,12.013],[-61.4211047,12.016],[-61.41851320000001,12.02],[-61.4166569,12.023],[-61.41487299999999,12.026],[-61.4131591,12.029],[-61.4109797,12.033],[-61.409422,12.036],[-61.40744449999999,12.04],[-61.405577300000004,12.043999999999999],[-61.4038171,12.047999999999998],[-61.402161,12.052000000000001],[-61.399865999999996,12.058],[-61.39845880000001,12.062000000000001],[-61.3971473,12.066],[-61.3959295,12.07],[-61.394275,12.076],[-61.393596300000006,12.078],[-61.3910564,12.081],[-61.38782199999999,12.085],[-61.3855047,12.088],[-61.382552100000005,12.092],[-61.3804362,12.095],[-61.37774039999999,12.099],[-61.3758089,12.102],[-61.3733493,12.106],[-61.37158839999999,12.109],[-61.369348200000005,12.113],[-61.367746499999996,12.116],[-61.36571180000001,12.12],[-61.3637893,12.123999999999999],[-61.3619754,12.127999999999998],[-61.360267099999994,12.132],[-61.35866139999999,12.136],[-61.35643999999999,12.142],[-61.35508039999999,12.145999999999999],[-61.3538123,12.15],[-61.3520543,12.156],[-61.3509963,12.16],[-61.35002769999999,12.164],[-61.3487397,12.17],[-61.34798939999999,12.174],[-61.34732449999999,12.177999999999999],[-61.3464859,12.184],[-61.346031399999994,12.187999999999999],[-61.345660099999996,12.192],[-61.3452579,12.197999999999999],[-61.3450925,12.202],[-61.3450091,12.206],[-61.345007599999995,12.209999999999999],[-61.345087899999996,12.213999999999999],[-61.3452502,12.217999999999998],[-61.345494599999995,12.222],[-61.34582149999999,12.225999999999999],[-61.3462313,12.229999999999999],[-61.347002499999995,12.235999999999999],[-61.34762189999999,12.239999999999998],[-61.3483264,12.243999999999998],[-61.3495448,12.25],[-61.345779300000004,12.253],[-61.3421596,12.256],[-61.339838,12.258],[-61.3364839,12.261],[-61.333273999999996,12.264],[-61.330199,12.267],[-61.3272505,12.27],[-61.324421099999995,12.273],[-61.3217043,12.276],[-61.31824699999999,12.28],[-61.3157713,12.283],[-61.3126179,12.286999999999999],[-61.30962449999999,12.290999999999999],[-61.3067826,12.295],[-61.30408469999999,12.299],[-61.301524099999995,12.303],[-61.300435,12.304],[-61.297963700000004,12.306000000000001],[-61.29439910000001,12.309000000000001],[-61.29211200000002,12.311],[-61.2888064,12.314],[-61.286681699999995,12.316],[-61.283606,12.319],[-61.280656900000004,12.322000000000001],[-61.277826999999995,12.325000000000001],[-61.27510960000001,12.328],[-61.272499,12.331],[-61.26999000000001,12.334],[-61.267577800000005,12.337],[-61.26525820000001,12.34],[-61.262302800000015,12.344],[-61.260184900000006,12.347],[-61.25748639999999,12.351],[-61.256502700000006,12.352000000000002],[-61.25251010000001,12.355000000000002],[-61.248711400000005,12.358],[-61.245090100000006,12.361],[-61.2416324,12.364],[-61.23832620000001,12.367],[-61.235161000000005,12.370000000000001],[-61.23212780000001,12.373000000000001],[-61.22827520000001,12.377],[-61.22552040000001,12.38],[-61.22287430000001,12.383000000000001],[-61.220331400000006,12.386000000000001],[-61.2170932,12.39],[-61.2147732,12.393],[-61.2118172,12.397],[-61.209698800000005,12.4],[-61.206999800000006,12.404],[-61.205066,12.407],[-61.20258810000001,12.411],[-61.2008015,12.414],[-61.199085100000005,12.417],[-61.19743690000001,12.42],[-61.19585520000001,12.423],[-61.19433810000001,12.426],[-61.19241330000001,12.43],[-61.1897291,12.436],[-61.1872793,12.442],[-61.18505530000001,12.447999999999999],[-61.1836941,12.452000000000002],[-61.1824277,12.456000000000001],[-61.18125439999999,12.46],[-61.180172500000005,12.464],[-61.1791805,12.468],[-61.178277,12.472000000000001],[-61.1770853,12.478],[-61.17603230000001,12.484],[-61.175387900000004,12.488],[-61.1745797,12.494],[-61.1741456,12.498],[-61.1737945,12.502],[-61.173526,12.506],[-61.17333980000001,12.51],[-61.17323580000001,12.514],[-61.173214300000005,12.516654800000001]]]],"type":"MultiPolygon"},"id":550727,"osm_type":"relation","type":"Feature","name":"Grenada","properties":{"flag":"http://upload.wikimedia.org/wikipedia/commons/b/bc/Flag_of_Grenada.svg","name":"Grenada","name:cs":"Grenada","name:de":"Grenada","name:en":"Grenada","name:eo":"Grenado","name:fr":"Grenade","name:fy":"Grenada","name:hr":"Grenada","name:nl":"Grenada","name:ru":"Гренада","name:sl":"Grenada","name:ta":"கிரெனடா","name:uk":"Гренада","boundary":"administrative","name:tzl":"Grenada","timezone":"America/Grenada","wikidata":"Q769","ISO3166-1":"GD","wikipedia":"en:Grenada","admin_level":"2","is_in:continent":"North America","ISO3166-1:alpha2":"GD","ISO3166-1:alpha3":"GRD","ISO3166-1:numeric":"308"}}],"type":"FeatureCollection","geocoding":{"creation_date":"2016-10-12","generator":{"author":{"name":"Mapzen"},"package":"fences-builder","version":"0.1.2"},"license":"ODbL (see http://www.openstreetmap.org/copyright)"}} ===================================== tests/test_open.py ===================================== @@ -1,8 +1,15 @@ """Tests of file opening""" +import os import fiona def test_open_shp(path_coutwildrnp_shp): """Open a shapefile""" assert fiona.open(path_coutwildrnp_shp) + + +def test_open_filename_with_exclamation(data_dir): + path = os.path.relpath(os.path.join(data_dir, "!test.geojson")) + assert os.path.exists(path), "Missing test data" + assert fiona.open(path), "Failed to open !test.geojson" View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/e285e491122b975197eaf60bc682cfa67c0f72c4...e0e0a03d16305576210ed223f61061901d2fb98c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/e285e491122b975197eaf60bc682cfa67c0f72c4...e0e0a03d16305576210ed223f61061901d2fb98c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:52:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:52:06 +0000 Subject: [Git][debian-gis-team/fiona][pristine-tar] 2 commits: pristine-tar data for fiona_1.8.7.orig.tar.gz Message-ID: <5d8da3868497d_1efe3f97848f257c118190@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / fiona Commits: b024559c by Bas Couwenberg at 2019-09-27T05:36:19Z pristine-tar data for fiona_1.8.7.orig.tar.gz - - - - - 418cb681 by Bas Couwenberg at 2019-09-27T05:36:28Z pristine-tar data for fiona_1.8.8.orig.tar.gz - - - - - 4 changed files: - + fiona_1.8.7.orig.tar.gz.delta - + fiona_1.8.7.orig.tar.gz.id - + fiona_1.8.8.orig.tar.gz.delta - + fiona_1.8.8.orig.tar.gz.id Changes: ===================================== fiona_1.8.7.orig.tar.gz.delta ===================================== Binary files /dev/null and b/fiona_1.8.7.orig.tar.gz.delta differ ===================================== fiona_1.8.7.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +15346ff9ce29805054d7b09332d9b31e4ff308c7 ===================================== fiona_1.8.8.orig.tar.gz.delta ===================================== Binary files /dev/null and b/fiona_1.8.8.orig.tar.gz.delta differ ===================================== fiona_1.8.8.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +962c2425bcd7804f9d81e044b470b132b04ec39c View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/0dce200e287c993ed8952887fe7a0a7b4948dd2f...418cb68128d9454e831ec2688e5aefcbde5c4619 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/0dce200e287c993ed8952887fe7a0a7b4948dd2f...418cb68128d9454e831ec2688e5aefcbde5c4619 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:52:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:52:08 +0000 Subject: [Git][debian-gis-team/fiona][upstream] 2 commits: New upstream version 1.8.7 Message-ID: <5d8da3884dce9_1efe2b1553ee7148118315@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / fiona Commits: 6f44711a by Bas Couwenberg at 2019-09-27T05:36:18Z New upstream version 1.8.7 - - - - - e627c390 by Bas Couwenberg at 2019-09-27T05:36:26Z New upstream version 1.8.8 - - - - - 6 changed files: - CHANGES.txt - fiona/__init__.py - fiona/_geometry.pyx - fiona/path.py - + tests/data/!test.geojson - tests/test_open.py Changes: ===================================== CHANGES.txt ===================================== @@ -3,6 +3,32 @@ Changes All issue numbers are relative to https://github.com/Toblerity/Fiona/issues. +1.8.8 (2019-09-25) +------------------ + +- The schema of geopackage files with a geometry type code of 3000 could not be + reported using Fiona 1.8.7. This bug is fixed. + +1.8.7 (2019-09-24) +------------------ + +Bug fixes: + +- Regression in handling of polygons with M values noted under version 1.8.5 + below was in fact not fixed then (see new report #789), but is fixed in + version 1.8.7. +- Windows filenames containing "!" are now parsed correctly, fixing issue #742. + +Upcoming changes: + +- In version 1.9.0, the objects yielded when a Collection is iterated will be + mutable mappings but will no longer be instances of Python's dict. Version + 1.9 is intended to be backwards compatible with 1.8 except where user code + tests `isinstance(feature, dict)`. In version 2.0 the new Feature, Geometry, + and Properties classes will become immutable mappings. See + https://github.com/Toblerity/fiona-rfc/blob/master/rfc/0001-fiona-2-0-changes.md + for more discussion of the upcoming changes for version 2.0. + 1.8.6 (2019-03-18) ------------------ ===================================== fiona/__init__.py ===================================== @@ -101,7 +101,7 @@ import uuid __all__ = ['bounds', 'listlayers', 'open', 'prop_type', 'prop_width'] -__version__ = "1.8.6" +__version__ = "1.8.8" __gdal_version__ = get_gdal_release_name() gdal_version = get_gdal_version_tuple() ===================================== fiona/_geometry.pyx ===================================== @@ -69,8 +69,10 @@ cdef unsigned int geometry_type_code(name) except? 9999: cdef object normalize_geometry_type_code(unsigned int code): """Normalize M geometry type codes.""" # Normalize 'M' types to 2D types. - if 2000 < code < 3000: + if 2000 <= code < 3000: code = code % 1000 + elif code == 3000: + code = 0 # Normalize 'ZM' types to 3D types. elif 3000 < code < 4000: code = (code % 1000) | 0x80000000 ===================================== fiona/path.py ===================================== @@ -132,7 +132,7 @@ def parse_path(path): elif path.startswith('/vsi'): return UnparsedPath(path) - else: + elif re.match("^[a-z0-9\\+]*://", path): parts = urlparse(path) # if the scheme is not one of Rasterio's supported schemes, we @@ -143,6 +143,9 @@ def parse_path(path): else: return ParsedPath.from_uri(path) + else: + return UnparsedPath(path) + def vsi_path(path): """Convert a parsed path to a GDAL VSI path ===================================== tests/data/!test.geojson ===================================== @@ -0,0 +1 @@ +{"features":[{"geometry":{"coordinates":[[[[-61.173214300000005,12.516654800000001],[-61.3827217,12.5301363],[-61.665747100000004,12.5966532],[-61.6661847,12.596],[-61.66814250000001,12.593],[-61.6700247,12.59],[-61.6718337,12.587],[-61.673571700000004,12.584],[-61.6752407,12.581],[-61.6768427,12.578],[-61.678379400000004,12.575000000000001],[-61.6803295,12.571],[-61.6830501,12.565000000000001],[-61.68553430000001,12.559000000000001],[-61.687063699999996,12.555000000000001],[-61.6884946,12.551],[-61.6898391,12.546999999999999],[-61.69209600000001,12.540999999999999],[-61.69413360000001,12.535],[-61.69595870000001,12.529],[-61.697577200000005,12.523],[-61.69899410000001,12.517],[-61.700213700000006,12.511],[-61.7012395,12.505],[-61.7020744,12.499],[-61.702626200000005,12.494],[-61.7033841,12.493],[-61.706211800000005,12.491],[-61.7089415,12.489],[-61.7141311,12.485000000000001],[-61.718995500000005,12.481],[-61.72356890000001,12.477],[-61.727879200000004,12.473],[-61.7319495,12.469000000000001],[-61.73579920000001,12.465000000000002],[-61.74032590000001,12.46],[-61.74373590000001,12.456000000000001],[-61.746971,12.452000000000002],[-61.7500412,12.447999999999999],[-61.75295580000001,12.443999999999999],[-61.753784499999995,12.443],[-61.756858300000005,12.44],[-61.7598054,12.437],[-61.762633400000006,12.434],[-61.76534870000001,12.431],[-61.767957200000005,12.427999999999999],[-61.7704641,12.425],[-61.7728741,12.422],[-61.775191500000005,12.419],[-61.7774201,12.416],[-61.7802595,12.412],[-61.782954800000006,12.408],[-61.78551270000001,12.404],[-61.7873446,12.401],[-61.789675900000006,12.397],[-61.7918847,12.393],[-61.79397550000001,12.389000000000001],[-61.794998400000004,12.388],[-61.79830060000001,12.386000000000001],[-61.8030062,12.383000000000001],[-61.8059936,12.381],[-61.810272399999995,12.378],[-61.8130009,12.376000000000001],[-61.815637599999995,12.374],[-61.8181882,12.372000000000002],[-61.82186339999999,12.369000000000002],[-61.8265048,12.365000000000002],[-61.830876599999996,12.361],[-61.8329692,12.359000000000002],[-61.835999,12.356000000000002],[-61.8413082,12.351],[-61.845319800000006,12.347],[-61.8464439,12.346],[-61.8501187,12.343],[-61.853625699999995,12.34],[-61.85697739999999,12.337],[-61.86122339999999,12.333],[-61.864252900000004,12.33],[-61.8671584,12.327],[-61.8699469,12.324],[-61.872645999999996,12.321],[-61.8754727,12.318],[-61.87906749999999,12.314],[-61.8833,12.309000000000001],[-61.88726319999999,12.304],[-61.88952,12.301],[-61.891690399999995,12.297999999999998],[-61.8937778,12.295],[-61.895785200000006,12.292],[-61.89771530000001,12.289],[-61.899570800000006,12.286],[-61.90251490000001,12.280999999999999],[-61.904753,12.277],[-61.9068719,12.273],[-61.908875900000005,12.269],[-61.911674299999994,12.263],[-61.9134062,12.259],[-61.9150578,12.255],[-61.9179797,12.248999999999999],[-61.920656900000004,12.242999999999999],[-61.92290190000001,12.238999999999999],[-61.925082,12.235],[-61.92666,12.232],[-61.9286637,12.227999999999998],[-61.930556100000004,12.223999999999998],[-61.9332651,12.217999999999998],[-61.936145100000005,12.212],[-61.938782200000006,12.206],[-61.943587599999994,12.193999999999999],[-61.94511500000001,12.19],[-61.9465439,12.186],[-61.9485074,12.18],[-61.95028749999999,12.174],[-61.95186999999999,12.168],[-61.9532519,12.162],[-61.95443739999999,12.156],[-61.954975999999995,12.154],[-61.9570107,12.147999999999998],[-61.9594482,12.139999999999999],[-61.961132600000006,12.133999999999999],[-61.962614,12.127999999999998],[-61.96295200000001,12.126999999999999],[-61.9668105,12.122],[-61.9704259,12.116999999999999],[-61.9738135,12.112],[-61.9769866,12.107],[-61.9799566,12.102],[-61.9827336,12.097],[-61.9853262,12.092],[-61.9882048,12.086],[-61.990875800000005,12.08],[-61.99252880000001,12.076],[-61.994819,12.07],[-61.996888999999996,12.064],[-61.99874590000001,12.058],[-62.000395600000004,12.052000000000001],[-62.0018433,12.046],[-62.0030933,12.04],[-62.003818700000004,12.036],[-62.0047472,12.03],[-62.0052609,12.026],[-62.005875200000006,12.02],[-62.0061812,12.016],[-62.0064861,12.01],[-62.0065868,12.006],[-62.006584499999995,12],[-62.006398100000006,11.994],[-62.0061714,11.99],[-62.0056768,11.984],[-62.0052436,11.98],[-62.004436999999996,11.974],[-62.003794,11.97],[-62.0026693,11.964],[-62.001811399999994,11.96],[-62.0003595,11.954],[-61.999279800000004,11.950000000000001],[-61.9974886,11.943999999999999],[-61.9961776,11.94],[-61.9940313,11.934],[-61.9924772,11.93],[-61.9908218,11.926],[-61.989062399999995,11.922],[-61.9871961,11.918],[-61.984707699999994,11.913],[-61.9825882,11.909],[-61.9803498,11.905000000000001],[-61.9773776,11.9],[-61.9748543,11.896],[-61.972195400000004,11.892000000000001],[-61.9693945,11.888],[-61.9664442,11.884],[-61.9641286,11.881],[-61.9617206,11.878],[-61.959215900000004,11.875000000000002],[-61.9557177,11.871],[-61.9520267,11.867],[-61.9496952,11.864],[-61.94728729999999,11.861],[-61.9430571,11.856000000000002],[-61.93853550000001,11.851],[-61.934690599999996,11.847],[-61.9306255,11.843],[-61.9274208,11.84],[-61.922921800000005,11.836],[-61.9193636,11.833],[-61.9156332,11.83],[-61.911715,11.827],[-61.9075906,11.824],[-61.903238,11.821],[-61.89863020000001,11.818],[-61.8937341,11.815000000000001],[-61.888507499999996,11.812000000000001],[-61.88481339999999,11.81],[-61.8789067,11.807],[-61.87468659999999,11.805000000000001],[-61.870200499999996,11.803],[-61.86540230000001,11.801],[-61.8602301,11.799],[-61.854597299999995,11.796999999999999],[-61.848375600000004,11.795],[-61.84498479999999,11.793999999999999],[-61.8413608,11.793],[-61.8374527,11.792],[-61.8331873,11.790999999999999],[-61.828452500000004,11.79],[-61.8230605,11.789],[-61.81664609999999,11.787999999999998],[-61.808274399999995,11.786999999999999],[-61.790283900000006,11.786],[-61.7840631,11.786],[-61.76607270000001,11.786999999999999],[-61.7573236,11.787999999999998],[-61.73933300000001,11.789],[-61.730961300000004,11.79],[-61.72079310000001,11.790999999999999],[-61.70280230000001,11.792],[-61.6944305,11.793],[-61.688016000000005,11.793999999999999],[-61.6826238,11.795],[-61.6732043,11.796999999999999],[-61.667812100000006,11.797999999999998],[-61.663077200000004,11.799],[-61.6588117,11.8],[-61.654903499999996,11.801],[-61.6512793,11.802000000000001],[-61.64788839999999,11.803],[-61.644693499999995,11.804],[-61.63878580000001,11.806000000000001],[-61.636033600000005,11.807],[-61.6308613,11.809000000000001],[-61.62841970000001,11.81],[-61.623784,11.812000000000001],[-61.621576700000006,11.813],[-61.6173564,11.815000000000001],[-61.6133668,11.817],[-61.60957990000001,11.819],[-61.6042318,11.822000000000001],[-61.6008621,11.824],[-61.5976324,11.826],[-61.5930244,11.829],[-61.590096,11.831],[-61.5872727,11.833],[-61.585739600000004,11.834],[-61.5816382,11.836],[-61.5758831,11.839],[-61.5705345,11.842],[-61.565532900000015,11.845],[-61.56375870000001,11.846],[-61.55785109999999,11.849],[-61.552374300000004,11.852000000000002],[-61.5472238,11.855000000000002],[-61.543925300000005,11.857000000000001],[-61.5407605,11.859000000000002],[-61.5362404,11.862000000000002],[-61.533365200000006,11.864],[-61.530591300000005,11.866000000000001],[-61.52791200000001,11.868],[-61.5240577,11.871],[-61.5215904,11.873000000000001],[-61.5180317,11.876000000000001],[-61.515748300000006,11.878],[-61.5124482,11.881],[-61.5103269,11.883000000000001],[-61.5072563,11.886000000000001],[-61.49835930000001,11.895000000000001],[-61.494617399999996,11.899000000000001],[-61.4902146,11.904],[-61.48533390000001,11.909],[-61.4816423,11.913],[-61.47729749999999,11.918],[-61.4732301,11.923],[-61.4694197,11.927999999999999],[-61.464353900000006,11.935],[-61.4615887,11.939],[-61.458328800000004,11.943999999999999],[-61.4552762,11.949],[-61.452420399999994,11.954],[-61.450271099999995,11.958],[-61.4482377,11.962000000000002],[-61.4454269,11.967],[-61.4438039,11.97],[-61.4412339,11.975000000000001],[-61.4388714,11.979000000000001],[-61.436091399999995,11.984],[-61.43451230000001,11.987],[-61.4329978,11.99],[-61.4310762,11.994],[-61.428396400000004,12],[-61.42595080000001,12.006],[-61.423730500000005,12.012],[-61.42313910000001,12.013],[-61.4211047,12.016],[-61.41851320000001,12.02],[-61.4166569,12.023],[-61.41487299999999,12.026],[-61.4131591,12.029],[-61.4109797,12.033],[-61.409422,12.036],[-61.40744449999999,12.04],[-61.405577300000004,12.043999999999999],[-61.4038171,12.047999999999998],[-61.402161,12.052000000000001],[-61.399865999999996,12.058],[-61.39845880000001,12.062000000000001],[-61.3971473,12.066],[-61.3959295,12.07],[-61.394275,12.076],[-61.393596300000006,12.078],[-61.3910564,12.081],[-61.38782199999999,12.085],[-61.3855047,12.088],[-61.382552100000005,12.092],[-61.3804362,12.095],[-61.37774039999999,12.099],[-61.3758089,12.102],[-61.3733493,12.106],[-61.37158839999999,12.109],[-61.369348200000005,12.113],[-61.367746499999996,12.116],[-61.36571180000001,12.12],[-61.3637893,12.123999999999999],[-61.3619754,12.127999999999998],[-61.360267099999994,12.132],[-61.35866139999999,12.136],[-61.35643999999999,12.142],[-61.35508039999999,12.145999999999999],[-61.3538123,12.15],[-61.3520543,12.156],[-61.3509963,12.16],[-61.35002769999999,12.164],[-61.3487397,12.17],[-61.34798939999999,12.174],[-61.34732449999999,12.177999999999999],[-61.3464859,12.184],[-61.346031399999994,12.187999999999999],[-61.345660099999996,12.192],[-61.3452579,12.197999999999999],[-61.3450925,12.202],[-61.3450091,12.206],[-61.345007599999995,12.209999999999999],[-61.345087899999996,12.213999999999999],[-61.3452502,12.217999999999998],[-61.345494599999995,12.222],[-61.34582149999999,12.225999999999999],[-61.3462313,12.229999999999999],[-61.347002499999995,12.235999999999999],[-61.34762189999999,12.239999999999998],[-61.3483264,12.243999999999998],[-61.3495448,12.25],[-61.345779300000004,12.253],[-61.3421596,12.256],[-61.339838,12.258],[-61.3364839,12.261],[-61.333273999999996,12.264],[-61.330199,12.267],[-61.3272505,12.27],[-61.324421099999995,12.273],[-61.3217043,12.276],[-61.31824699999999,12.28],[-61.3157713,12.283],[-61.3126179,12.286999999999999],[-61.30962449999999,12.290999999999999],[-61.3067826,12.295],[-61.30408469999999,12.299],[-61.301524099999995,12.303],[-61.300435,12.304],[-61.297963700000004,12.306000000000001],[-61.29439910000001,12.309000000000001],[-61.29211200000002,12.311],[-61.2888064,12.314],[-61.286681699999995,12.316],[-61.283606,12.319],[-61.280656900000004,12.322000000000001],[-61.277826999999995,12.325000000000001],[-61.27510960000001,12.328],[-61.272499,12.331],[-61.26999000000001,12.334],[-61.267577800000005,12.337],[-61.26525820000001,12.34],[-61.262302800000015,12.344],[-61.260184900000006,12.347],[-61.25748639999999,12.351],[-61.256502700000006,12.352000000000002],[-61.25251010000001,12.355000000000002],[-61.248711400000005,12.358],[-61.245090100000006,12.361],[-61.2416324,12.364],[-61.23832620000001,12.367],[-61.235161000000005,12.370000000000001],[-61.23212780000001,12.373000000000001],[-61.22827520000001,12.377],[-61.22552040000001,12.38],[-61.22287430000001,12.383000000000001],[-61.220331400000006,12.386000000000001],[-61.2170932,12.39],[-61.2147732,12.393],[-61.2118172,12.397],[-61.209698800000005,12.4],[-61.206999800000006,12.404],[-61.205066,12.407],[-61.20258810000001,12.411],[-61.2008015,12.414],[-61.199085100000005,12.417],[-61.19743690000001,12.42],[-61.19585520000001,12.423],[-61.19433810000001,12.426],[-61.19241330000001,12.43],[-61.1897291,12.436],[-61.1872793,12.442],[-61.18505530000001,12.447999999999999],[-61.1836941,12.452000000000002],[-61.1824277,12.456000000000001],[-61.18125439999999,12.46],[-61.180172500000005,12.464],[-61.1791805,12.468],[-61.178277,12.472000000000001],[-61.1770853,12.478],[-61.17603230000001,12.484],[-61.175387900000004,12.488],[-61.1745797,12.494],[-61.1741456,12.498],[-61.1737945,12.502],[-61.173526,12.506],[-61.17333980000001,12.51],[-61.17323580000001,12.514],[-61.173214300000005,12.516654800000001]]]],"type":"MultiPolygon"},"id":550727,"osm_type":"relation","type":"Feature","name":"Grenada","properties":{"flag":"http://upload.wikimedia.org/wikipedia/commons/b/bc/Flag_of_Grenada.svg","name":"Grenada","name:cs":"Grenada","name:de":"Grenada","name:en":"Grenada","name:eo":"Grenado","name:fr":"Grenade","name:fy":"Grenada","name:hr":"Grenada","name:nl":"Grenada","name:ru":"Гренада","name:sl":"Grenada","name:ta":"கிரெனடா","name:uk":"Гренада","boundary":"administrative","name:tzl":"Grenada","timezone":"America/Grenada","wikidata":"Q769","ISO3166-1":"GD","wikipedia":"en:Grenada","admin_level":"2","is_in:continent":"North America","ISO3166-1:alpha2":"GD","ISO3166-1:alpha3":"GRD","ISO3166-1:numeric":"308"}}],"type":"FeatureCollection","geocoding":{"creation_date":"2016-10-12","generator":{"author":{"name":"Mapzen"},"package":"fences-builder","version":"0.1.2"},"license":"ODbL (see http://www.openstreetmap.org/copyright)"}} ===================================== tests/test_open.py ===================================== @@ -1,8 +1,15 @@ """Tests of file opening""" +import os import fiona def test_open_shp(path_coutwildrnp_shp): """Open a shapefile""" assert fiona.open(path_coutwildrnp_shp) + + +def test_open_filename_with_exclamation(data_dir): + path = os.path.relpath(os.path.join(data_dir, "!test.geojson")) + assert os.path.exists(path), "Missing test data" + assert fiona.open(path), "Failed to open !test.geojson" View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/bb7dc8b8919051eb37be7d4492883e25fc146a01...e627c39016e858e6342e2d7ea58a705522790efb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/compare/bb7dc8b8919051eb37be7d4492883e25fc146a01...e627c39016e858e6342e2d7ea58a705522790efb You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:52:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:52:17 +0000 Subject: [Git][debian-gis-team/fiona] Pushed new tag debian/1.8.8-1 Message-ID: <5d8da39157bc_1efe3f977ef7aa1c118510@godard.mail> Bas Couwenberg pushed new tag debian/1.8.8-1 at Debian GIS Project / fiona -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/tree/debian/1.8.8-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:52:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:52:17 +0000 Subject: [Git][debian-gis-team/fiona] Pushed new tag upstream/1.8.7 Message-ID: <5d8da391e0fa0_1efe3f977ef7aa1c1187dc@godard.mail> Bas Couwenberg pushed new tag upstream/1.8.7 at Debian GIS Project / fiona -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/tree/upstream/1.8.7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 06:52:18 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 05:52:18 +0000 Subject: [Git][debian-gis-team/fiona] Pushed new tag upstream/1.8.8 Message-ID: <5d8da392b8f8e_1efe3f97848f257c11890@godard.mail> Bas Couwenberg pushed new tag upstream/1.8.8 at Debian GIS Project / fiona -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/tree/upstream/1.8.8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 27 06:55:30 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 05:55:30 +0000 Subject: Processing of fiona_1.8.8-1_source.changes Message-ID: fiona_1.8.8-1_source.changes uploaded successfully to localhost along with the files: fiona_1.8.8-1.dsc fiona_1.8.8.orig.tar.gz fiona_1.8.8-1.debian.tar.xz fiona_1.8.8-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 27 07:04:10 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 06:04:10 +0000 Subject: fiona_1.8.8-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 27 Sep 2019 07:37:43 +0200 Source: fiona Architecture: source Version: 1.8.8-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: fiona (1.8.8-1) unstable; urgency=medium . * Team upload. * New upstream release. Checksums-Sha1: 7eb46087434b89007f9ac8295962ffec9b3b9bf6 2237 fiona_1.8.8-1.dsc 8ab9704fea1ac7af2930dbed97a6bb4acc64339c 244768 fiona_1.8.8.orig.tar.gz 6ae2856f802c15ceeb8015d7556bddd957d41f8c 29680 fiona_1.8.8-1.debian.tar.xz 36e8aebd9302ec8a2cf91ad1c4948be8d7323fb5 14374 fiona_1.8.8-1_amd64.buildinfo Checksums-Sha256: 5a18a569ab6f9c46c42e1285c6e729842f2399811f818cda80a6fee456b1c915 2237 fiona_1.8.8-1.dsc 2072010f72953bd1005a5b92fac84725ce8cf77a2581bac0dfe99613fa0d57c8 244768 fiona_1.8.8.orig.tar.gz 1fad5580b9e96a0efb79c47e983f7c260d119f7acf4090a807dbef145eeb39b2 29680 fiona_1.8.8-1.debian.tar.xz 01440226fe2d545ef817d7501113a37ca67537d5ae4802c08fa42151fe466018 14374 fiona_1.8.8-1_amd64.buildinfo Files: ffd298f7f514a08221fbc4c2fc0d250d 2237 python optional fiona_1.8.8-1.dsc 176e1bcc77e7c04510d00c3adedd18b7 244768 python optional fiona_1.8.8.orig.tar.gz 2ff59b826a47a891128ded9c54562cfb 29680 python optional fiona_1.8.8-1.debian.tar.xz 8cbeda7a8e547a92aa5dc5cc6bd51b1e 14374 python optional fiona_1.8.8-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2Non8ACgkQZ1DxCuiN SvH9RBAAuBNrhOJYyWIudsu03GpVZbTpDZWNgjh4LUxqQ7dQ/DydnDf1wlIKNcbc wbOKvX5ryNAMcrNu3AQxtrnpEvBSCt8yP+q/k1vV8uozz9GyaFQwJoNyAuL4sSl7 bHOK5bjiFq4dKwbpGINWzM7LqALuifg50uR/FCH8/1iEwQ28IC/lyKlFmCjaJ0hP gBEJuMr1gTKcnJf6g+07BRpicuGiHTw+6D4h8+IKFPUagzTXAic1hGDQhttY8h/6 Ex0x/BqI3+4ZCXvpSGRwUFclEG9QE9rJLLP+C3Rb8GzOfmTazxSgB+FBOmBNgKD0 x2ACH+4WyLktl+ipxSd2Mfj/3kfmalrfMlQSoV/Ka8ZUfPaePNfoX7MOBOYmecqA tQkXFu3nPJTDqwX7sfRTHKTCrngMoQ5kl/bs8XKQYxdON6cawqw8TPsF5Li5tnXH BDQBZJA0lUWveBGRz8VfHzc200w5KK9NJAmxGN28k7D26GWgA42HITVRlEEpx6d1 V5zDIwLvaAV+Me/5qryrd2lV8LO9p/nwZOFlB32VSM06/DR06gQ98432RRe6oBea RjcBzTXllhMZaxXAaRnwEufir0l2hGiNN0I6DZrS4n9eI4s8IxhQ/7C6bklCXixy 5IQrKVlEMNFNT1i2Zn/MPjoPhclT1prhAhGAMX0QOuTDLfU5XS0= =5MTo -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Fri Sep 27 07:50:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 06:50:41 +0000 Subject: [Git][debian-gis-team/geographiclib][master] 9 commits: New upstream version 1.50 Message-ID: <5d8db141903f0_1efe2b155368714c122233@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / geographiclib Commits: b8ff1d6d by Bas Couwenberg at 2019-09-27T06:00:53Z New upstream version 1.50 - - - - - df2820d4 by Bas Couwenberg at 2019-09-27T06:01:03Z Update upstream source from tag 'upstream/1.50' Update to upstream version '1.50' with Debian dir aab6eaede1d6cead572f44b8f027ee89de74fafd - - - - - 7bda1600 by Bas Couwenberg at 2019-09-27T06:04:22Z New upstream release. - - - - - 13f90efa by Bas Couwenberg at 2019-09-27T06:08:39Z Update copyright years for Charles Karney. - - - - - 1d9f1162 by Bas Couwenberg at 2019-09-27T06:11:08Z Drop doxygen.patch, applied upstream. Refresh remaining patches. - - - - - d669489a by Bas Couwenberg at 2019-09-27T06:12:59Z Rename library package for SONAME bump. - - - - - a3336e17 by Bas Couwenberg at 2019-09-27T06:32:56Z Install Node.js module in /usr/share/nodejs. - - - - - eb1dcd47 by Bas Couwenberg at 2019-09-27T06:32:56Z Update symbols for amd64. - - - - - f525a4f2 by Bas Couwenberg at 2019-09-27T06:32:56Z Set distribution to experimental. - - - - - 25 changed files: - CMakeLists.txt - LICENSE.txt - Makefile.in - NEWS - aclocal.m4 - cmake/CMakeLists.txt - cmake/Makefile.in - cmake/project-config-version.cmake.in - cmake/project-config.cmake.in - compile - config.guess - config.sub - configure - configure.ac - debian/changelog - debian/control - debian/copyright - debian/libgeographic17.install → debian/libgeographic19.install - debian/libgeographic17.symbols → debian/libgeographic19.symbols - debian/node-geographiclib.install - − debian/patches/doxygen.patch - debian/patches/privacy.patch - debian/patches/series - debian/rules - depcomp The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/compare/f69a71a5763ca3300e44668ebfa3954a269fa709...f525a4f237982587a22434a380dbaa4948b677b1 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/compare/f69a71a5763ca3300e44668ebfa3954a269fa709...f525a4f237982587a22434a380dbaa4948b677b1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 07:50:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 06:50:43 +0000 Subject: [Git][debian-gis-team/geographiclib][pristine-tar] pristine-tar data for geographiclib_1.50.orig.tar.gz Message-ID: <5d8db14313af0_1efe2b155368714c122464@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / geographiclib Commits: b922a249 by Bas Couwenberg at 2019-09-27T06:01:03Z pristine-tar data for geographiclib_1.50.orig.tar.gz - - - - - 2 changed files: - + geographiclib_1.50.orig.tar.gz.delta - + geographiclib_1.50.orig.tar.gz.id Changes: ===================================== geographiclib_1.50.orig.tar.gz.delta ===================================== Binary files /dev/null and b/geographiclib_1.50.orig.tar.gz.delta differ ===================================== geographiclib_1.50.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +79e70f7ea57c44ed34e973fa2ec85cbc1398ab05 View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/b922a2499aa0a7ba5caa29e1de43dec368e215c4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/b922a2499aa0a7ba5caa29e1de43dec368e215c4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 07:50:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 06:50:44 +0000 Subject: [Git][debian-gis-team/geographiclib][upstream] New upstream version 1.50 Message-ID: <5d8db144318e3_1efe3f97848f257c1226b5@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / geographiclib Commits: b8ff1d6d by Bas Couwenberg at 2019-09-27T06:00:53Z New upstream version 1.50 - - - - - 15 changed files: - CMakeLists.txt - LICENSE.txt - Makefile.in - NEWS - aclocal.m4 - cmake/CMakeLists.txt - cmake/Makefile.in - cmake/project-config-version.cmake.in - cmake/project-config.cmake.in - compile - config.guess - config.sub - configure - configure.ac - depcomp The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/b8ff1d6da30815f9c9749de2bf908d500700d484 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/b8ff1d6da30815f9c9749de2bf908d500700d484 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 07:50:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 06:50:49 +0000 Subject: [Git][debian-gis-team/geographiclib] Pushed new tag debian/1.50-1_exp1 Message-ID: <5d8db1491b434_1efe3f977ec5fc24122873@godard.mail> Bas Couwenberg pushed new tag debian/1.50-1_exp1 at Debian GIS Project / geographiclib -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/tree/debian/1.50-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 07:50:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 06:50:49 +0000 Subject: [Git][debian-gis-team/geographiclib] Pushed new tag upstream/1.50 Message-ID: <5d8db149a91bc_1efe2b1553ee71481230cf@godard.mail> Bas Couwenberg pushed new tag upstream/1.50 at Debian GIS Project / geographiclib -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/tree/upstream/1.50 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 27 07:55:36 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 06:55:36 +0000 Subject: Processing of geographiclib_1.50-1~exp1_amd64.changes Message-ID: geographiclib_1.50-1~exp1_amd64.changes uploaded successfully to localhost along with the files: geographiclib_1.50-1~exp1.dsc geographiclib_1.50.orig.tar.gz geographiclib_1.50-1~exp1.debian.tar.xz geographiclib-doc_1.50-1~exp1_all.deb geographiclib-tools-dbgsym_1.50-1~exp1_amd64.deb geographiclib-tools_1.50-1~exp1_amd64.deb geographiclib_1.50-1~exp1_amd64.buildinfo libgeographic-dev_1.50-1~exp1_amd64.deb libgeographic19-dbgsym_1.50-1~exp1_amd64.deb libgeographic19_1.50-1~exp1_amd64.deb node-geographiclib_1.50-1~exp1_all.deb python3-geographiclib_1.50-1~exp1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 27 08:04:43 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 07:04:43 +0000 Subject: geographiclib_1.50-1~exp1_amd64.changes is NEW Message-ID: binary:libgeographic19 is NEW. binary:libgeographic19 is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From gitlab at salsa.debian.org Fri Sep 27 12:55:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 11:55:06 +0000 Subject: [Git][blends-team/gis][master] Drop python3-otb from remotesensing task. Message-ID: <5d8df89bda90_1efe2b1553bf2eec158593@godard.mail> Bas Couwenberg pushed to branch master at Debian Blends Team / gis Commits: 8827d963 by Bas Couwenberg at 2019-09-27T11:54:58Z Drop python3-otb from remotesensing task. - - - - - 2 changed files: - debian/changelog - tasks/remotesensing Changes: ===================================== debian/changelog ===================================== @@ -2,6 +2,8 @@ debian-gis (0.0.19) UNRELEASED; urgency=medium * tasks/devel: - Drop python3-mapbox-vector-tile + * tasks/remotesensing: + - Drop python3-otb * tasks/workstation: - Drop libgeo-point-perl ===================================== tasks/remotesensing ===================================== @@ -5,7 +5,7 @@ Description: Remote sensing and earth observation processing (interferometry, polarimetry, data visualization, etc) and earth observation. -Depends: monteverdi, otb-bin, otb-bin-qt, python3-otb, libotb-dev +Depends: monteverdi, otb-bin, otb-bin-qt, libotb-dev Depends: python3-epr, libepr-api-dev View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/8827d963a53622e93fd171ddd0beb90db7e6b982 -- View it on GitLab: https://salsa.debian.org/blends-team/gis/commit/8827d963a53622e93fd171ddd0beb90db7e6b982 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 16:38:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 15:38:41 +0000 Subject: [Git][debian-gis-team/otb][pristine-tar] pristine-tar data for otb_7.0.0~rc1+dfsg.orig.tar.xz Message-ID: <5d8e2d014535d_1efe2b1553bf2eec18386b@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / otb Commits: bc643511 by Bas Couwenberg at 2019-09-27T07:55:13Z pristine-tar data for otb_7.0.0~rc1+dfsg.orig.tar.xz - - - - - 2 changed files: - + otb_7.0.0~rc1+dfsg.orig.tar.xz.delta - + otb_7.0.0~rc1+dfsg.orig.tar.xz.id Changes: ===================================== otb_7.0.0~rc1+dfsg.orig.tar.xz.delta ===================================== Binary files /dev/null and b/otb_7.0.0~rc1+dfsg.orig.tar.xz.delta differ ===================================== otb_7.0.0~rc1+dfsg.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +bed307df60dca828319feec80f52899dc9acc2a5 View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/bc6435112eb5c8a86313190a46e95f3d81c12dfa -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/bc6435112eb5c8a86313190a46e95f3d81c12dfa You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 16:38:42 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 15:38:42 +0000 Subject: [Git][debian-gis-team/otb][master] 12 commits: New upstream version 7.0.0~rc1+dfsg Message-ID: <5d8e2d0277a25_1efe2b155366d594184014@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / otb Commits: f57e2a05 by Bas Couwenberg at 2019-09-27T07:43:34Z New upstream version 7.0.0~rc1+dfsg - - - - - 0c06c3a2 by Bas Couwenberg at 2019-09-27T07:55:14Z Update upstream source from tag 'upstream/7.0.0_rc1+dfsg' Update to upstream version '7.0.0~rc1+dfsg' with Debian dir 033f31377aa5b68f9ca175a3e5209ddf4151367a - - - - - 455f260a by Bas Couwenberg at 2019-09-27T08:13:33Z New upstream release candidate. - - - - - 1af18874 by Bas Couwenberg at 2019-09-27T08:16:24Z Drop patches, applied/fixed upstream. - - - - - 0d2b72fd by Bas Couwenberg at 2019-09-27T09:59:07Z Update copyright years for CNES. - - - - - 04f259ae by Bas Couwenberg at 2019-09-27T11:00:16Z Drop library packages no longer built. - - - - - 12d35a85 by Bas Couwenberg at 2019-09-27T11:00:16Z Drop python package, no longer built. - - - - - 3c85e152 by Bas Couwenberg at 2019-09-27T11:00:16Z Drop unused OTB_USE_MAPNIK option. - - - - - 21014d22 by Bas Couwenberg at 2019-09-27T12:41:26Z Drop unused overrides for file-references-package-build-path. - - - - - 0d5be7c1 by Bas Couwenberg at 2019-09-27T12:41:27Z Add lintian overrides for binary-without-manpage. - - - - - 9495c0ef by Bas Couwenberg at 2019-09-27T13:46:58Z Add patch to fix spelling errors. - - - - - 6922b674 by Bas Couwenberg at 2019-09-27T13:46:58Z Set distribution to experimental. - - - - - 30 changed files: - + .clang-format - + .editorconfig - + .gitattributes - + .gitlab-ci.yml - .gitlab/issue_templates/documentation_changes.md - + .gitlab/issue_templates/release.md - .gitlab/merge_request_templates/request_for_changes.md - + .mailmap - .travis.yml - + CI/README.md - + CI/cdash_handler.py - + CI/check_twin_pipelines.py - + CI/configure_options.cmake - + CI/contributors_check.sh - + CI/ctest2junit.xsl - + CI/debian-unstable-gcc.cmake - + CI/deploy.sh - + CI/dev_env.bat - + CI/headers_check.py - + CI/macos-10.11.6-clang.cmake - + CI/macros.cmake - + CI/main_ci.cmake - + CI/main_packages.cmake - + CI/main_qa.cmake - + CI/main_superbuild.cmake - + CI/otb_coverage.sh - + CI/prepare_superbuild.cmake - + CI/sb_configure_options.cmake - + CI/test/README - + CI/test/Testing/20190320-1706/Configure.xml The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/compare/3d5857bb2ce95d2b55af0d0199935b3e0837820d...6922b674af15d5be0b58f65905b2332a22405db6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/compare/3d5857bb2ce95d2b55af0d0199935b3e0837820d...6922b674af15d5be0b58f65905b2332a22405db6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 16:38:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 15:38:43 +0000 Subject: [Git][debian-gis-team/otb][upstream] New upstream version 7.0.0~rc1+dfsg Message-ID: <5d8e2d037f365_1efe3f9784696ff81842ea@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / otb Commits: f57e2a05 by Bas Couwenberg at 2019-09-27T07:43:34Z New upstream version 7.0.0~rc1+dfsg - - - - - 30 changed files: - + .clang-format - + .editorconfig - + .gitattributes - + .gitlab-ci.yml - .gitlab/issue_templates/documentation_changes.md - + .gitlab/issue_templates/release.md - .gitlab/merge_request_templates/request_for_changes.md - + .mailmap - .travis.yml - + CI/README.md - + CI/cdash_handler.py - + CI/check_twin_pipelines.py - + CI/configure_options.cmake - + CI/contributors_check.sh - + CI/ctest2junit.xsl - + CI/debian-unstable-gcc.cmake - + CI/deploy.sh - + CI/dev_env.bat - + CI/headers_check.py - + CI/macos-10.11.6-clang.cmake - + CI/macros.cmake - + CI/main_ci.cmake - + CI/main_packages.cmake - + CI/main_qa.cmake - + CI/main_superbuild.cmake - + CI/otb_coverage.sh - + CI/prepare_superbuild.cmake - + CI/sb_configure_options.cmake - + CI/test/README - + CI/test/Testing/20190320-1706/Configure.xml The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/f57e2a05538ac6fc1f262ce6015cfc45ac0893a9 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/f57e2a05538ac6fc1f262ce6015cfc45ac0893a9 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 16:38:52 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 15:38:52 +0000 Subject: [Git][debian-gis-team/otb] Pushed new tag debian/7.0.0_rc1+dfsg-1_exp1 Message-ID: <5d8e2d0c16d12_1efe3f97828c4dcc184479@godard.mail> Bas Couwenberg pushed new tag debian/7.0.0_rc1+dfsg-1_exp1 at Debian GIS Project / otb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/tree/debian/7.0.0_rc1+dfsg-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 16:38:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 15:38:53 +0000 Subject: [Git][debian-gis-team/otb] Pushed new tag upstream/7.0.0_rc1+dfsg Message-ID: <5d8e2d0d9d963_1efe2b155366d594184625@godard.mail> Bas Couwenberg pushed new tag upstream/7.0.0_rc1+dfsg at Debian GIS Project / otb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/tree/upstream/7.0.0_rc1+dfsg You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 27 16:43:39 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 15:43:39 +0000 Subject: Processing of otb_7.0.0~rc1+dfsg-1~exp1_amd64.changes Message-ID: otb_7.0.0~rc1+dfsg-1~exp1_amd64.changes uploaded successfully to localhost along with the files: otb_7.0.0~rc1+dfsg-1~exp1.dsc otb_7.0.0~rc1+dfsg.orig.tar.xz otb_7.0.0~rc1+dfsg-1~exp1.debian.tar.xz libotb-apps-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotb-apps_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotb-dev_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotb_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbapplicationengine-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbapplicationengine-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcarto-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcarto-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcommandline-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcommandline-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcommon-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcommon-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcurladapters-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbcurladapters-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbextendedfilename-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbextendedfilename-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbfuzzy-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbfuzzy-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbgdaladapters-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbgdaladapters-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbice-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbice-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbimagebase-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbimagebase-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbimageio-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbimageio-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbimagemanipulation-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbimagemanipulation-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiobsq-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiobsq-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiogdal-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiogdal-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiokml-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiokml-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiolum-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiolum-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiomstar-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiomstar-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbioonera-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbioonera-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiorad-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbiorad-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotblearningbase-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotblearningbase-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmapla-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmapla-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmathparser-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmathparser-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmathparserx-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmathparserx-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmetadata-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmetadata-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmonteverdi-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmonteverdi-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmonteverdicore-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmonteverdicore-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmonteverdigui-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbmonteverdigui-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbossimadapters-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbossimadapters-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbossimplugins-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbossimplugins-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbpolarimetry-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbpolarimetry-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbprojection-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbprojection-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbqtadapters-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbqtadapters-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbqtwidget-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbqtwidget-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbsampling-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbsampling-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbsiftfast-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbsiftfast-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbstatistics-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbstatistics-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbstreaming-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbstreaming-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbsupervised-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbsupervised-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbtestkernel-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbtestkernel-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbvectordatabase-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbvectordatabase-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbvectordataio-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbvectordataio-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbwavelet-7.0-1-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb libotbwavelet-7.0-1_7.0.0~rc1+dfsg-1~exp1_amd64.deb monteverdi-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb monteverdi_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-bin-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-bin-qt-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-bin-qt_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-bin_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-i18n_7.0.0~rc1+dfsg-1~exp1_all.deb otb-qgis-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-qgis_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-testdriver-dbgsym_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb-testdriver_7.0.0~rc1+dfsg-1~exp1_amd64.deb otb_7.0.0~rc1+dfsg-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 27 16:55:55 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 15:55:55 +0000 Subject: otb_7.0.0~rc1+dfsg-1~exp1_amd64.changes is NEW Message-ID: binary:libotbapplicationengine-7.0-1 is NEW. binary:libotbcarto-7.0-1 is NEW. binary:libotbcommandline-7.0-1 is NEW. binary:libotbcommon-7.0-1 is NEW. binary:libotbcurladapters-7.0-1 is NEW. binary:libotbextendedfilename-7.0-1 is NEW. binary:libotbfuzzy-7.0-1 is NEW. binary:libotbgdaladapters-7.0-1 is NEW. binary:libotbice-7.0-1 is NEW. binary:libotbimagebase-7.0-1 is NEW. binary:libotbimageio-7.0-1 is NEW. binary:libotbimagemanipulation-7.0-1 is NEW. binary:libotbiobsq-7.0-1 is NEW. binary:libotbiogdal-7.0-1 is NEW. binary:libotbiokml-7.0-1 is NEW. binary:libotbiolum-7.0-1 is NEW. binary:libotbiomstar-7.0-1 is NEW. binary:libotbioonera-7.0-1 is NEW. binary:libotbiorad-7.0-1 is NEW. binary:libotblearningbase-7.0-1 is NEW. binary:libotbmapla-7.0-1 is NEW. binary:libotbmathparser-7.0-1 is NEW. binary:libotbmathparserx-7.0-1 is NEW. binary:libotbmetadata-7.0-1 is NEW. binary:libotbmonteverdi-7.0-1 is NEW. binary:libotbmonteverdicore-7.0-1 is NEW. binary:libotbmonteverdigui-7.0-1 is NEW. binary:libotbossimadapters-7.0-1 is NEW. binary:libotbossimplugins-7.0-1 is NEW. binary:libotbpolarimetry-7.0-1 is NEW. binary:libotbprojection-7.0-1 is NEW. binary:libotbqtadapters-7.0-1 is NEW. binary:libotbqtwidget-7.0-1 is NEW. binary:libotbsampling-7.0-1 is NEW. binary:libotbsiftfast-7.0-1 is NEW. binary:libotbstatistics-7.0-1 is NEW. binary:libotbstreaming-7.0-1 is NEW. binary:libotbsupervised-7.0-1 is NEW. binary:libotbtestkernel-7.0-1 is NEW. binary:libotbvectordatabase-7.0-1 is NEW. binary:libotbvectordataio-7.0-1 is NEW. binary:libotbwavelet-7.0-1 is NEW. binary:libotbmathparser-7.0-1 is NEW. binary:libotbossimplugins-7.0-1 is NEW. binary:libotbcurladapters-7.0-1 is NEW. binary:libotbwavelet-7.0-1 is NEW. binary:libotbvectordataio-7.0-1 is NEW. binary:libotbextendedfilename-7.0-1 is NEW. binary:libotbmetadata-7.0-1 is NEW. binary:libotbmonteverdicore-7.0-1 is NEW. binary:libotbmonteverdigui-7.0-1 is NEW. binary:libotbstatistics-7.0-1 is NEW. binary:libotbimagemanipulation-7.0-1 is NEW. binary:libotbmathparserx-7.0-1 is NEW. binary:libotbimageio-7.0-1 is NEW. binary:libotbiomstar-7.0-1 is NEW. binary:libotbcommandline-7.0-1 is NEW. binary:libotbcarto-7.0-1 is NEW. binary:libotbmapla-7.0-1 is NEW. binary:libotbpolarimetry-7.0-1 is NEW. binary:libotbgdaladapters-7.0-1 is NEW. binary:libotblearningbase-7.0-1 is NEW. binary:libotbioonera-7.0-1 is NEW. binary:libotbsampling-7.0-1 is NEW. binary:libotbice-7.0-1 is NEW. binary:libotbprojection-7.0-1 is NEW. binary:libotbossimadapters-7.0-1 is NEW. binary:libotbqtwidget-7.0-1 is NEW. binary:libotbtestkernel-7.0-1 is NEW. binary:libotbcommon-7.0-1 is NEW. binary:libotbsupervised-7.0-1 is NEW. binary:libotbimagebase-7.0-1 is NEW. binary:libotbiorad-7.0-1 is NEW. binary:libotbstreaming-7.0-1 is NEW. binary:libotbsiftfast-7.0-1 is NEW. binary:libotbmonteverdi-7.0-1 is NEW. binary:libotbqtadapters-7.0-1 is NEW. binary:libotbfuzzy-7.0-1 is NEW. binary:libotbiokml-7.0-1 is NEW. binary:libotbiobsq-7.0-1 is NEW. binary:libotbiolum-7.0-1 is NEW. binary:libotbvectordatabase-7.0-1 is NEW. binary:libotbapplicationengine-7.0-1 is NEW. binary:libotbiogdal-7.0-1 is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From urbatys at information-immobilier.com Fri Sep 27 18:05:50 2019 From: urbatys at information-immobilier.com (Urbatys Immobilier) Date: Fri, 27 Sep 2019 17:05:50 +0000 Subject: =?utf-8?q?De_belles_parcelles_pour_construire_votre_maison_dans_le_Finist?= =?utf-8?q?=C3=A8re?= Message-ID: <201927091705.1lxb9vmp7n5jj@email.marketing-premium.com> Si vous ne parvenez pas à lire cet e-mail, cliquez http://r.email.marketing-premium.com/mk/mr/7M7-B-_4YGR-Kz1Pou2WSx5IQ3Ng0hdvJ2SPdSZ9G549CSVogEsMyyL36_oPGCRnfe6ecMJuIVeaug9Ldi6ua1KGAKh9Jl3GYfGgo4zNP1IJjKgrgkpG1g  DES TERRAINS À CONSTRUIRE PRÈS DE CHEZ VOUS !     ACHETER, LOUER, INVESTIR, VENDRE… Depuis plus de 30 ans, Urbatys vous accompagne dans votre projet immobilier [ ]( http://r.email.marketing-premium.com/mk/cl/f/jMsiry9vSitKH9tynUpTO-Wk5mD6q5wD3pRyyz2hJagX9ZhN4e51vsUb_-eykqFpjaCbznv-QqVTP1cWxEpwDm3N_c8-Xu4LabyNUSgDJk1jYUzZ5qeb5kNTpMEwQCGhEjjN0lCc5Uleij-tkIQFn4NAIGKyanYB01XcUI46 )   NOUVEAU   [ ]( http://r.email.marketing-premium.com/mk/cl/f/gt_50FOOsK3P46qrLTDttGHaxy-pfR1Jz1NL_w9NB9OEG4qY91P3hoZK_nwTQZVf-wJfTnCICfLAwbZFvkE8L46n-AW2oD_nihZ1kL13z4poIg6ACrsEXDJ1x84_TTbVQepgO45DbUw8SFaXoP_9UG9ymKdvDrAaQqN4C1D-JECDj7naPNp10ja_bwjP9BaWUgRDbn9ADT-RA87WYmUQt93Tfq4Yf84FFH5SeuiqxhiiJOkS7hNYK7cYf9zPByNKUMZ86tlOQEhYhazHmcgHPnV_lv9bwg ) [ ]( http://r.email.marketing-premium.com/mk/cl/f/VEZOWEhf4kNBeNpPwCPfHTFP-hmQIGfIPkxAHvV4_GaHTa49Rr_p7PxMw8pkOdKyJDXCanNWj_J-QXiHJczC-c8x1He3Mq-tMcDr8hY2CjU6xFNk-RAuk8cNskgwlTiD4e8BlRxgNYD38UNR0stpWxEme1wW8tQs4i8u0nDFfNDfhuEGZ36k4UL93dSaw65Nb77nE4yGk5j_--n774OE9Ghj0-okhPRTiYwAiLp6aZrVJgQ6fFnXorw6OGpo9ggve57dyDtGVSHZhxDHDX8rFGjwxhbqHQ )   PLOUEZOC'H (29) LES HAUTS DE ROSENGUY Terrains de 448 à 805 m² à partir de 38 500 €   - À proximité du centre-ville - Lotissement à taille humaine   [ > DÉCOUVRIR ]( http://r.email.marketing-premium.com/mk/cl/f/DOJpFUdRHj8fgD0nJmoFolHLZVwk40ga0NdTiOiqFESycFmKUKg_uO0BiBqhVS-TRhnvhbspFN0x2uZ3kVS40AH4xaIts4uKca-90oRT-Mfbxchndt0MhUdcD1x8qw1kRcZb9U_Lx-qMKC66U6C21eXpumi4Gy3ckDtunSPSKmAX20rrwbUSxhtSFcELLEZxBQXt_jU8s20X0Vr_p8FqfOAZHK0YENdo8tJqcM4mkxbKqpSGKzaO3dCm2EN1NJZjL4VqQZVGnLsuHI0IZc4SNk1EV1Ildg )       UN ENVIRONNEMENT CALME    [ ]( http://r.email.marketing-premium.com/mk/cl/f/-9Ztsn-HuyNNxipMpwA5n45cfvyD76C5333HB3ACLl8safIcWwGhMGuecoFYgdsKcNDmTvtU8rS4F4kqLEMmNTqAeO3w0Y1GgNyFEm08AUjvrRYgncnWXjlNTQVhVUM-JAQPfjGUumc-LWDBPuSFfoGJLNG2d0LqwTP02BXLSCKutysfXA3rIA3QmuDY0ZjBVka4Nz4jLS-squAepY_0zioIT7WoZBtCO0s6gintI0LCfvHEWTcyk6f7ZF0bM2Mz2fpUOCU5APMQOt0N1Ahag9kIvn0sx_B2 ) [ ]( http://r.email.marketing-premium.com/mk/cl/f/fdGyKYtEQqwoCHU3w5cu1WnqwnKBXNzJbj2jasfqOnJHTepN7O3Z_xOhZuobwQeTPTKCuLt7tMXd3aJ5iH5JMnUw1XRr6qnGCEvUhRXy3IT3-TFw4sPc_rf_c-Z50cMS2vwpirEVXepgsC_wcdtMyBTs71BfhkSHsF9R826ILT0qXkBvHBXArWwMa6K-fiDDMqiq4Ax_e_Lg4bxmHDZ3X7Zt4ZSll_MScDInzT1p127n2zRiE7AfO0gCI-XFzcEiyLRsfkA8wu50rDHCk0Swuq7YEwbV-a1U )   LE FOLGOET (29) LE DOMAINE DE LA BASILIQUE Terrains de 342 à 495m² à partir de 34 500 €   - Au coeur du bourg - Commerces et école à pied   [ > DÉCOUVRIR ]( http://r.email.marketing-premium.com/mk/cl/f/Z0rbTrQFN4KmIaw2efCWM9vdGpyUIWt_7nAls4N9JOGfp83bQnSWMEgBAe9uBS2GQhebi8ih3sYDpgBW-j1j53Bi0U7xMXDUyhyarQ6mAmqDPuTXxHFb68CZwVaYIceEEs0GHO0eznPssTPMrvlOGPeR1Ya8hVITR0yGIPJ9KC8onGE_YRW6E5AGvUsCq2_RfOppuZspwxzHp5nhUPU5duq0TisO5Y5gLFKRptgkZ05C2t4ioVwbZUUNwphZwA3yeGveHvDYjsK-m6dyhvY24gaQNHt9L_ld )       PROCHE DE QUIMPER   [ ]( http://r.email.marketing-premium.com/mk/cl/f/9cnl3ieHwPeN-vMoXp_HwvbumtTvKYk94dMRkmgly8v-gtxoM2QBETHrDupd6HxHtfE9j40c5WyeXaks0pc-XsopVVAJc95CiiVNHfCiytoFQJ90O2_TDJ8Zm8KAPVK1KSHFrXpICm-qvU6eeYAZX2vKzldVQo5SSf0sT1eY91a6avckP-BJBV84jogrVVwIpAY_lmYv2XsuAuKbJ7jnqN_rXm49U-nqb1oq5yi5W18JRu4kzkbFDUvatzycG2EvTnv_rsutk3PoMTXrtn4T94WJgmYGhQ ) [ ]( http://r.email.marketing-premium.com/mk/cl/f/wIgaaxgdm2QSlgbKyzTwkaM91anExQama6Q-6UAzvWXOPrJOsDYb2hNAjmDhEwAg32Z0HrBLNDkJpVgI5PI2SRUh01CemN57L-NN7Of8Fzv6SV-sqOv9PA8_xSxEqtDN8BPdXyBlFiSsl43OssW2fYe_rnf0j3wkQJdB7rXiquY41URhjX6zonvEpTukfkT_N_cfSotrd1pRyodgeQtU5qmE0BFGnNnysj7g6BR-olo6XiwkFtzi8szAseCPNX_rZ3wlTx08OUP7yAdwR2878bEIrkSD5w )   PLUGUFFAN (29) LE DOMAINE DE KERGREIZ Terrains de 414 à 726 m² à partir de 37 500 €   - À proximité du centre-ville - Arrêt de bus proche des terrains   [ > DÉCOUVRIR ]( http://r.email.marketing-premium.com/mk/cl/f/X05D_pjGg3qVm8O1zsvv3O5IEzxiQ8DddODmRNUcWk-nGArhK4qWxT5xIZQ1-AMCKhXcO4fIe9ZB1CntHqEof7T88yNM2zP3dwAdtI730-Od9kzvUvbZDN7IcuXnpVB7zpCpGS7unFo9dUGgo5pEh3bJE2zUae-H2dMYpNt2Y4wFTpzDqez7rWBB-bqRTX9VylMIRsCq-dMoAVPy0FhwkHn97nb_vOzofL43K423Bi8I2AAfcTdVbQHwcHaABb1bx2Ov52SFLUJz-Qs2lD7KOO4IeapCrA )       CONSTRUISEZ DANS UNE COMMUNE DE CHARME    [ ]( http://r.email.marketing-premium.com/mk/cl/f/P13WcurUTdExATsI8Al56sVTbuFJyBxpLOT-8KeIMtsRdImJ_C-902ArZ_Gwimj8S9qFYsE3cZR8m-isPiQh19dDepa92q_REhqrn662hJ4g4W0g6L0UahdIjFwn07gUvbBnjiI2yzkStdqoOPeclvXHa_5Vb7BpFzPvUKRa0vz0j7VgOX6sZkxjyvnw8NU0eaHGL4zsZNCSjcSyWAsWWyS733L0jSX8wcDRAa5wGCXSxEBMz7wF94Cd7Ff2pIz87MS9qAFfMJHdmlI2ALwITNRdtdD7dzQ ) [ ]( http://r.email.marketing-premium.com/mk/cl/f/MFXbC4l6Kc4CWOC78GT7a8D7t9eXn9Pv2KbXiNShKMxpW183h_nGKXtEAPf4sD6-ySQ5eQl866s1cQrW6GCxWM36JON9eH24uw8B__V2WB4giaS4ylR2aJVaWOl1SIBceYjHRBTnGl6jPYZE8IoXXv32mBu6QbH7o7oegqRiWG79IMPV8bG38bEePV3ZJQ0gP6zl1Unh80m4NUTxLPsWFWshKx3eYCG9QrXoDDH9MLUh1jhVAKpFcuYzcCXwL4BDFTk9tyf4OKwiS_rtBN-nWulcjvzq48U )   LANDERNEAU (29) LE DOMAINE DU ROUDOUS Terrains de 442 à 894 m² à partir de 39 900 €   - À proximité du centre ville - Nombreuses surfaces disponibles   [ > DÉCOUVRIR ]( http://r.email.marketing-premium.com/mk/cl/f/SHNj70RmkDtnorIpgEah2PoIxQC_TehFLaUmV4AM_r4qCIiKPKyNtJm3kynOifuIJ7fXBtQCGciKLCm4qfBEi3dlQDyPKzuiBxUq7qba147l28l9ous5M7ci9NBjRjUZqaHkm0yy6goPmYNlUdV1zz68yWCdVbhuxxNMS1mPMpRjLV-J-451fypnL2T39DBapjfL1z1qBnvECfA3cYw3Zrr7WnDwcJf9VaM4qVoKof4VkOIKpI6xfuvDzzQb7z2aCau7byv7gmfLbTO8YttjUbsM0lJE9TI )       IDÉAL POUR LES FAMILLES   [ ]( http://r.email.marketing-premium.com/mk/cl/f/bIaSqbUiguQ5TwzR8LzE48JCe0nZymXP9iFg644vn4HRdhke6gLS8kFK4kHf1c_juyGD55RgbV7k_32X0M7yqluu2fYybVdfGM35xfoyzZWKhBZP-pu-_l0ufqbdbRgvqVQnaY45XJW0pz8P_l2PtXt0w7BLIQgQ1x6jrHticei92Jprl7psbm05dUQcCiOTqfNXNl6sECK3vwzK20NO2xTvEXwQZ0NbHUUmfC4_tQc3xuMLPD7YnsNcla89hXiuZib8051s0swPSe2lYqv3NUOD1P5A ) [ ]( http://r.email.marketing-premium.com/mk/cl/f/iYib-rarnqs27SiV1fAU_nQZsW3aSht4OOCL07_pW6loUBzPyeo2wSNOP-QAAH2sbEDRuADNN-yXUmCzC0IJq6AXa9tIpFRusS78Qvj-ivMpFmKWjEXulPz4pB9ArzokBYlPASH8xvtpIi4OTKK-IqQ1SN4LTAY1c4KEfIIz9Xudzo7RMSIv0WVT5KHejJm4TfVtB_jieVi1T5ycWjWhpev8iqRt55na-iOowO-IO8eY4_69261AZpH9ZSGgFN82dNUnvUrn0fzEiDXygLPAlAAUllz_ )   BRIEC (29) LOTISSEMENT LUCIE AUBRAC Terrains de 301 à 762 m² à partir de 33 700 €   - Proximité commerces - Accès rapide par la voie express   [ > DÉCOUVRIR ]( http://r.email.marketing-premium.com/mk/cl/f/Jupk4pL_Elr0Ysj7p9EWvgmbKsIX96iiLacysfZDZVBF6glSMcwga2iOodQhjL0aLmApLEjQahXkmt2f-tE3x1TENKpMyeieGeWTkrFnieLvxBtyUBMN08iHPcgcociU0hzrjHo5p30l64K8mws6UOTHV36dMTf_e5_dSgbUrSicFsboNohA-A14a6gYSpeSXafZN9sNfi61lQP81PqQ1AcCYYjNf22M0wcCKIY2kJN2iA1WFTHKSDhqdZo_rZhz0Wa7aTC_4IwFctUZd58UqJQOHH1u )       [ > DÉCOUVRIR TOUS NOS BIENS ]( http://r.email.marketing-premium.com/mk/cl/f/-R_DMHS_uvmRdej268JmakYvLBSVWK7h32uLPFtGDrCZBKhs6mGFNmVsceaea15rISUS4f9oecQdO_68URnv_BYVqTSVV0SK9o3wsodARdWWlg4x2WjGb4DdErA_AemRzUerrrWoQ8cFZliCrXoVoWS3iuIjr4mhUMVtY4sfOvy70nkwQKazgCr3Wb6XL5zgLuZ55o1uOhb00pXoLs-xH1Um )     SUIVEZ L'ACTUALITÉ IMMOBILIÈRE AVEC URBATYS :     [ ]( http://r.email.marketing-premium.com/mk/cl/f/KdQigC_O0iK6jEhOs6BBuWArYnr6Jy8QdiJdzuz1jBdsTotjy30evXAZdi5DpQ9ZdD17d506WEux_mggY5DDvZwTYZuVvZ6hFMgZekQvl99MsqZJqr-DOca1HJDZDOsglBYIbGj9H4-GYdp2_3ok2x1HoEaqaixYMEyIfR4O5f02vxblg2GZ-BXJxJnHu1rNnJ24TsvWcxJfNKLv-OMLcJ_FRR2GsFKOfm7sbNjLBiDsko6hZEc )   18 septembre 2019 DEVENEZ PROPRIÉTAIRE À BREST AVEC URBATYS   [ > EN SAVOIR + ]( http://r.email.marketing-premium.com/mk/cl/f/okSsimZ3Fj3MeNwLQaSRSb4ZdrHWx5pyZyPkLTrXBYURKnJW47Y2cVggUdwWTsypaFw48oPEsJEl-5KKwToZ8xcd59i2avy-VN8l7GRrnRqY3gzQCG6trnIviFbvtmXVmbDmJEkE5pFhXhHXyrrvNXXQ-iIgEya4C9tLSDytmO0idUAR9TzyG2hsP5ChUwN--_kGYJzyBoTOznox9rXiN9SHjjNHIyhZbLDrveKgp60Qs4VTzB0 ) [ ]( http://r.email.marketing-premium.com/mk/cl/f/4wVnaly6xbj5MaP0dYg80dAjDimzqAwPqCtabrkU-3QA3g1ORoYCaZ7vAybDN6n8EhMOCauJGJDLeEF9rABtPwB21hm-Zl3Ikc5Ff06TycY1TYPGcSSfxCHG1AkARs016bKRHmROjcarpiQPayaiMMkS2AQBRHVL8Erv88pHftY_TTzui6IIsMeew4rEeR5wVX5vFh3uW3DryDoITuo9a-sEJOY49yxeyb9cavlI3LXo4GcMrcm53WZUpcpunkKogQ )   12 septembre 2019 COMMENT ACQUÉRIR UN TERRAIN CHEZ UN LOTISSEUR    [ > EN SAVOIR + ]( http://r.email.marketing-premium.com/mk/cl/f/MLspvdUOyCjD_bO70GpYQHDlWUsvu2j62EvVNCAAR0NJkTfj364U-y34kV59BP-oYIC9lpeF_YtHv0F-JybdHF9-c9IQmt0rznMXB7-K3Lp4iXzSImhpi38bjJssLwHKA3HJAYeaI4KqijKon1e1qcYCqc3Lo3TLFjEqjuz3l2-kyQQARFzLEGks5-99QJyg3W1vxiwf3MyDo2Xwn0tQyRrSe7n_FBwwDMHA9fg5YBu6Usm1P7nuKuwM5NayWnTDRQ )         Vous recherchez un terrain, un appartement ou une maison en Bretagne ou en Loire-Atlantique ? [ > CONTACTEZ-NOUS ]( http://r.email.marketing-premium.com/mk/cl/f/1FGL7az49sjUn8MH-Jls8rMzaxRlA5VcvIzDZxCA7V9Ff_MwQKH9ACwlbJcMVhOvlSESMSFCQGblOQNfHAS5HJF4pMgThB8sLvwzlKBYh7Ml9vUZm85oRIe4u8gKRrAL2rVqXlqyVC8k0Qr6SsxiS4sj30N_DWJuJVMnBehNlWiwzG_zD5CiJc4 )   [ ]( http://r.email.marketing-premium.com/mk/cl/f/3QZbKNmEOf-VkTz5La6M3Hp0lXTFdgtsQ765e1TmfbjyIvZJ_0mqiHq7xH7Xp5EYV7Bo_iO3WoeagAGMuPEWhi5wi2lwUKQemRzVkA9g1T0v7ziblLPY5ExdzqRA0acKTyMH0jyLjas31DHnsE5b17kf76hzblM5ImMgje1wUTP6dY4t0s3pOu3ZwKSKdQ2ugOLhvxgnTQ )   [ Se désinscrire ]( http://r.email.marketing-premium.com/mk/un/bv2_jxcYu78T6LA6w6UjwjwMKrcJ0SyEhKBCZOsPN15DNTgKscc3lqDRLSm_wIgy_DijjXaIb7qPmdQlMLw1uhb0auFx5FvFhb1cO1jWgAubnmP6cULXWy38IqWu8iiq7kBCj4QC5yQRkv-IxsyzvLW8A4i4inwHFVxgzzYlwaSGIr-T )     © 2019 Quintesis   -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 18:43:03 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 17:43:03 +0000 Subject: [Git][debian-gis-team/python-geopandas][master] 4 commits: New upstream version 0.6.0 Message-ID: <5d8e4a27620ac_1efe2b1553bf2eec195267@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-geopandas Commits: 7c428e66 by Bas Couwenberg at 2019-09-27T17:28:15Z New upstream version 0.6.0 - - - - - f0ee222c by Bas Couwenberg at 2019-09-27T17:28:38Z Update upstream source from tag 'upstream/0.6.0' Update to upstream version '0.6.0' with Debian dir 0b7c5f48695b3d67ea51727251dd8a61704e1ab3 - - - - - db8b227a by Bas Couwenberg at 2019-09-27T17:29:26Z New upstream release. - - - - - b0372a30 by Bas Couwenberg at 2019-09-27T17:32:40Z Set distribution to unstable. - - - - - 30 changed files: - + .pre-commit-config.yaml - .travis.yml - CHANGELOG.md - CONTRIBUTING.md - README.md - appveyor.yml - benchmarks/geom_methods.py - ci/travis/27-latest-conda-forge.yaml - − ci/travis/27-pd020.yaml - ci/travis/27-pd023.yaml - ci/travis/35-minimal.yaml - ci/travis/36-pd020.yaml → ci/travis/36-pd023.yaml - ci/travis/36-pd022.yaml → ci/travis/36-pd024.yaml - ci/travis/37-dev.yaml - ci/travis/37-latest-conda-forge.yaml - ci/travis/37-latest-defaults.yaml - debian/changelog - doc/Makefile - doc/environment.yml - doc/source/contributing.rst - doc/source/data_structures.rst - doc/source/geometric_manipulations.rst - doc/source/index.rst - doc/source/install.rst - doc/source/mapping.rst - doc/source/mergingdata.rst - + doc/source/missing_empty.rst - doc/source/reference.rst - examples/create_geopandas_from_pandas.py - examples/plotting_basemap_background.py The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/compare/d540e32c2c9d13b61ef9ba7cd22ff48718999eac...b0372a30e79fc82d227ae722507aad778eb049f8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/compare/d540e32c2c9d13b61ef9ba7cd22ff48718999eac...b0372a30e79fc82d227ae722507aad778eb049f8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 18:43:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 17:43:04 +0000 Subject: [Git][debian-gis-team/python-geopandas][pristine-tar] pristine-tar data for python-geopandas_0.6.0.orig.tar.gz Message-ID: <5d8e4a28904db_1efe2b155371825019541f@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / python-geopandas Commits: 3f69dea3 by Bas Couwenberg at 2019-09-27T17:28:38Z pristine-tar data for python-geopandas_0.6.0.orig.tar.gz - - - - - 2 changed files: - + python-geopandas_0.6.0.orig.tar.gz.delta - + python-geopandas_0.6.0.orig.tar.gz.id Changes: ===================================== python-geopandas_0.6.0.orig.tar.gz.delta ===================================== Binary files /dev/null and b/python-geopandas_0.6.0.orig.tar.gz.delta differ ===================================== python-geopandas_0.6.0.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +85bf37b04eec436decb51a0003d4564c6ecf0320 View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/commit/3f69dea360c2a99f03e905f1be958babb7ce9e0a -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/commit/3f69dea360c2a99f03e905f1be958babb7ce9e0a You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 18:43:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 17:43:05 +0000 Subject: [Git][debian-gis-team/python-geopandas][upstream] New upstream version 0.6.0 Message-ID: <5d8e4a2981c4c_1efe2b1553bf2eec195671@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-geopandas Commits: 7c428e66 by Bas Couwenberg at 2019-09-27T17:28:15Z New upstream version 0.6.0 - - - - - 30 changed files: - + .pre-commit-config.yaml - .travis.yml - CHANGELOG.md - CONTRIBUTING.md - README.md - appveyor.yml - benchmarks/geom_methods.py - ci/travis/27-latest-conda-forge.yaml - − ci/travis/27-pd020.yaml - ci/travis/27-pd023.yaml - ci/travis/35-minimal.yaml - ci/travis/36-pd020.yaml → ci/travis/36-pd023.yaml - ci/travis/36-pd022.yaml → ci/travis/36-pd024.yaml - ci/travis/37-dev.yaml - ci/travis/37-latest-conda-forge.yaml - ci/travis/37-latest-defaults.yaml - doc/Makefile - doc/environment.yml - doc/source/contributing.rst - doc/source/data_structures.rst - doc/source/geometric_manipulations.rst - doc/source/index.rst - doc/source/install.rst - doc/source/mapping.rst - doc/source/mergingdata.rst - + doc/source/missing_empty.rst - doc/source/reference.rst - examples/create_geopandas_from_pandas.py - examples/plotting_basemap_background.py - + examples/plotting_with_folium.ipynb The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/commit/7c428e66368e48b3f63b036aa49e543e7e802834 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/commit/7c428e66368e48b3f63b036aa49e543e7e802834 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 18:43:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 17:43:08 +0000 Subject: [Git][debian-gis-team/python-geopandas] Pushed new tag debian/0.6.0-1 Message-ID: <5d8e4a2c442d9_1efe2b155371825019585a@godard.mail> Bas Couwenberg pushed new tag debian/0.6.0-1 at Debian GIS Project / python-geopandas -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/tree/debian/0.6.0-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Fri Sep 27 18:43:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Fri, 27 Sep 2019 17:43:09 +0000 Subject: [Git][debian-gis-team/python-geopandas] Pushed new tag upstream/0.6.0 Message-ID: <5d8e4a2d11645_1efe3f97828c4dcc19601e@godard.mail> Bas Couwenberg pushed new tag upstream/0.6.0 at Debian GIS Project / python-geopandas -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/tree/upstream/0.6.0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Fri Sep 27 18:54:10 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 17:54:10 +0000 Subject: Processing of python-geopandas_0.6.0-1_source.changes Message-ID: python-geopandas_0.6.0-1_source.changes uploaded successfully to localhost along with the files: python-geopandas_0.6.0-1.dsc python-geopandas_0.6.0.orig.tar.gz python-geopandas_0.6.0-1.debian.tar.xz python-geopandas_0.6.0-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Fri Sep 27 19:05:17 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Fri, 27 Sep 2019 18:05:17 +0000 Subject: python-geopandas_0.6.0-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Fri, 27 Sep 2019 19:32:02 +0200 Source: python-geopandas Architecture: source Version: 0.6.0-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: python-geopandas (0.6.0-1) unstable; urgency=medium . * Team upload. * New upstream release. Checksums-Sha1: a34114a82e83274de901ff93c85ca40ba1e43b53 2408 python-geopandas_0.6.0-1.dsc c3f09faa36594bc5b9af8c88efd0d428c9a1232e 5576777 python-geopandas_0.6.0.orig.tar.gz 15f919042bfc8ef5a8c10b190aa2ada0952e816a 1321556 python-geopandas_0.6.0-1.debian.tar.xz 547df79395efdae430aa07121da6df88d6dd4dfe 12951 python-geopandas_0.6.0-1_amd64.buildinfo Checksums-Sha256: 8d099f90e1029d77ccd855e670f200964f1d26d3938d494513293ceb40471ac4 2408 python-geopandas_0.6.0-1.dsc b40fbb2110f333b2fbc72793541b4fbd0046f1127d8e67c1323107b1108a6a60 5576777 python-geopandas_0.6.0.orig.tar.gz 42cbefbd7f44eba03fcc7eba369788c9e2133381f0da7fb6582bbe3fc2fd2073 1321556 python-geopandas_0.6.0-1.debian.tar.xz d9ae72847192f3202544b4abb36c20bc33f523a79c5d347a592f7f8d4bae9a70 12951 python-geopandas_0.6.0-1_amd64.buildinfo Files: 3b5831ec64432f1d6310a2bb3f3110d1 2408 python optional python-geopandas_0.6.0-1.dsc 0d3898eba0c478c0f0a422d090fb113a 5576777 python optional python-geopandas_0.6.0.orig.tar.gz 8eb575412d62a0a0e98a1051fd744844 1321556 python optional python-geopandas_0.6.0-1.debian.tar.xz d0c1533ecce3afe48810ebb643f9ee1a 12951 python optional python-geopandas_0.6.0-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2OSe0ACgkQZ1DxCuiN SvHPuhAAjDmfCjHLIJb1filIMyY9Rty/OVdykvdh+ZcFjrzHM2/ztEJ1NNsIael9 ms5dR4cLA5ZjSD4/doPnQfGVGZ+O+U6Huj8XINoK2FoGjEtE3RbTfUCZbmJk+SUI u1RoAyi/RK3/MP1XwOdTqOQ6LVRynvFPusOWN7g41sOPpK4lpPFDOJadElpEY6/K SN92EOAH1wriGpnVIA56r+hSO/6Qp0CK7jagnD0JDYJUVNDA1RR/yrrkexVgf94b 4wqEGWiuYU9wB8B82ZA3iWizNCB+agh+CwiZZgVcSMg0D+ukHglvfwAWbjjXYbx8 p3o6SZTZTYnyAHVW7BHCeF5XmQykxIyLg0PyEAwPyuh0JvcJLmCWjORd3pUsTqE3 ZT+F1Q3FC6SQOXGXGWdHGPnhylDAHA98VJbkspmNePIz6Xfa9iO8QuPGqFwHvAIy y9lLAeLVnJK0XWmLxwE2lVw2uIl5RtRWRo/kjDL2wwXpMa6LfZtbvLY210VpbD4e MeZ+aM5SyxX+tJz4ppJnMK+SwRbdjEHpkbZ0LzYc2+EakLcgoVX68J7sOXEO4Zju OGFv6XDYtldHkgofk8zTeDNrdnVoJTmazIXtxSD1FNWE69E83kDQ2eX5wWf+4cot LOuGEw5RDL5pkX8YzmIRQWnxPC8hB/i1nAv4tqPF3mQkWtzIsM8= =qLKQ -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Sat Sep 28 08:34:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 28 Sep 2019 07:34:31 +0000 Subject: [Git][debian-gis-team/geos][pristine-tar] pristine-tar data for geos_3.8.0~beta1.orig.tar.bz2 Message-ID: <5d8f0d071ba9e_1efe3f97828c4dcc242178@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / geos Commits: 16ef8c21 by Bas Couwenberg at 2019-09-28T05:29:48Z pristine-tar data for geos_3.8.0~beta1.orig.tar.bz2 - - - - - 2 changed files: - + geos_3.8.0~beta1.orig.tar.bz2.delta - + geos_3.8.0~beta1.orig.tar.bz2.id Changes: ===================================== geos_3.8.0~beta1.orig.tar.bz2.delta ===================================== Binary files /dev/null and b/geos_3.8.0~beta1.orig.tar.bz2.delta differ ===================================== geos_3.8.0~beta1.orig.tar.bz2.id ===================================== @@ -0,0 +1 @@ +a61cdcc293319040739c53f8399cf55663fb78fd View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/commit/16ef8c21dd755c4832e0af2494c332fd9272f8ce -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/commit/16ef8c21dd755c4832e0af2494c332fd9272f8ce You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 28 08:34:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 28 Sep 2019 07:34:32 +0000 Subject: [Git][debian-gis-team/geos][experimental] 15 commits: Add copyright holders for debian/*. Message-ID: <5d8f0d08286d3_1efe3f97849ba5cc2423f@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / geos Commits: e29bd804 by Bas Couwenberg at 2019-07-23T04:54:09Z Add copyright holders for debian/*. - - - - - a771dc33 by Bas Couwenberg at 2019-07-23T05:09:46Z Revert "Update branch in gbp.conf & Vcs-Git URL." This reverts commit c2bfb54f84071094b769740728c490bb3b4915ec. - - - - - a4b2ea18 by Bas Couwenberg at 2019-07-23T05:12:22Z Update symbols for other architectures. - - - - - 88ca0964 by Bas Couwenberg at 2019-07-23T05:12:41Z Set distribution to unstable. - - - - - 7b55806b by Bas Couwenberg at 2019-09-28T05:29:18Z Update branch in gbp.conf & Vcs-Git URL. - - - - - e210185d by Bas Couwenberg at 2019-09-28T06:44:10Z New upstream version 3.8.0~beta1 - - - - - ea6eb2b5 by Bas Couwenberg at 2019-09-28T06:44:10Z New upstream beta release. - - - - - a4fd7d9a by Bas Couwenberg at 2019-09-28T06:44:10Z Update copyright file: Changes: - Update copyright years for existing copyright holders - Add new copyright holders - Add license & copyright for ttmath sources - Add license & copyright for tinyxml2 sources - Add license & copyright for astyle sources - - - - - 3b95cf7a by Bas Couwenberg at 2019-09-28T06:44:10Z Rename C++ library package for SONAME bump. - - - - - 15791362 by Bas Couwenberg at 2019-09-28T06:44:10Z Refresh patches. - - - - - 1ee06dbd by Bas Couwenberg at 2019-09-28T06:44:10Z Don't install TODO in docs, removed upstream. - - - - - 4bceeb49 by Bas Couwenberg at 2019-09-28T06:59:56Z Remove .asm files. - - - - - a3e72317 by Bas Couwenberg at 2019-09-28T07:18:00Z Add lintian override for hardening-no-fortify-functions. - - - - - 15917fa7 by Bas Couwenberg at 2019-09-28T07:18:00Z Update symbols for amd64. - - - - - 29d40224 by Bas Couwenberg at 2019-09-28T07:18:00Z Set distribution to experimental. - - - - - 30 changed files: - .editorconfig - CMakeLists.txt - ChangeLog - INSTALL - Makefile.am - Makefile.in - NEWS - README.md - − TODO - + Version.txt - − autogen.bat - + benchmarks/CMakeLists.txt - tests/perf/ClassSizes.cpp → benchmarks/ClassSizes.cpp - tests/perf/Makefile.am → benchmarks/Makefile.am - tests/perf/Makefile.in → benchmarks/Makefile.in - + benchmarks/algorithm/CMakeLists.txt - + benchmarks/algorithm/InteriorPointAreaPerfTest.cpp - + benchmarks/algorithm/Makefile.am - + benchmarks/algorithm/Makefile.in - tests/perf/CMakeLists.txt → benchmarks/capi/CMakeLists.txt - tests/perf/capi/Makefile.am → benchmarks/capi/Makefile.am - tests/perf/capi/Makefile.in → benchmarks/capi/Makefile.in - tests/perf/capi/memleak_mp_prep.c → benchmarks/capi/memleak_mp_prep.c - tests/perf/operation/CMakeLists.txt → benchmarks/operation/CMakeLists.txt - tests/perf/operation/Makefile.am → benchmarks/operation/Makefile.am - tests/perf/operation/Makefile.in → benchmarks/operation/Makefile.in - tests/perf/operation/buffer/CMakeLists.txt → benchmarks/operation/buffer/CMakeLists.txt - tests/perf/operation/buffer/IteratedBufferStressTest.cpp → benchmarks/operation/buffer/IteratedBufferStressTest.cpp - tests/perf/operation/buffer/Makefile.am → benchmarks/operation/buffer/Makefile.am - tests/perf/operation/buffer/Makefile.in → benchmarks/operation/buffer/Makefile.in The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/compare/3a7057c64825fd06f14b54ca7d567145ea429d15...29d40224d9ac9fe1defdfe04d727103e54d6fe32 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/compare/3a7057c64825fd06f14b54ca7d567145ea429d15...29d40224d9ac9fe1defdfe04d727103e54d6fe32 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 28 08:34:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 28 Sep 2019 07:34:34 +0000 Subject: [Git][debian-gis-team/geos][upstream] New upstream version 3.8.0~beta1 Message-ID: <5d8f0d0a6d5e9_1efe3f977e856380242594@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / geos Commits: 44ab7383 by Bas Couwenberg at 2019-09-28T05:29:43Z New upstream version 3.8.0~beta1 - - - - - 30 changed files: - .editorconfig - CMakeLists.txt - ChangeLog - INSTALL - Makefile.am - Makefile.in - NEWS - README.md - − TODO - + Version.txt - − autogen.bat - + benchmarks/CMakeLists.txt - tests/perf/ClassSizes.cpp → benchmarks/ClassSizes.cpp - tests/perf/Makefile.am → benchmarks/Makefile.am - tests/perf/Makefile.in → benchmarks/Makefile.in - + benchmarks/algorithm/CMakeLists.txt - + benchmarks/algorithm/InteriorPointAreaPerfTest.cpp - + benchmarks/algorithm/Makefile.am - + benchmarks/algorithm/Makefile.in - tests/perf/CMakeLists.txt → benchmarks/capi/CMakeLists.txt - tests/perf/capi/Makefile.am → benchmarks/capi/Makefile.am - tests/perf/capi/Makefile.in → benchmarks/capi/Makefile.in - tests/perf/capi/memleak_mp_prep.c → benchmarks/capi/memleak_mp_prep.c - tests/perf/operation/CMakeLists.txt → benchmarks/operation/CMakeLists.txt - tests/perf/operation/Makefile.am → benchmarks/operation/Makefile.am - tests/perf/operation/Makefile.in → benchmarks/operation/Makefile.in - tests/perf/operation/buffer/CMakeLists.txt → benchmarks/operation/buffer/CMakeLists.txt - tests/perf/operation/buffer/IteratedBufferStressTest.cpp → benchmarks/operation/buffer/IteratedBufferStressTest.cpp - tests/perf/operation/buffer/Makefile.am → benchmarks/operation/buffer/Makefile.am - tests/perf/operation/buffer/Makefile.in → benchmarks/operation/buffer/Makefile.in The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/commit/44ab738326588f3e61df215c4047701752124e93 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/commit/44ab738326588f3e61df215c4047701752124e93 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 28 08:34:35 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 28 Sep 2019 07:34:35 +0000 Subject: [Git][debian-gis-team/geos] Pushed new tag debian/3.8.0_beta1-1_exp1 Message-ID: <5d8f0d0bf3682_1efe3f977e85638024279b@godard.mail> Bas Couwenberg pushed new tag debian/3.8.0_beta1-1_exp1 at Debian GIS Project / geos -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/tree/debian/3.8.0_beta1-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sat Sep 28 08:34:37 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sat, 28 Sep 2019 07:34:37 +0000 Subject: [Git][debian-gis-team/geos] Pushed new tag upstream/3.8.0_beta1 Message-ID: <5d8f0d0d1f406_1efe2b154f63cba0242930@godard.mail> Bas Couwenberg pushed new tag upstream/3.8.0_beta1 at Debian GIS Project / geos -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geos/tree/upstream/3.8.0_beta1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sat Sep 28 08:40:19 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 28 Sep 2019 07:40:19 +0000 Subject: Processing of geos_3.8.0~beta1-1~exp1_amd64.changes Message-ID: geos_3.8.0~beta1-1~exp1_amd64.changes uploaded successfully to localhost along with the files: geos_3.8.0~beta1-1~exp1.dsc geos_3.8.0~beta1.orig.tar.bz2 geos_3.8.0~beta1-1~exp1.debian.tar.xz geos_3.8.0~beta1-1~exp1_amd64.buildinfo libgeos++-dev_3.8.0~beta1-1~exp1_amd64.deb libgeos-3.8.0-dbgsym_3.8.0~beta1-1~exp1_amd64.deb libgeos-3.8.0_3.8.0~beta1-1~exp1_amd64.deb libgeos-c1v5-dbgsym_3.8.0~beta1-1~exp1_amd64.deb libgeos-c1v5_3.8.0~beta1-1~exp1_amd64.deb libgeos-dev_3.8.0~beta1-1~exp1_amd64.deb libgeos-doc_3.8.0~beta1-1~exp1_all.deb ruby-geos-dbgsym_3.8.0~beta1-1~exp1_amd64.deb ruby-geos_3.8.0~beta1-1~exp1_amd64.deb Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sat Sep 28 08:49:30 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sat, 28 Sep 2019 07:49:30 +0000 Subject: geos_3.8.0~beta1-1~exp1_amd64.changes is NEW Message-ID: binary:libgeos-3.8.0 is NEW. binary:libgeos-3.8.0 is NEW. Your package has been put into the NEW queue, which requires manual action from the ftpteam to process. The upload was otherwise valid (it had a good OpenPGP signature and file hashes are valid), so please be patient. Packages are routinely processed through to the archive, and do feel free to browse the NEW queue[1]. If there is an issue with the upload, you will receive an email from a member of the ftpteam. If you have any questions, you may reply to this email. [1]: https://ftp-master.debian.org/new.html or https://ftp-master.debian.org/backports-new.html for *-backports From sales at singaporepainting.com.sg Sun Sep 29 05:26:08 2019 From: sales at singaporepainting.com.sg (Starlight) Date: Sun, 29 Sep 2019 04:26:08 +0000 Subject: Painting, Renovation and Epoxy Flooring Message-ID: <8dbba2c2e37b503fd6002bd511520ee8@starlightwaterproofing.com.sg> http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/73c867ef0fc3cddb37e9ef38612d7060c1e3e211 About Us http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/6bcbb55f754fd3c301ba5f7830ff99ef90cff8df Services http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/7ad4af9e16fc30fd854473de0fef6bf378c1506c Promotion http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/2368985b8ab415dca567523e803f258293d25b7e Our Portfolio http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/9a4a856465ec5b2f69e61bd6868ac0154eb8a8e4 Contact Us http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/1aa44c36ee407fc05bebbf6c14710fd38149f0e1 Email http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/dcf9e133ccfa0e268c552e66e09effb5b8a065da Сall Us http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/1aa44c36ee407fc05bebbf6c14710fd38149f0e1 Wall Art & Design   Epoxy Flooring   Internal & External Paitning Waterproofing GET IN TOUCH! (HP) +65 84822650 (Tel) +65 62812408 (Fax) +65 62812562 Email: vince.starlight at gmail.com http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/a3960bdfdad2dbeaf2b3ce39ed53a02247369d9b   You are receiving this email because you have visited our site or asked us about the regular newsletter.  Unsubscribe http://vps.starlightwaterproofing.com.sg/newsletter/index.php/campaigns/xk616f4q6ef4a/track-url/xn5077s2qq571/a2f4832f7b34ecc0a3bf182465f6fd6ae599d644 and reply with "Remove" in the subject. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 29 07:21:47 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 29 Sep 2019 06:21:47 +0000 Subject: [Git][debian-gis-team/postgis][experimental] 5 commits: New upstream version 3.0.0~beta1+dfsg Message-ID: <5d904d7b69b4d_46f62ac0fc81bc70851f1@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / postgis Commits: 75ab1d94 by Bas Couwenberg at 2019-09-29T05:53:32Z New upstream version 3.0.0~beta1+dfsg - - - - - 4efdc6e5 by Bas Couwenberg at 2019-09-29T05:54:12Z Update upstream source from tag 'upstream/3.0.0_beta1+dfsg' Update to upstream version '3.0.0~beta1+dfsg' with Debian dir a1cdf97e17e97260713f52c208bc6f99517905bb - - - - - c4dfbfc6 by Bas Couwenberg at 2019-09-29T05:54:53Z New upstream beta release. - - - - - fac750c2 by Bas Couwenberg at 2019-09-29T06:00:02Z Update copyright for fuzzers sources. - - - - - 31de15aa by Bas Couwenberg at 2019-09-29T06:00:45Z Set distribution to experimental. - - - - - 30 changed files: - .drone-1.0.yml - .gitlab-ci.yml - ChangeLog - GNUmakefile.in - NEWS - README.postgis - Version.config - aclocal.m4 - build-aux/ltmain.sh - ci/bessie32/postgis_regress.sh - + ci/dronie/postgis_regress.sh - ci/travis/run_usan_clang.sh - ci/winnie/build_postgis.sh - ci/winnie/package_postgis.sh - ci/winnie/regress_postgis.sh - configure - configure.ac - debian/changelog - debian/copyright - deps/Makefile - deps/Makefile.in - doc/Makefile.in - doc/extras_topology.xml - doc/po/templates/extras_topology.xml.pot - doc/reference_accessor.xml - doc/reference_editor.xml - doc/reference_output.xml - doc/reference_relationship.xml - doc/reference_version.xml - doc/release_notes.xml The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/compare/77486d249421c96e94949a1f9915f0a5bbdc84fd...31de15aaa1ecf3e6a22361d8f941ac2073e49964 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/compare/77486d249421c96e94949a1f9915f0a5bbdc84fd...31de15aaa1ecf3e6a22361d8f941ac2073e49964 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 29 07:21:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 29 Sep 2019 06:21:49 +0000 Subject: [Git][debian-gis-team/postgis][upstream] New upstream version 3.0.0~beta1+dfsg Message-ID: <5d904d7d5fe96_46f62ac0fc69eb54853bb@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / postgis Commits: 75ab1d94 by Bas Couwenberg at 2019-09-29T05:53:32Z New upstream version 3.0.0~beta1+dfsg - - - - - 30 changed files: - .drone-1.0.yml - .gitlab-ci.yml - ChangeLog - GNUmakefile.in - NEWS - README.postgis - Version.config - aclocal.m4 - build-aux/ltmain.sh - ci/bessie32/postgis_regress.sh - + ci/dronie/postgis_regress.sh - ci/travis/run_usan_clang.sh - ci/winnie/build_postgis.sh - ci/winnie/package_postgis.sh - ci/winnie/regress_postgis.sh - configure - configure.ac - deps/Makefile - deps/Makefile.in - doc/Makefile.in - doc/extras_topology.xml - doc/po/templates/extras_topology.xml.pot - doc/reference_accessor.xml - doc/reference_editor.xml - doc/reference_output.xml - doc/reference_relationship.xml - doc/reference_version.xml - doc/release_notes.xml - extensions/Makefile.in - extensions/address_standardizer/Makefile The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/75ab1d94befb8e1832845ebd3172cba1239b6ea8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/75ab1d94befb8e1832845ebd3172cba1239b6ea8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 29 07:21:47 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 29 Sep 2019 06:21:47 +0000 Subject: [Git][debian-gis-team/postgis][pristine-tar] pristine-tar data for postgis_3.0.0~beta1+dfsg.orig.tar.xz Message-ID: <5d904d7bcb2f_46f62ac0fde9912884954@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / postgis Commits: dc185ef4 by Bas Couwenberg at 2019-09-29T05:54:12Z pristine-tar data for postgis_3.0.0~beta1+dfsg.orig.tar.xz - - - - - 2 changed files: - + postgis_3.0.0~beta1+dfsg.orig.tar.xz.delta - + postgis_3.0.0~beta1+dfsg.orig.tar.xz.id Changes: ===================================== postgis_3.0.0~beta1+dfsg.orig.tar.xz.delta ===================================== Binary files /dev/null and b/postgis_3.0.0~beta1+dfsg.orig.tar.xz.delta differ ===================================== postgis_3.0.0~beta1+dfsg.orig.tar.xz.id ===================================== @@ -0,0 +1 @@ +9c822920ace215125c6b57639962f9024232401d View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/dc185ef44e7e0d7b34bdd408414f547146d40025 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/dc185ef44e7e0d7b34bdd408414f547146d40025 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 29 07:22:20 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 29 Sep 2019 06:22:20 +0000 Subject: [Git][debian-gis-team/postgis] Pushed new tag debian/3.0.0_beta1+dfsg-1_exp1 Message-ID: <5d904d9c141bd_46f62ac0fde991288557f@godard.mail> Bas Couwenberg pushed new tag debian/3.0.0_beta1+dfsg-1_exp1 at Debian GIS Project / postgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/tree/debian/3.0.0_beta1+dfsg-1_exp1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Sun Sep 29 07:22:21 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Sun, 29 Sep 2019 06:22:21 +0000 Subject: [Git][debian-gis-team/postgis] Pushed new tag upstream/3.0.0_beta1+dfsg Message-ID: <5d904d9d23d27_46f63fbabaa6d6b4857ee@godard.mail> Bas Couwenberg pushed new tag upstream/3.0.0_beta1+dfsg at Debian GIS Project / postgis -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/tree/upstream/3.0.0_beta1+dfsg You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Sun Sep 29 07:34:46 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 29 Sep 2019 06:34:46 +0000 Subject: Processing of postgis_3.0.0~beta1+dfsg-1~exp1_source.changes Message-ID: postgis_3.0.0~beta1+dfsg-1~exp1_source.changes uploaded successfully to localhost along with the files: postgis_3.0.0~beta1+dfsg-1~exp1.dsc postgis_3.0.0~beta1+dfsg.orig.tar.xz postgis_3.0.0~beta1+dfsg-1~exp1.debian.tar.xz postgis_3.0.0~beta1+dfsg-1~exp1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Sun Sep 29 07:50:44 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Sun, 29 Sep 2019 06:50:44 +0000 Subject: postgis_3.0.0~beta1+dfsg-1~exp1_source.changes ACCEPTED into experimental Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Sun, 29 Sep 2019 08:00:25 +0200 Source: postgis Architecture: source Version: 3.0.0~beta1+dfsg-1~exp1 Distribution: experimental Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: postgis (3.0.0~beta1+dfsg-1~exp1) experimental; urgency=medium . * New upstream beta release. * Update copyright for fuzzers sources. Checksums-Sha1: d1200b8106582232322ed076c845aff271ddcaa6 3031 postgis_3.0.0~beta1+dfsg-1~exp1.dsc 3839b620ae364bbd755183e9c1908dfb593773e1 9578996 postgis_3.0.0~beta1+dfsg.orig.tar.xz 50f586015803aa7f8a6ab3db072715c698c6fadb 37644 postgis_3.0.0~beta1+dfsg-1~exp1.debian.tar.xz 1480293b08759727da5f99a9c5a8f8c3c373ae88 22999 postgis_3.0.0~beta1+dfsg-1~exp1_amd64.buildinfo Checksums-Sha256: a002aee362143eeab6589ae4b0e37c781024272fb502b356c87cc77f2bc2b9ac 3031 postgis_3.0.0~beta1+dfsg-1~exp1.dsc 20ab447e78b11c727817178aefdd3cc4ee29beeb4b6fd41ce10a89f57730e427 9578996 postgis_3.0.0~beta1+dfsg.orig.tar.xz 9f4239fef3ff681bc42214c580cdd031f8c2ce3b92774245545c60659a06471d 37644 postgis_3.0.0~beta1+dfsg-1~exp1.debian.tar.xz 47f255c2aec46b6dae78679f20b0da43ec09a0321cdba397d33638dad23ab85b 22999 postgis_3.0.0~beta1+dfsg-1~exp1_amd64.buildinfo Files: d4ad293f3f347bdec05fda1c3a2233eb 3031 misc optional postgis_3.0.0~beta1+dfsg-1~exp1.dsc bec37bc40246fc0c8bc8f9fa34fd3eae 9578996 misc optional postgis_3.0.0~beta1+dfsg.orig.tar.xz fe3a6f85669a811f61c97e0936a590ab 37644 misc optional postgis_3.0.0~beta1+dfsg-1~exp1.debian.tar.xz 528c61ac0e8df2963f525c0969ee6e68 22999 misc optional postgis_3.0.0~beta1+dfsg-1~exp1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2QTUMACgkQZ1DxCuiN SvENwg//RsThx/zMHc3x44DMi28wGgZd9PtuFE9JYFFwGxXd/8MX60nPDTnpv4v5 2LE6n4svnm1GbeEeGAOfqUgapAFKATpZNbghh2ye/yAFmqcVWoV/r1SW7RAbBOJn vjcg7QULCyCgGZT+RYhAWX0V9r92oN/OdJkJg41/InUJeSbWuXAAF6z7wkOVEdqL SseWp91as8Kj2QbAK8354e+zqAuP7D6Iq9ZGXvCODau+SLPz4GHgee2+hPyusQa3 zhqOnfCyCyBSrDHZWLbEkzQ4WsfVYGHfN81L6B3h3ItzJFl9L/BTL8xuRy6QHJMy 3YDAS0XGd29jHvu94SiEmYMb9BZY7WxMnWjBzYefDARBwGNHy+x4UvinE9Vz4vbw /XSmMLuQArUwM24TIv7p5l/yjcJHPEi/euRiEWg38L4HTi306kALxaQZ1jqIWafV Vve9AQGAK/TpWX3mWUVjUoCK/y+5Oz4seS6vJjcMT5CvOeHyxogdfy32A/HXRWun S+cQwwL0sFjYPhOxdER0y4yUKcpWxm1YpZ5Fv3r7xOj0q6kplJILhNC3LlX3MQvj TUkwL6OM8D28QFyJGAQr/x+bZPPbpDFciUOidibj0OL54f0KhVA0kM7kmeVO49OY E0keLF5adWJmMiM/NTOAwD8kwlbINWv5YT38Grz39X7gof+GM5A= =Jv2P -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Mon Sep 30 06:37:16 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 05:37:16 +0000 Subject: [Git][debian-gis-team/trollimage][upstream] New upstream version 1.10.1 Message-ID: <5d91948c28bd0_46f63fbabacd32c81781ef@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / trollimage Commits: 27483868 by Antonio Valentino at 2019-09-30T05:29:46Z New upstream version 1.10.1 - - - - - 5 changed files: - .travis.yml - CHANGELOG.md - appveyor.yml - trollimage/version.py - trollimage/xrimage.py Changes: ===================================== .travis.yml ===================================== @@ -16,7 +16,7 @@ env: - CONDA_CHANNEL_PRIORITY=True install: - - git clone --depth 1 git://github.com/astropy/ci-helpers.git + - git clone --depth 1 -b all-the-fixes git://github.com/djhoese/ci-helpers.git - source ci-helpers/travis/setup_conda.sh script: coverage run --source=trollimage setup.py test after_success: ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,14 @@ +## Version 1.10.1 (2019/09/26) + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 56](https://github.com/pytroll/trollimage/pull/56) - Fix WKT version used to convert CRS to GeoTIFF CRS + +In this release 1 pull request was closed. + + ## Version 1.10.0 (2019/09/20) ### Pull Requests Merged ===================================== appveyor.yml ===================================== @@ -19,10 +19,10 @@ environment: NUMPY_VERSION: "stable" install: - - "git clone --depth 1 git://github.com/astropy/ci-helpers.git" +# - "git clone --depth 1 git://github.com/astropy/ci-helpers.git" + - "git clone --depth 1 -b all-the-fixes git://github.com/djhoese/ci-helpers.git" - "powershell ci-helpers/appveyor/install-miniconda.ps1" - - "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" - - "activate test" + - "conda activate test" build: false # Not a C# project, build stuff at the test step instead. ===================================== trollimage/version.py ===================================== @@ -23,9 +23,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (HEAD -> master, tag: v1.10.0)" - git_full = "b1fb06cbf6ef8b23e5816c423df1eeaf8e76d606" - git_date = "2019-09-20 09:41:02 +0200" + git_refnames = " (HEAD -> master, tag: v1.10.1)" + git_full = "9130e96fae8e880ccf843298b508712a4d80a481" + git_date = "2019-09-26 15:08:59 -0500" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords ===================================== trollimage/xrimage.py ===================================== @@ -331,8 +331,12 @@ class XRImage(object): try: area = data.attrs['area'] + if rasterio.__gdal_version__ >= '3': + wkt_version = 'WKT2_2018' + else: + wkt_version = 'WKT1_GDAL' if hasattr(area, 'crs'): - crs = rasterio.crs.CRS.from_wkt(area.crs.to_wkt()) + crs = rasterio.crs.CRS.from_wkt(area.crs.to_wkt(version=wkt_version)) else: crs = rasterio.crs.CRS(data.attrs['area'].proj_dict) west, south, east, north = data.attrs['area'].area_extent View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/2748386843c062af485f7b3833c925e0098fd58f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/2748386843c062af485f7b3833c925e0098fd58f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 06:37:12 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 05:37:12 +0000 Subject: [Git][debian-gis-team/trollimage][master] 4 commits: New upstream version 1.10.1 Message-ID: <5d919488a1799_46f63fbabacd32c817778f@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / trollimage Commits: 27483868 by Antonio Valentino at 2019-09-30T05:29:46Z New upstream version 1.10.1 - - - - - e58c2a3b by Antonio Valentino at 2019-09-30T05:29:49Z Update upstream source from tag 'upstream/1.10.1' Update to upstream version '1.10.1' with Debian dir f49dd1de90c01b15ac8bed7408a7ab536ec13c73 - - - - - 7272dd9c by Antonio Valentino at 2019-09-30T05:30:46Z New upstream release - - - - - 157f9005 by Antonio Valentino at 2019-09-30T05:33:44Z Set distribution to unstable - - - - - 6 changed files: - .travis.yml - CHANGELOG.md - appveyor.yml - debian/changelog - trollimage/version.py - trollimage/xrimage.py Changes: ===================================== .travis.yml ===================================== @@ -16,7 +16,7 @@ env: - CONDA_CHANNEL_PRIORITY=True install: - - git clone --depth 1 git://github.com/astropy/ci-helpers.git + - git clone --depth 1 -b all-the-fixes git://github.com/djhoese/ci-helpers.git - source ci-helpers/travis/setup_conda.sh script: coverage run --source=trollimage setup.py test after_success: ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,14 @@ +## Version 1.10.1 (2019/09/26) + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 56](https://github.com/pytroll/trollimage/pull/56) - Fix WKT version used to convert CRS to GeoTIFF CRS + +In this release 1 pull request was closed. + + ## Version 1.10.0 (2019/09/20) ### Pull Requests Merged ===================================== appveyor.yml ===================================== @@ -19,10 +19,10 @@ environment: NUMPY_VERSION: "stable" install: - - "git clone --depth 1 git://github.com/astropy/ci-helpers.git" +# - "git clone --depth 1 git://github.com/astropy/ci-helpers.git" + - "git clone --depth 1 -b all-the-fixes git://github.com/djhoese/ci-helpers.git" - "powershell ci-helpers/appveyor/install-miniconda.ps1" - - "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" - - "activate test" + - "conda activate test" build: false # Not a C# project, build stuff at the test step instead. ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +trollimage (1.10.1-1) unstable; urgency=medium + + * New upstream release. + + -- Antonio Valentino Mon, 30 Sep 2019 05:33:09 +0000 + trollimage (1.10.0-1) unstable; urgency=medium * New upstream release. ===================================== trollimage/version.py ===================================== @@ -23,9 +23,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (HEAD -> master, tag: v1.10.0)" - git_full = "b1fb06cbf6ef8b23e5816c423df1eeaf8e76d606" - git_date = "2019-09-20 09:41:02 +0200" + git_refnames = " (HEAD -> master, tag: v1.10.1)" + git_full = "9130e96fae8e880ccf843298b508712a4d80a481" + git_date = "2019-09-26 15:08:59 -0500" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords ===================================== trollimage/xrimage.py ===================================== @@ -331,8 +331,12 @@ class XRImage(object): try: area = data.attrs['area'] + if rasterio.__gdal_version__ >= '3': + wkt_version = 'WKT2_2018' + else: + wkt_version = 'WKT1_GDAL' if hasattr(area, 'crs'): - crs = rasterio.crs.CRS.from_wkt(area.crs.to_wkt()) + crs = rasterio.crs.CRS.from_wkt(area.crs.to_wkt(version=wkt_version)) else: crs = rasterio.crs.CRS(data.attrs['area'].proj_dict) west, south, east, north = data.attrs['area'].area_extent View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/compare/e5d9fe50bab42c71a3c76ce1409589efee173fb8...157f90057b6a86ffc3c2db51580656d78ced934b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/compare/e5d9fe50bab42c71a3c76ce1409589efee173fb8...157f90057b6a86ffc3c2db51580656d78ced934b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 06:37:14 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 05:37:14 +0000 Subject: [Git][debian-gis-team/trollimage][pristine-tar] pristine-tar data for trollimage_1.10.1.orig.tar.gz Message-ID: <5d91948a3469_46f63fbab2a34ba017794@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / trollimage Commits: 9b037ec8 by Antonio Valentino at 2019-09-30T05:29:49Z pristine-tar data for trollimage_1.10.1.orig.tar.gz - - - - - 2 changed files: - + trollimage_1.10.1.orig.tar.gz.delta - + trollimage_1.10.1.orig.tar.gz.id Changes: ===================================== trollimage_1.10.1.orig.tar.gz.delta ===================================== Binary files /dev/null and b/trollimage_1.10.1.orig.tar.gz.delta differ ===================================== trollimage_1.10.1.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +61a311d1fa1be3ea910fb126e825b7802199c4f3 View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/9b037ec835e112b75fa49f470d4615f2130d6ada -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/9b037ec835e112b75fa49f470d4615f2130d6ada You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 06:37:23 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 05:37:23 +0000 Subject: [Git][debian-gis-team/trollimage] Pushed new tag upstream/1.10.1 Message-ID: <5d919493ab56a_46f63fbab2a34ba01783fa@godard.mail> Antonio Valentino pushed new tag upstream/1.10.1 at Debian GIS Project / trollimage -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/tree/upstream/1.10.1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 07:07:20 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 06:07:20 +0000 Subject: [Git][debian-gis-team/pyresample][pristine-tar] pristine-tar data for pyresample_1.13.1.orig.tar.gz Message-ID: <5d919b98c0c16_46f62ac0ff4afc641791a2@godard.mail> Antonio Valentino pushed to branch pristine-tar at Debian GIS Project / pyresample Commits: 9c36158f by Antonio Valentino at 2019-09-30T05:41:01Z pristine-tar data for pyresample_1.13.1.orig.tar.gz - - - - - 2 changed files: - + pyresample_1.13.1.orig.tar.gz.delta - + pyresample_1.13.1.orig.tar.gz.id Changes: ===================================== pyresample_1.13.1.orig.tar.gz.delta ===================================== Binary files /dev/null and b/pyresample_1.13.1.orig.tar.gz.delta differ ===================================== pyresample_1.13.1.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +70686a8eaa50684ba131eebe792a8b989a95ba17 View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/9c36158f220a4ee85958054a24fade9683266e79 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/9c36158f220a4ee85958054a24fade9683266e79 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 07:07:24 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 06:07:24 +0000 Subject: [Git][debian-gis-team/pyresample][upstream] New upstream version 1.13.1 Message-ID: <5d919b9cc6bdc_46f62ac0fd73f814179375@godard.mail> Antonio Valentino pushed to branch upstream at Debian GIS Project / pyresample Commits: 57ba9507 by Antonio Valentino at 2019-09-30T05:40:47Z New upstream version 1.13.1 - - - - - 6 changed files: - CHANGELOG.md - pyresample/geometry.py - pyresample/test/test_geometry.py - pyresample/test/test_utils.py - pyresample/utils/_proj4.py - pyresample/version.py Changes: ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,15 @@ +## Version 1.13.1 (2019/09/26) + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 218](https://github.com/pytroll/pyresample/pull/218) - Fix proj_str returning invalid PROJ strings when towgs84 was included +* [PR 217](https://github.com/pytroll/pyresample/pull/217) - Fix get_geostationary_angle_extent assuming a/b definitions +* [PR 216](https://github.com/pytroll/pyresample/pull/216) - Fix proj4 radius parameters for spherical cases + +In this release 3 pull requests were closed. + ## Version 1.13.0 (2019/09/13) ### Issues Closed ===================================== pyresample/geometry.py ===================================== @@ -36,7 +36,8 @@ from pyproj import Geod, transform from pyresample import CHUNK_SIZE from pyresample._spatial_mp import Cartesian, Cartesian_MP, Proj, Proj_MP from pyresample.boundary import AreaDefBoundary, Boundary, SimpleBoundary -from pyresample.utils import proj4_str_to_dict, proj4_dict_to_str, convert_proj_floats +from pyresample.utils import (proj4_str_to_dict, proj4_dict_to_str, + convert_proj_floats, proj4_radius_parameters) from pyresample.area_config import create_area_def try: @@ -1346,7 +1347,17 @@ class AreaDefinition(BaseDefinition): @property def proj_str(self): """Return PROJ projection string.""" - return proj4_dict_to_str(self.proj_dict, sort=True) + proj_dict = self.proj_dict.copy() + if 'towgs84' in proj_dict and isinstance(proj_dict['towgs84'], list): + # pyproj 2+ creates a list in the dictionary + # but the string should be comma-separated + if all(x == 0 for x in proj_dict['towgs84']): + # all 0s in towgs84 are technically equal to not having them + # specified, but PROJ considers them different + proj_dict.pop('towgs84') + else: + proj_dict['towgs84'] = ','.join(str(x) for x in proj_dict['towgs84']) + return proj4_dict_to_str(proj_dict, sort=True) def __str__(self): """Return string representation of the AreaDefinition.""" @@ -1904,8 +1915,9 @@ class AreaDefinition(BaseDefinition): def get_geostationary_angle_extent(geos_area): """Get the max earth (vs space) viewing angles in x and y.""" # get some projection parameters - req = geos_area.proj_dict['a'] / 1000.0 - rp = geos_area.proj_dict['b'] / 1000.0 + a, b = proj4_radius_parameters(geos_area.proj_dict) + req = a / 1000.0 + rp = b / 1000.0 h = geos_area.proj_dict['h'] / 1000.0 + req # compute some constants ===================================== pyresample/test/test_geometry.py ===================================== @@ -1133,6 +1133,38 @@ class Test(unittest.TestCase): area_extent=[-40000., -40000., 40000., 40000.]) self.assertEqual(area.proj_str, expected_proj) + if utils.is_pyproj2(): + # CRS with towgs84 in it + # we remove towgs84 if they are all 0s + projection = {'proj': 'laea', 'lat_0': 52, 'lon_0': 10, 'x_0': 4321000, 'y_0': 3210000, + 'ellps': 'GRS80', 'towgs84': '0,0,0,0,0,0,0', 'units': 'm', 'no_defs': True} + area = geometry.AreaDefinition( + area_id='test_towgs84', + description='', + proj_id='', + projection=projection, + width=123, height=123, + area_extent=[-40000., -40000., 40000., 40000.]) + self.assertEqual(area.proj_str, + '+ellps=GRS80 +lat_0=52 +lon_0=10 +no_defs +proj=laea ' + # '+towgs84=0.0,0.0,0.0,0.0,0.0,0.0,0.0 ' + '+type=crs +units=m ' + '+x_0=4321000 +y_0=3210000') + projection = {'proj': 'laea', 'lat_0': 52, 'lon_0': 10, 'x_0': 4321000, 'y_0': 3210000, + 'ellps': 'GRS80', 'towgs84': '0,5,0,0,0,0,0', 'units': 'm', 'no_defs': True} + area = geometry.AreaDefinition( + area_id='test_towgs84', + description='', + proj_id='', + projection=projection, + width=123, height=123, + area_extent=[-40000., -40000., 40000., 40000.]) + self.assertEqual(area.proj_str, + '+ellps=GRS80 +lat_0=52 +lon_0=10 +no_defs +proj=laea ' + '+towgs84=0.0,5.0,0.0,0.0,0.0,0.0,0.0 ' + '+type=crs +units=m ' + '+x_0=4321000 +y_0=3210000') + def test_striding(self): """Test striding AreaDefinitions.""" from pyresample import utils @@ -1867,6 +1899,13 @@ class TestCrop(unittest.TestCase): expected = (0.15185342867090912, 0.15133555510297725) + np.testing.assert_allclose(expected, + geometry.get_geostationary_angle_extent(geos_area)) + + geos_area.proj_dict = {'ellps': 'GRS80', + 'h': 35785831.00} + expected = (0.15185277703584374, 0.15133971368991794) + np.testing.assert_allclose(expected, geometry.get_geostationary_angle_extent(geos_area)) ===================================== pyresample/test/test_utils.py ===================================== @@ -330,6 +330,7 @@ class TestMisc(unittest.TestCase): 1000, 1000, (-1000, -1000, 1000, 1000)) def test_proj4_radius_parameters_provided(self): + """Test proj4_radius_parameters with a/b.""" from pyresample import utils a, b = utils._proj4.proj4_radius_parameters( '+proj=stere +a=6378273 +b=6356889.44891', @@ -338,6 +339,7 @@ class TestMisc(unittest.TestCase): np.testing.assert_almost_equal(b, 6356889.44891) def test_proj4_radius_parameters_ellps(self): + """Test proj4_radius_parameters with ellps.""" from pyresample import utils a, b = utils._proj4.proj4_radius_parameters( '+proj=stere +ellps=WGS84', @@ -346,6 +348,7 @@ class TestMisc(unittest.TestCase): np.testing.assert_almost_equal(b, 6356752.314245, decimal=6) def test_proj4_radius_parameters_default(self): + """Test proj4_radius_parameters with default parameters.""" from pyresample import utils a, b = utils._proj4.proj4_radius_parameters( '+proj=lcc', @@ -354,6 +357,15 @@ class TestMisc(unittest.TestCase): np.testing.assert_almost_equal(a, 6378137.) np.testing.assert_almost_equal(b, 6356752.314245, decimal=6) + def test_proj4_radius_parameters_spherical(self): + """Test proj4_radius_parameters in case of a spherical earth.""" + from pyresample import utils + a, b = utils._proj4.proj4_radius_parameters( + '+proj=stere +R=6378273', + ) + np.testing.assert_almost_equal(a, 6378273.) + np.testing.assert_almost_equal(b, 6378273.) + def test_convert_proj_floats(self): from collections import OrderedDict import pyresample.utils as utils ===================================== pyresample/utils/_proj4.py ===================================== @@ -113,6 +113,9 @@ def proj4_radius_parameters(proj4_dict): new_info['b'] = float(new_info['a']) * (1 - float(new_info['f'])) elif 'b' in new_info and 'f' in new_info: new_info['a'] = float(new_info['b']) / (1 - float(new_info['f'])) + elif 'R' in new_info: + new_info['a'] = new_info['R'] + new_info['b'] = new_info['R'] else: geod = Geod(**{'ellps': 'WGS84'}) new_info['a'] = geod.a ===================================== pyresample/version.py ===================================== @@ -23,9 +23,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (HEAD -> master, tag: v1.13.0)" - git_full = "790a0bae85cb243c17f0150011e30c834f244e04" - git_date = "2019-09-13 08:32:20 +0200" + git_refnames = " (HEAD -> master, tag: v1.13.1)" + git_full = "d9ff2c9012a4dbfd0f39172fde2e21bb34c04e4a" + git_date = "2019-09-26 20:22:21 +0200" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/57ba95072204b4acb2d3d7ada508735c1fb7ce61 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/57ba95072204b4acb2d3d7ada508735c1fb7ce61 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 07:07:26 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 06:07:26 +0000 Subject: [Git][debian-gis-team/pyresample] Pushed new tag upstream/1.13.1 Message-ID: <5d919b9e4bc45_46f63fbab63e63a81795c7@godard.mail> Antonio Valentino pushed new tag upstream/1.13.1 at Debian GIS Project / pyresample -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/tree/upstream/1.13.1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 07:07:20 2019 From: gitlab at salsa.debian.org (Antonio Valentino) Date: Mon, 30 Sep 2019 06:07:20 +0000 Subject: [Git][debian-gis-team/pyresample][master] 5 commits: New upstream version 1.13.1 Message-ID: <5d919b9876816_46f63fbabacd32c81789d8@godard.mail> Antonio Valentino pushed to branch master at Debian GIS Project / pyresample Commits: 44db96dd by Antonio Valentino at 2019-09-30T05:59:43Z New upstream version 1.13.1 - - - - - 703badc0 by Antonio Valentino at 2019-09-30T05:59:43Z New upstream release - - - - - 421c1631 by Antonio Valentino at 2019-09-30T05:59:43Z Refresh all patches - - - - - c5a15af1 by Antonio Valentino at 2019-09-30T06:00:59Z Explicitly specify Rules-Requires-Root - - - - - be1169b3 by Antonio Valentino at 2019-09-30T06:01:34Z Set distribution to unstable - - - - - 11 changed files: - CHANGELOG.md - debian/changelog - debian/control - debian/patches/0001-fix-proj4-initialization.patch - debian/patches/0002-Skip-dask-related-tests-if-dask-is-not-available.patch - debian/patches/0003-Make-xarray-optional-for-testing.patch - pyresample/geometry.py - pyresample/test/test_geometry.py - pyresample/test/test_utils.py - pyresample/utils/_proj4.py - pyresample/version.py Changes: ===================================== CHANGELOG.md ===================================== @@ -1,3 +1,15 @@ +## Version 1.13.1 (2019/09/26) + +### Pull Requests Merged + +#### Bugs fixed + +* [PR 218](https://github.com/pytroll/pyresample/pull/218) - Fix proj_str returning invalid PROJ strings when towgs84 was included +* [PR 217](https://github.com/pytroll/pyresample/pull/217) - Fix get_geostationary_angle_extent assuming a/b definitions +* [PR 216](https://github.com/pytroll/pyresample/pull/216) - Fix proj4 radius parameters for spherical cases + +In this release 3 pull requests were closed. + ## Version 1.13.0 (2019/09/13) ### Issues Closed ===================================== debian/changelog ===================================== @@ -1,3 +1,13 @@ +pyresample (1.13.1-1) unstable; urgency=medium + + * New upstream release. + * debian/patches: + - refresh all patches + * debian/control: + - explicitly specify Rules-Requires-Root: no + + -- Antonio Valentino Mon, 30 Sep 2019 08:01:23 +0200 + pyresample (1.13.0-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -3,6 +3,7 @@ Maintainer: Debian GIS Project Uploaders: Antonio Valentino Section: python Priority: optional +Rules-Requires-Root: no Build-Depends: cython3, debhelper-compat (= 12), dh-python, ===================================== debian/patches/0001-fix-proj4-initialization.patch ===================================== @@ -21,7 +21,7 @@ index 063264d..620e665 100644 YSIZE: 480 AREA_EXTENT: (-20037508.342789244, -10018754.171394622, 20037508.342789244, 10018754.171394622) diff --git a/pyresample/test/test_geometry.py b/pyresample/test/test_geometry.py -index ba3341e..7a9630b 100644 +index 387546f..2c6375b 100644 --- a/pyresample/test/test_geometry.py +++ b/pyresample/test/test_geometry.py @@ -560,7 +560,7 @@ class Test(unittest.TestCase): ===================================== debian/patches/0002-Skip-dask-related-tests-if-dask-is-not-available.patch ===================================== @@ -9,7 +9,7 @@ Subject: Skip dask-related tests if dask is not available 3 files changed, 18 insertions(+), 2 deletions(-) diff --git a/pyresample/test/test_geometry.py b/pyresample/test/test_geometry.py -index 7a9630b..0cd6ed2 100644 +index 2c6375b..c3f58dc 100644 --- a/pyresample/test/test_geometry.py +++ b/pyresample/test/test_geometry.py @@ -9,7 +9,8 @@ import numpy as np @@ -42,7 +42,7 @@ index 7a9630b..0cd6ed2 100644 def test_get_proj_coords_dask(self): """Test get_proj_coords usage with dask arrays.""" from pyresample import utils -@@ -1424,6 +1431,8 @@ class TestSwathDefinition(unittest.TestCase): +@@ -1456,6 +1463,8 @@ class TestSwathDefinition(unittest.TestCase): assert_np_dict_allclose(res.proj_dict, proj_dict) self.assertEqual(res.shape, (6, 3)) @@ -51,7 +51,7 @@ index 7a9630b..0cd6ed2 100644 def test_aggregation(self): """Test aggregation on SwathDefinitions.""" if (sys.version_info < (3, 0)): -@@ -1441,6 +1450,7 @@ class TestSwathDefinition(unittest.TestCase): +@@ -1473,6 +1482,7 @@ class TestSwathDefinition(unittest.TestCase): np.testing.assert_allclose(res.lons, [[179, -179]]) np.testing.assert_allclose(res.lats, [[0.5, 0.5]], atol=2e-5) ===================================== debian/patches/0003-Make-xarray-optional-for-testing.patch ===================================== @@ -9,7 +9,7 @@ Subject: Make xarray optional for testing 3 files changed, 17 insertions(+), 1 deletion(-) diff --git a/pyresample/test/test_geometry.py b/pyresample/test/test_geometry.py -index 0cd6ed2..de5fd7e 100644 +index c3f58dc..c63fa9c 100644 --- a/pyresample/test/test_geometry.py +++ b/pyresample/test/test_geometry.py @@ -24,6 +24,11 @@ if sys.version_info < (2, 7): @@ -24,7 +24,7 @@ index 0cd6ed2..de5fd7e 100644 try: import dask except ImportError: -@@ -1330,6 +1335,7 @@ class TestSwathDefinition(unittest.TestCase): +@@ -1362,6 +1367,7 @@ class TestSwathDefinition(unittest.TestCase): self.assertFalse( swath_def == swath_def2, 'swath_defs are not expected to be equal') @@ -32,7 +32,7 @@ index 0cd6ed2..de5fd7e 100644 def test_compute_omerc_params(self): """Test omerc parameters computation.""" lats = np.array([[85.23900604248047, 62.256004333496094, 35.58000183105469], -@@ -1401,6 +1407,7 @@ class TestSwathDefinition(unittest.TestCase): +@@ -1433,6 +1439,7 @@ class TestSwathDefinition(unittest.TestCase): np.testing.assert_allclose(lats, [80., 80., 80., 80., 80., 80., 80., 80., 80., 80., 80., 80.]) @@ -40,7 +40,7 @@ index 0cd6ed2..de5fd7e 100644 def test_compute_optimal_bb(self): """Test computing the bb area.""" from pyresample.utils import is_pyproj2 -@@ -1431,6 +1438,7 @@ class TestSwathDefinition(unittest.TestCase): +@@ -1463,6 +1470,7 @@ class TestSwathDefinition(unittest.TestCase): assert_np_dict_allclose(res.proj_dict, proj_dict) self.assertEqual(res.shape, (6, 3)) @@ -48,7 +48,7 @@ index 0cd6ed2..de5fd7e 100644 @unittest.skipIf(not hasattr(DataArray, 'coarsen'), 'DataArray.coarsen not available') @unittest.skipIf(not dask, 'dask not available') def test_aggregation(self): -@@ -1450,6 +1458,7 @@ class TestSwathDefinition(unittest.TestCase): +@@ -1482,6 +1490,7 @@ class TestSwathDefinition(unittest.TestCase): np.testing.assert_allclose(res.lons, [[179, -179]]) np.testing.assert_allclose(res.lats, [[0.5, 0.5]], atol=2e-5) @@ -56,7 +56,7 @@ index 0cd6ed2..de5fd7e 100644 @unittest.skipIf(not dask, 'dask not available') def test_striding(self): """Test striding.""" -@@ -1776,6 +1785,7 @@ class TestDynamicAreaDefinition(unittest.TestCase): +@@ -1808,6 +1817,7 @@ class TestDynamicAreaDefinition(unittest.TestCase): self.assertEqual(result.width, 395) self.assertEqual(result.height, 539) ===================================== pyresample/geometry.py ===================================== @@ -36,7 +36,8 @@ from pyproj import Geod, transform from pyresample import CHUNK_SIZE from pyresample._spatial_mp import Cartesian, Cartesian_MP, Proj, Proj_MP from pyresample.boundary import AreaDefBoundary, Boundary, SimpleBoundary -from pyresample.utils import proj4_str_to_dict, proj4_dict_to_str, convert_proj_floats +from pyresample.utils import (proj4_str_to_dict, proj4_dict_to_str, + convert_proj_floats, proj4_radius_parameters) from pyresample.area_config import create_area_def try: @@ -1346,7 +1347,17 @@ class AreaDefinition(BaseDefinition): @property def proj_str(self): """Return PROJ projection string.""" - return proj4_dict_to_str(self.proj_dict, sort=True) + proj_dict = self.proj_dict.copy() + if 'towgs84' in proj_dict and isinstance(proj_dict['towgs84'], list): + # pyproj 2+ creates a list in the dictionary + # but the string should be comma-separated + if all(x == 0 for x in proj_dict['towgs84']): + # all 0s in towgs84 are technically equal to not having them + # specified, but PROJ considers them different + proj_dict.pop('towgs84') + else: + proj_dict['towgs84'] = ','.join(str(x) for x in proj_dict['towgs84']) + return proj4_dict_to_str(proj_dict, sort=True) def __str__(self): """Return string representation of the AreaDefinition.""" @@ -1904,8 +1915,9 @@ class AreaDefinition(BaseDefinition): def get_geostationary_angle_extent(geos_area): """Get the max earth (vs space) viewing angles in x and y.""" # get some projection parameters - req = geos_area.proj_dict['a'] / 1000.0 - rp = geos_area.proj_dict['b'] / 1000.0 + a, b = proj4_radius_parameters(geos_area.proj_dict) + req = a / 1000.0 + rp = b / 1000.0 h = geos_area.proj_dict['h'] / 1000.0 + req # compute some constants ===================================== pyresample/test/test_geometry.py ===================================== @@ -1133,6 +1133,38 @@ class Test(unittest.TestCase): area_extent=[-40000., -40000., 40000., 40000.]) self.assertEqual(area.proj_str, expected_proj) + if utils.is_pyproj2(): + # CRS with towgs84 in it + # we remove towgs84 if they are all 0s + projection = {'proj': 'laea', 'lat_0': 52, 'lon_0': 10, 'x_0': 4321000, 'y_0': 3210000, + 'ellps': 'GRS80', 'towgs84': '0,0,0,0,0,0,0', 'units': 'm', 'no_defs': True} + area = geometry.AreaDefinition( + area_id='test_towgs84', + description='', + proj_id='', + projection=projection, + width=123, height=123, + area_extent=[-40000., -40000., 40000., 40000.]) + self.assertEqual(area.proj_str, + '+ellps=GRS80 +lat_0=52 +lon_0=10 +no_defs +proj=laea ' + # '+towgs84=0.0,0.0,0.0,0.0,0.0,0.0,0.0 ' + '+type=crs +units=m ' + '+x_0=4321000 +y_0=3210000') + projection = {'proj': 'laea', 'lat_0': 52, 'lon_0': 10, 'x_0': 4321000, 'y_0': 3210000, + 'ellps': 'GRS80', 'towgs84': '0,5,0,0,0,0,0', 'units': 'm', 'no_defs': True} + area = geometry.AreaDefinition( + area_id='test_towgs84', + description='', + proj_id='', + projection=projection, + width=123, height=123, + area_extent=[-40000., -40000., 40000., 40000.]) + self.assertEqual(area.proj_str, + '+ellps=GRS80 +lat_0=52 +lon_0=10 +no_defs +proj=laea ' + '+towgs84=0.0,5.0,0.0,0.0,0.0,0.0,0.0 ' + '+type=crs +units=m ' + '+x_0=4321000 +y_0=3210000') + def test_striding(self): """Test striding AreaDefinitions.""" from pyresample import utils @@ -1867,6 +1899,13 @@ class TestCrop(unittest.TestCase): expected = (0.15185342867090912, 0.15133555510297725) + np.testing.assert_allclose(expected, + geometry.get_geostationary_angle_extent(geos_area)) + + geos_area.proj_dict = {'ellps': 'GRS80', + 'h': 35785831.00} + expected = (0.15185277703584374, 0.15133971368991794) + np.testing.assert_allclose(expected, geometry.get_geostationary_angle_extent(geos_area)) ===================================== pyresample/test/test_utils.py ===================================== @@ -330,6 +330,7 @@ class TestMisc(unittest.TestCase): 1000, 1000, (-1000, -1000, 1000, 1000)) def test_proj4_radius_parameters_provided(self): + """Test proj4_radius_parameters with a/b.""" from pyresample import utils a, b = utils._proj4.proj4_radius_parameters( '+proj=stere +a=6378273 +b=6356889.44891', @@ -338,6 +339,7 @@ class TestMisc(unittest.TestCase): np.testing.assert_almost_equal(b, 6356889.44891) def test_proj4_radius_parameters_ellps(self): + """Test proj4_radius_parameters with ellps.""" from pyresample import utils a, b = utils._proj4.proj4_radius_parameters( '+proj=stere +ellps=WGS84', @@ -346,6 +348,7 @@ class TestMisc(unittest.TestCase): np.testing.assert_almost_equal(b, 6356752.314245, decimal=6) def test_proj4_radius_parameters_default(self): + """Test proj4_radius_parameters with default parameters.""" from pyresample import utils a, b = utils._proj4.proj4_radius_parameters( '+proj=lcc', @@ -354,6 +357,15 @@ class TestMisc(unittest.TestCase): np.testing.assert_almost_equal(a, 6378137.) np.testing.assert_almost_equal(b, 6356752.314245, decimal=6) + def test_proj4_radius_parameters_spherical(self): + """Test proj4_radius_parameters in case of a spherical earth.""" + from pyresample import utils + a, b = utils._proj4.proj4_radius_parameters( + '+proj=stere +R=6378273', + ) + np.testing.assert_almost_equal(a, 6378273.) + np.testing.assert_almost_equal(b, 6378273.) + def test_convert_proj_floats(self): from collections import OrderedDict import pyresample.utils as utils ===================================== pyresample/utils/_proj4.py ===================================== @@ -113,6 +113,9 @@ def proj4_radius_parameters(proj4_dict): new_info['b'] = float(new_info['a']) * (1 - float(new_info['f'])) elif 'b' in new_info and 'f' in new_info: new_info['a'] = float(new_info['b']) / (1 - float(new_info['f'])) + elif 'R' in new_info: + new_info['a'] = new_info['R'] + new_info['b'] = new_info['R'] else: geod = Geod(**{'ellps': 'WGS84'}) new_info['a'] = geod.a ===================================== pyresample/version.py ===================================== @@ -23,9 +23,9 @@ def get_keywords(): # setup.py/versioneer.py will grep for the variable names, so they must # each be defined on a line of their own. _version.py will just call # get_keywords(). - git_refnames = " (HEAD -> master, tag: v1.13.0)" - git_full = "790a0bae85cb243c17f0150011e30c834f244e04" - git_date = "2019-09-13 08:32:20 +0200" + git_refnames = " (HEAD -> master, tag: v1.13.1)" + git_full = "d9ff2c9012a4dbfd0f39172fde2e21bb34c04e4a" + git_date = "2019-09-26 20:22:21 +0200" keywords = {"refnames": git_refnames, "full": git_full, "date": git_date} return keywords View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/compare/44d53d73f4a56ca27fed046c8baff194f4b4b3d6...be1169b316646e8e82b084cfd0319efd85fa8bf2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/compare/44d53d73f4a56ca27fed046c8baff194f4b4b3d6...be1169b316646e8e82b084cfd0319efd85fa8bf2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 07:16:19 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 06:16:19 +0000 Subject: [Git][debian-gis-team/trollimage] Pushed new tag debian/1.10.1-1 Message-ID: <5d919db340d94_46f63fbab63e63a81801f3@godard.mail> Bas Couwenberg pushed new tag debian/1.10.1-1 at Debian GIS Project / trollimage -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/tree/debian/1.10.1-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 30 07:29:55 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 30 Sep 2019 06:29:55 +0000 Subject: Processing of trollimage_1.10.1-1_source.changes Message-ID: trollimage_1.10.1-1_source.changes uploaded successfully to localhost along with the files: trollimage_1.10.1-1.dsc trollimage_1.10.1.orig.tar.gz trollimage_1.10.1-1.debian.tar.xz trollimage_1.10.1-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From gitlab at salsa.debian.org Mon Sep 30 07:30:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 06:30:16 +0000 Subject: [Git][debian-gis-team/pyresample] Pushed new tag debian/1.13.1-1 Message-ID: <5d91a0f85fdbc_46f62ac0ff4afc641817e2@godard.mail> Bas Couwenberg pushed new tag debian/1.13.1-1 at Debian GIS Project / pyresample -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/tree/debian/1.13.1-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 30 07:36:11 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 30 Sep 2019 06:36:11 +0000 Subject: trollimage_1.10.1-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 30 Sep 2019 05:33:09 +0000 Source: trollimage Architecture: source Version: 1.10.1-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: trollimage (1.10.1-1) unstable; urgency=medium . * New upstream release. Checksums-Sha1: eab759eab3fecb746d169090a02c01802bc07597 2158 trollimage_1.10.1-1.dsc 611d8ec78cfa3bf534de3a43cfb5de43995d2bab 1418201 trollimage_1.10.1.orig.tar.gz cd7436ceeba7c81d17ca9500d808ce9c3e1db06f 3172 trollimage_1.10.1-1.debian.tar.xz 6f83e532113e2ac4b4841e14f58d571a125e7b3c 10020 trollimage_1.10.1-1_amd64.buildinfo Checksums-Sha256: 4fa0d204977b340d4fa5859fc39dc97774a117310165517c597cd191a0cacabb 2158 trollimage_1.10.1-1.dsc f1b825e0310732c8dfc1bafa4592dc298658fc06a9cb19470b32c3d188b11def 1418201 trollimage_1.10.1.orig.tar.gz 97281a5f1a4ba8654a7fa43593ff54680a2e102cb65d338b86c9468f4c267984 3172 trollimage_1.10.1-1.debian.tar.xz f4c03baf8837484f9b8add5a7384ff38c1f66d48c7f5b88dde4f5564f0e9d0e9 10020 trollimage_1.10.1-1_amd64.buildinfo Files: effa287360a8d006ddd185e91841bf8f 2158 python optional trollimage_1.10.1-1.dsc 10fc0c36d6042129d046c94578c57621 1418201 python optional trollimage_1.10.1.orig.tar.gz 6a2b266604d1b3a478bdf078ecc94d9a 3172 python optional trollimage_1.10.1-1.debian.tar.xz b73f48c2c06a2042d4b0bdc83f86ee41 10020 python optional trollimage_1.10.1-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2RnZ4ACgkQZ1DxCuiN SvGnmw/9EfnKSUpg9QL+78RxHZGGHB/Y4w9u6IJZvyMum52W/UL12bIClHl1BSyh 2A3XJU/fmk6+AOnND2B/Afm4+hOR9uitRQaoald4Yu7REo6Ub4Zlmq+YGY9hYsuI xLDHSjpZ9IEmHa6U70bBSy0j/8X0Cn+QNfgV2YJkpFMWRcLuVUpOAdWYwztt44UX dCxZBHMdkSpc7g325fSDaSHPbZ2ARkR/dBRkaAnknglUGTdlXovhNeQS7eBLzwZf WQXBR/PZsmCR64k8Z3P6XHh9xaDYHU5E1/2BW9CP5LslsfMgSk+EDtMcNMkbIL9i WZtNf4eODPvMhDvb17EaDRvbHsEgDvB7ebCZAl2jaCD7x+7x1RMHWhKF8gySznE9 PqN9YohLnnIJfwFphPGrPCm0u/An6tW6z/VLPUbWz9uhlqvLn1qVBokW+CTlmaC9 Rv8KFDeY0jb0R0sVcu81QUbps2Q1tzxYgQVmyUiwjoTdCk3P6eiDWbZcXGg2BBbK d7nRm1q9FsIR2S4422hFGJjN4+3Eyu5XHkGUQWMyZi+Mk84zyITm60uIr1/Smd08 dLE5lzopePThlKz+Revnf23JugcrUvQb2jXgNM12JCvuIGAMhCISk8ssKaBxF4Sf JL2JZtujbKvHalvkchp8YaG2X/78+sprfCW+89OIp9KRdOsQBJA= =KIy1 -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From ftpmaster at ftp-master.debian.org Mon Sep 30 07:39:55 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 30 Sep 2019 06:39:55 +0000 Subject: Processing of pyresample_1.13.1-1_source.changes Message-ID: pyresample_1.13.1-1_source.changes uploaded successfully to localhost along with the files: pyresample_1.13.1-1.dsc pyresample_1.13.1.orig.tar.gz pyresample_1.13.1-1.debian.tar.xz pyresample_1.13.1-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 30 07:49:01 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 30 Sep 2019 06:49:01 +0000 Subject: pyresample_1.13.1-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 30 Sep 2019 08:01:23 +0200 Source: pyresample Architecture: source Version: 1.13.1-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Antonio Valentino Changes: pyresample (1.13.1-1) unstable; urgency=medium . * New upstream release. * debian/patches: - refresh all patches * debian/control: - explicitly specify Rules-Requires-Root: no Checksums-Sha1: 14e2dec070503e56ba86feaae4233c3621baf805 2532 pyresample_1.13.1-1.dsc 424f4d8475f6fc33baa6f9623b20ddccda25b337 5783090 pyresample_1.13.1.orig.tar.gz 3da08ba45d6821f66020eb84f3af4fe4ed7fba87 10760 pyresample_1.13.1-1.debian.tar.xz 97c01a083bfc3076592de3e48e9284ca8fced155 12695 pyresample_1.13.1-1_amd64.buildinfo Checksums-Sha256: 80cd0e8fe3e58e3966f2692b4d7d2e36184a3fefa1c0bb713937f0e815897a56 2532 pyresample_1.13.1-1.dsc e2e0cb7a66981a990c6aa5c608c2a8517c04b443e8466d9eaf0d6fd3614b0e2d 5783090 pyresample_1.13.1.orig.tar.gz 3dc68074586a1509ea0b3029b4e2f102deab3903cb8be82bae2a63cd652aab98 10760 pyresample_1.13.1-1.debian.tar.xz 121ea9d7148a2db1f5ffed3a44c3d7d7d680b7c344cd22c3ee98854189c46aff 12695 pyresample_1.13.1-1_amd64.buildinfo Files: fafc1e810272246c44d6ae175dc8e731 2532 python optional pyresample_1.13.1-1.dsc a58a9e7a90dc434dcf08c4adaa30bd93 5783090 python optional pyresample_1.13.1.orig.tar.gz d1e166a8611da31f7824f10fd89c8d94 10760 python optional pyresample_1.13.1-1.debian.tar.xz cff60e590043d805890b615fe146fb1a 12695 python optional pyresample_1.13.1-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2RoNsACgkQZ1DxCuiN SvGy9Q//ZLfvTfIOMlBEug9YT3QRti9c1Deth/5rP+fHWQGjO0mcn/1m/MsHVxrn i+TojdLzzN8HzGtW/Y9WJwZZOuht9+yf/HsD1FhlG63OWHUUeQjgSmvyQq8oIurE +dneaJYHySM1dhZ7MmSxqH93zkHjg29nO7E1stS4Y7x4JZXh/xPUT/G8S8yZlc+y GJuNiibCWZCNxVMetTciPj32TyI8VJtaKC+LyPl8s5fUI+PT46IzsYieKDTBeoPE zeDaN0iKTTOW3aQQpRDuyqrasBQ40c+CRHTh+oYYpFFzC3wGlCCo2Aa5khJ8rmnP cQOytTZLxgBsxNMFSqe7SKa8Tig+kcxcOrhDqTlwHr/X/88qfRd4PuXB83NFIZJk xR13CkGXpltVBTc54EBvGCBYicuFdq0uT8nMkDNsz+g1n2PTYeL3UajDC+8XLwul equVMYY7ifqV/mYKYzq09nCyehZXu1aDGYSngFCtTYl5L4QS2RmzDzaCqwVZxwQD aX4xco00pBMddRXExrYs5bUseHvmVm0Jh47YqX4i00hfgY4WMRk25GP9VgxOSWJE KUngDj9exRUAKh7uuR0B+pbAxks5Y/rF3kJyf3dziDUUwKiNXkvc6v8zX7xOoIVz 8K/7OK/gN1dIjzsEBGJwyZOcUONDk/Dit0FkOuCNYw5hV7ABWtM= =/6iv -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From brenda at acae.co.za Mon Sep 30 09:59:25 2019 From: brenda at acae.co.za (Brenda Mweshi (Workshop Invitation)) Date: Mon, 30 Sep 2019 10:59:25 +0200 Subject: Confirmed Training Workshops and Seminars for Octobert to December Message-ID: <55602359279842952514257@PROD08> Thank you for Receiving this workshop invitation. You may please unsubscribe here if you no longer wish to receive our emails --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 1.png Type: image/png Size: 46038 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 2.gif Type: image/gif Size: 646 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 3.gif Type: image/gif Size: 641 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 4.gif Type: image/gif Size: 650 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 5.gif Type: image/gif Size: 650 bytes Desc: not available URL: From jidanni at jidanni.org Mon Sep 30 15:20:50 2019 From: jidanni at jidanni.org (=?UTF-8?Q?=E7=A9=8D=E4=B8=B9=E5=B0=BC?= Dan Jacobson) Date: Mon, 30 Sep 2019 22:20:50 +0800 Subject: Bug#941434: "You should update!" upon start Message-ID: <87pnjhhnlp.5.fsf@jidanni.org> Package: josm Version: 0.0.svn15322+dfsg-1 Severity: wishlist Start JOSM. The first thing we see is ... You should update! So maybe Debian should update more. From owner at bugs.debian.org Mon Sep 30 15:36:03 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Mon, 30 Sep 2019 14:36:03 +0000 Subject: Processed: Re: Bug#941434: "You should update!" upon start References: <8beb63ba-ef12-0f5a-a70b-f783bba0005d@xs4all.nl> Message-ID: Processing commands for control at bugs.debian.org: > tags 941434 wontfix Bug #941434 [josm] "You should update!" upon start Added tag(s) wontfix. > thanks Stopping processing here. Please contact me if you need assistance. -- 941434: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=941434 Debian Bug Tracking System Contact owner at bugs.debian.org with problems From owner at bugs.debian.org Mon Sep 30 15:45:10 2019 From: owner at bugs.debian.org (Debian Bug Tracking System) Date: Mon, 30 Sep 2019 14:45:10 +0000 Subject: Bug#941434: marked as done ("You should update!" upon start) References: <8beb63ba-ef12-0f5a-a70b-f783bba0005d@xs4all.nl> <87pnjhhnlp.5.fsf@jidanni.org> Message-ID: Your message dated Mon, 30 Sep 2019 16:33:54 +0200 with message-id <8beb63ba-ef12-0f5a-a70b-f783bba0005d at xs4all.nl> and subject line Re: Bug#941434: "You should update!" upon start has caused the Debian Bug report #941434, regarding "You should update!" upon start to be marked as done. This means that you claim that the problem has been dealt with. If this is not the case it is now your responsibility to reopen the Bug report if necessary, and/or fix the problem forthwith. (NB: If you are a system administrator and have no idea what this message is talking about, this may indicate a serious mail system misconfiguration somewhere. Please contact owner at bugs.debian.org immediately.) -- 941434: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=941434 Debian Bug Tracking System Contact owner at bugs.debian.org with problems -------------- next part -------------- An embedded message was scrubbed... From: =?utf-8?B?56mN5Li55bC8?= Dan Jacobson Subject: "You should update!" upon start Date: Mon, 30 Sep 2019 22:20:50 +0800 Size: 4341 URL: -------------- next part -------------- An embedded message was scrubbed... From: Sebastiaan Couwenberg Subject: Re: Bug#941434: "You should update!" upon start Date: Mon, 30 Sep 2019 16:33:54 +0200 Size: 6767 URL: From gitlab at salsa.debian.org Mon Sep 30 16:09:52 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 15:09:52 +0000 Subject: [Git][debian-gis-team/josm][pristine-tar] pristine-tar data for josm_0.0.svn15390+dfsg.orig.tar.gz Message-ID: <5d921ac0abf80_46f62ac0ff19eb8824447d@godard.mail> Bas Couwenberg pushed to branch pristine-tar at Debian GIS Project / josm Commits: d9eb932b by Bas Couwenberg at 2019-09-30T14:56:04Z pristine-tar data for josm_0.0.svn15390+dfsg.orig.tar.gz - - - - - 2 changed files: - + josm_0.0.svn15390+dfsg.orig.tar.gz.delta - + josm_0.0.svn15390+dfsg.orig.tar.gz.id Changes: ===================================== josm_0.0.svn15390+dfsg.orig.tar.gz.delta ===================================== Binary files /dev/null and b/josm_0.0.svn15390+dfsg.orig.tar.gz.delta differ ===================================== josm_0.0.svn15390+dfsg.orig.tar.gz.id ===================================== @@ -0,0 +1 @@ +7f2cbee8b908e91aa9c5b109c43aa35c8ca6c04d View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/commit/d9eb932b4ba2d22229eb02611ba8a63f33d4dda7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/commit/d9eb932b4ba2d22229eb02611ba8a63f33d4dda7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 16:09:55 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 15:09:55 +0000 Subject: [Git][debian-gis-team/josm][master] 5 commits: New upstream version 0.0.svn15390+dfsg Message-ID: <5d921ac3c4506_46f62ac0fe4cd06c2446ba@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / josm Commits: dd5134f9 by Bas Couwenberg at 2019-09-30T14:52:22Z New upstream version 0.0.svn15390+dfsg - - - - - ad4c2690 by Bas Couwenberg at 2019-09-30T14:56:05Z Update upstream source from tag 'upstream/0.0.svn15390+dfsg' Update to upstream version '0.0.svn15390+dfsg' with Debian dir 8dc3b862da893bd81fe681780c4b7a11f1d38102 - - - - - 1f831a08 by Bas Couwenberg at 2019-09-30T14:56:29Z New tested snapshot. - - - - - d02850ba by Bas Couwenberg at 2019-09-30T14:58:03Z Refresh patches. - - - - - 46628bda by Bas Couwenberg at 2019-09-30T14:59:44Z Set distribution to unstable. - - - - - 30 changed files: - REVISION - build.xml - data/defaultpresets.xml - data/validator/combinations.mapcss - data/validator/deprecated.mapcss - data/validator/ignoretags.cfg - data/validator/numeric.mapcss - data/validator/territories.mapcss - + data_nodist/rtklib_example2.pos - data_nodist/trans/ast.lang - data_nodist/trans/be.lang - data_nodist/trans/bg.lang - data_nodist/trans/ca-valencia.lang - data_nodist/trans/ca.lang - data_nodist/trans/cs.lang - data_nodist/trans/da.lang - data_nodist/trans/de.lang - data_nodist/trans/el.lang - data_nodist/trans/en.lang - data_nodist/trans/en_AU.lang - data_nodist/trans/en_CA.lang - data_nodist/trans/en_GB.lang - data_nodist/trans/es.lang - data_nodist/trans/et.lang - data_nodist/trans/fi.lang - data_nodist/trans/fr.lang - data_nodist/trans/gl.lang - data_nodist/trans/hu.lang - data_nodist/trans/id.lang - data_nodist/trans/it.lang The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/compare/42ce829861aa46288a38be3ca9ab92f85fbb00e3...46628bda5cb12be227cb9b8a0b297626f3b3e6e3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/compare/42ce829861aa46288a38be3ca9ab92f85fbb00e3...46628bda5cb12be227cb9b8a0b297626f3b3e6e3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 16:09:56 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 15:09:56 +0000 Subject: [Git][debian-gis-team/josm][upstream] New upstream version 0.0.svn15390+dfsg Message-ID: <5d921ac487f23_46f62ac0f98dfce024489f@godard.mail> Bas Couwenberg pushed to branch upstream at Debian GIS Project / josm Commits: dd5134f9 by Bas Couwenberg at 2019-09-30T14:52:22Z New upstream version 0.0.svn15390+dfsg - - - - - 30 changed files: - REVISION - build.xml - data/defaultpresets.xml - data/validator/combinations.mapcss - data/validator/deprecated.mapcss - data/validator/ignoretags.cfg - data/validator/numeric.mapcss - data/validator/territories.mapcss - + data_nodist/rtklib_example2.pos - data_nodist/trans/ast.lang - data_nodist/trans/be.lang - data_nodist/trans/bg.lang - data_nodist/trans/ca-valencia.lang - data_nodist/trans/ca.lang - data_nodist/trans/cs.lang - data_nodist/trans/da.lang - data_nodist/trans/de.lang - data_nodist/trans/el.lang - data_nodist/trans/en.lang - data_nodist/trans/en_AU.lang - data_nodist/trans/en_CA.lang - data_nodist/trans/en_GB.lang - data_nodist/trans/es.lang - data_nodist/trans/et.lang - data_nodist/trans/fi.lang - data_nodist/trans/fr.lang - data_nodist/trans/gl.lang - data_nodist/trans/hu.lang - data_nodist/trans/id.lang - data_nodist/trans/it.lang The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/commit/dd5134f9e48441e958de6e1aad417023349e8402 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/commit/dd5134f9e48441e958de6e1aad417023349e8402 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 16:09:57 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 15:09:57 +0000 Subject: [Git][debian-gis-team/josm] Pushed new tag debian/0.0.svn15390+dfsg-1 Message-ID: <5d921ac5e08a1_46f62ac0fbe48aa42450bf@godard.mail> Bas Couwenberg pushed new tag debian/0.0.svn15390+dfsg-1 at Debian GIS Project / josm -- View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/tree/debian/0.0.svn15390+dfsg-1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 16:09:58 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 15:09:58 +0000 Subject: [Git][debian-gis-team/josm] Pushed new tag upstream/0.0.svn15390+dfsg Message-ID: <5d921ac6b6ac4_46f62ac0f98dfce0245250@godard.mail> Bas Couwenberg pushed new tag upstream/0.0.svn15390+dfsg at Debian GIS Project / josm -- View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/tree/upstream/0.0.svn15390+dfsg You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ftpmaster at ftp-master.debian.org Mon Sep 30 16:19:26 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 30 Sep 2019 15:19:26 +0000 Subject: Processing of josm_0.0.svn15390+dfsg-1_source.changes Message-ID: josm_0.0.svn15390+dfsg-1_source.changes uploaded successfully to localhost along with the files: josm_0.0.svn15390+dfsg-1.dsc josm_0.0.svn15390+dfsg.orig.tar.gz josm_0.0.svn15390+dfsg-1.debian.tar.xz josm_0.0.svn15390+dfsg-1_amd64.buildinfo Greetings, Your Debian queue daemon (running on host usper.debian.org) From ftpmaster at ftp-master.debian.org Mon Sep 30 16:39:25 2019 From: ftpmaster at ftp-master.debian.org (Debian FTP Masters) Date: Mon, 30 Sep 2019 15:39:25 +0000 Subject: josm_0.0.svn15390+dfsg-1_source.changes ACCEPTED into unstable Message-ID: Accepted: -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 1.8 Date: Mon, 30 Sep 2019 16:59:30 +0200 Source: josm Architecture: source Version: 0.0.svn15390+dfsg-1 Distribution: unstable Urgency: medium Maintainer: Debian GIS Project Changed-By: Bas Couwenberg Changes: josm (0.0.svn15390+dfsg-1) unstable; urgency=medium . * New tested snapshot. * Refresh patches. Checksums-Sha1: 20ddd787e63239a7b9b0353b04c966cadb94b7c5 2360 josm_0.0.svn15390+dfsg-1.dsc 9da71a1ba4ca7b7c6cd6304354e2aaa40568e06f 51615823 josm_0.0.svn15390+dfsg.orig.tar.gz 649a3dfab3bfd6f81aa345d246eef25b3634ba05 97352 josm_0.0.svn15390+dfsg-1.debian.tar.xz f5afa97808cb1168f6ac1a074db9ef3e0ef52733 12717 josm_0.0.svn15390+dfsg-1_amd64.buildinfo Checksums-Sha256: d728dbce91e5ef6c5e3540f8bde51af04a12c72d485b47a6c1583d85e409408d 2360 josm_0.0.svn15390+dfsg-1.dsc fdbcb1083f6b18c14278d6b05f72c56354aba630a2636761394c16537d1f93b4 51615823 josm_0.0.svn15390+dfsg.orig.tar.gz e7f20da6478b957737c0b8545af871a65f4778f1e4f3e6be659cf63b941a625c 97352 josm_0.0.svn15390+dfsg-1.debian.tar.xz dbfc46856c51214107acd80db568bb6ab3c678fdf9537b841bcdc49955ccffed 12717 josm_0.0.svn15390+dfsg-1_amd64.buildinfo Files: f2583bbe87b04a19a41489c67da55858 2360 utils optional josm_0.0.svn15390+dfsg-1.dsc b05cf9722759fb5bebbf94daea1dbeda 51615823 utils optional josm_0.0.svn15390+dfsg.orig.tar.gz dca276bd7ed07d7382329218e8d13a1e 97352 utils optional josm_0.0.svn15390+dfsg-1.debian.tar.xz ae261d3950d7f404b76b1f87d8eb2037 12717 utils optional josm_0.0.svn15390+dfsg-1_amd64.buildinfo -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEgYLeQXBWQI1hRlDRZ1DxCuiNSvEFAl2SGpgACgkQZ1DxCuiN SvGwnQ//Tnm4cexYHaAdmD07n4vrAdAfbkIje5nsTcK15eoA39ptwdev8rClW6iS MH0t7AL6l2tYLk745aqCEeoHoY3zHfOoDuJR5ULz1zrAEgzSPhtKOCY0TAP77XxL dkZ+Pj3gtPSvicju4oYdDYLIOobGTQKLk4jOtdM7ToX2OQA17jpoC8m39FJqWruT 5Y/x/4YrPihi0/zLFZ62BpcWFE9ooyJVl5YPUCLHurwtr4VtMft9Uft8xXzptO7E cf2RMTjPCr/gWO6OYhF/1ZN01nDBxp2VBE7VRGtLLok6rj6vyig8r0Wo5k5RUlIf d2tvREL0zuyC2Wr7G8efebagThHlD00ODAm7gU8+uMRZ68cyy0ISXYDNiPVi8BIe b7lab+yTNVApBDhQNEpbHj4Riheu06qS/OF2gZdzeXDV1uKd6/Whdavey330b7yP iBNxnDKiY3bTpJQBo3Wdu0Fl4B01IQ4O0egmDPmKrYCdDz4buy4gAe/G/oBPQ1de GLWK2JSm3WWtNZyBMgx10zjdBq/QkPWFU5e892yyaTG/H+S7H99gdImqD7xRhoA2 0r6Niey+Fc89hnmyQ9oZNo0xJkZwsfqak3vOVg4KqEyHHJQHrdT6oWmGs/8wV7mc hVG3EJPvshmak+Wt7tnRWSBgXBviCMunmMoFm0+E1/fSHwURvc8= =ZlXx -----END PGP SIGNATURE----- Thank you for your contribution to Debian. From gitlab at salsa.debian.org Mon Sep 30 17:49:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:49:09 +0000 Subject: [Git][debian-gis-team/aggdraw][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923205cd3d2_46f62ac0fbe48aa4256669@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / aggdraw Commits: 77c385d8 by Bas Couwenberg at 2019-09-30T16:49:02Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -2,6 +2,7 @@ aggdraw (1.3.11-3) UNRELEASED; urgency=medium * Use debhelper-compat instead of debian/compat. * Remove obsolete field Name from debian/upstream/metadata. + * Bump Standards-Version to 4.4.1, no changes. -- Antonio Valentino Tue, 06 Aug 2019 07:12:02 +0000 ===================================== debian/control ===================================== @@ -13,7 +13,7 @@ Build-Depends: debhelper-compat (= 12), python3-pil, python3-pkgconfig, python3-pytest -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/aggdraw Vcs-Git: https://salsa.debian.org/debian-gis-team/aggdraw.git Homepage: https://github.com/pytroll/aggdraw View it on GitLab: https://salsa.debian.org/debian-gis-team/aggdraw/commit/77c385d8143735b07b996931d3be4e5d349d0118 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/aggdraw/commit/77c385d8143735b07b996931d3be4e5d349d0118 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:50:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:50:30 +0000 Subject: [Git][debian-gis-team/avce00][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9232569c47b_46f62ac0ff19eb882568cb@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / avce00 Commits: 8d2ad64f by Bas Couwenberg at 2019-09-30T16:50:23Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ avce00 (2.0.0-8) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sat, 04 Aug 2018 11:51:11 +0200 ===================================== debian/control ===================================== @@ -5,7 +5,7 @@ Uploaders: Francesco Paolo Lovergine , Section: science Priority: optional Build-Depends: debhelper (>= 9) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/avce00 Vcs-Git: https://salsa.debian.org/debian-gis-team/avce00.git Homepage: http://avce00.maptools.org/avce00/ View it on GitLab: https://salsa.debian.org/debian-gis-team/avce00/commit/8d2ad64ff3254228d7b2d53918f7a2f9d8465f32 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/avce00/commit/8d2ad64ff3254228d7b2d53918f7a2f9d8465f32 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:50:55 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:50:55 +0000 Subject: [Git][debian-gis-team/cftime][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92326f27de6_46f63fbab65044882570d9@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / cftime Commits: 5e0fe041 by Bas Couwenberg at 2019-09-30T16:50:47Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +cftime (1.0.3.4-4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 18:50:41 +0200 + cftime (1.0.3.4-3) unstable; urgency=medium * Drop Python 2 support. ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9), python3-numpy, python3-pytest, cython3 -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/cftime/ Vcs-Git: https://salsa.debian.org/debian-gis-team/cftime.git Homepage: http://unidata.github.io/cftime/ View it on GitLab: https://salsa.debian.org/debian-gis-team/cftime/commit/5e0fe041680871e1851fcbd5c929e468d9202bdc -- View it on GitLab: https://salsa.debian.org/debian-gis-team/cftime/commit/5e0fe041680871e1851fcbd5c929e468d9202bdc You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:51:19 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:51:19 +0000 Subject: [Git][debian-gis-team/dans-gdal-scripts][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923287a10d4_46f63fbab650448825728e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / dans-gdal-scripts Commits: 21ad8e74 by Bas Couwenberg at 2019-09-30T16:51:09Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ dans-gdal-scripts (0.24-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: debhelper (>= 9), autotools-dev, libgdal-dev, libboost-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/dans-gdal-scripts Vcs-Git: https://salsa.debian.org/debian-gis-team/dans-gdal-scripts.git Homepage: http://www.gina.alaska.edu/projects/gina-tools View it on GitLab: https://salsa.debian.org/debian-gis-team/dans-gdal-scripts/commit/21ad8e74404f9c7f088d944909a08cd77d68393f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/dans-gdal-scripts/commit/21ad8e74404f9c7f088d944909a08cd77d68393f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:51:56 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:51:56 +0000 Subject: [Git][debian-gis-team/doris][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9232ac60030_46f62ac0fe4cd06c2574cc@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / doris Commits: 20b1cd33 by Bas Couwenberg at 2019-09-30T16:51:41Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +doris (5.0.3~beta+dfsg-14) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 18:51:39 +0200 + doris (5.0.3~beta+dfsg-13) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -20,7 +20,7 @@ Build-Depends: debhelper-compat (= 12), python3-scipy, python3-setuptools, python3-shapely -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/doris Vcs-Git: https://salsa.debian.org/debian-gis-team/doris.git Homepage: http://doris.tudelft.nl View it on GitLab: https://salsa.debian.org/debian-gis-team/doris/commit/20b1cd33c7746bfcaf0dbb153ec4862a9b519f03 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/doris/commit/20b1cd33c7746bfcaf0dbb153ec4862a9b519f03 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:52:21 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:52:21 +0000 Subject: [Git][debian-gis-team/e00compr][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9232c5c8092_46f62ac0ff19eb88257687@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / e00compr Commits: 55d7d78e by Bas Couwenberg at 2019-09-30T16:52:14Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ e00compr (1.0.1-6) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sat, 04 Aug 2018 11:56:14 +0200 ===================================== debian/control ===================================== @@ -5,7 +5,7 @@ Uploaders: Francesco Paolo Lovergine , Section: science Priority: optional Build-Depends: debhelper (>= 9) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/e00compr Vcs-Git: https://salsa.debian.org/debian-gis-team/e00compr.git Homepage: http://avce00.maptools.org/e00compr/ View it on GitLab: https://salsa.debian.org/debian-gis-team/e00compr/commit/55d7d78e96b9b36d8b31a47d638b9b5cbaa3238f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/e00compr/commit/55d7d78e96b9b36d8b31a47d638b9b5cbaa3238f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:52:51 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:52:51 +0000 Subject: [Git][debian-gis-team/epr-api][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9232e3cf658_46f63fbab650448825787d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / epr-api Commits: 575703eb by Bas Couwenberg at 2019-09-30T16:52:44Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +epr-api (2.3~dev20150708-10) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 18:52:43 +0200 + epr-api (2.3~dev20150708-9) unstable; urgency=medium [ Antonio Valentino ] ===================================== debian/control ===================================== @@ -6,7 +6,7 @@ Priority: optional Build-Depends: cmake, doxygen, debhelper-compat (= 12) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/epr-api Vcs-Git: https://salsa.debian.org/debian-gis-team/epr-api.git Homepage: https://github.com/bcdev/epr-api View it on GitLab: https://salsa.debian.org/debian-gis-team/epr-api/commit/575703eb54ea3a6f0fc7b0202bde2f740543cdc0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/epr-api/commit/575703eb54ea3a6f0fc7b0202bde2f740543cdc0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:53:55 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:53:55 +0000 Subject: [Git][debian-gis-team/fiona][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923323ba0a1_46f63fbab6504488258049@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / fiona Commits: 64154591 by Bas Couwenberg at 2019-09-30T16:53:38Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +fiona (1.8.8-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 18:53:29 +0200 + fiona (1.8.8-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -20,7 +20,7 @@ Build-Depends: debhelper (>= 9), python3-setuptools, python3-six, python3-sphinx -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/fiona Vcs-Git: https://salsa.debian.org/debian-gis-team/fiona.git Homepage: https://github.com/Toblerity/Fiona View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/commit/6415459118b16f49b766b9f11fbc455e4a87b00c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fiona/commit/6415459118b16f49b766b9f11fbc455e4a87b00c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:54:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:54:41 +0000 Subject: [Git][debian-gis-team/freexl][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923351f35a5_46f62ac0f98dfce0258293@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / freexl Commits: 43268c18 by Bas Couwenberg at 2019-09-30T16:54:34Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ freexl (1.0.5-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add Build-Depends-Package field to symbols file. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Priority: optional Build-Depends: debhelper (>= 9.20160114), autotools-dev, dh-autoreconf -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/freexl Vcs-Git: https://salsa.debian.org/debian-gis-team/freexl.git Homepage: https://www.gaia-gis.it/fossil/freexl/ View it on GitLab: https://salsa.debian.org/debian-gis-team/freexl/commit/43268c18194bed97dfd4e701948443f7284f2e2b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/freexl/commit/43268c18194bed97dfd4e701948443f7284f2e2b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:57:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:57:29 +0000 Subject: [Git][debian-gis-team/fyba][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9233f93a52c_46f62ac0ff19eb882584ba@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / fyba Commits: 96529b8b by Bas Couwenberg at 2019-09-30T16:57:22Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ fyba (4.1.1-7) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -5,7 +5,7 @@ Section: libs Priority: optional Build-Depends: debhelper (>= 9.20160114), dh-autoreconf -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/fyba Vcs-Git: https://salsa.debian.org/debian-gis-team/fyba.git Homepage: https://github.com/kartverket/fyba View it on GitLab: https://salsa.debian.org/debian-gis-team/fyba/commit/96529b8b39a77bf284cdc75b1548274fa618bc16 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/fyba/commit/96529b8b39a77bf284cdc75b1548274fa618bc16 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:58:20 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:58:20 +0000 Subject: [Git][debian-gis-team/gdal-grass][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92342c70545_46f62ac0f98dfce025861d@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / gdal-grass Commits: a2c41a58 by Bas Couwenberg at 2019-09-30T16:58:05Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libgdal-grass (3.0.1-1~exp4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 18:58:03 +0200 + libgdal-grass (3.0.1-1~exp3) experimental; urgency=medium * Disable as-needed linking on Debian too, gcc-9 enables it by default. ===================================== debian/control ===================================== @@ -13,7 +13,7 @@ Build-Depends: debhelper (>= 9), libpq-dev, lsb-release, pkg-config -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gdal-grass Vcs-Git: https://salsa.debian.org/debian-gis-team/gdal-grass.git -b experimental Homepage: http://www.gdal.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/commit/a2c41a585ec2bc6b5b848acac295b45e522520d5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gdal-grass/commit/a2c41a585ec2bc6b5b848acac295b45e522520d5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:58:45 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:58:45 +0000 Subject: [Git][debian-gis-team/geographiclib][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923445644ed_46f62ac0ff19eb8825881f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / geographiclib Commits: e2425d2d by Bas Couwenberg at 2019-09-30T16:58:38Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +geographiclib (1.50-1~exp2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 18:58:37 +0200 + geographiclib (1.50-1~exp1) experimental; urgency=medium [ Ross Gammon ] ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: autoconf-archive, python3-all, python3-setuptools, pkg-kde-tools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/geographiclib Vcs-Git: https://salsa.debian.org/debian-gis-team/geographiclib.git Homepage: https://geographiclib.sourceforge.io/ View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/e2425d2d8d65d16502a79fe696fd34d7dd26b9d7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geographiclib/commit/e2425d2d8d65d16502a79fe696fd34d7dd26b9d7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:59:03 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:59:03 +0000 Subject: [Git][debian-gis-team/geolinks][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923457a7ba4_46f62ac0ff19eb88259061@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / geolinks Commits: 6794633a by Bas Couwenberg at 2019-09-30T16:58:56Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +geolinks (0.2.0-5) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 18:58:55 +0200 + geolinks (0.2.0-4) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -8,7 +8,7 @@ Build-Depends: debhelper (>= 9), dh-python, python3-setuptools, python3-all, -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/geolinks Vcs-Git: https://salsa.debian.org/debian-gis-team/geolinks.git Homepage: https://github.com/geopython/geolinks View it on GitLab: https://salsa.debian.org/debian-gis-team/geolinks/commit/6794633a2bcfff489568f6bf3fdc3089b79a2cfd -- View it on GitLab: https://salsa.debian.org/debian-gis-team/geolinks/commit/6794633a2bcfff489568f6bf3fdc3089b79a2cfd You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 17:59:54 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 16:59:54 +0000 Subject: [Git][debian-gis-team/glymur][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92348a991e7_46f62ac0fbe48aa42592a@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / glymur Commits: 27170c34 by Bas Couwenberg at 2019-09-30T16:59:47Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +glymur (0.8.18+ds-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 18:59:46 +0200 + glymur (0.8.18+ds-1) unstable; urgency=medium [ Bas Couwenberg ] ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper-compat (= 12), python3-pkg-resources, python3-setuptools, python3-skimage -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/glymur Vcs-Git: https://salsa.debian.org/debian-gis-team/glymur.git Homepage: https://github.com/quintusdias/glymur View it on GitLab: https://salsa.debian.org/debian-gis-team/glymur/commit/27170c34c3f31304061c3e699e0fd3c3e7780540 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/glymur/commit/27170c34c3f31304061c3e699e0fd3c3e7780540 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:00:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:00:15 +0000 Subject: [Git][debian-gis-team/gmt-dcw][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92349f232ae_46f62ac0fbe48aa4259410@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / gmt-dcw Commits: fbbc4153 by Bas Couwenberg at 2019-09-30T17:00:06Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ gmt-dcw (1.1.4-3) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -4,7 +4,7 @@ Uploaders: Bas Couwenberg Section: science Priority: optional Build-Depends: debhelper (>= 9), -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gmt-dcw Vcs-Git: https://salsa.debian.org/debian-gis-team/gmt-dcw.git Homepage: http://www.soest.hawaii.edu/pwessel/dcw/index.html View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt-dcw/commit/fbbc415319bdae478cb4a6a718fd3c5effce930b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt-dcw/commit/fbbc415319bdae478cb4a6a718fd3c5effce930b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:00:36 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:00:36 +0000 Subject: [Git][debian-gis-team/gmt-gshhg][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9234b43abf4_46f62ac0fe4cd06c2600dd@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / gmt-gshhg Commits: b57c7d00 by Bas Couwenberg at 2019-09-30T17:00:26Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ gmt-gshhg (2.3.7-5) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -4,7 +4,7 @@ Uploaders: Bas Couwenberg Section: science Priority: optional Build-Depends: debhelper (>= 9), -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gmt-gshhg Vcs-Git: https://salsa.debian.org/debian-gis-team/gmt-gshhg.git Homepage: http://www.soest.hawaii.edu/pwessel/gshhg/index.html View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt-gshhg/commit/b57c7d00051113adcf7d6974cd5c4858f261e76d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt-gshhg/commit/b57c7d00051113adcf7d6974cd5c4858f261e76d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:01:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:01:43 +0000 Subject: [Git][debian-gis-team/gmt][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9234f711ba3_46f62ac0ff19eb882602f6@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / gmt Commits: 8fe2191c by Bas Couwenberg at 2019-09-30T17:01:22Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +gmt (6.0.0~rc4+dfsg-1~exp2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:01:03 +0200 + gmt (6.0.0~rc4+dfsg-1~exp1) experimental; urgency=medium * New upstream release candidate. ===================================== debian/control ===================================== @@ -30,7 +30,7 @@ Build-Depends: debhelper (>= 9.20160114), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gmt Vcs-Git: https://salsa.debian.org/debian-gis-team/gmt.git -b experimental Homepage: http://gmt.soest.hawaii.edu/ View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/commit/8fe2191c9ccf06d4e5e4fe2000ff021813e332e8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gmt/commit/8fe2191c9ccf06d4e5e4fe2000ff021813e332e8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:02:01 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:02:01 +0000 Subject: [Git][debian-gis-team/gpsprune][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923509bba75_46f63fbab65044882604ea@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / gpsprune Commits: 86dc1f6f by Bas Couwenberg at 2019-09-30T17:01:54Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ gpsprune (19.2-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 25 Dec 2018 22:21:42 +0100 ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: ant, fastjar, libjava3d-java, libvecmath-java -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gpsprune Vcs-Git: https://salsa.debian.org/debian-gis-team/gpsprune.git Homepage: https://activityworkshop.net/software/gpsprune/index.html View it on GitLab: https://salsa.debian.org/debian-gis-team/gpsprune/commit/86dc1f6ff02cc68b027ee1567154e58eb29e6d36 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gpsprune/commit/86dc1f6ff02cc68b027ee1567154e58eb29e6d36 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:02:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:02:27 +0000 Subject: [Git][debian-gis-team/gpxsee][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9235235012e_46f62ac0ff19eb88260677@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / gpxsee Commits: 55e59f50 by Bas Couwenberg at 2019-09-30T17:02:20Z Bump Standards-Version to 4.4.1, no changes. - - - - - 1 changed file: - debian/control Changes: ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: debhelper (>= 11), qtbase5-dev, qtbase5-dev-tools, qttools5-dev-tools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/gpxsee Vcs-Git: https://salsa.debian.org/debian-gis-team/gpxsee.git Homepage: https://www.gpxsee.org View it on GitLab: https://salsa.debian.org/debian-gis-team/gpxsee/commit/55e59f504949884bfea5d844877983450ad90ae6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/gpxsee/commit/55e59f504949884bfea5d844877983450ad90ae6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:03:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:03:08 +0000 Subject: [Git][debian-gis-team/grass][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92354cc7ecc_46f62ac0fbe48aa42608cf@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / grass Commits: 14d98f4f by Bas Couwenberg at 2019-09-30T17:02:55Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +grass (7.8.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:02:39 +0200 + grass (7.8.0-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -52,7 +52,7 @@ Build-Depends: autoconf2.13, python3-wxgtk4.0, unixodbc-dev, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/grass Vcs-Git: https://salsa.debian.org/debian-gis-team/grass.git Homepage: https://grass.osgeo.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/14d98f4f3d6359f2440496c0193a2070f55c32fb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grass/commit/14d98f4f3d6359f2440496c0193a2070f55c32fb You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:03:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:03:53 +0000 Subject: [Git][debian-gis-team/grits][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923579835e9_46f63fbab65044882610fc@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / grits Commits: 40f69949 by Bas Couwenberg at 2019-09-30T17:03:31Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ libgrits (0.8.1-6) UNRELEASED; urgency=medium * Drop .gitignore file. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add Build-Depends-Package field to symbols file. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), libsoup2.4-dev, mesa-common-dev, libglu1-mesa-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/grits Vcs-Git: https://salsa.debian.org/debian-gis-team/grits.git Homepage: http://pileus.org/grits View it on GitLab: https://salsa.debian.org/debian-gis-team/grits/commit/40f6994980c0ab740ea5497c8256bb959fef3c33 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/grits/commit/40f6994980c0ab740ea5497c8256bb959fef3c33 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:04:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:04:32 +0000 Subject: [Git][debian-gis-team/h5utils][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9235a01461f_46f62ac0ff19eb882612b1@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / h5utils Commits: e02b4c4d by Bas Couwenberg at 2019-09-30T17:04:11Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ h5utils (1.13.1-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sat, 04 Aug 2018 14:40:13 +0200 ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), libmatheval-dev, libpng-dev, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/h5utils Vcs-Git: https://salsa.debian.org/debian-gis-team/h5utils.git Homepage: https://github.com/stevengj/h5utils View it on GitLab: https://salsa.debian.org/debian-gis-team/h5utils/commit/e02b4c4dac547a172c85f3b8258046876a5d6095 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/h5utils/commit/e02b4c4dac547a172c85f3b8258046876a5d6095 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:04:50 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:04:50 +0000 Subject: [Git][debian-gis-team/hdf4][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9235b2276c9_46f62ac0f98dfce02614e3@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / hdf4 Commits: 63f9ae1f by Bas Couwenberg at 2019-09-30T17:04:42Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ libhdf4 (4.2.14-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Wed, 10 Jul 2019 18:04:29 +0200 ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 9), libjpeg-dev, sharutils, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/hdf4 Vcs-Git: https://salsa.debian.org/debian-gis-team/hdf4.git Homepage: http://www.hdfgroup.com/ ===================================== debian/control.in ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 9), libjpeg-dev, sharutils, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/hdf4 Vcs-Git: https://salsa.debian.org/debian-gis-team/hdf4.git Homepage: http://www.hdfgroup.com/ View it on GitLab: https://salsa.debian.org/debian-gis-team/hdf4/commit/63f9ae1f1c2ce485c5e263b4e26841391ed775b4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/hdf4/commit/63f9ae1f1c2ce485c5e263b4e26841391ed775b4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:05:56 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:05:56 +0000 Subject: [Git][debian-gis-team/hdf5][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9235f4269c1_46f62ac0f98dfce02616d2@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / hdf5 Commits: 0c915bfc by Bas Couwenberg at 2019-09-30T17:05:49Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ hdf5 (1.10.5+repack-1~exp9) UNRELEASED; urgency=medium * Fix pkg-config file for libhdf5-openmpi-dev (closes: #934875) + * Bump Standards-Version to 4.4.1, no changes. -- Gilles Filippini Tue, 20 Aug 2019 14:11:28 +0200 ===================================== debian/control ===================================== @@ -18,7 +18,7 @@ Build-Depends: debhelper (>= 10~), javahelper [!hppa !hurd-i386] Build-Depends-Indep: doxygen, php-cli -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/hdf5 Vcs-Git: https://salsa.debian.org/debian-gis-team/hdf5.git Homepage: http://hdfgroup.org/HDF5/ ===================================== debian/control.in ===================================== @@ -18,7 +18,7 @@ Build-Depends: debhelper (>= 10~), javahelper [!hppa !hurd-i386] Build-Depends-Indep: doxygen, php-cli -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/hdf5 Vcs-Git: https://salsa.debian.org/debian-gis-team/hdf5.git Homepage: http://hdfgroup.org/HDF5/ View it on GitLab: https://salsa.debian.org/debian-gis-team/hdf5/commit/0c915bfcf3cadc6a2b3983a8141cf8588238be40 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/hdf5/commit/0c915bfcf3cadc6a2b3983a8141cf8588238be40 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:06:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:06:14 +0000 Subject: [Git][debian-gis-team/jmapviewer][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923606b35ac_46f62ac0fbe48aa42619da@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / jmapviewer Commits: 4959bfa5 by Bas Couwenberg at 2019-09-30T17:06:06Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ jmapviewer (2.11+dfsg-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Wed, 10 Jul 2019 18:18:05 +0200 ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9~), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/jmapviewer Vcs-Git: https://salsa.debian.org/debian-gis-team/jmapviewer.git Homepage: https://wiki.openstreetmap.org/wiki/JMapViewer View it on GitLab: https://salsa.debian.org/debian-gis-team/jmapviewer/commit/4959bfa5d2d5c85743a927da01c008a1a4694bae -- View it on GitLab: https://salsa.debian.org/debian-gis-team/jmapviewer/commit/4959bfa5d2d5c85743a927da01c008a1a4694bae You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:06:35 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:06:35 +0000 Subject: [Git][debian-gis-team/josm][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92361b81738_46f62ac0f98dfce02621b2@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / josm Commits: 1baf2ce9 by Bas Couwenberg at 2019-09-30T17:06:27Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +josm (0.0.svn15390+dfsg-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:06:26 +0200 + josm (0.0.svn15390+dfsg-1) unstable; urgency=medium * New tested snapshot. ===================================== debian/control ===================================== @@ -19,7 +19,7 @@ Build-Depends: debhelper (>= 9~), liboauth-signpost-java (>= 1.2), libterm-readkey-perl, openjfx -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/josm Vcs-Git: https://salsa.debian.org/debian-gis-team/josm.git Homepage: https://josm.openstreetmap.de View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/commit/1baf2ce97a397c94d417229bbbaf8de46d2854ec -- View it on GitLab: https://salsa.debian.org/debian-gis-team/josm/commit/1baf2ce97a397c94d417229bbbaf8de46d2854ec You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:07:01 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:07:01 +0000 Subject: [Git][debian-gis-team/jts][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923635ac693_46f62ac0f98dfce02623ca@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / jts Commits: 0fb18a3b by Bas Couwenberg at 2019-09-30T17:06:52Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +jts (1.16.1+ds-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:06:50 +0200 + jts (1.16.1+ds-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -16,7 +16,7 @@ Build-Depends-Indep: default-jdk-doc, libmaven-javadoc-plugin-java, libmaven-source-plugin-java, junit -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/jts Vcs-Git: https://salsa.debian.org/debian-gis-team/jts.git Homepage: https://locationtech.github.io/jts/ View it on GitLab: https://salsa.debian.org/debian-gis-team/jts/commit/0fb18a3b63a56f624dd08898ed92bb4871693a1b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/jts/commit/0fb18a3b63a56f624dd08898ed92bb4871693a1b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:08:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:08:04 +0000 Subject: [Git][debian-gis-team/laszip][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9236742267f_46f62ac0fbe48aa42625ed@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / laszip Commits: fcb547f1 by Bas Couwenberg at 2019-09-30T17:07:53Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ laszip (3.4.1-3) UNRELEASED; urgency=medium * Update Homepage URL to use HTTPS. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Mon, 29 Jul 2019 06:56:51 +0200 ===================================== debian/control ===================================== @@ -6,7 +6,7 @@ Priority: optional Build-Depends: chrpath, cmake (>= 2.8.11), debhelper (>= 9) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/laszip Vcs-Git: https://salsa.debian.org/debian-gis-team/laszip.git Homepage: https://laszip.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/laszip/commit/fcb547f13c2c47462ea35266b21e1415ff4a19e5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/laszip/commit/fcb547f13c2c47462ea35266b21e1415ff4a19e5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:08:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:08:25 +0000 Subject: [Git][debian-gis-team/libapache2-mod-tile][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923689bacd7_46f62ac0fe4cd06c2627b0@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libapache2-mod-tile Commits: ba9d3b48 by Bas Couwenberg at 2019-09-30T17:08:18Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -3,7 +3,7 @@ libapache2-mod-tile (0.4+git20170108-e25bfdb-1) UNRELEASED; urgency=medium * New upstream git snapshot. * Update packaging to incorporate Debian GIS changes. * Update Vcs-* URLs for Salsa. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Update gbp.conf to use --source-only-changes by default. * Add patch by Boris Shtrasman to use libiniparser package. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), libiniparser-dev, libjs-openlayers, libmapnik-dev | libmapnik2-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libapache2-mod-tile Vcs-Git: https://salsa.debian.org/debian-gis-team/libapache2-mod-tile.git Homepage: https://wiki.openstreetmap.org/wiki/Mod_tile View it on GitLab: https://salsa.debian.org/debian-gis-team/libapache2-mod-tile/commit/ba9d3b48260086dcd26024604fe45c7cc5ca7a41 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libapache2-mod-tile/commit/ba9d3b48260086dcd26024604fe45c7cc5ca7a41 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:08:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:08:49 +0000 Subject: [Git][debian-gis-team/libcitygml][master] 2 commits: Update symbols for amd64. Message-ID: <5d9236a128eb9_46f62ac0ff19eb8826293e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libcitygml Commits: dabf31be by Bas Couwenberg at 2019-09-09T18:14:29Z Update symbols for amd64. - - - - - 3156957a by Bas Couwenberg at 2019-09-30T17:08:41Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/libcitygml2.symbols Changes: ===================================== debian/changelog ===================================== @@ -1,10 +1,11 @@ libcitygml (2.0.9-3) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Add Build-Depends-Package field to symbols file. * Add lintian overrides for file-references-package-build-path. * Update gbp.conf to use --source-only-changes by default. + * Update symbols for amd64. -- Bas Couwenberg Sat, 04 Aug 2018 14:44:04 +0200 ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9), libgl1-mesa-dev | libgl-dev, libglu-dev, pkg-kde-tools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libcitygml Vcs-Git: https://salsa.debian.org/debian-gis-team/libcitygml.git Homepage: https://github.com/jklimke/libcitygml ===================================== debian/libcitygml2.symbols ===================================== @@ -131,7 +131,9 @@ libcitygml.so.2 #PACKAGE# #MINVER# (optional=templinst)_ZN7citygml10parseValueIbEET_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERSt10shared_ptrINS_13CityGMLLoggerEERKNS_16DocumentLocationE at Base 2.0.9 _ZN7citygml10setCountryEPNS_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 2.0.4 _ZN7citygml11setLocalityEPNS_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 2.0.4 + (optional=templinst)_ZN7citygml12parseVecListI5TVec2IfEEESt6vectorIT_SaIS4_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERSt10shared_ptrINS_13CityGMLLoggerEERKNS_16DocumentLocationE at Base 2.0.9 (optional=templinst)_ZN7citygml12parseVecListI5TVec3IdEEESt6vectorIT_SaIS4_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERSt10shared_ptrINS_13CityGMLLoggerEERKNS_16DocumentLocationE at Base 1.4.3 + (optional=templinst)_ZN7citygml12parseVecListIfEESt6vectorIT_SaIS2_EERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEERSt10shared_ptrINS_13CityGMLLoggerEERKNS_16DocumentLocationE at Base 2.0.9 _ZN7citygml13AddressParser18parseElementEndTagERKNS_8NodeType7XMLNodeERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 1.4.3 _ZN7citygml13AddressParser20parseElementStartTagERKNS_8NodeType7XMLNodeERNS_10AttributesE at Base 1.4.3 _ZN7citygml13AddressParser23parseChildElementEndTagERKNS_8NodeType7XMLNodeERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 1.4.3 @@ -1082,14 +1084,11 @@ libcitygml.so.2 #PACKAGE# #MINVER# (optional=templinst)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeES2_SaIS2_ENSt8__detail9_IdentityESt8equal_toIS2_ESt4hashIS2_ENS4_18_Mod_range_hashingENS4_20_Default_ranged_hashENS4_20_Prime_rehash_policyENS4_17_Hashtable_traitsILb1ELb1ELb1EEEE5clearEv at Base 1.4.3 (optional=templinst)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeES2_SaIS2_ENSt8__detail9_IdentityESt8equal_toIS2_ESt4hashIS2_ENS4_18_Mod_range_hashingENS4_20_Default_ranged_hashENS4_20_Prime_rehash_policyENS4_17_Hashtable_traitsILb1ELb1ELb1EEEE9_M_insertIRKS2_NS4_17_ReuseOrAllocNodeISaINS4_10_Hash_nodeIS2_Lb1EEEEEEEESt4pairINS4_14_Node_iteratorIS2_Lb1ELb1EEEbEOT_RKT0_St17integral_constantIbLb1EEm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeES2_SaIS2_ENSt8__detail9_IdentityESt8equal_toIS2_ESt4hashIS2_ENS4_18_Mod_range_hashingENS4_20_Default_ranged_hashENS4_20_Prime_rehash_policyENS4_17_Hashtable_traitsILb1ELb1ELb1EEEE9_M_rehashEjRKj at Base 1.4.3 - (optional=templinst|arch=alpha amd64 arm64 kfreebsd-amd64 mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeES2_SaIS2_ENSt8__detail9_IdentityESt8equal_toIS2_ESt4hashIS2_ENS4_18_Mod_range_hashingENS4_20_Default_ranged_hashENS4_20_Prime_rehash_policyENS4_17_Hashtable_traitsILb1ELb1ELb1EEEE9_M_rehashEmRKm at Base 1.4.3 (optional=templinst|arch=!alpha !amd64 !arm64 !hppa !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sh4 !sparc64 !x32)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeESt4pairIKS2_St8functionIFvPNS0_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEEESaISI_ENSt8__detail10_Select1stESt8equal_toIS2_ESt4hashIS2_ENSK_18_Mod_range_hashingENSK_20_Default_ranged_hashENSK_20_Prime_rehash_policyENSK_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEjjPNSK_10_Hash_nodeISI_Lb1EEE at Base 1.4.3 (optional=templinst|arch=alpha mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeESt4pairIKS2_St8functionIFvPNS0_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEEESaISI_ENSt8__detail10_Select1stESt8equal_toIS2_ESt4hashIS2_ENSK_18_Mod_range_hashingENSK_20_Default_ranged_hashENSK_20_Prime_rehash_policyENSK_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSK_10_Hash_nodeISI_Lb1EEE at Base 2.0.4 (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeESt4pairIKS2_St8functionIFvPNS0_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEEESaISI_ENSt8__detail10_Select1stESt8equal_toIS2_ESt4hashIS2_ENSK_18_Mod_range_hashingENSK_20_Default_ranged_hashENSK_20_Prime_rehash_policyENSK_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEjRKj at Base 1.4.3 (optional=templinst|arch=alpha amd64 arm64 kfreebsd-amd64 mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableIN7citygml8NodeType7XMLNodeESt4pairIKS2_St8functionIFvPNS0_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEEESaISI_ENSt8__detail10_Select1stESt8equal_toIS2_ESt4hashIS2_ENSK_18_Mod_range_hashingENSK_20_Default_ranged_hashENSK_20_Prime_rehash_policyENSK_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEmRKm at Base 1.4.3 (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES5_SaIS5_ENSt8__detail9_IdentityESt8equal_toIS5_ESt4hashIS5_ENS7_18_Mod_range_hashingENS7_20_Default_ranged_hashENS7_20_Prime_rehash_policyENS7_17_Hashtable_traitsILb1ELb1ELb1EEEE9_M_rehashEjRKj at Base 1.4.3 - (optional=templinst|arch=alpha amd64 arm64 kfreebsd-amd64 mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES5_SaIS5_ENSt8__detail9_IdentityESt8equal_toIS5_ESt4hashIS5_ENS7_18_Mod_range_hashingENS7_20_Default_ranged_hashENS7_20_Prime_rehash_policyENS7_17_Hashtable_traitsILb1ELb1ELb1EEEE9_M_rehashEmRKm at Base 1.4.3 - (optional=templinst|arch=amd64 arm64 kfreebsd-amd64 m68k mips64el ppc64el sparc64 x32|subst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N7citygml10CityObject15CityObjectsTypeEESaISB_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSD_18_Mod_range_hashingENSD_20_Default_ranged_hashENSD_20_Prime_rehash_policyENSD_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashE{size_t}RK{size_t}@Base 1.4.3 (optional=templinst|arch=!alpha !amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N7citygml10CityObject15CityObjectsTypeEESaISB_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSD_18_Mod_range_hashingENSD_20_Default_ranged_hashENSD_20_Prime_rehash_policyENSD_17_Hashtable_traitsILb1ELb0ELb1EEEEC1IPKSB_EET_SS_jRKSI_RKSJ_RKSK_RKSG_RKSE_RKSC_ at Base 2.0.4 (optional=templinst|arch=alpha amd64 ppc64 s390x sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N7citygml10CityObject15CityObjectsTypeEESaISB_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSD_18_Mod_range_hashingENSD_20_Default_ranged_hashENSD_20_Prime_rehash_policyENSD_17_Hashtable_traitsILb1ELb0ELb1EEEEC1IPKSB_EET_SS_mRKSI_RKSJ_RKSK_RKSG_RKSE_RKSC_ at Base 2.0.4 (optional=templinst|arch=!alpha !amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N7citygml10CityObject15CityObjectsTypeEESaISB_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSD_18_Mod_range_hashingENSD_20_Default_ranged_hashENSD_20_Prime_rehash_policyENSD_17_Hashtable_traitsILb1ELb0ELb1EEEEC2IPKSB_EET_SS_jRKSI_RKSJ_RKSK_RKSG_RKSE_RKSC_ at Base 2.0.4 @@ -1102,6 +1101,7 @@ libcitygml.so.2 #PACKAGE# #MINVER# (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_PN7citygml8NodeType7XMLNodeEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEEm at Base 2.0.9 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_PN7citygml8NodeType7XMLNodeEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE4findERS7_ at Base 1.4.3 (optional=templinst|arch=!alpha !amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_PN7citygml8NodeType7XMLNodeEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEjRKj at Base 2.0.4 + (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_PN7citygml8NodeType7XMLNodeEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEmRKm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !hppa !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sh4 !sparc64 !x32)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml10AppearanceEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEjjPNSE_10_Hash_nodeISC_Lb1EEE at Base 1.4.3 (optional=templinst|arch=alpha mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml10AppearanceEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEE at Base 2.0.4 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml10AppearanceEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE5clearEv at Base 1.4.3 @@ -1111,6 +1111,7 @@ libcitygml.so.2 #PACKAGE# #MINVER# (optional=templinst|arch=alpha mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml18TextureCoordinatesEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEE at Base 2.0.4 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml18TextureCoordinatesEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEEm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml18TextureCoordinatesEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEjRKj at Base 2.0.4 + (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml18TextureCoordinatesEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEmRKm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml23TextureTargetDefinitionEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEjjPNSE_10_Hash_nodeISC_Lb1EEE at Base 1.4.3 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml23TextureTargetDefinitionEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEEm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml23TextureTargetDefinitionEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEjRKj at Base 1.4.3 @@ -1128,15 +1129,18 @@ libcitygml.so.2 #PACKAGE# #MINVER# (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml7PolygonEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEEm at Base 2.0.9 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml7PolygonEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE5clearEv at Base 1.4.3 (optional=templinst|arch=!alpha !amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml7PolygonEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEjRKj at Base 2.0.4 + (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml7PolygonEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEmRKm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !hppa !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sh4 !sparc64 !x32)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml8GeometryEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEjjPNSE_10_Hash_nodeISC_Lb1EEE at Base 1.4.3 (optional=templinst|arch=alpha mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml8GeometryEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEE at Base 2.0.4 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml8GeometryEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSE_10_Hash_nodeISC_Lb1EEEm at Base 2.0.9 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml8GeometryEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE5clearEv at Base 1.4.3 (optional=templinst|arch=!alpha !amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml8GeometryEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEjRKj at Base 2.0.4 + (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St10shared_ptrIN7citygml8GeometryEEESaISC_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSE_18_Mod_range_hashingENSE_20_Default_ranged_hashENSE_20_Prime_rehash_policyENSE_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEmRKm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !hppa !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sh4 !sparc64 !x32)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St6vectorI5TVec2IfESaISA_EEESaISD_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSF_18_Mod_range_hashingENSF_20_Default_ranged_hashENSF_20_Prime_rehash_policyENSF_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEjjPNSF_10_Hash_nodeISD_Lb1EEE at Base 1.4.3 (optional=templinst|arch=alpha mips64el ppc64 ppc64el s390x sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St6vectorI5TVec2IfESaISA_EEESaISD_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSF_18_Mod_range_hashingENSF_20_Default_ranged_hashENSF_20_Prime_rehash_policyENSF_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSF_10_Hash_nodeISD_Lb1EEE at Base 2.0.4 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St6vectorI5TVec2IfESaISA_EEESaISD_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSF_18_Mod_range_hashingENSF_20_Default_ranged_hashENSF_20_Prime_rehash_policyENSF_17_Hashtable_traitsILb1ELb0ELb1EEEE21_M_insert_unique_nodeEmmPNSF_10_Hash_nodeISD_Lb1EEEm at Base 2.0.9 (optional=templinst|arch=!alpha !amd64 !arm64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St6vectorI5TVec2IfESaISA_EEESaISD_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSF_18_Mod_range_hashingENSF_20_Default_ranged_hashENSF_20_Prime_rehash_policyENSF_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEjRKj at Base 2.0.4 + (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St6vectorI5TVec2IfESaISA_EEESaISD_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSF_18_Mod_range_hashingENSF_20_Default_ranged_hashENSF_20_Prime_rehash_policyENSF_17_Hashtable_traitsILb1ELb0ELb1EEEE9_M_rehashEmRKm at Base 2.0.9 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St6vectorI5TVec2IfESaISA_EEESaISD_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSF_18_Mod_range_hashingENSF_20_Default_ranged_hashENSF_20_Prime_rehash_policyENSF_17_Hashtable_traitsILb1ELb0ELb1EEEED1Ev at Base 1.4.3 (optional=templinst)_ZNSt10_HashtableINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_St6vectorI5TVec2IfESaISA_EEESaISD_ENSt8__detail10_Select1stESt8equal_toIS5_ESt4hashIS5_ENSF_18_Mod_range_hashingENSF_20_Default_ranged_hashENSF_20_Prime_rehash_policyENSF_17_Hashtable_traitsILb1ELb0ELb1EEEED2Ev at Base 1.4.3 (optional=templinst|arch=!alpha !amd64 !arm64 !hurd-i386 !i386 !kfreebsd-amd64 !kfreebsd-i386 !mips !mips64el !mipsel !powerpc !powerpcspe !ppc64 !ppc64el !s390x !sparc64)_ZNSt10_HashtableIPN7citygml10LineStringESt4pairIKS2_NSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEESaISB_ENSt8__detail10_Select1stESt8equal_toIS2_ESt4hashIS2_ENSD_18_Mod_range_hashingENSD_20_Default_ranged_hashENSD_20_Prime_rehash_policyENSD_17_Hashtable_traitsILb0ELb0ELb1EEEE21_M_insert_unique_nodeEjjPNSD_10_Hash_nodeISB_Lb0EEE at Base 2.0 @@ -1168,8 +1172,6 @@ libcitygml.so.2 #PACKAGE# #MINVER# (optional=templinst|arch=!armel)_ZNSt12__shared_ptrIN7citygml13CityGMLLoggerELN9__gnu_cxx12_Lock_policyE2EEC2ERKS4_ at Base 1.4.3 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN7citygml18TextureCoordinatesELN9__gnu_cxx12_Lock_policyE1EEC1ISaIS1_EJNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESC_EEESt19_Sp_make_shared_tagRKT_DpOT0_ at Base 2.0.4 (optional=templinst|arch=armel)_ZNSt12__shared_ptrIN7citygml18TextureCoordinatesELN9__gnu_cxx12_Lock_policyE1EEC2ISaIS1_EJNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESC_EEESt19_Sp_make_shared_tagRKT_DpOT0_ at Base 2.0.4 - (optional=templinst|arch=!armel)_ZNSt12__shared_ptrIN7citygml18TextureCoordinatesELN9__gnu_cxx12_Lock_policyE2EEC1ISaIS1_EJNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESC_EEESt19_Sp_make_shared_tagRKT_DpOT0_ at Base 2.0.4 - (optional=templinst|arch=!armel)_ZNSt12__shared_ptrIN7citygml18TextureCoordinatesELN9__gnu_cxx12_Lock_policyE2EEC2ISaIS1_EJNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESC_EEESt19_Sp_make_shared_tagRKT_DpOT0_ at Base 2.0.4 (optional=templinst)_ZNSt13unordered_mapIN7citygml8NodeType7XMLNodeESt8functionIFvPNS0_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEESt4hashIS2_ESt8equal_toIS2_ESaISt4pairIKS2_SF_EEED1Ev at Base 1.4.3 (optional=templinst)_ZNSt13unordered_mapIN7citygml8NodeType7XMLNodeESt8functionIFvPNS0_7AddressERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEEESt4hashIS2_ESt8equal_toIS2_ESaISt4pairIKS2_SF_EEED2Ev at Base 1.4.3 (optional=templinst)_ZNSt13unordered_mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN7citygml10CityObject15CityObjectsTypeESt4hashIS5_ESt8equal_toIS5_ESaISt4pairIKS5_S8_EEED1Ev at Base 1.4.3 @@ -1192,9 +1194,11 @@ libcitygml.so.2 #PACKAGE# #MINVER# (optional=templinst|arch=armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE1EEC2IN7citygml22CityModelElementParserESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 (optional=templinst|arch=armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE1EEC2IN7citygml8GeometryESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 (optional=templinst|arch=armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE1EEC2IN7citygml9CityModelESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 + (optional=templinst)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC1IN7citygml18TextureCoordinatesESaIS5_EJNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESC_EEERPT_St20_Sp_alloc_shared_tagIT0_EDpOT1_ at Base 2.0.9 (optional=templinst|arch=!armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC1IN7citygml22CityModelElementParserESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 (optional=templinst|arch=!armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC1IN7citygml8GeometryESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 (optional=templinst|arch=!armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC1IN7citygml9CityModelESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 + (optional=templinst)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC2IN7citygml18TextureCoordinatesESaIS5_EJNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESC_EEERPT_St20_Sp_alloc_shared_tagIT0_EDpOT1_ at Base 2.0.9 (optional=templinst|arch=!armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC2IN7citygml22CityModelElementParserESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 (optional=templinst|arch=!armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC2IN7citygml8GeometryESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 (optional=templinst|arch=!armel)_ZNSt14__shared_countILN9__gnu_cxx12_Lock_policyE2EEC2IN7citygml9CityModelESt14default_deleteIS5_EEEOSt10unique_ptrIT_T0_E at Base 2.0.4 @@ -1593,7 +1597,6 @@ libcitygml.so.2 #PACKAGE# #MINVER# (arch=!armel)_ZTISt19_Sp_counted_deleterIPN7citygml8GeometryESt14default_deleteIS1_ESaIvELN9__gnu_cxx12_Lock_policyE2EE at Base 2.0 (arch=armel)_ZTISt19_Sp_counted_deleterIPN7citygml9CityModelESt14default_deleteIS1_ESaIvELN9__gnu_cxx12_Lock_policyE1EE at Base 1.4.3 (arch=!armel)_ZTISt19_Sp_counted_deleterIPN7citygml9CityModelESt14default_deleteIS1_ESaIvELN9__gnu_cxx12_Lock_policyE2EE at Base 2.0 - _ZTISt19_Sp_make_shared_tag at Base 1.4.3 (arch=armel)_ZTISt23_Sp_counted_ptr_inplaceIN7citygml18TextureCoordinatesESaIS1_ELN9__gnu_cxx12_Lock_policyE1EE at Base 2.0 (arch=!armel)_ZTISt23_Sp_counted_ptr_inplaceIN7citygml18TextureCoordinatesESaIS1_ELN9__gnu_cxx12_Lock_policyE2EE at Base 2.0.4 (arch=armel)_ZTISt23_Sp_counted_ptr_inplaceIN7citygml6ObjectESaIS1_ELN9__gnu_cxx12_Lock_policyE1EE at Base 1.4.3 @@ -1805,6 +1808,7 @@ libcitygml.so.2 #PACKAGE# #MINVER# (arch=!armel)_ZTVSt23_Sp_counted_ptr_inplaceIN7citygml6ObjectESaIS1_ELN9__gnu_cxx12_Lock_policyE2EE at Base 2.0 (arch=armel)_ZTVSt23_Sp_counted_ptr_inplaceIN7citygml9StdLoggerESaIS1_ELN9__gnu_cxx12_Lock_policyE1EE at Base 1.4.3 (arch=!armel)_ZTVSt23_Sp_counted_ptr_inplaceIN7citygml9StdLoggerESaIS1_ELN9__gnu_cxx12_Lock_policyE2EE at Base 2.0 + _ZZNSt19_Sp_make_shared_tag5_S_tiEvE5__tag at Base 2.0.9 _ZanN7citygml10CityObject15CityObjectsTypeES1_ at Base 2.0.4 _ZcoN7citygml10CityObject15CityObjectsTypeE at Base 2.0.4 _ZeoN7citygml10CityObject15CityObjectsTypeES1_ at Base 2.0.4 View it on GitLab: https://salsa.debian.org/debian-gis-team/libcitygml/compare/92836217d6f42a089c6eebbfc645392fb142fa31...3156957ae6a4d0d6b593e646d405c533e979394e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libcitygml/compare/92836217d6f42a089c6eebbfc645392fb142fa31...3156957ae6a4d0d6b593e646d405c533e979394e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:09:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:09:16 +0000 Subject: [Git][debian-gis-team/libepsilon][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9236bc6b343_46f63fbab65044882631af@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libepsilon Commits: 909bbb63 by Bas Couwenberg at 2019-09-30T17:09:09Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ libepsilon (0.9.2+dfsg-5) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add Build-Depends-Package field to symbols file. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libepsilon Vcs-Git: https://salsa.debian.org/debian-gis-team/libepsilon.git Homepage: https://sourceforge.net/projects/epsilon-project View it on GitLab: https://salsa.debian.org/debian-gis-team/libepsilon/commit/909bbb63a9c9d30304b2f68a49ec6d80c5c5f238 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libepsilon/commit/909bbb63a9c9d30304b2f68a49ec6d80c5c5f238 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:10:44 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:10:44 +0000 Subject: [Git][debian-gis-team/libgeo-shapelib-perl][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9237143b5f0_46f63fbab6504488263330@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libgeo-shapelib-perl Commits: 3b14ac47 by Bas Couwenberg at 2019-09-30T17:10:27Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libgeo-shapelib-perl (0.22-5) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:10:25 +0200 + libgeo-shapelib-perl (0.22-4) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), libshp-dev, libtree-r-perl, perl -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libgeo-shapelib-perl Vcs-Git: https://salsa.debian.org/debian-gis-team/libgeo-shapelib-perl.git Homepage: https://github.com/ajolma/Geo-Shapelib View it on GitLab: https://salsa.debian.org/debian-gis-team/libgeo-shapelib-perl/commit/3b14ac47eee1bea6cb9fdc88382e3b3cca652ddf -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libgeo-shapelib-perl/commit/3b14ac47eee1bea6cb9fdc88382e3b3cca652ddf You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:11:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:11:38 +0000 Subject: [Git][debian-gis-team/libgeotiff][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92374a585f8_46f62ac0f98dfce02635f0@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libgeotiff Commits: 258ad738 by Bas Couwenberg at 2019-09-30T17:11:22Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libgeotiff (1.5.1-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:11:15 +0200 + libgeotiff (1.5.1-2) unstable; urgency=medium * Add patch to fix FTFBS with PROJ 6.2.0. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9.20160114), libproj-dev (>= 6.0.0), libtiff-dev, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libgeotiff Vcs-Git: https://salsa.debian.org/debian-gis-team/libgeotiff.git Homepage: https://geotiff.osgeo.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/libgeotiff/commit/258ad738d33d723924dc2a2d7ca736a065e45a47 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libgeotiff/commit/258ad738d33d723924dc2a2d7ca736a065e45a47 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:12:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:12:16 +0000 Subject: [Git][debian-gis-team/libkml][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9237703ec2a_46f62ac0fbe48aa4263736@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libkml Commits: aef47d46 by Bas Couwenberg at 2019-09-30T17:12:08Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libkml (1.3.0-9) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:12:00 +0200 + libkml (1.3.0-8) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -13,7 +13,7 @@ Build-Depends: debhelper (>= 9), liburiparser-dev (>= 0.7.1), pkg-kde-tools, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libkml Vcs-Git: https://salsa.debian.org/debian-gis-team/libkml.git Homepage: https://github.com/libkml/libkml View it on GitLab: https://salsa.debian.org/debian-gis-team/libkml/commit/aef47d4693bd486f48f7ce59337fb961f99365eb -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libkml/commit/aef47d4693bd486f48f7ce59337fb961f99365eb You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From sales at bonesca.nl Mon Sep 30 18:28:28 2019 From: sales at bonesca.nl (Bonesca Sales) Date: Mon, 30 Sep 2019 20:28:28 +0300 Subject: Pricelist Bonesca Message-ID: <7iDjnl4sx2_6ssFN-newsletter@synergia.data.lt> An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:12:37 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:12:37 +0000 Subject: [Git][debian-gis-team/libosmium][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92378580b20_46f62ac0ff19eb88263972@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libosmium Commits: c53f857b by Bas Couwenberg at 2019-09-30T17:12:29Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +libosmium (2.15.3-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:12:28 +0200 + libosmium (2.15.3-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 9), libprotozero-dev (>= 1.6.3), libsparsehash-dev, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libosmium/ Vcs-Git: https://salsa.debian.org/debian-gis-team/libosmium.git Homepage: https://osmcode.org/libosmium/ View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/commit/c53f857b1929df654d72d3590460911c14521281 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libosmium/commit/c53f857b1929df654d72d3590460911c14521281 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:20:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:20:34 +0000 Subject: [Git][debian-gis-team/librasterlite2][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9239622cbff_46f62ac0ff19eb88264314@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / librasterlite2 Commits: baa02ca0 by Bas Couwenberg at 2019-09-30T17:20:26Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +librasterlite2 (1.1.0~beta0+really1.1.0~beta0-1~exp4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:20:24 +0200 + librasterlite2 (1.1.0~beta0+really1.1.0~beta0-1~exp3) experimental; urgency=medium * Remove package name from lintian overrides. ===================================== debian/control ===================================== @@ -26,7 +26,7 @@ Build-Depends: debhelper (>= 9.20160114), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/librasterlite2 Vcs-Git: https://salsa.debian.org/debian-gis-team/librasterlite2.git -b experimental Homepage: https://www.gaia-gis.it/fossil/librasterlite2/ View it on GitLab: https://salsa.debian.org/debian-gis-team/librasterlite2/commit/baa02ca0b155bca51a961d5a185e22e793de5ec0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/librasterlite2/commit/baa02ca0b155bca51a961d5a185e22e793de5ec0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:20:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:20:49 +0000 Subject: [Git][debian-gis-team/librewms][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923971ab412_46f62ac0ff19eb88264599@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / librewms Commits: 40b596ea by Bas Couwenberg at 2019-09-30T17:20:42Z Bump Standards-Version to 4.4.1, no changes. - - - - - 1 changed file: - debian/control Changes: ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), libsqlite3-dev, libwxgtk3.0-dev, wx-common -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/librewms/ Vcs-Git: https://salsa.debian.org/debian-gis-team/librewms.git Homepage: https://www.gaia-gis.it/fossil/librewms/ View it on GitLab: https://salsa.debian.org/debian-gis-team/librewms/commit/40b596eaa3ddc94bb5bf624e3f0d1f077ab48786 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/librewms/commit/40b596eaa3ddc94bb5bf624e3f0d1f077ab48786 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:21:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:21:09 +0000 Subject: [Git][debian-gis-team/librttopo][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92398595d66_46f63fbab65044882647d7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / librttopo Commits: 05002fc3 by Bas Couwenberg at 2019-09-30T17:21:01Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +librttopo (1.1.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:20:59 +0200 + librttopo (1.1.0-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: debhelper (>= 9), libjson-c-dev, libpcre3-dev, libxml2-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/librttopo/ Vcs-Git: https://salsa.debian.org/debian-gis-team/librttopo.git Homepage: https://git.osgeo.org/gitea/rttopo/librttopo View it on GitLab: https://salsa.debian.org/debian-gis-team/librttopo/commit/05002fc3d27d17c592905503b4eb09d516fb89bf -- View it on GitLab: https://salsa.debian.org/debian-gis-team/librttopo/commit/05002fc3d27d17c592905503b4eb09d516fb89bf You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:21:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:21:30 +0000 Subject: [Git][debian-gis-team/libspatialindex][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92399a679f2_46f62ac0f98dfce0264986@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / libspatialindex Commits: b75254db by Bas Couwenberg at 2019-09-30T17:21:20Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ spatialindex (1.9.0-2) UNRELEASED; urgency=medium * Update gbp.conf to use --source-only-changes by default. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 07 Jul 2019 08:32:59 +0200 ===================================== debian/control ===================================== @@ -8,7 +8,7 @@ Build-Depends: debhelper (>= 9.20160114), autotools-dev, dh-autoreconf, pkg-kde-tools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/libspatialindex Vcs-Git: https://salsa.debian.org/debian-gis-team/libspatialindex.git Homepage: https://libspatialindex.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/libspatialindex/commit/b75254db20fb0258dee6ab72b6eba104551f72a0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/libspatialindex/commit/b75254db20fb0258dee6ab72b6eba104551f72a0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:21:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:21:53 +0000 Subject: [Git][debian-gis-team/mapbox-geometry][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9239b111135_46f62ac0f98dfce02653e7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapbox-geometry Commits: 7f3d0680 by Bas Couwenberg at 2019-09-30T17:21:44Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ mapbox-geometry (1.0.0-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 25 Dec 2018 22:33:34 +0100 ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), catch, cmake (>= 3.8), libmapbox-variant-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapbox-geometry/ Vcs-Git: https://salsa.debian.org/debian-gis-team/mapbox-geometry.git Homepage: https://github.com/mapbox/geometry.hpp View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-geometry/commit/7f3d06806c28303865ba46861f4abfff19548ca5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-geometry/commit/7f3d06806c28303865ba46861f4abfff19548ca5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:22:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:22:27 +0000 Subject: [Git][debian-gis-team/mapbox-variant][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9239d3752aa_46f63fbab65044882655af@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapbox-variant Commits: 130d0d70 by Bas Couwenberg at 2019-09-30T17:22:16Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ mapbox-variant (1.1.6-2) UNRELEASED; urgency=medium [ Bas Couwenberg ] - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. [ Helmut Grohne ] * Mark libmapbox-variant-dev Multi-Arch: foreign. (Closes: #940618) ===================================== debian/control ===================================== @@ -8,7 +8,7 @@ Build-Depends: debhelper (>= 9), libboost-system-dev, libboost-timer-dev, libboost-chrono-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapbox-variant/ Vcs-Git: https://salsa.debian.org/debian-gis-team/mapbox-variant.git Homepage: https://github.com/mapbox/variant View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-variant/commit/130d0d703c24a112d1706c274003fb0fe56a88b6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-variant/commit/130d0d703c24a112d1706c274003fb0fe56a88b6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:23:02 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:23:02 +0000 Subject: [Git][debian-gis-team/mapbox-wagyu][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9239f61355c_46f62ac0fbe48aa4265725@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapbox-wagyu Commits: c29fbfee by Bas Couwenberg at 2019-09-30T17:22:54Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ mapbox-wagyu (0.4.3-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), libboost-dev, libmapbox-geometry-dev, rapidjson-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapbox-wagyu/ Vcs-Git: https://salsa.debian.org/debian-gis-team/mapbox-wagyu.git Homepage: https://github.com/mapbox/wagyu View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-wagyu/commit/c29fbfeeb99c046feb62b32803012eb7def1a59f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapbox-wagyu/commit/c29fbfeeb99c046feb62b32803012eb7def1a59f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:23:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:23:23 +0000 Subject: [Git][debian-gis-team/mapcache][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923a0bb43f2_46f62ac0fbe48aa4265914@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapcache Commits: 7e838ebb by Bas Couwenberg at 2019-09-30T17:23:16Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mapcache (1.8.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:23:12 +0200 + mapcache (1.8.0-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -35,7 +35,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapcache Vcs-Git: https://salsa.debian.org/debian-gis-team/mapcache.git Homepage: http://www.mapserver.org/mapcache/ View it on GitLab: https://salsa.debian.org/debian-gis-team/mapcache/commit/7e838ebb3616c2ebcef4de3b14b94a3a6ff7c9df -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapcache/commit/7e838ebb3616c2ebcef4de3b14b94a3a6ff7c9df You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:23:45 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:23:45 +0000 Subject: [Git][debian-gis-team/mapcode][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923a21e01ea_46f62ac0ff19eb882661c3@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapcode Commits: f59233b8 by Bas Couwenberg at 2019-09-30T17:23:37Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ mapcode (2.5.5-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -5,7 +5,7 @@ Section: misc Priority: optional Build-Depends: debhelper (>= 9), cmake (>= 3.3) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapcode/ Vcs-Git: https://salsa.debian.org/debian-gis-team/mapcode.git Homepage: http://www.mapcode.com/ View it on GitLab: https://salsa.debian.org/debian-gis-team/mapcode/commit/f59233b8a5a5153af0776118dbc233ea22163089 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapcode/commit/f59233b8a5a5153af0776118dbc233ea22163089 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:24:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:24:15 +0000 Subject: [Git][debian-gis-team/mapnik][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923a3fd0895_46f62ac0f98dfce0266327@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapnik Commits: 245a835b by Bas Couwenberg at 2019-09-30T17:24:06Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mapnik (3.0.22+ds1-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:24:02 +0200 + mapnik (3.0.22+ds1-1) unstable; urgency=medium * New repacked upstream release. ===================================== debian/control ===================================== @@ -31,7 +31,7 @@ Build-Depends: debhelper (>= 9~), pkg-config, python3, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapnik Vcs-Git: https://salsa.debian.org/debian-gis-team/mapnik.git Homepage: http://www.mapnik.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/mapnik/commit/245a835bf4cc31623f3eefd103ade5d6ff18812e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapnik/commit/245a835bf4cc31623f3eefd103ade5d6ff18812e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:24:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:24:38 +0000 Subject: [Git][debian-gis-team/mapnik-vector-tile][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923a56bf96a_46f62ac0fe4cd06c266545@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapnik-vector-tile Commits: 2fb8c765 by Bas Couwenberg at 2019-09-30T17:24:30Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mapnik-vector-tile (1.6.1+dfsg-9) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:24:29 +0200 + mapnik-vector-tile (1.6.1+dfsg-8) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -13,7 +13,7 @@ Build-Depends: debhelper (>= 9), protobuf-compiler, python3-all, python3-gdal -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapnik-vector-tile Vcs-Git: https://salsa.debian.org/debian-gis-team/mapnik-vector-tile.git Homepage: https://github.com/mapbox/mapnik-vector-tile View it on GitLab: https://salsa.debian.org/debian-gis-team/mapnik-vector-tile/commit/2fb8c765a0312f18cbb4318f7b32290867c55f33 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapnik-vector-tile/commit/2fb8c765a0312f18cbb4318f7b32290867c55f33 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:25:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:25:09 +0000 Subject: [Git][debian-gis-team/mapproxy][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923a7554bdd_46f62ac0fe4cd06c26676f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapproxy Commits: ea8ef3e0 by Bas Couwenberg at 2019-09-30T17:25:00Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ mapproxy (1.12.0-2) UNRELEASED; urgency=medium * Update override for embedded-javascript-library. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Mon, 09 Sep 2019 20:54:10 +0200 ===================================== debian/control ===================================== @@ -23,7 +23,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapproxy Vcs-Git: https://salsa.debian.org/debian-gis-team/mapproxy.git Homepage: http://mapproxy.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/mapproxy/commit/ea8ef3e0abe75ef9da994452a10b39b7d46e3548 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapproxy/commit/ea8ef3e0abe75ef9da994452a10b39b7d46e3548 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:25:31 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:25:31 +0000 Subject: [Git][debian-gis-team/mapserver][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923a8b63771_46f62ac0ff19eb88266958@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mapserver Commits: e0a9f447 by Bas Couwenberg at 2019-09-30T17:25:21Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mapserver (7.4.2-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:25:20 +0200 + mapserver (7.4.2-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -45,7 +45,7 @@ Build-Depends: debhelper (>= 9.20160114), docbook-xml, xsltproc Build-Conflicts: libcurl3-openssl-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mapserver Vcs-Git: https://salsa.debian.org/debian-gis-team/mapserver.git Homepage: http://www.mapserver.org View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/commit/e0a9f44759c9114a341271179549a38ffd050a22 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mapserver/commit/e0a9f44759c9114a341271179549a38ffd050a22 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:25:51 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:25:51 +0000 Subject: [Git][debian-gis-team/mgrs][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923a9f65a50_46f62ac0f98dfce026719f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mgrs Commits: dd095b89 by Bas Couwenberg at 2019-09-30T17:25:44Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ mgrs (1.0.0-1~exp2) UNRELEASED; urgency=medium * Update Vcs-* URLs for Salsa. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Strip trailing whitespace from rules file. * Update watch file to limit matches to archive path. * Remove package name from lintian overrides. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), dh-buildinfo, nodejs, node-uglify -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mgrs/ Vcs-Git: https://salsa.debian.org/debian-gis-team/mgrs.git Homepage: https://github.com/proj4js/mgrs View it on GitLab: https://salsa.debian.org/debian-gis-team/mgrs/commit/dd095b891760914798a0b478d97d3b27db5b87d8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mgrs/commit/dd095b891760914798a0b478d97d3b27db5b87d8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:26:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:26:09 +0000 Subject: [Git][debian-gis-team/mkgmap][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923ab11fe_46f63fbab6504488267362@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mkgmap Commits: 1d7141a8 by Bas Couwenberg at 2019-09-30T17:26:01Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +mkgmap (0.0.0+svn4289-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:26:00 +0200 + mkgmap (0.0.0+svn4289-1) unstable; urgency=medium * New upstream SVN snapshot. ===================================== debian/control ===================================== @@ -14,7 +14,7 @@ Build-Depends: ant, libprotobuf-java, libosmpbf-java, time -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mkgmap Vcs-Git: https://salsa.debian.org/debian-gis-team/mkgmap.git Homepage: http://www.mkgmap.org.uk View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/commit/1d7141a8e4aa1b8d2ce80403c3c29e29e83654e9 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap/commit/1d7141a8e4aa1b8d2ce80403c3c29e29e83654e9 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:26:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:26:23 +0000 Subject: [Git][debian-gis-team/mkgmapgui][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923abf85d1a_46f62ac0ff19eb88267541@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mkgmapgui Commits: d1eb21d5 by Bas Couwenberg at 2019-09-30T17:26:16Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ mkgmapgui (1.1.ds-11) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 25 Dec 2018 22:39:43 +0100 ===================================== debian/control ===================================== @@ -8,7 +8,7 @@ Priority: optional Build-Depends: debhelper (>= 9), javahelper Build-Depends-Indep: default-jdk -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mkgmapgui Vcs-Git: https://salsa.debian.org/debian-gis-team/mkgmapgui.git Homepage: http://activityworkshop.net/software/mkgmapgui View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmapgui/commit/d1eb21d5d3fd029055ddc422fb45eff2779df317 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmapgui/commit/d1eb21d5d3fd029055ddc422fb45eff2779df317 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:26:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:26:38 +0000 Subject: [Git][debian-gis-team/mkgmap-splitter][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923aceccd42_46f63fbab650448826777c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / mkgmap-splitter Commits: dd4f120f by Bas Couwenberg at 2019-09-30T17:26:31Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ mkgmap-splitter (0.0.0+svn592-2) UNRELEASED; urgency=medium * Update gbp.conf to use --source-only-changes by default. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 07 Jul 2019 08:44:50 +0200 ===================================== debian/control ===================================== @@ -20,7 +20,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/mkgmap-splitter Vcs-Git: https://salsa.debian.org/debian-gis-team/mkgmap-splitter.git Homepage: http://www.mkgmap.org.uk/doc/splitter.html View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap-splitter/commit/dd4f120f0a48edfa72bbd262b98733a00dbab364 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/mkgmap-splitter/commit/dd4f120f0a48edfa72bbd262b98733a00dbab364 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:27:26 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:27:26 +0000 Subject: [Git][debian-gis-team/narray-miss][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923afe4a76e_46f62ac0ff19eb882679d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / narray-miss Commits: e378d1d0 by Bas Couwenberg at 2019-09-30T17:27:18Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ ruby-narray-miss (1.4.0-3) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add gbp.conf to use pristine-tar & --source-only-changes by default. -- Bas Couwenberg Sun, 05 Aug 2018 20:31:58 +0200 ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9.0~), gem2deb (>= 0.3.0~), ruby-narray, rdtool -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/narray-miss Vcs-Git: https://salsa.debian.org/debian-gis-team/narray-miss.git Homepage: http://ruby.gfd-dennou.org/products/narray_miss/index.html View it on GitLab: https://salsa.debian.org/debian-gis-team/narray-miss/commit/e378d1d0da84a8878052294e91633b512a321a2d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/narray-miss/commit/e378d1d0da84a8878052294e91633b512a321a2d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:27:48 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:27:48 +0000 Subject: [Git][debian-gis-team/nco][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923b1474efc_46f62ac0ff19eb8826814d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / nco Commits: ffe80e8f by Bas Couwenberg at 2019-09-30T17:27:38Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ nco (4.8.1-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Wed, 10 Jul 2019 18:33:34 +0200 ===================================== debian/control ===================================== @@ -18,7 +18,7 @@ Build-Depends: debhelper (>= 9), libudunits2-dev, libdap-dev, texinfo -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/nco Vcs-Git: https://salsa.debian.org/debian-gis-team/nco.git Homepage: http://nco.sourceforge.net/ View it on GitLab: https://salsa.debian.org/debian-gis-team/nco/commit/ffe80e8fdb04b5164bdd34ff10fb3c8c825234f5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/nco/commit/ffe80e8fdb04b5164bdd34ff10fb3c8c825234f5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:28:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:28:06 +0000 Subject: [Git][debian-gis-team/ncview][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923b269dad7_46f63fbab6504488268354@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / ncview Commits: 2055319c by Bas Couwenberg at 2019-09-30T17:27:59Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ ncview (2.1.8+ds-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sun, 05 Aug 2018 20:32:30 +0200 ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 9), libudunits2-dev, libxaw7-dev, netcdf-bin -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/ncview Vcs-Git: https://salsa.debian.org/debian-gis-team/ncview.git Homepage: http://meteora.ucsd.edu/~pierce/ncview_home_page.html View it on GitLab: https://salsa.debian.org/debian-gis-team/ncview/commit/2055319c7c4875a12345a54d6fa697b965aed186 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ncview/commit/2055319c7c4875a12345a54d6fa697b965aed186 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:28:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:28:25 +0000 Subject: [Git][debian-gis-team/netcdf4-python][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923b39c9add_46f62ac0ff19eb88268827@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / netcdf4-python Commits: e221a69e by Bas Couwenberg at 2019-09-30T17:28:17Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +netcdf4-python (1.5.2-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:28:16 +0200 + netcdf4-python (1.5.2-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 9), libnetcdf-dev (>= 1:4.4.0), netcdf-bin, chrpath -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/netcdf4-python/ Vcs-Git: https://salsa.debian.org/debian-gis-team/netcdf4-python.git Homepage: http://unidata.github.io/netcdf4-python/ View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/commit/e221a69e98e47afd8af715a929ea448eafb9f6a8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf4-python/commit/e221a69e98e47afd8af715a929ea448eafb9f6a8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:28:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:28:46 +0000 Subject: [Git][debian-gis-team/netcdf-cxx][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923b4e5f8b4_46f62ac0f98dfce02690fe@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / netcdf-cxx Commits: 3f024db4 by Bas Couwenberg at 2019-09-30T17:28:37Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ netcdf-cxx (4.3.1-1) UNRELEASED; urgency=medium * New upstream release. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Add Build-Depends-Package field to symbols file. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -14,7 +14,7 @@ Build-Depends: debhelper (>= 9.20160114), libnetcdf-dev (>= 1:4.6.0), pkg-config, pkg-kde-tools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/netcdf-cxx Vcs-Git: https://salsa.debian.org/debian-gis-team/netcdf-cxx.git -b experimental Homepage: http://www.unidata.ucar.edu/software/netcdf/ View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/commit/3f024db4aa1f4981dab0a2bf1ed47734edb6f967 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx/commit/3f024db4aa1f4981dab0a2bf1ed47734edb6f967 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:29:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:29:04 +0000 Subject: [Git][debian-gis-team/netcdf-cxx-legacy][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923b605c0d5_46f62ac0fbe48aa4269289@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / netcdf-cxx-legacy Commits: fe47195d by Bas Couwenberg at 2019-09-30T17:28:56Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ netcdf-cxx-legacy (4.2-12) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add Build-Depends-Package field to symbols file. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9.20160114), libnetcdf-dev (>= 1:4.4.0), pkg-kde-tools, texinfo -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/netcdf-cxx-legacy Vcs-Git: https://salsa.debian.org/debian-gis-team/netcdf-cxx-legacy.git Homepage: http://www.unidata.ucar.edu/software/netcdf/ View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx-legacy/commit/fe47195d6501bc7ef3c7cbd4d81a8c9646e802c7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-cxx-legacy/commit/fe47195d6501bc7ef3c7cbd4d81a8c9646e802c7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:29:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:29:30 +0000 Subject: [Git][debian-gis-team/netcdf-fortran][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923b7ab1fa5_46f62ac0fe4cd06c2694c1@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / netcdf-fortran Commits: 4993d041 by Bas Couwenberg at 2019-09-30T17:29:18Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +netcdf-fortran (4.5.2+ds-1~exp2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:29:17 +0200 + netcdf-fortran (4.5.2+ds-1~exp1) experimental; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -13,7 +13,7 @@ Build-Depends: debhelper (>= 9.20160114), graphviz, libnetcdf-dev (>= 1:4.6.2), pkg-config -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/netcdf-fortran Vcs-Git: https://salsa.debian.org/debian-gis-team/netcdf-fortran.git -b experimental Homepage: http://www.unidata.ucar.edu/software/netcdf/ View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/4993d041fa372400f6e110174dc200726e50c64a -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf-fortran/commit/4993d041fa372400f6e110174dc200726e50c64a You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:29:51 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:29:51 +0000 Subject: [Git][debian-gis-team/netcdf][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923b8f7e05f_46f62ac0fe4cd06c26968c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / netcdf Commits: 586962da by Bas Couwenberg at 2019-09-30T17:29:43Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +netcdf (1:4.7.1-1~exp2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:29:41 +0200 + netcdf (1:4.7.1-1~exp1) experimental; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -13,7 +13,7 @@ Build-Depends: debhelper (>= 9.20160114), graphviz, libhdf5-dev (>= 1.8.6-1~), libcurl4-gnutls-dev | libcurl-ssl-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/netcdf Vcs-Git: https://salsa.debian.org/debian-gis-team/netcdf.git Homepage: http://www.unidata.ucar.edu/software/netcdf/ View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf/commit/586962da474876dfabfaddffc56646b3a2104258 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/netcdf/commit/586962da474876dfabfaddffc56646b3a2104258 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:30:11 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:30:11 +0000 Subject: [Git][debian-gis-team/nik4][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923ba370b09_46f62ac0f98dfce0270075@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / nik4 Commits: 9b558b71 by Bas Couwenberg at 2019-09-30T17:30:03Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +nik4 (1.6-7) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:30:02 +0200 + nik4 (1.6-6) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/nik4/ Vcs-Git: https://salsa.debian.org/debian-gis-team/nik4.git Homepage: https://github.com/Zverik/Nik4 View it on GitLab: https://salsa.debian.org/debian-gis-team/nik4/commit/9b558b71f15f446e6a0c76b278609b05d020bf5a -- View it on GitLab: https://salsa.debian.org/debian-gis-team/nik4/commit/9b558b71f15f446e6a0c76b278609b05d020bf5a You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:30:33 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:30:33 +0000 Subject: [Git][debian-gis-team/node-kosmtik][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923bb9335e2_46f62ac0f98dfce02704ab@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / node-kosmtik Commits: 296feecd by Bas Couwenberg at 2019-09-30T17:30:25Z Bump Standards-Version to 4.4.1, no changes. - - - - - 1 changed file: - debian/control Changes: ===================================== debian/control ===================================== @@ -16,7 +16,7 @@ Build-Depends: debhelper (>= 9), node-generic-pool (>= 2.2.0), node-lodash, mapnik-reference -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Homepage: https://github.com/kosmtik/kosmtik#readme Vcs-Browser: https://salsa.debian.org/debian-gis-team/node-kosmtik Vcs-Git: https://salsa.debian.org/debian-gis-team/node-kosmtik.git View it on GitLab: https://salsa.debian.org/debian-gis-team/node-kosmtik/commit/296feecd3436f29a71159c3de32f6ebfc5418c24 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/node-kosmtik/commit/296feecd3436f29a71159c3de32f6ebfc5418c24 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:30:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:30:53 +0000 Subject: [Git][debian-gis-team/node-leaflet-formbuilder][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923bcdd9ef4_46f63fbab6504488270647@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / node-leaflet-formbuilder Commits: 42f1db2f by Bas Couwenberg at 2019-09-30T17:30:44Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ node-leaflet-formbuilder (0.2.1-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 28 Aug 2018 14:18:44 +0200 ===================================== debian/control ===================================== @@ -6,7 +6,7 @@ Uploaders: Ross Gammon Build-Depends: debhelper (>= 9), dh-buildinfo, nodejs -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Homepage: https://github.com/yohanboniface/Leaflet.FormBuilder#readme Vcs-Git: https://salsa.debian.org/debian-gis-team/node-leaflet-formbuilder.git Vcs-Browser: https://salsa.debian.org/debian-gis-team/node-leaflet-formbuilder View it on GitLab: https://salsa.debian.org/debian-gis-team/node-leaflet-formbuilder/commit/42f1db2fc394c19486f2edcd1531629fb70916a0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/node-leaflet-formbuilder/commit/42f1db2fc394c19486f2edcd1531629fb70916a0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:31:15 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:31:15 +0000 Subject: [Git][debian-gis-team/node-leaflet-hash][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923be3d92c0_46f62ac0f98dfce02708ae@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / node-leaflet-hash Commits: 7ad293b8 by Bas Couwenberg at 2019-09-30T17:31:08Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ node-leaflet-hash (0.2.1-3) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sun, 05 Aug 2018 20:35:21 +0200 ===================================== debian/control ===================================== @@ -6,7 +6,7 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-buildinfo, nodejs -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/node-leaflet-hash Vcs-Git: https://salsa.debian.org/debian-gis-team/node-leaflet-hash.git Homepage: https://github.com/mlevans/leaflet-hash View it on GitLab: https://salsa.debian.org/debian-gis-team/node-leaflet-hash/commit/7ad293b8ea6e82c0d247735fdaa622c4bcf2c82c -- View it on GitLab: https://salsa.debian.org/debian-gis-team/node-leaflet-hash/commit/7ad293b8ea6e82c0d247735fdaa622c4bcf2c82c You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:31:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:31:32 +0000 Subject: [Git][debian-gis-team/node-osmium][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923bf477731_46f62ac0f98dfce0271041@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / node-osmium Commits: d78e770c by Bas Couwenberg at 2019-09-30T17:31:24Z Bump Standards-Version to 4.4.1, no changes. - - - - - 1 changed file: - debian/control Changes: ===================================== debian/control ===================================== @@ -4,7 +4,7 @@ Uploaders: Bas Couwenberg Section: science Priority: optional Build-Depends: debhelper (>= 9) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/node-osmium/ Vcs-Git: https://salsa.debian.org/debian-gis-team/node-osmium.git Homepage: https://osmcode.org/node-osmium/ View it on GitLab: https://salsa.debian.org/debian-gis-team/node-osmium/commit/d78e770c6b34158736fdbcfe67dc0c8667bac398 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/node-osmium/commit/d78e770c6b34158736fdbcfe67dc0c8667bac398 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:31:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:31:49 +0000 Subject: [Git][debian-gis-team/node-quickselect][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923c05f380_46f62ac0f98dfce02712eb@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / node-quickselect Commits: 059bea80 by Bas Couwenberg at 2019-09-30T17:31:41Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ node-quickselect (1.0.1-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 28 Aug 2018 14:19:34 +0200 ===================================== debian/control ===================================== @@ -6,7 +6,7 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-buildinfo, nodejs -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/node-quickselect Vcs-Git: https://salsa.debian.org/debian-gis-team/node-quickselect.git Homepage: https://github.com/mourner/quickselect View it on GitLab: https://salsa.debian.org/debian-gis-team/node-quickselect/commit/059bea803a938ccd1583dbc69f99935669aff601 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/node-quickselect/commit/059bea803a938ccd1583dbc69f99935669aff601 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:32:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:32:25 +0000 Subject: [Git][debian-gis-team/node-rbush][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923c297b509_46f62ac0f98dfce02719ac@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / node-rbush Commits: cef8f2be by Bas Couwenberg at 2019-09-30T17:32:09Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ node-rbush (2.0.2-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 28 Aug 2018 14:19:53 +0200 ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-buildinfo, nodejs -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/node-rbush Vcs-Git: https://salsa.debian.org/debian-gis-team/node-rbush.git Homepage: https://github.com/mourner/rbush View it on GitLab: https://salsa.debian.org/debian-gis-team/node-rbush/commit/cef8f2bedb292e512b4e2ef08867a639a581253b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/node-rbush/commit/cef8f2bedb292e512b4e2ef08867a639a581253b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:33:20 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:33:20 +0000 Subject: [Git][debian-gis-team/ogdi-dfsg][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923c6039521_46f62ac0fe4cd06c27235c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / ogdi-dfsg Commits: 089022ed by Bas Couwenberg at 2019-09-30T17:33:07Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ ogdi-dfsg (4.1.0+ds-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Wed, 10 Jul 2019 18:38:59 +0200 ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9), pkg-config, tcl-dev (>= 8.4), zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/ogdi-dfsg Vcs-Git: https://salsa.debian.org/debian-gis-team/ogdi-dfsg.git Homepage: http://ogdi.sourceforge.net/ View it on GitLab: https://salsa.debian.org/debian-gis-team/ogdi-dfsg/commit/089022ed288e7e91e71770ac810751e70c75bb72 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ogdi-dfsg/commit/089022ed288e7e91e71770ac810751e70c75bb72 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:33:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:33:49 +0000 Subject: [Git][debian-gis-team/opencpn][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923c7dacf85_46f62ac0f98dfce027254b@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / opencpn Commits: fe090788 by Bas Couwenberg at 2019-09-30T17:33:41Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +opencpn (4.8.8+dfsg.2-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Alec Leamas Mon, 30 Sep 2019 19:33:40 +0200 + opencpn (4.8.8+dfsg.2-1) unstable; urgency=medium [ Bas Couwenberg ] ===================================== debian/control ===================================== @@ -21,7 +21,7 @@ Build-Depends: debhelper (>= 11), libwxgtk3.0-dev, libwxsvg-dev, portaudio19-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Homepage: https://opencpn.org Vcs-Browser: https://salsa.debian.org/debian-gis-team/opencpn Vcs-Git: https://salsa.debian.org/debian-gis-team/opencpn.git View it on GitLab: https://salsa.debian.org/debian-gis-team/opencpn/commit/fe090788a7a6eaea64bc656f0344c51f1f6a1bb3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/opencpn/commit/fe090788a7a6eaea64bc656f0344c51f1f6a1bb3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:34:11 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:34:11 +0000 Subject: [Git][debian-gis-team/openlayers][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923c93551c4_46f63fbab650448827279b@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / openlayers Commits: 263d6ce0 by Bas Couwenberg at 2019-09-30T17:34:02Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +openlayers (2.13.1+ds2-8) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:34:00 +0200 + openlayers (2.13.1+ds2-7) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Priority: optional Build-Depends: debhelper (>= 9), python3 Build-Depends-Indep: node-uglify -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/openlayers Vcs-Git: https://salsa.debian.org/debian-gis-team/openlayers.git Homepage: http://openlayers.org/two/ View it on GitLab: https://salsa.debian.org/debian-gis-team/openlayers/commit/263d6ce0318cf4919f045cac529478f007e820e8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/openlayers/commit/263d6ce0318cf4919f045cac529478f007e820e8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:34:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:34:30 +0000 Subject: [Git][debian-gis-team/openstreetmap-carto][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923ca6998ff_46f62ac0fbe48aa427295f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / openstreetmap-carto Commits: 67afdc37 by Bas Couwenberg at 2019-09-30T17:34:22Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -8,7 +8,7 @@ openstreetmap-carto (4.0.0-1) UNRELEASED; urgency=medium (closes: #855384, #925493) * Update copyright-format URL to use HTTPS. * Update Vcs-* URLs for Salsa. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to use releases instead of tags. * Update watch file to limit matches to archive path. * Remove package name from lintian overrides. ===================================== debian/control ===================================== @@ -6,7 +6,7 @@ Priority: optional Build-Depends: debhelper (>= 9.0.0), node-carto (>= 0.18.0), po-debconf -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/openstreetmap-carto Vcs-Git: https://salsa.debian.org/debian-gis-team/openstreetmap-carto.git Homepage: https://github.com/gravitystorm/openstreetmap-carto View it on GitLab: https://salsa.debian.org/debian-gis-team/openstreetmap-carto/commit/67afdc378ac37324e77da584088d321b8780d600 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/openstreetmap-carto/commit/67afdc378ac37324e77da584088d321b8780d600 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:35:07 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:35:07 +0000 Subject: [Git][debian-gis-team/osgearth][master] 2 commits: Update symbols for other architectures. Message-ID: <5d923ccb4b419_46f62ac0f98dfce02731d1@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osgearth Commits: 20ab2379 by Bas Couwenberg at 2019-09-10T13:41:40Z Update symbols for other architectures. - - - - - 7350f4ee by Bas Couwenberg at 2019-09-30T17:34:59Z Bump Standards-Version to 4.4.1, no changes. - - - - - 4 changed files: - debian/changelog - debian/control - debian/libosgearth5.symbols - debian/libosgearthannotation5.symbols The diff was not included because it is too large. View it on GitLab: https://salsa.debian.org/debian-gis-team/osgearth/compare/ef6f937589be9c6fa8e3813c8530adde9123bed5...7350f4ee38fa6dee498147441a2c79ac44196529 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osgearth/compare/ef6f937589be9c6fa8e3813c8530adde9123bed5...7350f4ee38fa6dee498147441a2c79ac44196529 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:35:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:35:29 +0000 Subject: [Git][debian-gis-team/osm2pgrouting][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923ce1e70de_46f62ac0f98dfce027339b@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osm2pgrouting Commits: 0364688b by Bas Couwenberg at 2019-09-30T17:35:23Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ osm2pgrouting (2.3.6-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Append -DNDEBUG to CXXFLAGS to remove buildpath from binaries. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -16,7 +16,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osm2pgrouting Vcs-Git: https://salsa.debian.org/debian-gis-team/osm2pgrouting.git Homepage: https://github.com/pgRouting/osm2pgrouting View it on GitLab: https://salsa.debian.org/debian-gis-team/osm2pgrouting/commit/0364688b15ffce93491aa39a560ea7139a8b5f6f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osm2pgrouting/commit/0364688b15ffce93491aa39a560ea7139a8b5f6f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:35:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:35:53 +0000 Subject: [Git][debian-gis-team/osm2pgsql][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923cf9f1b6_46f63fbab65044882735cd@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osm2pgsql Commits: 52012ad3 by Bas Couwenberg at 2019-09-30T17:35:44Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ osm2pgsql (1.0.0+ds-2) UNRELEASED; urgency=medium * Don't define ACCEPT_USE_OF_DEPRECATED_PROJ_API_H, fixed in libosmium. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Fri, 30 Aug 2019 15:10:29 +0200 ===================================== debian/control ===================================== @@ -21,7 +21,7 @@ Build-Depends: debhelper (>= 9), lua5.2, python3, python3-psycopg2 -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osm2pgsql Vcs-Git: https://salsa.debian.org/debian-gis-team/osm2pgsql.git Homepage: https://wiki.openstreetmap.org/wiki/Osm2pgsql View it on GitLab: https://salsa.debian.org/debian-gis-team/osm2pgsql/commit/52012ad33cbf74c240b1bf0f71aa0335ab20b761 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osm2pgsql/commit/52012ad33cbf74c240b1bf0f71aa0335ab20b761 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:36:10 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:36:10 +0000 Subject: [Git][debian-gis-team/osmcoastline][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923d0ad8a9b_46f63fbab6504488273737@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osmcoastline Commits: b24b0a53 by Bas Couwenberg at 2019-09-30T17:36:03Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ osmcoastline (2.2.4-2) UNRELEASED; urgency=medium * Update gbp.conf to use --source-only-changes by default. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 07 Jul 2019 09:13:41 +0200 ===================================== debian/control ===================================== @@ -16,7 +16,7 @@ Build-Depends: debhelper (>= 9), spatialite-bin, sqlite3, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osmcoastline Vcs-Git: https://salsa.debian.org/debian-gis-team/osmcoastline.git Homepage: https://osmcode.org/osmcoastline/ View it on GitLab: https://salsa.debian.org/debian-gis-team/osmcoastline/commit/b24b0a5325480dc1c7a7b4601abf7d3f86fe06e3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmcoastline/commit/b24b0a5325480dc1c7a7b4601abf7d3f86fe06e3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:36:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:36:30 +0000 Subject: [Git][debian-gis-team/osmctools][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923d1ea30c8_46f63fbab650448827399e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osmctools Commits: fd60505c by Bas Couwenberg at 2019-09-30T17:36:22Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ osmctools (0.9-3) UNRELEASED; urgency=medium * Remove .pc from .gitignore. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-autoreconf, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osmctools Vcs-Git: https://salsa.debian.org/debian-gis-team/osmctools.git Homepage: https://gitlab.com/osm-c-tools/osmctools View it on GitLab: https://salsa.debian.org/debian-gis-team/osmctools/commit/fd60505ce1b7bc5433601f4f9f24fa0d3988cb56 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmctools/commit/fd60505ce1b7bc5433601f4f9f24fa0d3988cb56 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:36:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:36:53 +0000 Subject: [Git][debian-gis-team/osm-gps-map][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923d35889d_46f62ac0fbe48aa4274170@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osm-gps-map Commits: 6ca76519 by Bas Couwenberg at 2019-09-30T17:36:44Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +osm-gps-map (1.1.0-7) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:36:43 +0200 + osm-gps-map (1.1.0-6) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -18,7 +18,7 @@ Build-Depends: debhelper (>= 9.20160114), gnome-common, gtk-doc-tools, gobject-introspection -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osm-gps-map Vcs-Git: https://salsa.debian.org/debian-gis-team/osm-gps-map.git Homepage: https://nzjrs.github.com/osm-gps-map/ View it on GitLab: https://salsa.debian.org/debian-gis-team/osm-gps-map/commit/6ca765192a708ec638e0723b778574f8e2d5a452 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osm-gps-map/commit/6ca765192a708ec638e0723b778574f8e2d5a452 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:37:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:37:16 +0000 Subject: [Git][debian-gis-team/osmium-tool][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923d4ca5e30_46f63fbab65044882743e0@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osmium-tool Commits: 5902533f by Bas Couwenberg at 2019-09-30T17:37:07Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +osmium-tool (1.11.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:37:06 +0200 + osmium-tool (1.11.0-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), libosmium2-dev (>= 2.15.2), pandoc, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osmium-tool/ Vcs-Git: https://salsa.debian.org/debian-gis-team/osmium-tool.git Homepage: https://osmcode.org/osmium-tool/ View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/5902533f9119e50530b66cf43a06b1e04dd3e712 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmium-tool/commit/5902533f9119e50530b66cf43a06b1e04dd3e712 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:37:36 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:37:36 +0000 Subject: [Git][debian-gis-team/osmosis][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923d60b1fb9_46f62ac0fe4cd06c2745b9@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osmosis Commits: 605d5c64 by Bas Couwenberg at 2019-09-30T17:37:28Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,7 +1,7 @@ osmosis (0.47-5) UNRELEASED; urgency=medium * Update gbp.conf to use --source-only-changes by default. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 07 Jul 2019 09:23:39 +0200 ===================================== debian/control ===================================== @@ -33,7 +33,7 @@ Build-Depends: debhelper (>= 9), libxerces2-java, libxz-java, maven-repo-helper -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osmosis Vcs-Git: https://salsa.debian.org/debian-gis-team/osmosis.git Homepage: https://wiki.openstreetmap.org/wiki/Osmosis View it on GitLab: https://salsa.debian.org/debian-gis-team/osmosis/commit/605d5c6410047444dde52b69f905687dd07cd625 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmosis/commit/605d5c6410047444dde52b69f905687dd07cd625 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:38:02 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:38:02 +0000 Subject: [Git][debian-gis-team/osmpbf][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923d7a48c88_46f63fbab65044882747d9@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / osmpbf Commits: de0b08ce by Bas Couwenberg at 2019-09-30T17:37:54Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ osmpbf (1.3.3-13) UNRELEASED; urgency=medium * Update gbp.conf for renamed branches. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Thu, 29 Aug 2019 11:00:31 +0200 ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), libprotobuf-java (>= 3.0.0), protobuf-compiler (>= 3.0.0), maven-repo-helper -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/osmpbf Vcs-Git: https://salsa.debian.org/debian-gis-team/osmpbf.git Homepage: https://github.com/scrosby/OSM-binary View it on GitLab: https://salsa.debian.org/debian-gis-team/osmpbf/commit/de0b08ce4d3ecb99a33f6cfad2f3a00deda8e774 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/osmpbf/commit/de0b08ce4d3ecb99a33f6cfad2f3a00deda8e774 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:38:21 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:38:21 +0000 Subject: [Git][debian-gis-team/ossim][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923d8d9f9b2_46f62ac0fbe48aa4274944@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / ossim Commits: 5d736f96 by Bas Couwenberg at 2019-09-30T17:38:12Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +ossim (2.9.1-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:38:11 +0200 + ossim (2.9.1-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -14,7 +14,7 @@ Build-Depends: cmake (>= 2.8), libpng-dev, libtiff-dev, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/ossim Vcs-Git: https://salsa.debian.org/debian-gis-team/ossim.git Homepage: https://trac.osgeo.org/ossim/ View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/5d736f96da3594b42b8ead929f8ae20b2c28d34d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ossim/commit/5d736f96da3594b42b8ead929f8ae20b2c28d34d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:38:51 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:38:51 +0000 Subject: [Git][debian-gis-team/otb][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923dab51e61_46f62ac0ff19eb88275151@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / otb Commits: 7e349f7b by Bas Couwenberg at 2019-09-30T17:38:30Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +otb (7.0.0~rc1+dfsg-1~exp2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:38:29 +0200 + otb (7.0.0~rc1+dfsg-1~exp1) experimental; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -40,7 +40,7 @@ Build-Depends: debhelper (>= 9), qtbase5-dev, qttools5-dev, swig -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/otb Vcs-Git: https://salsa.debian.org/debian-gis-team/otb.git Homepage: http://www.orfeo-toolbox.org/ ===================================== debian/control.in ===================================== @@ -40,7 +40,7 @@ Build-Depends: debhelper (>= 9), qtbase5-dev, qttools5-dev, swig -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/otb Vcs-Git: https://salsa.debian.org/debian-gis-team/otb.git Homepage: http://www.orfeo-toolbox.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/7e349f7bcadf86d08b6ac6ee1c542c539abdc4c1 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/otb/commit/7e349f7bcadf86d08b6ac6ee1c542c539abdc4c1 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:39:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:39:09 +0000 Subject: [Git][debian-gis-team/owslib][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923dbd19d7a_46f62ac0ff19eb8827536d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / owslib Commits: 03da59ad by Bas Couwenberg at 2019-09-30T17:39:01Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +owslib (0.18.0-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:39:00 +0200 + owslib (0.18.0-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -13,7 +13,7 @@ Build-Depends: debhelper (>= 9), python3-setuptools, python3-sphinx, python3-tz -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/owslib Vcs-Git: https://salsa.debian.org/debian-gis-team/owslib.git Homepage: https://geopython.github.com/OWSLib/ View it on GitLab: https://salsa.debian.org/debian-gis-team/owslib/commit/03da59ada7935bf8e4b096b217843b978ef762bf -- View it on GitLab: https://salsa.debian.org/debian-gis-team/owslib/commit/03da59ada7935bf8e4b096b217843b978ef762bf You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:39:23 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:39:23 +0000 Subject: [Git][debian-gis-team/package_template][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923dcbf196b_46f63fbab65044882755a9@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / package_template Commits: 825da030 by Bas Couwenberg at 2019-09-30T17:39:16Z Bump Standards-Version to 4.4.1, no changes. - - - - - 1 changed file: - debian/control Changes: ===================================== debian/control ===================================== @@ -4,7 +4,7 @@ Uploaders: #USERNAME# <#EMAIL#> Section: science Priority: optional Build-Depends: debhelper (>= 9) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/#PACKAGE# Vcs-Git: https://salsa.debian.org/debian-gis-team/#PACKAGE#.git Homepage: View it on GitLab: https://salsa.debian.org/debian-gis-team/package_template/commit/825da0304bf2509c3d798dc49a411d0d20a56c53 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/package_template/commit/825da0304bf2509c3d798dc49a411d0d20a56c53 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:39:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:39:43 +0000 Subject: [Git][debian-gis-team/pdal][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923ddf6e4f7_46f62ac0ff19eb8827601f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pdal Commits: 0850a97e by Bas Couwenberg at 2019-09-30T17:39:34Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pdal (2.0.1+ds-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:39:32 +0200 + pdal (2.0.1+ds-1) unstable; urgency=medium * Update symbols for other architectures. ===================================== debian/control ===================================== @@ -38,7 +38,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pdal Vcs-Git: https://salsa.debian.org/debian-gis-team/pdal.git Homepage: http://pdal.io/ View it on GitLab: https://salsa.debian.org/debian-gis-team/pdal/commit/0850a97efdf0568c7bcbf9287fa20529e01a9f3f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pdal/commit/0850a97efdf0568c7bcbf9287fa20529e01a9f3f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:40:01 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:40:01 +0000 Subject: [Git][debian-gis-team/pg_comparator][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923df1d4750_46f62ac0fe4cd06c276238@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pg_comparator Commits: 3568bfc0 by Bas Couwenberg at 2019-09-30T17:39:53Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ pg-comparator (2.3.1-5) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 25 Dec 2018 22:55:36 +0100 ===================================== debian/control ===================================== @@ -5,7 +5,7 @@ Section: database Priority: optional Build-Depends: debhelper (>= 9), postgresql-server-dev-all -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pg_comparator Vcs-Git: https://salsa.debian.org/debian-gis-team/pg_comparator.git Homepage: http://www.coelho.net/pg_comparator/ View it on GitLab: https://salsa.debian.org/debian-gis-team/pg_comparator/commit/3568bfc0912997b728ceaf04e3f0bde62b8858a8 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pg_comparator/commit/3568bfc0912997b728ceaf04e3f0bde62b8858a8 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:40:21 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:40:21 +0000 Subject: [Git][debian-gis-team/pgrouting][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923e054db0c_46f62ac0ff19eb882764c2@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pgrouting Commits: 6b71e26d by Bas Couwenberg at 2019-09-30T17:40:11Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pgrouting (2.6.3-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:40:10 +0200 + pgrouting (2.6.3-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: cmake (>= 3.2), postgresql-server-dev-all, python3-sphinx (>= 1.0.7+dfsg), rdfind, -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pgrouting Vcs-Git: https://salsa.debian.org/debian-gis-team/pgrouting.git Homepage: https://www.pgrouting.org ===================================== debian/control.in ===================================== @@ -12,7 +12,7 @@ Build-Depends: cmake (>= 3.2), postgresql-server-dev-all, python3-sphinx (>= 1.0.7+dfsg), rdfind, -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pgrouting Vcs-Git: https://salsa.debian.org/debian-gis-team/pgrouting.git Homepage: https://www.pgrouting.org View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/commit/6b71e26d1960b6db2f8e72fa86c9787e62f58278 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pgrouting/commit/6b71e26d1960b6db2f8e72fa86c9787e62f58278 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:40:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:40:41 +0000 Subject: [Git][debian-gis-team/pgsql-ogr-fdw][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923e196c745_46f62ac0ff19eb8827666d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pgsql-ogr-fdw Commits: 280b1f25 by Bas Couwenberg at 2019-09-30T17:40:33Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ pgsql-ogr-fdw (1.0.8-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Wed, 10 Jul 2019 18:50:25 +0200 ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Priority: optional Build-Depends: debhelper (>= 10), libgdal-dev, postgresql-server-dev-all -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pgsql-ogr-fdw Vcs-Git: https://salsa.debian.org/debian-gis-team/pgsql-ogr-fdw.git Homepage: https://github.com/pramsey/pgsql-ogr-fdw ===================================== debian/control.in ===================================== @@ -7,7 +7,7 @@ Priority: optional Build-Depends: debhelper (>= 10), libgdal-dev, postgresql-server-dev-all -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pgsql-ogr-fdw Vcs-Git: https://salsa.debian.org/debian-gis-team/pgsql-ogr-fdw.git Homepage: https://github.com/pramsey/pgsql-ogr-fdw View it on GitLab: https://salsa.debian.org/debian-gis-team/pgsql-ogr-fdw/commit/280b1f25de807de90711a245da9e9698558760c3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pgsql-ogr-fdw/commit/280b1f25de807de90711a245da9e9698558760c3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:41:02 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:41:02 +0000 Subject: [Git][debian-gis-team/php-geos][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923e2ebbee3_46f62ac0ff19eb88276843@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / php-geos Commits: f8edae30 by Bas Couwenberg at 2019-09-30T17:40:53Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ php-geos (1.0.0-5) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: debhelper (>= 9), libgeos-dev, php-dev, phpunit -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/php-geos Vcs-Git: https://salsa.debian.org/debian-gis-team/php-geos.git Homepage: https://git.osgeo.org/gitea/geos/php-geos/ View it on GitLab: https://salsa.debian.org/debian-gis-team/php-geos/commit/f8edae301022b6df7e5098facc34db804a17b06d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/php-geos/commit/f8edae301022b6df7e5098facc34db804a17b06d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:41:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:41:25 +0000 Subject: [Git][debian-gis-team/pktools][master] 2 commits: Update symbols for other architectures. Message-ID: <5d923e45677ae_46f62ac0fe4cd06c27706e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pktools Commits: da2184ae by Bas Couwenberg at 2019-09-10T06:26:21Z Update symbols for other architectures. - - - - - 19bd3d48 by Bas Couwenberg at 2019-09-30T17:41:16Z Bump Standards-Version to 4.4.1, no changes. - - - - - 5 changed files: - debian/changelog - debian/control - debian/libalgorithms1.symbols - debian/libfileclasses1.symbols - debian/libimageclasses1.symbols Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,10 @@ +pktools (2.6.7.6+ds-3) UNRELEASED; urgency=medium + + * Update symbols for other architectures. + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Tue, 10 Sep 2019 08:26:14 +0200 + pktools (2.6.7.6+ds-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -17,7 +17,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pktools Vcs-Git: https://salsa.debian.org/debian-gis-team/pktools.git Homepage: http://pktools.nongnu.org/ ===================================== debian/libalgorithms1.symbols ===================================== @@ -1,7 +1,7 @@ -# SymbolsHelper-Confirmed: 2.6.7.6 amd64 +# SymbolsHelper-Confirmed: 2.6.7.6 alpha amd64 armel armhf i386 ia64 m68k mips64el mipsel powerpc ppc64 ppc64el riscv64 s390x sparc64 x32 libalgorithms.so.1 #PACKAGE# #MINVER# * Build-Depends-Package: pktools-dev - (optional=templinst|arch=alpha amd64 arm64 kfreebsd-amd64 mips64el ppc64 ppc64el s390x sparc64)_Z11string2typeIiET_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 2.6.4 + (optional=templinst|arch=alpha amd64 arm64 ia64 kfreebsd-amd64 mips64el ppc64 ppc64el riscv64 s390x sparc64)_Z11string2typeIiET_RKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 2.6.4 _Z12compareClassRKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEES6_ at Base 2.6.4 (optional=templinst)_Z15getGDALDataTypeIdE12GDALDataTypev at Base 2.6.7 (optional=templinst)_Z15getGDALDataTypeIfE12GDALDataTypev at Base 2.6.7 @@ -240,100 +240,100 @@ libalgorithms.so.1 #PACKAGE# #MINVER# _ZNK6Kernel18kernel_precomputedEii at Base 2.5.2 (optional=templinst)_ZNK8Vector2dIdE3sumEv at Base 2.5.2 (optional=templinst)_ZNKSt5ctypeIcE8do_widenEc at Base 2.5.2 - (optional=templinst|arch=hppa x32)_ZNKSt6vectorI8Vector2dIfESaIS1_EE12_M_check_lenEjPKc at Base 2.6.7 - (optional=templinst)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN11statfactory11StatFactory18INTERPOLATION_TYPEESt4lessIS5_ESaISt4pairIKS5_S8_EEEixEOS5_ at Base 2.6.7.3 + (optional=templinst|arch=!armel !armhf !m68k)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN11statfactory11StatFactory18INTERPOLATION_TYPEESt4lessIS5_ESaISt4pairIKS5_S8_EEEixEOS5_ at Base 2.6.7.3 (optional=templinst)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN6filter11FILTER_TYPEESt4lessIS5_ESaISt4pairIKS5_S7_EEEixEOS5_ at Base 2.6.7.3 (optional=templinst)_ZNSt3mapINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEEN8filter2d11FILTER_TYPEESt4lessIS5_ESaISt4pairIKS5_S7_EEEixEOS5_ at Base 2.6.7.3 - (optional=templinst)_ZNSt3mapIliSt4lessIlESaISt4pairIKliEEEixEOl at Base 2.6.7.3 - (optional=templinst)_ZNSt6vectorI8Vector2dIfESaIS1_EE17_M_default_appendEm at Base 2.6.7.3 - (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|arch=amd64 ia64 mips64el ppc64el riscv64 sparc64 x32)_ZNSt3mapIliSt4lessIlESaISt4pairIKliEEEixEOl at Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorI8Vector2dIfESaIS1_EE17_M_default_appendE{size_t}@Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EE17_M_default_appendE{size_t}@Base 2.6.7.3 (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EE17_M_realloc_insertIJRKS5_EEEvN9__gnu_cxx17__normal_iteratorIPS5_S7_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EED1Ev at Base 2.6.4 (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EED2Ev at Base 2.6.4 (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EEaSERKS7_ at Base 2.6.4 - (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt6vectorIS_IdSaIdEESaIS1_EE14_M_fill_insertEN9__gnu_cxx17__normal_iteratorIPS1_S3_EEjRKS1_ at Base 2.6.4 - (optional=templinst)_ZNSt6vectorIS_IdSaIdEESaIS1_EE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|arch=armel armhf i386 m68k mipsel powerpc x32)_ZNSt6vectorIS_IdSaIdEESaIS1_EE17_M_default_appendEj at Base 2.6.7.6 + (optional=templinst|arch=!armel !armhf !i386 !m68k !mipsel !powerpc !x32)_ZNSt6vectorIS_IdSaIdEESaIS1_EE17_M_default_appendEm at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IdSaIdEESaIS1_EE17_M_realloc_insertIJRKS1_EEEvN9__gnu_cxx17__normal_iteratorIPS1_S3_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IdSaIdEESaIS1_EE8_M_eraseEN9__gnu_cxx17__normal_iteratorIPS1_S3_EE at Base 2.5.2 (optional=templinst)_ZNSt6vectorIS_IdSaIdEESaIS1_EED1Ev at Base 2.5.2 (optional=templinst)_ZNSt6vectorIS_IdSaIdEESaIS1_EED2Ev at Base 2.5.2 - (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt6vectorIS_IfSaIfEESaIS1_EE14_M_fill_insertEN9__gnu_cxx17__normal_iteratorIPS1_S3_EEjRKS1_ at Base 2.6.4 - (optional=templinst)_ZNSt6vectorIS_IfSaIfEESaIS1_EE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|arch=armel armhf i386 m68k mipsel powerpc x32)_ZNSt6vectorIS_IfSaIfEESaIS1_EE17_M_default_appendEj at Base 2.6.7.6 + (optional=templinst|arch=!armel !armhf !i386 !m68k !mipsel !powerpc !x32)_ZNSt6vectorIS_IfSaIfEESaIS1_EE17_M_default_appendEm at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IfSaIfEESaIS1_EE17_M_realloc_insertIJRKS1_EEEvN9__gnu_cxx17__normal_iteratorIPS1_S3_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IfSaIfEESaIS1_EE8_M_eraseEN9__gnu_cxx17__normal_iteratorIPS1_S3_EES7_ at Base 2.5.2 (optional=templinst)_ZNSt6vectorIS_IfSaIfEESaIS1_EED1Ev at Base 2.5.2 (optional=templinst)_ZNSt6vectorIS_IfSaIfEESaIS1_EED2Ev at Base 2.5.2 - (optional=templinst)_ZNSt6vectorIS_IsSaIsEESaIS1_EE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorIS_IsSaIsEESaIS1_EE17_M_default_appendE{size_t}@Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IsSaIsEESaIS1_EE17_M_realloc_insertIJRKS1_EEEvN9__gnu_cxx17__normal_iteratorIPS1_S3_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IsSaIsEESaIS1_EED1Ev at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IsSaIsEESaIS1_EED2Ev at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEE12emplace_backIJdEEEvDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEE13_M_assign_auxIN9__gnu_cxx17__normal_iteratorIPKdS1_EEEEvT_S8_St20forward_iterator_tag at Base 2.6.7 - (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt6vectorIdSaIdEE14_M_fill_insertEN9__gnu_cxx17__normal_iteratorIPdS1_EEjRKd at Base 2.6.4 (optional=templinst)_ZNSt6vectorIdSaIdEE15_M_range_insertIN9__gnu_cxx17__normal_iteratorIPdS1_EEEEvS6_T_S7_St20forward_iterator_tag at Base 2.6.7 - (optional=templinst)_ZNSt6vectorIdSaIdEE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|arch=armel armhf i386 m68k mipsel powerpc x32)_ZNSt6vectorIdSaIdEE17_M_default_appendEj at Base 2.6.7.6 + (optional=templinst|arch=!armel !armhf !i386 !m68k !mipsel !powerpc !x32)_ZNSt6vectorIdSaIdEE17_M_default_appendEm at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEE17_M_realloc_insertIJRKdEEEvN9__gnu_cxx17__normal_iteratorIPdS1_EEDpOT_ at Base 2.6.7.3 + (optional=templinst|arch=armel armhf m68k x32)_ZNSt6vectorIdSaIdEE6resizeEj at Base 2.6.7.6 (optional=templinst)_ZNSt6vectorIdSaIdEE6resizeEm at Base 2.6.7.6 - (optional=templinst)_ZNSt6vectorIdSaIdEE8_M_eraseEN9__gnu_cxx17__normal_iteratorIPdS1_EE at Base 2.6.2 + (optional=templinst|arch=!alpha !i386 !mipsel !powerpc !ppc64 !s390x)_ZNSt6vectorIdSaIdEE8_M_eraseEN9__gnu_cxx17__normal_iteratorIPdS1_EE at Base 2.6.2 (optional=templinst)_ZNSt6vectorIdSaIdEE8_M_eraseEN9__gnu_cxx17__normal_iteratorIPdS1_EES5_ at Base 2.5.2 (optional=templinst)_ZNSt6vectorIdSaIdEEC1ERKS1_ at Base 2.6.4 - (optional=templinst)_ZNSt6vectorIdSaIdEEC1EmRKS0_ at Base 2.6.7.3 - (optional=templinst|subst|arch=!amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64el !s390x !sparc64)_ZNSt6vectorIdSaIdEEC1E{size_t}RKdRKS0_ at Base 2.6.4 + (optional=templinst|subst)_ZNSt6vectorIdSaIdEEC1E{size_t}RKS0_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEEC2ERKS1_ at Base 2.6.4 - (optional=templinst)_ZNSt6vectorIdSaIdEEC2EmRKS0_ at Base 2.6.7.3 - (optional=templinst|subst|arch=!amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64el !s390x !sparc64)_ZNSt6vectorIdSaIdEEC2E{size_t}RKdRKS0_ at Base 2.6.4 + (optional=templinst|subst)_ZNSt6vectorIdSaIdEEC2E{size_t}RKS0_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEEaSERKS1_ at Base 2.5.2 - (optional=templinst|arch=!alpha !amd64 !arm64 !kfreebsd-amd64 !mips64el !ppc64 !ppc64el !s390x !sparc64)_ZNSt6vectorIfSaIfEE14_M_fill_insertEN9__gnu_cxx17__normal_iteratorIPfS1_EEjRKf at Base 2.6.4 (optional=templinst)_ZNSt6vectorIfSaIfEE15_M_range_insertIN9__gnu_cxx17__normal_iteratorIPfS1_EEEEvS6_T_S7_St20forward_iterator_tag at Base 2.6.7 - (optional=templinst)_ZNSt6vectorIfSaIfEE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|arch=armel armhf i386 m68k mipsel powerpc x32)_ZNSt6vectorIfSaIfEE17_M_default_appendEj at Base 2.6.7.6 + (optional=templinst|arch=!armel !armhf !i386 !m68k !mipsel !powerpc !x32)_ZNSt6vectorIfSaIfEE17_M_default_appendEm at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIfSaIfEE17_M_realloc_insertIJRKfEEEvN9__gnu_cxx17__normal_iteratorIPfS1_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIfSaIfEE8_M_eraseEN9__gnu_cxx17__normal_iteratorIPfS1_EES5_ at Base 2.5.2 (optional=templinst)_ZNSt6vectorIfSaIfEEaSERKS1_ at Base 2.6.7.3 - (optional=templinst)_ZNSt6vectorIsSaIsEE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorIsSaIsEE17_M_default_appendE{size_t}@Base 2.6.7.3 (optional=templinst)_ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE12_M_constructIPKcEEvT_S8_St20forward_iterator_tag at Base 2.6.7.3 (optional=templinst)_ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE12_M_constructIPcEEvT_S7_St20forward_iterator_tag at Base 2.6.7.3 (optional=templinst)_ZNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEED0Ev at Base 2.6.4 (optional=templinst)_ZNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEED1Ev at Base 2.6.4 (optional=templinst)_ZNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEED2Ev at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N11statfactory11StatFactory18INTERPOLATION_TYPEEESt10_Select1stISB_ESt4lessIS5_ESaISB_EE14_M_insert_nodeEPSt18_Rb_tree_node_baseSJ_PSt13_Rb_tree_nodeISB_E at Base 2.6.7.3 + (optional=templinst|arch=armel armhf ia64 m68k mips64el ppc64el riscv64 sparc64)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N11statfactory11StatFactory18INTERPOLATION_TYPEEESt10_Select1stISB_ESt4lessIS5_ESaISB_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESM_IJEEEEESt17_Rb_tree_iteratorISB_ESt23_Rb_tree_const_iteratorISB_EDpOT_ at Base 2.6.7.6 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N11statfactory11StatFactory18INTERPOLATION_TYPEEESt10_Select1stISB_ESt4lessIS5_ESaISB_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N11statfactory11StatFactory18INTERPOLATION_TYPEEESt10_Select1stISB_ESt4lessIS5_ESaISB_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISB_ERS7_ at Base 2.6.4 + (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N11statfactory11StatFactory18INTERPOLATION_TYPEEESt10_Select1stISB_ESt4lessIS5_ESaISB_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISB_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N11statfactory11StatFactory18INTERPOLATION_TYPEEESt10_Select1stISB_ESt4lessIS5_ESaISB_EE8_M_eraseEPSt13_Rb_tree_nodeISB_E at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm11KERNEL_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE14_M_insert_nodeEPSt18_Rb_tree_node_baseSI_PSt13_Rb_tree_nodeISA_E at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm11KERNEL_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESL_IJEEEEESt17_Rb_tree_iteratorISA_ESt23_Rb_tree_const_iteratorISA_EDpOT_ at Base 2.6.7.6 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm11KERNEL_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm11KERNEL_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 + (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm11KERNEL_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm11KERNEL_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE8_M_eraseEPSt13_Rb_tree_nodeISA_E at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm8SVM_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE14_M_insert_nodeEPSt18_Rb_tree_node_baseSI_PSt13_Rb_tree_nodeISA_E at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm8SVM_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESL_IJEEEEESt17_Rb_tree_iteratorISA_ESt23_Rb_tree_const_iteratorISA_EDpOT_ at Base 2.6.7.6 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm8SVM_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm8SVM_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 + (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm8SVM_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N3svm8SVM_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE8_M_eraseEPSt13_Rb_tree_nodeISA_E at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE14_M_insert_nodeEPSt18_Rb_tree_node_baseSI_PSt13_Rb_tree_nodeISA_E at Base 2.6.7.3 + (optional=templinst|arch=armel armhf ia64 m68k mips64el ppc64el riscv64 sparc64)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESL_IJEEEEESt17_Rb_tree_iteratorISA_ESt23_Rb_tree_const_iteratorISA_EDpOT_ at Base 2.6.7.6 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 + (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE8_M_eraseEPSt13_Rb_tree_nodeISA_E at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter7PADDINGEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE14_M_insert_nodeEPSt18_Rb_tree_node_baseSI_PSt13_Rb_tree_nodeISA_E at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter7PADDINGEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESL_IJEEEEESt17_Rb_tree_iteratorISA_ESt23_Rb_tree_const_iteratorISA_EDpOT_ at Base 2.6.7.6 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter7PADDINGEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter7PADDINGEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 + (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter7PADDINGEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N6filter7PADDINGEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE8_M_eraseEPSt13_Rb_tree_nodeISA_E at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N8filter2d11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE14_M_insert_nodeEPSt18_Rb_tree_node_baseSI_PSt13_Rb_tree_nodeISA_E at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N8filter2d11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESL_IJEEEEESt17_Rb_tree_iteratorISA_ESt23_Rb_tree_const_iteratorISA_EDpOT_ at Base 2.6.7.6 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N8filter2d11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N8filter2d11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 + (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N8filter2d11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_N8filter2d11FILTER_TYPEEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE8_M_eraseEPSt13_Rb_tree_nodeISA_E at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_sESt10_Select1stIS8_ESt4lessIS5_ESaIS8_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESJ_IJEEEEESt17_Rb_tree_iteratorIS8_ESt23_Rb_tree_const_iteratorIS8_EDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_sESt10_Select1stIS8_ESt4lessIS5_ESaIS8_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_sESt10_Select1stIS8_ESt4lessIS5_ESaIS8_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS8_ERS7_ at Base 2.6.4 + (optional=templinst|arch=amd64 arm64 hppa ia64 kfreebsd-amd64 m68k mips64el ppc64el riscv64 sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_sESt10_Select1stIS8_ESt4lessIS5_ESaIS8_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS8_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_sESt10_Select1stIS8_ESt4lessIS5_ESaIS8_EE8_M_eraseEPSt13_Rb_tree_nodeIS8_E at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeIiSt4pairIKiiESt10_Select1stIS2_ESt4lessIiESaIS2_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJOiEESD_IJEEEEESt17_Rb_tree_iteratorIS2_ESt23_Rb_tree_const_iteratorIS2_EDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeIiSt4pairIKiiESt10_Select1stIS2_ESt4lessIiESaIS2_EE24_M_get_insert_unique_posERS1_ at Base 2.5.2 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeIiSt4pairIKiiESt10_Select1stIS2_ESt4lessIiESaIS2_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS2_ERS1_ at Base 2.5.3 + (optional=templinst|arch=amd64 arm64 hppa ia64 kfreebsd-amd64 m68k mips64el ppc64el riscv64 sparc64 x32)_ZNSt8_Rb_treeIiSt4pairIKiiESt10_Select1stIS2_ESt4lessIiESaIS2_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS2_ERS1_ at Base 2.5.3 (optional=templinst)_ZNSt8_Rb_treeIiSt4pairIKiiESt10_Select1stIS2_ESt4lessIiESaIS2_EE8_M_eraseEPSt13_Rb_tree_nodeIS2_E at Base 2.5.2 (optional=templinst)_ZNSt8_Rb_treeIlSt4pairIKliESt10_Select1stIS2_ESt4lessIlESaIS2_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJOlEESD_IJEEEEESt17_Rb_tree_iteratorIS2_ESt23_Rb_tree_const_iteratorIS2_EDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeIlSt4pairIKliESt10_Select1stIS2_ESt4lessIlESaIS2_EE24_M_get_insert_unique_posERS1_ at Base 2.5.3 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeIlSt4pairIKliESt10_Select1stIS2_ESt4lessIlESaIS2_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS2_ERS1_ at Base 2.5.3 + (optional=templinst|arch=amd64 arm64 hppa ia64 kfreebsd-amd64 m68k mips64el ppc64el riscv64 sparc64 x32)_ZNSt8_Rb_treeIlSt4pairIKliESt10_Select1stIS2_ESt4lessIlESaIS2_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorIS2_ERS1_ at Base 2.5.3 (optional=templinst)_ZNSt8_Rb_treeIlSt4pairIKliESt10_Select1stIS2_ESt4lessIlESaIS2_EE8_M_eraseEPSt13_Rb_tree_nodeIS2_E at Base 2.5.3 (optional=templinst)_ZSt11__make_heapIN9__gnu_cxx17__normal_iteratorIPNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt6vectorIS7_SaIS7_EEEENS0_5__ops15_Iter_comp_iterIPFbRKS7_SG_EEEEvT_SK_RT0_ at Base 2.6.7.3 (optional=templinst|subst)_ZSt13__adjust_heapIN9__gnu_cxx17__normal_iteratorIPNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt6vectorIS7_SaIS7_EEEE{ssize_t}S7_NS0_5__ops15_Iter_comp_iterIPFbRKS7_SG_EEEEvT_T0_SL_T1_T2_ at Base 2.6.4 @@ -341,11 +341,11 @@ libalgorithms.so.1 #PACKAGE# #MINVER# (optional=templinst)_ZSt16__insertion_sortIN9__gnu_cxx17__normal_iteratorIPNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt6vectorIS7_SaIS7_EEEENS0_5__ops15_Iter_comp_iterIPFbRKS7_SG_EEEEvT_SK_T0_ at Base 2.6.4 (optional=templinst)_ZSt16__insertion_sortIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEENS0_5__ops15_Iter_less_iterEEvT_S9_T0_ at Base 2.5.2 (optional=templinst|subst)_ZSt16__introsort_loopIN9__gnu_cxx17__normal_iteratorIPNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt6vectorIS7_SaIS7_EEEE{ssize_t}NS0_5__ops15_Iter_comp_iterIPFbRKS7_SG_EEEEvT_SK_T0_T1_ at Base 2.6.4 - (optional=templinst)_ZSt16__introsort_loopIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEElNS0_5__ops15_Iter_less_iterEEvT_S9_T0_T1_ at Base 2.6.7.3 + (optional=templinst|arch=!armel !armhf !i386 !m68k !mipsel !powerpc|subst)_ZSt16__introsort_loopIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEE{ssize_t}NS0_5__ops15_Iter_less_iterEEvT_S9_T0_T1_ at Base 2.6.7.3 (optional=templinst)_ZSt22__final_insertion_sortIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEENS0_5__ops15_Iter_less_iterEEvT_S9_T0_ at Base 2.6.7.3 (optional=templinst)_ZSt25__unguarded_linear_insertIN9__gnu_cxx17__normal_iteratorIPNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt6vectorIS7_SaIS7_EEEENS0_5__ops14_Val_comp_iterIPFbRKS7_SG_EEEEvT_T0_ at Base 2.6.4 (optional=templinst)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPKdSt6vectorIdSaIdEEEENS0_5__ops16_Iter_equals_valIS2_EEET_SB_SB_T0_St26random_access_iterator_tag at Base 2.5.2 - (optional=templinst|arch=armel armhf)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEENS0_5__ops16_Iter_equals_valIKdEEET_SB_SB_T0_St26random_access_iterator_tag at Base 2.6.4 + (optional=templinst|arch=armel armhf m68k)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEENS0_5__ops16_Iter_equals_valIKdEEET_SB_SB_T0_St26random_access_iterator_tag at Base 2.6.4 _ZTI11CostFactory at Base 2.5.4 _ZTI11ONE_CLASS_Q at Base 2.5.2 _ZTI13BadConversion at Base 2.5.2 ===================================== debian/libfileclasses1.symbols ===================================== @@ -1,8 +1,8 @@ -# SymbolsHelper-Confirmed: 2.6.7.3 amd64 +# SymbolsHelper-Confirmed: 2.6.7.6 armel armhf i386 m68k mipsel powerpc x32 libfileClasses.so.1 #PACKAGE# #MINVER# * Build-Depends-Package: pktools-dev _ZGVN4arma5DatumIdE3nanE at Base 2.6.7.3 - _ZGVN4arma5DatumIxE3nanE at Base 2.6.7.3 + (subst)_ZGVN4arma5DatumI{qptrdiff}E3nanE at Base 2.6.7.3 _ZN15FileReaderAscii4openERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE at Base 2.6.4 _ZN15FileReaderAscii5closeEv at Base 2.5.2 _ZN15FileReaderAscii7nrOfColEbb at Base 2.5.2 @@ -16,7 +16,7 @@ libfileClasses.so.1 #PACKAGE# #MINVER# _ZN15FileReaderAsciiD1Ev at Base 2.5.2 _ZN15FileReaderAsciiD2Ev at Base 2.5.2 _ZN4arma5DatumIdE3nanE at Base 2.6.7.3 - _ZN4arma5DatumIxE3nanE at Base 2.6.7.3 + (subst)_ZN4arma5DatumI{qptrdiff}E3nanE at Base 2.6.7.3 (optional=templinst)_ZNKSt5ctypeIcE8do_widenEc at Base 2.5.2 (optional=templinst)_ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE12_M_constructIPKcEEvT_S8_St20forward_iterator_tag at Base 2.6.7.3 (optional=templinst)_ZNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEED0Ev at Base 2.6.4 ===================================== debian/libimageclasses1.symbols ===================================== @@ -1,4 +1,4 @@ -# SymbolsHelper-Confirmed: 2.6.7.6 amd64 +# SymbolsHelper-Confirmed: 2.6.7.6 amd64 armel armhf i386 ia64 m68k mipsel powerpc riscv64 x32 libimageClasses.so.1 #PACKAGE# #MINVER# * Build-Depends-Package: pktools-dev (optional=templinst)_Z15getGDALDataTypeIdE12GDALDataTypev at Base 2.6.7 @@ -131,20 +131,20 @@ libimageClasses.so.1 #PACKAGE# #MINVER# _ZNK13ImgRasterGdal9geo2imageEddRdS0_ at Base 2.6.6 _ZNK13ImgRasterGdal9image2geoEddRdS0_ at Base 2.6.6 (optional=templinst)_ZNKSt5ctypeIcE8do_widenEc at Base 2.5.2 - (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EE17_M_default_appendE{size_t}@Base 2.6.7.3 (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EE8_M_eraseEN9__gnu_cxx17__normal_iteratorIPS5_S7_EE at Base 2.6.4 (optional=templinst)_ZNSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS5_EEaSERKS7_ at Base 2.6.4 - (optional=templinst)_ZNSt6vectorIP12OGRFieldDefnSaIS1_EE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorIP12OGRFieldDefnSaIS1_EE17_M_default_appendE{size_t}@Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIPvSaIS0_EE17_M_realloc_insertIJS0_EEEvN9__gnu_cxx17__normal_iteratorIPS0_S2_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIS_IfSaIfEESaIS1_EE17_M_realloc_insertIJRKS1_EEEvN9__gnu_cxx17__normal_iteratorIPS1_S3_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEE15_M_range_insertIN9__gnu_cxx17__normal_iteratorIPdS1_EEEEvS6_T_S7_St20forward_iterator_tag at Base 2.6.4 - (optional=templinst)_ZNSt6vectorIdSaIdEE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorIdSaIdEE17_M_default_appendE{size_t}@Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEE17_M_realloc_insertIJRKdEEEvN9__gnu_cxx17__normal_iteratorIPdS1_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIdSaIdEEaSERKS1_ at Base 2.5.2 (optional=templinst)_ZNSt6vectorIfSaIfEE12emplace_backIJfEEEvDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIfSaIfEE17_M_realloc_insertIJfEEEvN9__gnu_cxx17__normal_iteratorIPfS1_EEDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt6vectorIiSaIiEE17_M_realloc_insertIJiEEEvN9__gnu_cxx17__normal_iteratorIPiS1_EEDpOT_ at Base 2.6.7.3 - (optional=templinst)_ZNSt6vectorIsSaIsEE17_M_default_appendEm at Base 2.6.7.3 + (optional=templinst|subst)_ZNSt6vectorIsSaIsEE17_M_default_appendE{size_t}@Base 2.6.7.3 (optional=templinst)_ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE12_M_constructIPKcEEvT_S8_St20forward_iterator_tag at Base 2.6.7.3 (optional=templinst)_ZNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEE12_M_constructIPcEEvT_S7_St20forward_iterator_tag at Base 2.6.7.3 (optional=templinst)_ZNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEED0Ev at Base 2.6.4 @@ -152,15 +152,15 @@ libimageClasses.so.1 #PACKAGE# #MINVER# (optional=templinst)_ZNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEED2Ev at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_8Vector2dIfEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE22_M_emplace_hint_uniqueIJRKSt21piecewise_construct_tSt5tupleIJRS7_EESL_IJEEEEESt17_Rb_tree_iteratorISA_ESt23_Rb_tree_const_iteratorISA_EDpOT_ at Base 2.6.7.3 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_8Vector2dIfEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE24_M_get_insert_unique_posERS7_ at Base 2.6.4 - (optional=templinst|arch=amd64 arm64 hppa kfreebsd-amd64 mips64el ppc64el sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_8Vector2dIfEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 + (optional=templinst|arch=amd64 arm64 hppa ia64 kfreebsd-amd64 m68k mips64el ppc64el riscv64 sparc64 x32)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_8Vector2dIfEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE29_M_get_insert_hint_unique_posESt23_Rb_tree_const_iteratorISA_ERS7_ at Base 2.6.4 (optional=templinst)_ZNSt8_Rb_treeINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt4pairIKS5_8Vector2dIfEESt10_Select1stISA_ESt4lessIS5_ESaISA_EE8_M_eraseEPSt13_Rb_tree_nodeISA_E at Base 2.6.4 (optional=templinst|subst)_ZSt13__adjust_heapIN9__gnu_cxx17__normal_iteratorIPsSt6vectorIsSaIsEEEE{ssize_t}sNS0_5__ops15_Iter_less_iterEEvT_T0_SA_T1_T2_ at Base 2.5.2 (optional=templinst)_ZSt16__insertion_sortIN9__gnu_cxx17__normal_iteratorIPsSt6vectorIsSaIsEEEENS0_5__ops15_Iter_less_iterEEvT_S9_T0_ at Base 2.5.2 - (optional=templinst)_ZSt16__introsort_loopIN9__gnu_cxx17__normal_iteratorIPsSt6vectorIsSaIsEEEElNS0_5__ops15_Iter_less_iterEEvT_S9_T0_T1_ at Base 2.6.7.3 + (optional=templinst|subst)_ZSt16__introsort_loopIN9__gnu_cxx17__normal_iteratorIPsSt6vectorIsSaIsEEEE{ssize_t}NS0_5__ops15_Iter_less_iterEEvT_S9_T0_T1_ at Base 2.6.7.3 (optional=templinst)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt6vectorIS7_SaIS7_EEEENS0_5__ops16_Iter_equals_valIS8_EEET_SH_SH_T0_St26random_access_iterator_tag at Base 2.6.4 - (optional=templinst|arch=alpha armel armhf hurd-i386 i386 kfreebsd-i386 mips mipsel powerpc ppc64 s390x)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPKdSt6vectorIdSaIdEEEENS0_5__ops16_Iter_equals_valIS2_EEET_SB_SB_T0_St26random_access_iterator_tag at Base 2.5.3 + (optional=templinst|arch=alpha armel armhf hurd-i386 i386 kfreebsd-i386 m68k mips mipsel powerpc ppc64 s390x)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPKdSt6vectorIdSaIdEEEENS0_5__ops16_Iter_equals_valIS2_EEET_SB_SB_T0_St26random_access_iterator_tag at Base 2.5.3 (optional=templinst)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESt6vectorIS7_SaIS7_EEEENS0_5__ops16_Iter_equals_valIKS7_EEET_SH_SH_T0_St26random_access_iterator_tag at Base 2.6.4 - (optional=templinst|arch=armel armhf)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEENS0_5__ops16_Iter_equals_valIKdEEET_SB_SB_T0_St26random_access_iterator_tag at Base 2.6.4 + (optional=templinst|arch=armel armhf m68k)_ZSt9__find_ifIN9__gnu_cxx17__normal_iteratorIPdSt6vectorIdSaIdEEEENS0_5__ops16_Iter_equals_valIKdEEET_SB_SB_T0_St26random_access_iterator_tag at Base 2.6.4 _ZTI13ImgRasterGdal at Base 2.6.6 _ZTI13ImgReaderGdal at Base 2.6.6 _ZTI13ImgWriterGdal at Base 2.6.6 View it on GitLab: https://salsa.debian.org/debian-gis-team/pktools/compare/40e916e364d11273eec57669bc2a14074302781a...19bd3d4810b81c501d19f8dbde34f875dc965948 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pktools/compare/40e916e364d11273eec57669bc2a14074302781a...19bd3d4810b81c501d19f8dbde34f875dc965948 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:41:45 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:41:45 +0000 Subject: [Git][debian-gis-team/postgis][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923e5977023_46f62ac0fbe48aa427723c@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / postgis Commits: e9a25ad6 by Bas Couwenberg at 2019-09-30T17:41:35Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -2,6 +2,7 @@ postgis (3.0.0~beta1+dfsg-1~exp1) experimental; urgency=medium * New upstream beta release. * Update copyright for fuzzers sources. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 29 Sep 2019 08:00:25 +0200 ===================================== debian/control ===================================== @@ -35,7 +35,7 @@ Build-Depends: autoconf2.13, protobuf-c-compiler, rdfind, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/postgis Vcs-Git: https://salsa.debian.org/debian-gis-team/postgis.git -b experimental Homepage: http://postgis.net/ ===================================== debian/control.in ===================================== @@ -35,7 +35,7 @@ Build-Depends: autoconf2.13, protobuf-c-compiler, rdfind, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/postgis Vcs-Git: https://salsa.debian.org/debian-gis-team/postgis.git -b experimental Homepage: http://postgis.net/ View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/e9a25ad63461b7905aa9ed6d6caaef7888f5bb69 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis/commit/e9a25ad63461b7905aa9ed6d6caaef7888f5bb69 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:42:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:42:08 +0000 Subject: [Git][debian-gis-team/postgis-java][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923e7027661_46f63fbab6504488277490@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / postgis-java Commits: f0fe49f8 by Bas Couwenberg at 2019-09-30T17:41:56Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ postgis-java (1:2.3.0-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Tue, 25 Dec 2018 22:57:43 +0100 ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends-Indep: default-jdk-headless (>= 2:1.7) | java7-sdk-headless, maven-debian-helper, maven-repo-helper, maven -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/postgis-java Vcs-Git: https://salsa.debian.org/debian-gis-team/postgis-java.git Homepage: https://github.com/postgis/postgis-java View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis-java/commit/f0fe49f851830c6c1247a6a01a2ed21959beac6e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/postgis-java/commit/f0fe49f851830c6c1247a6a01a2ed21959beac6e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:42:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:42:30 +0000 Subject: [Git][debian-gis-team/pprepair][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923e8645383_46f62ac0ff19eb882776c9@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pprepair Commits: cf927047 by Bas Couwenberg at 2019-09-30T17:42:23Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ pprepair (0.0~20170614-dd91a21-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pprepair/ Vcs-Git: https://salsa.debian.org/debian-gis-team/pprepair.git Homepage: https://github.com/tudelft3d/pprepair View it on GitLab: https://salsa.debian.org/debian-gis-team/pprepair/commit/cf927047ad8f74002a9c31c2ed5e71bbd1b6391e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pprepair/commit/cf927047ad8f74002a9c31c2ed5e71bbd1b6391e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:42:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:42:43 +0000 Subject: [Git][debian-gis-team/prepair][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923e93ec0eb_46f63fbab65044882778f2@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / prepair Commits: 3b2a698d by Bas Couwenberg at 2019-09-30T17:42:37Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ prepair (0.7.1-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/prepair/ Vcs-Git: https://salsa.debian.org/debian-gis-team/prepair.git Homepage: https://github.com/tudelft3d/prepair View it on GitLab: https://salsa.debian.org/debian-gis-team/prepair/commit/3b2a698d378f8420612fd9ef28116dfea2f4803b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/prepair/commit/3b2a698d378f8420612fd9ef28116dfea2f4803b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:43:01 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:43:01 +0000 Subject: [Git][debian-gis-team/proj4js][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923ea5afe1d_46f62ac0ff19eb882780cc@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / proj4js Commits: c04b5fe6 by Bas Couwenberg at 2019-09-30T17:42:55Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -5,7 +5,7 @@ proj4js (2.4.0+ds-1~exp2) UNRELEASED; urgency=medium * Fix deprecated source override location. * Rename PROJ.4 to PROJ. * Update Vcs-* URLs for Salsa. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), dh-buildinfo, nodejs, node-uglify -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/proj4js/ Vcs-Git: https://salsa.debian.org/debian-gis-team/proj4js.git Homepage: http://proj4js.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/proj4js/commit/c04b5fe61930cb78773f2378b19017636bd1a759 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj4js/commit/c04b5fe61930cb78773f2378b19017636bd1a759 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:43:19 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:43:19 +0000 Subject: [Git][debian-gis-team/proj][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923eb7197f2_46f63fbab650448827826e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / proj Commits: d361c3ad by Bas Couwenberg at 2019-09-30T17:43:10Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +proj (6.2.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:43:09 +0200 + proj (6.2.0-1) unstable; urgency=medium * Update symbols for other architectures. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), sharutils, sqlite3, xz-utils -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/proj Vcs-Git: https://salsa.debian.org/debian-gis-team/proj.git Homepage: https://proj.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/commit/d361c3ad0eee404979273ca74b027153675fd3c3 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj/commit/d361c3ad0eee404979273ca74b027153675fd3c3 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:44:06 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:44:06 +0000 Subject: [Git][debian-gis-team/proj-rdnap][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923ee6d527c_46f62ac0f98dfce027847c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / proj-rdnap Commits: 226e8728 by Bas Couwenberg at 2019-09-30T17:43:55Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +proj-rdnap (2008-10) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Thu, 29 Aug 2019 13:06:57 +0200 + proj-rdnap (2008-9) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), libipc-run-perl, libjson-perl, libjson-xs-perl -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/proj-rdnap Vcs-Git: https://salsa.debian.org/debian-gis-team/proj-rdnap.git Homepage: https://www.kadaster.nl/transformatie-van-coordinaten View it on GitLab: https://salsa.debian.org/debian-gis-team/proj-rdnap/commit/226e8728a7aaa55327ae24ad51b4eccdfb8fd4dd -- View it on GitLab: https://salsa.debian.org/debian-gis-team/proj-rdnap/commit/226e8728a7aaa55327ae24ad51b4eccdfb8fd4dd You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:44:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:44:29 +0000 Subject: [Git][debian-gis-team/protozero][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923efd10535_46f62ac0f98dfce02786c7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / protozero Commits: 57732368 by Bas Couwenberg at 2019-09-30T17:44:21Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +protozero (1.6.8-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:44:20 +0200 + protozero (1.6.8-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9), libprotobuf-dev, protobuf-compiler, pkg-config -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/protozero/ Vcs-Git: https://salsa.debian.org/debian-gis-team/protozero.git Homepage: https://github.com/mapbox/protozero View it on GitLab: https://salsa.debian.org/debian-gis-team/protozero/commit/57732368ab418df084bded0d3cf3da91c144c4ac -- View it on GitLab: https://salsa.debian.org/debian-gis-team/protozero/commit/57732368ab418df084bded0d3cf3da91c144c4ac You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:44:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:44:46 +0000 Subject: [Git][debian-gis-team/pycoast][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923f0e8710d_46f62ac0fbe48aa42788e6@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pycoast Commits: 9b37e0e8 by Bas Couwenberg at 2019-09-30T17:44:39Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pycoast (1.2.3+dfsg-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:44:38 +0200 + pycoast (1.2.3+dfsg-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -14,7 +14,7 @@ Build-Depends: debhelper (>= 11), python3-setuptools, python3-six, python3-sphinx -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pycoast Vcs-Git: https://salsa.debian.org/debian-gis-team/pycoast.git Homepage: https://github.com/pytroll/pycoast View it on GitLab: https://salsa.debian.org/debian-gis-team/pycoast/commit/9b37e0e8d5f24ec68f99406777776cfbe6d64e2f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pycoast/commit/9b37e0e8d5f24ec68f99406777776cfbe6d64e2f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:45:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:45:04 +0000 Subject: [Git][debian-gis-team/pycsw][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923f20a510c_46f62ac0fe4cd06c279048@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pycsw Commits: 190e64f6 by Bas Couwenberg at 2019-09-30T17:44:56Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pycsw (2.4.1+dfsg-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:44:54 +0200 + pycsw (2.4.1+dfsg-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -28,7 +28,7 @@ Build-Depends: apache2-dev, python3-tz, python3-xmltodict, ronn -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pycsw Vcs-Git: https://salsa.debian.org/debian-gis-team/pycsw.git Homepage: http://pycsw.org View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/commit/190e64f6697514ff8d6c4fad5e6b1602249f7d9a -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pycsw/commit/190e64f6697514ff8d6c4fad5e6b1602249f7d9a You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:45:24 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:45:24 +0000 Subject: [Git][debian-gis-team/pydecorate][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923f34967b_46f62ac0fbe48aa4279216@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pydecorate Commits: 67059453 by Bas Couwenberg at 2019-09-30T17:45:15Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -2,7 +2,7 @@ pydecorate (0.2.1-2) UNRELEASED; urgency=medium * Team upload. * Update gbp.conf to use --source-only-changes by default. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 07 Jul 2019 09:38:02 +0200 ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 11), python3-pytest, python3-setuptools, python3-trollimage -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pydecorate Vcs-Git: https://salsa.debian.org/debian-gis-team/pydecorate.git Homepage: https://github.com/pytroll/pydecorate View it on GitLab: https://salsa.debian.org/debian-gis-team/pydecorate/commit/670594533c2f377d7d2a435924846563317b854b -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pydecorate/commit/670594533c2f377d7d2a435924846563317b854b You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:45:39 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:45:39 +0000 Subject: [Git][debian-gis-team/pyepr][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923f43b329f_46f62ac0fbe48aa42794b4@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyepr Commits: e556bbb5 by Bas Couwenberg at 2019-09-30T17:45:31Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pyepr (1.0.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:45:30 +0200 + pyepr (1.0.0-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -17,7 +17,7 @@ Build-Depends: debhelper (>= 12), python3-numpy-dbg, python3-sphinx, texlive-latex-extra -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyepr Vcs-Git: https://salsa.debian.org/debian-gis-team/pyepr.git Homepage: https://avalentino.github.com/pyepr View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/e556bbb5b2dc0b4282802cf8be6fc83899d1901d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyepr/commit/e556bbb5b2dc0b4282802cf8be6fc83899d1901d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:46:05 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:46:05 +0000 Subject: [Git][debian-gis-team/pygac][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923f5d41ae0_46f62ac0fbe48aa42797ba@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pygac Commits: 50e30023 by Bas Couwenberg at 2019-09-30T17:45:56Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pygac (1.1.0-4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:45:55 +0200 + pygac (1.1.0-3) unstable; urgency=medium * Correctly install the configuration file (Closes: #933536). ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper-compat (= 12), python3-pyorbital, python3-scipy, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pygac Vcs-Git: https://salsa.debian.org/debian-gis-team/pygac.git Homepage: https://github.com/pytroll/pygac View it on GitLab: https://salsa.debian.org/debian-gis-team/pygac/commit/50e30023d7244d2ce60db4eaff7eb7cb0781f7c6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pygac/commit/50e30023d7244d2ce60db4eaff7eb7cb0781f7c6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:46:37 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:46:37 +0000 Subject: [Git][debian-gis-team/pykdtree][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923f7d3c8cf_46f62ac0ff19eb8827993c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pykdtree Commits: a8e428a6 by Bas Couwenberg at 2019-09-30T17:46:23Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pykdtree (1.3.1-5) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:46:21 +0200 + pykdtree (1.3.1-4) unstable; urgency=medium * Update autopkgtest in debian/tests to use Python 3 instead of Python 2. ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: debhelper (>= 12), python3-nose, python3-numpy, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pykdtree Vcs-Git: https://salsa.debian.org/debian-gis-team/pykdtree.git Homepage: https://github.com/pytroll/pykdtree View it on GitLab: https://salsa.debian.org/debian-gis-team/pykdtree/commit/a8e428a663bec91a3240ae63b82f9f4fb5837fa4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pykdtree/commit/a8e428a663bec91a3240ae63b82f9f4fb5837fa4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:46:57 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:46:57 +0000 Subject: [Git][debian-gis-team/pyninjotiff][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923f916af8e_46f63fbab65044882801f3@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyninjotiff Commits: 5d2fba1f by Bas Couwenberg at 2019-09-30T17:46:47Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pyninjotiff (0.2.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:46:45 +0200 + pyninjotiff (0.2.0-1) unstable; urgency=medium [ Bas Couwenberg ] ===================================== debian/control ===================================== @@ -17,7 +17,7 @@ Build-Depends: debhelper-compat (= 12), python3-six, python3-trollimage, python3-xarray -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyninjotiff Vcs-Git: https://salsa.debian.org/debian-gis-team/pyninjotiff.git Homepage: https://github.com/pytroll/pyninjotiff View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/commit/5d2fba1f8ad8b308946808d14e5c0b4f3ca626f0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyninjotiff/commit/5d2fba1f8ad8b308946808d14e5c0b4f3ca626f0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:47:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:47:17 +0000 Subject: [Git][debian-gis-team/pylibtiff][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923fa5186f3_46f63fbab650448828036e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pylibtiff Commits: d5d53add by Bas Couwenberg at 2019-09-30T17:47:09Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -2,6 +2,7 @@ pylibtiff (0.4.2-7) UNRELEASED; urgency=medium * Drop obsolete get-orig-source script. * Set compat to 12. + * Bump Standards-Version to 4.4.1, no changes. -- Antonio Valentino Sun, 21 Jul 2019 23:55:32 +0200 ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 12), python3-bitarray, python3-numpy, python3-pytest -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pylibtiff Vcs-Git: https://salsa.debian.org/debian-gis-team/pylibtiff.git Homepage: https://github.com/pearu/pylibtiff View it on GitLab: https://salsa.debian.org/debian-gis-team/pylibtiff/commit/d5d53add7e706fd51cbccead7068a3b2d3037884 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pylibtiff/commit/d5d53add7e706fd51cbccead7068a3b2d3037884 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:47:37 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:47:37 +0000 Subject: [Git][debian-gis-team/pyorbital][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923fb93a439_46f62ac0fbe48aa4280562@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyorbital Commits: 0ca62386 by Bas Couwenberg at 2019-09-30T17:47:29Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ pyorbital (1.5.0-5) UNRELEASED; urgency=medium * Drop debian/compat file, and depend on debelper-compat. + * Bump Standards-Version to 4.4.1, no changes. -- Antonio Valentino Tue, 20 Aug 2019 19:40:32 +0000 ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper-compat (= 12), python3-setuptools, python3-sphinx, python3-xarray -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyorbital Vcs-Git: https://salsa.debian.org/debian-gis-team/pyorbital.git Homepage: https://github.com/pytroll/pyorbital View it on GitLab: https://salsa.debian.org/debian-gis-team/pyorbital/commit/0ca62386c12bc063a7a0545f9b303026644f0de4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyorbital/commit/0ca62386c12bc063a7a0545f9b303026644f0de4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:47:59 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:47:59 +0000 Subject: [Git][debian-gis-team/pyosmium][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923fcf853a4_46f62ac0f98dfce02807b7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyosmium Commits: f544c343 by Bas Couwenberg at 2019-09-30T17:47:51Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pyosmium (2.15.3-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:47:49 +0200 + pyosmium (2.15.3-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -21,7 +21,7 @@ Build-Depends: cmake (>= 2.8.12), python3-shapely, python3-sphinx, python3-sphinxcontrib.autoprogram -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyosmium/ Vcs-Git: https://salsa.debian.org/debian-gis-team/pyosmium.git Homepage: https://osmcode.org/pyosmium/ View it on GitLab: https://salsa.debian.org/debian-gis-team/pyosmium/commit/f544c3435b9cd9bcf01195e82c6790cec3a07374 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyosmium/commit/f544c3435b9cd9bcf01195e82c6790cec3a07374 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:48:22 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:48:22 +0000 Subject: [Git][debian-gis-team/pyresample][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d923fe6d3684_46f62ac0ff19eb882809d1@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyresample Commits: 81e67c13 by Bas Couwenberg at 2019-09-30T17:48:13Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pyresample (1.13.1-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:48:12 +0200 + pyresample (1.13.1-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -24,7 +24,7 @@ Build-Depends: cython3, python3-sphinx, python3-xarray, python3-yaml -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyresample Vcs-Git: https://salsa.debian.org/debian-gis-team/pyresample.git Homepage: https://github.com/pytroll/pyresample View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/81e67c1396b6a2c9f8428a3c33a04fe34c00d0cf -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyresample/commit/81e67c1396b6a2c9f8428a3c33a04fe34c00d0cf You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:48:49 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:48:49 +0000 Subject: [Git][debian-gis-team/pysal][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92400153afa_46f62ac0fe4cd06c2811da@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / pysal Commits: 21b17328 by Bas Couwenberg at 2019-09-30T17:48:39Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -2,7 +2,7 @@ pysal (2.1.0-1) UNRELEASED; urgency=medium * Team upload. * New upstream release. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add python3-{deprecated,descartes,tqdm} to (build) dependencies. -- Bas Couwenberg Mon, 29 Jul 2019 19:59:17 +0200 ===================================== debian/control ===================================== @@ -23,7 +23,7 @@ Build-Depends: debhelper (>= 9), python3-sklearn (>= 0.21.0), python3-statsmodels, python3-tqdm -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pysal Vcs-Git: https://salsa.debian.org/debian-gis-team/pysal.git -b experimental Homepage: https://pysal.readthedocs.org/en/latest/ View it on GitLab: https://salsa.debian.org/debian-gis-team/pysal/commit/21b17328f1f44ae945938075a3596809b655f397 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pysal/commit/21b17328f1f44ae945938075a3596809b655f397 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:49:11 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:49:11 +0000 Subject: [Git][debian-gis-team/pyshp][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9240171ddc0_46f62ac0f98dfce02813cf@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyshp Commits: 5bef89fe by Bas Couwenberg at 2019-09-30T17:49:01Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pyshp (2.1.0+ds-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:49:00 +0200 + pyshp (2.1.0+ds-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -8,7 +8,7 @@ Build-Depends: debhelper (>= 9), dh-python, python3-all, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyshp Vcs-Git: https://salsa.debian.org/debian-gis-team/pyshp.git Homepage: https://github.com/GeospatialPython/pyshp View it on GitLab: https://salsa.debian.org/debian-gis-team/pyshp/commit/5bef89fefab70a7667b85ed35b4afc09925d3476 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyshp/commit/5bef89fefab70a7667b85ed35b4afc09925d3476 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:49:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:49:29 +0000 Subject: [Git][debian-gis-team/pyspectral][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92402978a43_46f62ac0fbe48aa428154b@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pyspectral Commits: 456016b1 by Bas Couwenberg at 2019-09-30T17:49:21Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pyspectral (0.9.0+ds-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:49:20 +0200 + pyspectral (0.9.0+ds-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -23,7 +23,7 @@ Build-Depends: debhelper-compat (= 12), python3-tqdm, python3-xlrd, python3-yaml -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pyspectral Vcs-Git: https://salsa.debian.org/debian-gis-team/pyspectral.git Homepage: https://github.com/pytroll/pyspectral View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/commit/456016b18c6790fd177ed8aae56f95683a335ea7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pyspectral/commit/456016b18c6790fd177ed8aae56f95683a335ea7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:49:50 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:49:50 +0000 Subject: [Git][debian-gis-team/python-affine][master] 2 commits: Remove myself from Uploaders. Message-ID: <5d92403e165cf_46f62ac0fbe48aa428178@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-affine Commits: 36be9e2d by Bas Couwenberg at 2019-09-11T14:54:47Z Remove myself from Uploaders. - - - - - facbae75 by Bas Couwenberg at 2019-09-30T17:49:41Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,11 @@ +python-affine (2.3.0-2) UNRELEASED; urgency=medium + + * Team upload. + * Remove myself from Uploaders. + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Wed, 11 Sep 2019 16:54:19 +0200 + python-affine (2.3.0-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -1,7 +1,6 @@ Source: python-affine Maintainer: Debian GIS Project -Uploaders: Johan Van de Wauw , - Bas Couwenberg +Uploaders: Johan Van de Wauw Section: python Priority: optional Build-Depends: debhelper (>= 9), @@ -9,7 +8,7 @@ Build-Depends: debhelper (>= 9), python3-all, python3-pytest, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-affine Vcs-Git: https://salsa.debian.org/debian-gis-team/python-affine.git Homepage: https://github.com/sgillies/affine View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/compare/542a040e41296624866b749ab94d442479caea73...facbae75748e16dedaf3160c358b4a61b8b686d0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-affine/compare/542a040e41296624866b749ab94d442479caea73...facbae75748e16dedaf3160c358b4a61b8b686d0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:50:07 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:50:07 +0000 Subject: [Git][debian-gis-team/python-cartopy][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92404f7614a_46f62ac0ff19eb882819eb@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-cartopy Commits: df70e408 by Bas Couwenberg at 2019-09-30T17:50:00Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-cartopy (0.17.0+dfsg-7) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:49:59 +0200 + python-cartopy (0.17.0+dfsg-6) unstable; urgency=medium * Remove Python 2 from debian/tests. ===================================== debian/control ===================================== @@ -26,7 +26,7 @@ Build-Depends: cython3, python3-tk, xauth, xvfb -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-cartopy Vcs-Git: https://salsa.debian.org/debian-gis-team/python-cartopy.git Homepage: https://scitools.org.uk/cartopy/ View it on GitLab: https://salsa.debian.org/debian-gis-team/python-cartopy/commit/df70e4082bdbcc951d8c194e176acb200cdb1047 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-cartopy/commit/df70e4082bdbcc951d8c194e176acb200cdb1047 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:50:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:50:29 +0000 Subject: [Git][debian-gis-team/python-click-plugins][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924065f95f_46f62ac0ff19eb8828217e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-click-plugins Commits: 8701e678 by Bas Couwenberg at 2019-09-30T17:50:21Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-click-plugins (1.1.1-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:50:20 +0200 + python-click-plugins (1.1.1-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: debhelper (>= 9), python3-click, python3-pytest, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-click-plugins/ Vcs-Git: https://salsa.debian.org/debian-gis-team/python-click-plugins.git Homepage: https://github.com/click-contrib/click-plugins View it on GitLab: https://salsa.debian.org/debian-gis-team/python-click-plugins/commit/8701e67891d15df2c08b5b725b8854d3258de96d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-click-plugins/commit/8701e67891d15df2c08b5b725b8854d3258de96d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:50:52 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:50:52 +0000 Subject: [Git][debian-gis-team/python-cligj][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92407c106e9_46f62ac0ff19eb882823c5@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-cligj Commits: dd39ccdf by Bas Couwenberg at 2019-09-30T17:50:45Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-cligj (0.5.0-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:50:44 +0200 + python-cligj (0.5.0-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -8,7 +8,7 @@ Build-Depends: debhelper (>= 9), python3-setuptools, python3-click, python3-all, -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-cligj Vcs-Git: https://salsa.debian.org/debian-gis-team/python-cligj.git Homepage: https://github.com/mapbox/cligj View it on GitLab: https://salsa.debian.org/debian-gis-team/python-cligj/commit/dd39ccdf5797c690aa1a16ea07c3b92ae1c1d2e7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-cligj/commit/dd39ccdf5797c690aa1a16ea07c3b92ae1c1d2e7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:51:21 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:51:21 +0000 Subject: [Git][debian-gis-team/python-deprecated][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924099ca162_46f62ac0fbe48aa42827b0@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-deprecated Commits: 1e9579b2 by Bas Couwenberg at 2019-09-30T17:51:12Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-deprecated (1.2.6-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:51:11 +0200 + python-deprecated (1.2.6-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9), python3-sphinx, python3-wrapt, tox -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-deprecated Vcs-Git: https://salsa.debian.org/debian-gis-team/python-deprecated.git Homepage: https://github.com/tantale/deprecated View it on GitLab: https://salsa.debian.org/debian-gis-team/python-deprecated/commit/1e9579b2275a8a4ff925a8b70551761d6fcc8650 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-deprecated/commit/1e9579b2275a8a4ff925a8b70551761d6fcc8650 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:51:53 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:51:53 +0000 Subject: [Git][debian-gis-team/python-descartes][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9240b91236c_46f62ac0fe4cd06c28296a@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-descartes Commits: a405591d by Bas Couwenberg at 2019-09-30T17:51:44Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-descartes (1.1.0-4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:51:43 +0200 + python-descartes (1.1.0-3) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9), python3-matplotlib, python3-setuptools, python3-shapely -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-descartes/ Vcs-Git: https://salsa.debian.org/debian-gis-team/python-descartes.git Homepage: https://bitbucket.org/sgillies/descartes View it on GitLab: https://salsa.debian.org/debian-gis-team/python-descartes/commit/a405591ded1f830d76f92b3d41ce3393a0def739 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-descartes/commit/a405591ded1f830d76f92b3d41ce3393a0def739 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:52:19 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:52:19 +0000 Subject: [Git][debian-gis-team/python-geojson][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9240d379fec_46f62ac0fbe48aa42835e7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-geojson Commits: d9535c75 by Bas Couwenberg at 2019-09-30T17:52:06Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-geojson (2.5.0-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:52:04 +0200 + python-geojson (2.5.0-2) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), dh-python, python3-all, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-geojson Vcs-Git: https://salsa.debian.org/debian-gis-team/python-geojson.git Homepage: https://github.com/frewsxcv/python-geojson View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geojson/commit/d9535c75de8a30699eef23638992d89567b0a0dd -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geojson/commit/d9535c75de8a30699eef23638992d89567b0a0dd You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:52:58 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:52:58 +0000 Subject: [Git][debian-gis-team/python-geopandas][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9240fab6da8_46f62ac0f98dfce028371f@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-geopandas Commits: 819cc1d0 by Bas Couwenberg at 2019-09-30T17:52:50Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-geopandas (0.6.0-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:52:49 +0200 + python-geopandas (0.6.0-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -29,7 +29,7 @@ Build-Depends: debhelper (>= 9), python3-six, python3-shapely, python3-sqlalchemy -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-geopandas/ Vcs-Git: https://salsa.debian.org/debian-gis-team/python-geopandas.git Homepage: http://www.geopandas.org View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/commit/819cc1d0cb394d89b13aa85473e6ec80f507e0a0 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geopandas/commit/819cc1d0cb394d89b13aa85473e6ec80f507e0a0 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:53:19 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:53:19 +0000 Subject: [Git][debian-gis-team/python-geotiepoints][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92410fc2003_46f62ac0fe4cd06c284259@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-geotiepoints Commits: b1a28879 by Bas Couwenberg at 2019-09-30T17:53:12Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-geotiepoints (1.1.8-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:53:11 +0200 + python-geotiepoints (1.1.8-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -14,7 +14,7 @@ Build-Depends: debhelper-compat (= 12), python3-scipy, python3-setuptools, python3-xarray -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-geotiepoints Vcs-Git: https://salsa.debian.org/debian-gis-team/python-geotiepoints.git Homepage: https://github.com/pytroll/python-geotiepoints View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geotiepoints/commit/b1a2887979114d05014ed77ec4961974669aec59 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-geotiepoints/commit/b1a2887979114d05014ed77ec4961974669aec59 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:53:38 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:53:38 +0000 Subject: [Git][debian-gis-team/python-hdf4][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9241224d2fb_46f62ac0ff19eb882846a4@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-hdf4 Commits: 3ee42a11 by Bas Couwenberg at 2019-09-30T17:53:30Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-hdf4 (0.10.1-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:53:29 +0200 + python-hdf4 (0.10.1-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper-compat (= 12), python3-numpy, python3-setuptools, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-hdf4 Vcs-Git: https://salsa.debian.org/debian-gis-team/python-hdf4.git Homepage: https://github.com/fhs/python-hdf4 View it on GitLab: https://salsa.debian.org/debian-gis-team/python-hdf4/commit/3ee42a11889fbe7c64ea62aba3d2b61dc3dbc4f2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-hdf4/commit/3ee42a11889fbe7c64ea62aba3d2b61dc3dbc4f2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:53:57 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:53:57 +0000 Subject: [Git][debian-gis-team/python-mapnik][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924135c0e3e_46f62ac0fe4cd06c284838@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-mapnik Commits: 71c97978 by Bas Couwenberg at 2019-09-30T17:53:49Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-mapnik (1:0.0~20180723-588fc9062-4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:53:48 +0200 + python-mapnik (1:0.0~20180723-588fc9062-3) unstable; urgency=medium * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), python3-cairo, python3-cairo-dev, python3-nose -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-mapnik Vcs-Git: https://salsa.debian.org/debian-gis-team/python-mapnik.git Homepage: https://github.com/mapnik/python-mapnik View it on GitLab: https://salsa.debian.org/debian-gis-team/python-mapnik/commit/71c979787db6889695ef46a3a4057493e12d6f91 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-mapnik/commit/71c979787db6889695ef46a3a4057493e12d6f91 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:54:16 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:54:16 +0000 Subject: [Git][debian-gis-team/python-osmapi][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924148dc33c_46f62ac0fbe48aa428547c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-osmapi Commits: aaa0bc51 by Bas Couwenberg at 2019-09-30T17:54:09Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-osmapi (1.2.2-4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:54:08 +0200 + python-osmapi (1.2.2-3) unstable; urgency=medium * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -14,7 +14,7 @@ Build-Depends: debhelper (>= 9), python3-requests, python3-xmltodict, tox -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-osmapi Vcs-Git: https://salsa.debian.org/debian-gis-team/python-osmapi.git Homepage: https://wiki.openstreetmap.org/wiki/Osmapi View it on GitLab: https://salsa.debian.org/debian-gis-team/python-osmapi/commit/aaa0bc51b316bce2fa2ea54030be50e4f5ff1f65 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-osmapi/commit/aaa0bc51b316bce2fa2ea54030be50e4f5ff1f65 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:54:35 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:54:35 +0000 Subject: [Git][debian-gis-team/python-pdal][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92415b1560e_46f62ac0ff19eb882866aa@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pdal Commits: e17f6b2c by Bas Couwenberg at 2019-09-30T17:54:27Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-pdal (2.2.2+ds-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:54:26 +0200 + python-pdal (2.2.2+ds-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), python3-numpy, python3-packaging, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-pdal Vcs-Git: https://salsa.debian.org/debian-gis-team/python-pdal.git Homepage: https://github.com/PDAL/python View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/e17f6b2ca2bf4ab8d600ecd6cb368593e4167012 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/e17f6b2ca2bf4ab8d600ecd6cb368593e4167012 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:54:56 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:54:56 +0000 Subject: [Git][debian-gis-team/python-pyproj][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924170d8af4_46f62ac0ff19eb882868b6@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-pyproj Commits: b41e8309 by Bas Couwenberg at 2019-09-30T17:54:46Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-pyproj (2.4.0+ds-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:54:44 +0200 + python-pyproj (2.4.0+ds-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -16,7 +16,7 @@ Build-Depends: debhelper (>= 9), python3-pytest, python3-pytest-cov, python3-shapely -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-pyproj Vcs-Git: https://salsa.debian.org/debian-gis-team/python-pyproj.git Homepage: https://github.com/pyproj4/pyproj View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/b41e83092a448a6707e19a58316c6df25f113820 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pyproj/commit/b41e83092a448a6707e19a58316c6df25f113820 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:55:17 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:55:17 +0000 Subject: [Git][debian-gis-team/python-rtree][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924185bf118_46f62ac0fbe48aa42870a2@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-rtree Commits: 6c9dcbad by Bas Couwenberg at 2019-09-30T17:55:08Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-rtree (0.8.3+ds-5) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:55:07 +0200 + python-rtree (0.8.3+ds-4) unstable; urgency=medium * Rename doc-base document to python3-rtee to avoid conflict. ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9), python3-numpy, python3-setuptools, python3-sphinx -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-rtree Vcs-Git: https://salsa.debian.org/debian-gis-team/python-rtree.git Homepage: https://github.com/Toblerity/rtree View it on GitLab: https://salsa.debian.org/debian-gis-team/python-rtree/commit/6c9dcbadededf79206793cbfed3d3f0ef467b960 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-rtree/commit/6c9dcbadededf79206793cbfed3d3f0ef467b960 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:55:40 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:55:40 +0000 Subject: [Git][debian-gis-team/python-shapely][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92419c7c418_46f63fbab6504488287255@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-shapely Commits: 588f4b2c by Bas Couwenberg at 2019-09-30T17:55:31Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ python-shapely (1.6.4-4) UNRELEASED; urgency=medium * Update Homepage URL. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Tue, 30 Jul 2019 10:55:24 +0200 ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 9), python3-pytest, python3-setuptools, cython3 -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-shapely Vcs-Git: https://salsa.debian.org/debian-gis-team/python-shapely.git Homepage: https://github.com/Toblerity/Shapely View it on GitLab: https://salsa.debian.org/debian-gis-team/python-shapely/commit/588f4b2c06357d4a1abdca9af77119508a3185f6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-shapely/commit/588f4b2c06357d4a1abdca9af77119508a3185f6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:56:08 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:56:08 +0000 Subject: [Git][debian-gis-team/python-snuggs][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9241b8c9e32_46f62ac0ff19eb8828845@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-snuggs Commits: d13f3982 by Bas Couwenberg at 2019-09-30T17:55:57Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +python-snuggs (1.4.7-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:55:53 +0200 + python-snuggs (1.4.7-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), python3-pytest, python3-click, python3-pyparsing, -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-snuggs Vcs-Git: https://salsa.debian.org/debian-gis-team/python-snuggs.git Homepage: https://github.com/mapbox/snuggs View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/commit/d13f3982dba59c5531bc027771bfcfc9bb674f9f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-snuggs/commit/d13f3982dba59c5531bc027771bfcfc9bb674f9f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:56:33 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:56:33 +0000 Subject: [Git][debian-gis-team/python-stetl][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9241d145dda_46f63fbab6504488288632@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / python-stetl Commits: 4615e970 by Bas Couwenberg at 2019-09-30T17:56:23Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ python-stetl (2.0+ds-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Wed, 10 Jul 2019 19:12:25 +0200 ===================================== debian/control ===================================== @@ -23,7 +23,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/python-stetl Vcs-Git: https://salsa.debian.org/debian-gis-team/python-stetl.git Homepage: http://stetl.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/python-stetl/commit/4615e970add47a64bf3e1cfed5a5f938043c7d70 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/python-stetl/commit/4615e970add47a64bf3e1cfed5a5f938043c7d70 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:57:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:57:00 +0000 Subject: [Git][debian-gis-team/pytroll-schedule][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9241ecbd628_46f63fbab650448828907c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pytroll-schedule Commits: 8cc7b1e0 by Bas Couwenberg at 2019-09-30T17:56:53Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pytroll-schedule (0.5.2-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 19:56:52 +0200 + pytroll-schedule (0.5.2-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper-compat (= 12), python3-pyorbital, python3-pyresample, python3-setuptools -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pytroll-schedule Vcs-Git: https://salsa.debian.org/debian-gis-team/pytroll-schedule.git Homepage: https://github.com/pytroll/pytroll-schedule View it on GitLab: https://salsa.debian.org/debian-gis-team/pytroll-schedule/commit/8cc7b1e0dbb7d215a25f007c820f388f638e75ab -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pytroll-schedule/commit/8cc7b1e0dbb7d215a25f007c820f388f638e75ab You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:57:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:57:25 +0000 Subject: [Git][debian-gis-team/pywps][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924205d496e_46f62ac0fbe48aa428988e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / pywps Commits: 670503b0 by Bas Couwenberg at 2019-09-30T17:57:17Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +pywps (4.2.2-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:57:16 +0200 + pywps (4.2.2-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -23,7 +23,7 @@ Build-Depends: debhelper (>= 9), python3-sphinx, python3-sqlalchemy, python3-werkzeug -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/pywps Vcs-Git: https://salsa.debian.org/debian-gis-team/pywps.git Homepage: https://pywps.org View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/commit/670503b0f629f1480c803281e0684a476c42af26 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/pywps/commit/670503b0f629f1480c803281e0684a476c42af26 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:58:02 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:58:02 +0000 Subject: [Git][debian-gis-team/qmapshack][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92422a609c3_46f62ac0ff19eb882900f6@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / qmapshack Commits: 9016a322 by Bas Couwenberg at 2019-09-30T17:57:53Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,7 @@ qmapshack (1.13.2-2) UNRELEASED; urgency=medium * Update URLs for move to GitHub. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Thu, 19 Sep 2019 06:19:19 +0200 ===================================== debian/control ===================================== @@ -17,7 +17,7 @@ Build-Depends: cmake, qttools5-dev, qttools5-dev-tools, qtwebengine5-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/qmapshack Vcs-Git: https://salsa.debian.org/debian-gis-team/qmapshack.git Homepage: https://github.com/Maproom/qmapshack/wiki View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/9016a3222c66545d5dc088e492db144c017a4086 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/qmapshack/commit/9016a3222c66545d5dc088e492db144c017a4086 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:58:30 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:58:30 +0000 Subject: [Git][debian-gis-team/rasterio][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92424659514_46f62ac0fe4cd06c290270@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / rasterio Commits: a74a3707 by Bas Couwenberg at 2019-09-30T17:58:22Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +rasterio (1.0.28-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:58:20 +0200 + rasterio (1.0.28-1) unstable; urgency=medium * Team upload. ===================================== debian/control ===================================== @@ -21,7 +21,7 @@ Build-Depends: debhelper (>= 9), python3-pytest, python3-setuptools, python3-snuggs (>= 1.4.1) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/rasterio Vcs-Git: https://salsa.debian.org/debian-gis-team/rasterio.git Homepage: https://github.com/mapbox/rasterio View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/a74a3707a8946f64a79da9d046f9135ac31f92f6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rasterio/commit/a74a3707a8946f64a79da9d046f9135ac31f92f6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:59:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:59:00 +0000 Subject: [Git][debian-gis-team/readosm][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92426457462_46f62ac0ff19eb8829041c@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / readosm Commits: 12f78e04 by Bas Couwenberg at 2019-09-30T17:58:52Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ readosm (1.1.0+dfsg-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add Build-Depends-Package field to symbols file. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -11,7 +11,7 @@ Build-Depends: debhelper (>= 9.20160114), zlib1g-dev, doxygen, graphviz -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/readosm Vcs-Git: https://salsa.debian.org/debian-gis-team/readosm.git Homepage: https://www.gaia-gis.it/fossil/readosm/index View it on GitLab: https://salsa.debian.org/debian-gis-team/readosm/commit/12f78e04e2406eac88c2a8cb3317d5bb8bb93064 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/readosm/commit/12f78e04e2406eac88c2a8cb3317d5bb8bb93064 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:59:27 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:59:27 +0000 Subject: [Git][debian-gis-team/routino][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92427f1b8db_46f63fbab6504488290665@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / routino Commits: ba2d91d9 by Bas Couwenberg at 2019-09-30T17:59:18Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +routino (3.3.2-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 19:59:17 +0200 + routino (3.3.2-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper (>= 9.20160114), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/routino Vcs-Git: https://salsa.debian.org/debian-gis-team/routino.git Homepage: http://www.routino.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/commit/ba2d91d9b34471a0791d77f257fc00f5320985d2 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/routino/commit/ba2d91d9b34471a0791d77f257fc00f5320985d2 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 18:59:46 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 17:59:46 +0000 Subject: [Git][debian-gis-team/rtklib][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9242921ac14_46f62ac0fbe48aa429083e@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / rtklib Commits: 4e5e43c1 by Bas Couwenberg at 2019-09-30T17:59:37Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -8,7 +8,7 @@ rtklib (2.4.3+dfsg1-2) UNRELEASED; urgency=medium * Use standalone license paragraphs. * Reduce debhelper dependency version to 9, to match compat level. * Update Vcs-* URLs for Salsa. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sun, 28 Jan 2018 09:57:05 +0100 ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), qtbase5-dev-tools, libqt5serialport5-dev, libpng-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/rtklib Vcs-Git: https://salsa.debian.org/debian-gis-team/rtklib.git Homepage: http://gpspp.sakura.ne.jp/rtklib/rtklib.htm View it on GitLab: https://salsa.debian.org/debian-gis-team/rtklib/commit/4e5e43c190be413352cf54c8b084654a1c576204 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/rtklib/commit/4e5e43c190be413352cf54c8b084654a1c576204 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:00:14 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:00:14 +0000 Subject: [Git][debian-gis-team/ruby-hdfeos5][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9242ae241fd_46f63fbab65044882910ec@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / ruby-hdfeos5 Commits: db23262c by Bas Couwenberg at 2019-09-30T18:00:03Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ ruby-hdfeos5 (1.2-10) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9~), libhe5-hdfeos-dev, ruby-narray, ruby-narray-miss -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/ruby-hdfeos5 Vcs-Git: https://salsa.debian.org/debian-gis-team/ruby-hdfeos5.git Homepage: http://ruby.gfd-dennou.org/products/ruby-hdfeos5/ View it on GitLab: https://salsa.debian.org/debian-gis-team/ruby-hdfeos5/commit/db23262c9b50eb6c4e229bd3b65ec71b333a3969 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ruby-hdfeos5/commit/db23262c9b50eb6c4e229bd3b65ec71b333a3969 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:00:34 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:00:34 +0000 Subject: [Git][debian-gis-team/ruby-netcdf][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9242c236698_46f63fbab6504488291282@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / ruby-netcdf Commits: c422aa39 by Bas Couwenberg at 2019-09-30T18:00:25Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ ruby-netcdf (0.7.2-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sun, 05 Aug 2018 20:58:00 +0200 ===================================== debian/control ===================================== @@ -10,7 +10,7 @@ Build-Depends: debhelper (>= 9~), ruby-narray-miss, libnetcdf-dev, netcdf-bin -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/ruby-netcdf Vcs-Git: https://salsa.debian.org/debian-gis-team/ruby-netcdf.git Homepage: http://ruby.gfd-dennou.org/products/ruby-netcdf View it on GitLab: https://salsa.debian.org/debian-gis-team/ruby-netcdf/commit/c422aa39c80c0a094b7ae89afd975660c0f84dd7 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/ruby-netcdf/commit/c422aa39c80c0a094b7ae89afd975660c0f84dd7 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:01:00 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:01:00 +0000 Subject: [Git][debian-gis-team/saga][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9242dc278e1_46f62ac0fe4cd06c2914a7@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / saga Commits: f5a62d73 by Bas Couwenberg at 2019-09-30T18:00:52Z Bump Standards-Version to 4.4.1, no changes. - - - - - 3 changed files: - debian/changelog - debian/control - debian/control.in Changes: ===================================== debian/changelog ===================================== @@ -2,6 +2,7 @@ saga (7.3.0+dfsg-3) UNRELEASED; urgency=medium * Drop obsolete libsaga transitional package. (closes: #940781) + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Thu, 19 Sep 2019 18:14:05 +0200 ===================================== debian/control ===================================== @@ -24,7 +24,7 @@ Build-Depends: debhelper (>= 9), libwxgtk3.0-gtk3-dev, python3-dev, swig -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/saga Vcs-Git: https://salsa.debian.org/debian-gis-team/saga.git Homepage: http://www.saga-gis.org/ ===================================== debian/control.in ===================================== @@ -24,7 +24,7 @@ Build-Depends: debhelper (>= 9), libwxgtk3.0-gtk3-dev, python3-dev, swig -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/saga Vcs-Git: https://salsa.debian.org/debian-gis-team/saga.git Homepage: http://www.saga-gis.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/f5a62d737332003e2522569a3fcff93a591dce16 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/saga/commit/f5a62d737332003e2522569a3fcff93a591dce16 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:01:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:01:25 +0000 Subject: [Git][debian-gis-team/satpy][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9242f5ee8bc_46f63fbab6504488291661@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / satpy Commits: 7fb29cd2 by Bas Couwenberg at 2019-09-30T18:01:18Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +satpy (0.16.1-4) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 20:01:16 +0200 + satpy (0.16.1-3) unstable; urgency=medium * debian/patches: ===================================== debian/control ===================================== @@ -44,7 +44,7 @@ Build-Depends: debhelper-compat (= 12), python3-trollsift, python3-xarray, python3-yaml -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/satpy Vcs-Git: https://salsa.debian.org/debian-gis-team/satpy.git Homepage: https://github.com/pytroll/satpy View it on GitLab: https://salsa.debian.org/debian-gis-team/satpy/commit/7fb29cd24446afb56d9f47c9ecbc1830392a3460 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/satpy/commit/7fb29cd24446afb56d9f47c9ecbc1830392a3460 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:01:43 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:01:43 +0000 Subject: [Git][debian-gis-team/savi][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243075ab89_46f62ac0ff19eb882918d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / savi Commits: ef9a4d10 by Bas Couwenberg at 2019-09-30T18:01:36Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ savi (1.5.1-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update gbp.conf to use --source-only-changes by default. -- Bas Couwenberg Sun, 05 Aug 2018 20:59:08 +0200 ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), autotools-dev, tcl-dev, tk-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/savi Vcs-Git: https://salsa.debian.org/debian-gis-team/savi.git Homepage: http://savi.sourceforge.net/ View it on GitLab: https://salsa.debian.org/debian-gis-team/savi/commit/ef9a4d10917a7f7adf0decf9725475e18f67d493 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/savi/commit/ef9a4d10917a7f7adf0decf9725475e18f67d493 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:02:07 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:02:07 +0000 Subject: [Git][debian-gis-team/sfcgal][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92431fd2461_46f62ac0f98dfce0292033@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / sfcgal Commits: 7f088423 by Bas Couwenberg at 2019-09-30T18:02:00Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +sfcgal (1.3.7-3) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 20:01:59 +0200 + sfcgal (1.3.7-2) unstable; urgency=medium * Bump Standards-Version to 4.4.0, no changes. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9.20160114), libgmp-dev, pkg-kde-tools, chrpath -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/sfcgal Vcs-Git: https://salsa.debian.org/debian-gis-team/sfcgal.git Homepage: http://www.sfcgal.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/sfcgal/commit/7f088423f84f8b3c986a606bf105289cb1df569d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/sfcgal/commit/7f088423f84f8b3c986a606bf105289cb1df569d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:02:24 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:02:24 +0000 Subject: [Git][debian-gis-team/shapelib][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924330a9940_46f63fbab6504488292239@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / shapelib Commits: 8193eb44 by Bas Couwenberg at 2019-09-30T18:02:17Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ shapelib (1.5.0-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Wed, 10 Jul 2019 19:19:41 +0200 ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Priority: optional Build-Depends: debhelper (>= 9), dh-autoreconf, ronn -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/shapelib Vcs-Git: https://salsa.debian.org/debian-gis-team/shapelib.git Homepage: http://shapelib.maptools.org/ View it on GitLab: https://salsa.debian.org/debian-gis-team/shapelib/commit/8193eb448620872bcdf97f6be562dd3e89d7b8c9 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/shapelib/commit/8193eb448620872bcdf97f6be562dd3e89d7b8c9 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:02:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:02:41 +0000 Subject: [Git][debian-gis-team/snaphu][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243416b57f_46f62ac0ff19eb882924db@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / snaphu Commits: a123e7b8 by Bas Couwenberg at 2019-09-30T18:02:34Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +snaphu (2.0.1-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 20:02:33 +0200 + snaphu (2.0.1-1) unstable; urgency=medium [ Bas Couwenberg ] ===================================== debian/control ===================================== @@ -5,7 +5,7 @@ Section: non-free/science XS-Autobuild: no Priority: optional Build-Depends: debhelper-compat (= 12) -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/snaphu Vcs-Git: https://salsa.debian.org/debian-gis-team/snaphu.git Homepage: https://web.stanford.edu/group/radar/softwareandlinks/sw/snaphu/ View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/commit/a123e7b8429a217cfd61074a83c164e74b37d646 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/snaphu/commit/a123e7b8429a217cfd61074a83c164e74b37d646 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:02:59 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:02:59 +0000 Subject: [Git][debian-gis-team/sosi2osm][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243536c1fc_46f62ac0fbe48aa42928ed@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / sosi2osm Commits: 9e111182 by Bas Couwenberg at 2019-09-30T18:02:52Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -3,6 +3,7 @@ sosi2osm (1.0.0-7) UNRELEASED; urgency=medium * Team upload. * Add patch by Helmut Grohne to fix FTCBFS. (closes: #932065) + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 14 Jul 2019 18:20:34 +0200 ===================================== debian/control ===================================== @@ -8,7 +8,7 @@ Build-Depends: debhelper (>= 11), libfyba-dev, pkg-config, libproj-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/sosi2osm Vcs-Git: https://salsa.debian.org/debian-gis-team/sosi2osm.git Homepage: https://github.com/Gnonthgol/sosi2osm View it on GitLab: https://salsa.debian.org/debian-gis-team/sosi2osm/commit/9e111182ee2731909a8e3df495b94a72b41d5cb6 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/sosi2osm/commit/9e111182ee2731909a8e3df495b94a72b41d5cb6 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:03:20 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:03:20 +0000 Subject: [Git][debian-gis-team/spatialite][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243686adda_46f63fbab650448829305a@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / spatialite Commits: e6175f17 by Bas Couwenberg at 2019-09-30T18:03:12Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +spatialite (5.0.0~beta0-1~exp5) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 20:03:11 +0200 + spatialite (5.0.0~beta0-1~exp4) experimental; urgency=medium * Require at least librttopo-dev 1.1.0. ===================================== debian/control ===================================== @@ -18,7 +18,7 @@ Build-Depends: debhelper (>= 9.20160114), libxml2-dev, pkg-config, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/spatialite Vcs-Git: https://salsa.debian.org/debian-gis-team/spatialite.git -b experimental Homepage: https://www.gaia-gis.it/fossil/libspatialite View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite/commit/e6175f1753f04b930bb08647e4a6113219ccecae -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite/commit/e6175f1753f04b930bb08647e4a6113219ccecae You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:03:47 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:03:47 +0000 Subject: [Git][debian-gis-team/spatialite-gui][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924383a05dd_46f62ac0fbe48aa4293273@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / spatialite-gui Commits: b3ddafe9 by Bas Couwenberg at 2019-09-30T18:03:36Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +spatialite-gui (2.1.0~beta0+really2.1.0~beta0-1~exp6) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 20:03:35 +0200 + spatialite-gui (2.1.0~beta0+really2.1.0~beta0-1~exp5) experimental; urgency=medium * Switch to wxWidgets GTK 3 implementation. ===================================== debian/control ===================================== @@ -22,7 +22,7 @@ Build-Depends: debhelper (>= 9.20160114), pkg-config, wx-common, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/spatialite-gui Vcs-Git: https://salsa.debian.org/debian-gis-team/spatialite-gui.git -b experimental Homepage: https://www.gaia-gis.it/fossil/spatialite_gui/ View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/commit/b3ddafe9f3a503533a0649210154b177f9eb152f -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-gui/commit/b3ddafe9f3a503533a0649210154b177f9eb152f You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:04:09 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:04:09 +0000 Subject: [Git][debian-gis-team/spatialite-tools][experimental] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92439957c1b_46f62ac0fbe48aa4293448@godard.mail> Bas Couwenberg pushed to branch experimental at Debian GIS Project / spatialite-tools Commits: ec5776d6 by Bas Couwenberg at 2019-09-30T18:04:00Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +spatialite-tools (4.4.0~rc1-1~exp6) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Bas Couwenberg Mon, 30 Sep 2019 20:03:57 +0200 + spatialite-tools (4.4.0~rc1-1~exp5) experimental; urgency=medium * No change rebuild with PROJ 6. ===================================== debian/control ===================================== @@ -23,7 +23,7 @@ Build-Depends: debhelper (>= 9), docbook-xsl, docbook-xml, xsltproc -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/spatialite-tools Vcs-Git: https://salsa.debian.org/debian-gis-team/spatialite-tools.git -b experimental Homepage: https://www.gaia-gis.it/fossil/spatialite-tools/ View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-tools/commit/ec5776d64a69a1b42d4611aa68ce78f8fd10e2b5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/spatialite-tools/commit/ec5776d64a69a1b42d4611aa68ce78f8fd10e2b5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:04:29 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:04:29 +0000 Subject: [Git][debian-gis-team/tinyows][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243adeb26d_46f63fbab65044882936d8@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / tinyows Commits: 462a445d by Bas Couwenberg at 2019-09-30T18:04:20Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ tinyows (1.1.1-7) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Update watch file to limit matches to archive path. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -12,7 +12,7 @@ Build-Depends: debhelper (>= 9), libpq-dev, libxml2-dev (>= 2.8.0), postgis -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/tinyows Vcs-Git: https://salsa.debian.org/debian-gis-team/tinyows.git Homepage: http://www.mapserver.org/tinyows/ View it on GitLab: https://salsa.debian.org/debian-gis-team/tinyows/commit/462a445dc1088e1cd57ed06da7b41ebb5ddcbb2e -- View it on GitLab: https://salsa.debian.org/debian-gis-team/tinyows/commit/462a445dc1088e1cd57ed06da7b41ebb5ddcbb2e You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:04:48 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:04:48 +0000 Subject: [Git][debian-gis-team/totalopenstation][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243c0cf20d_46f62ac0f98dfce02938a2@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / totalopenstation Commits: 395c7b0e by Bas Couwenberg at 2019-09-30T18:04:41Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ totalopenstation (0.3.3-4) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add gbp.conf to use pristine-tar & --source-only-changes by default. -- Matteo F. Vescovi Thu, 05 Jul 2018 11:21:10 +0200 ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: python, python-setuptools, python-tk -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Homepage: http://tops.iosa.it/ Vcs-Git: https://salsa.debian.org/debian-gis-team/totalopenstation.git Vcs-Browser: https://salsa.debian.org/debian-gis-team/totalopenstation View it on GitLab: https://salsa.debian.org/debian-gis-team/totalopenstation/commit/395c7b0eda3f0b6798609f51fccd9c79d0cb010d -- View it on GitLab: https://salsa.debian.org/debian-gis-team/totalopenstation/commit/395c7b0eda3f0b6798609f51fccd9c79d0cb010d You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:05:12 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:05:12 +0000 Subject: [Git][debian-gis-team/trollimage][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243d81a70_46f62ac0fe4cd06c2940b3@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / trollimage Commits: 175e587e by Bas Couwenberg at 2019-09-30T18:05:05Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,3 +1,9 @@ +trollimage (1.10.1-2) UNRELEASED; urgency=medium + + * Bump Standards-Version to 4.4.1, no changes. + + -- Antonio Valentino Mon, 30 Sep 2019 20:05:04 +0200 + trollimage (1.10.1-1) unstable; urgency=medium * New upstream release. ===================================== debian/control ===================================== @@ -15,7 +15,7 @@ Build-Depends: debhelper-compat (= 12), python3-setuptools, python3-six, python3-xarray -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/trollimage Vcs-Git: https://salsa.debian.org/debian-gis-team/trollimage.git Homepage: https://github.com/pytroll/trollimage View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/175e587e4a1b6a9ee418a8e56d7e6adc8d2e69ce -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollimage/commit/175e587e4a1b6a9ee418a8e56d7e6adc8d2e69ce You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:05:32 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:05:32 +0000 Subject: [Git][debian-gis-team/trollsift][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9243ec2a52d_46f62ac0ff19eb88294274@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / trollsift Commits: f2a69281 by Bas Couwenberg at 2019-09-30T18:05:23Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -2,7 +2,7 @@ trollsift (0.3.2-2) UNRELEASED; urgency=medium * Team upload. * Update gbp.conf to use --source-only-changes by default. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. -- Bas Couwenberg Sun, 07 Jul 2019 10:22:36 +0200 ===================================== debian/control ===================================== @@ -9,7 +9,7 @@ Build-Depends: debhelper (>= 11), python3-all, python3-setuptools, python3-six -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/trollsift Vcs-Git: https://salsa.debian.org/debian-gis-team/trollsift.git Homepage: https://github.com/pytroll/trollsift View it on GitLab: https://salsa.debian.org/debian-gis-team/trollsift/commit/f2a69281f25fc12f7fcaa240ab8ec295bf5bccc5 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/trollsift/commit/f2a69281f25fc12f7fcaa240ab8ec295bf5bccc5 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:06:04 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:06:04 +0000 Subject: [Git][debian-gis-team/unarr][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92440c12d72_46f62ac0fbe48aa429441d@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / unarr Commits: 63a993f2 by Bas Couwenberg at 2019-09-30T18:05:57Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -2,7 +2,7 @@ unarr (1.0.1-2) UNRELEASED; urgency=medium * Team upload. * Add Build-Depends-Package field to symbols file. - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Remove package name from lintian overrides. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), cmake, libbz2-dev, zlib1g-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/unarr Vcs-Git: https://salsa.debian.org/debian-gis-team/unarr.git Homepage: https://github.com/selmf/unarr View it on GitLab: https://salsa.debian.org/debian-gis-team/unarr/commit/63a993f22bd0bffc67bed8b1f13f67887a134e01 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/unarr/commit/63a993f22bd0bffc67bed8b1f13f67887a134e01 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:06:25 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:06:25 +0000 Subject: [Git][debian-gis-team/virtualpg][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d924421ccfe8_46f62ac0fe4cd06c29463@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / virtualpg Commits: f9066d3a by Bas Couwenberg at 2019-09-30T18:06:18Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - debian/changelog - debian/control Changes: ===================================== debian/changelog ===================================== @@ -1,6 +1,6 @@ virtualpg (2.0.0~rc0-2) UNRELEASED; urgency=medium - * Bump Standards-Version to 4.4.0, no changes. + * Bump Standards-Version to 4.4.1, no changes. * Add Build-Depends-Package field to symbols file. * Update gbp.conf to use --source-only-changes by default. ===================================== debian/control ===================================== @@ -7,7 +7,7 @@ Build-Depends: debhelper (>= 9), dh-autoreconf, libpq-dev, libsqlite3-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/virtualpg Vcs-Git: https://salsa.debian.org/debian-gis-team/virtualpg.git Homepage: https://www.gaia-gis.it/fossil/virtualpg View it on GitLab: https://salsa.debian.org/debian-gis-team/virtualpg/commit/f9066d3a9d670e6a842ac9f80fcc2ca32a7c0d69 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/virtualpg/commit/f9066d3a9d670e6a842ac9f80fcc2ca32a7c0d69 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:06:41 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:06:41 +0000 Subject: [Git][debian-gis-team/zoo-project][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d92443128551_46f62ac0f98dfce0294860@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / zoo-project Commits: 098f3eb8 by Bas Couwenberg at 2019-09-30T18:06:34Z Bump Standards-Version to 4.4.1, no changes. - - - - - 1 changed file: - debian/control Changes: ===================================== debian/control ===================================== @@ -32,7 +32,7 @@ Build-Depends: debhelper (>= 9), python-all, python-dev Build-Conflicts: libcurl3-openssl-dev -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 Vcs-Browser: https://salsa.debian.org/debian-gis-team/zoo-project Vcs-Git: https://salsa.debian.org/debian-gis-team/zoo-project.git Homepage: http://zoo-project.org View it on GitLab: https://salsa.debian.org/debian-gis-team/zoo-project/commit/098f3eb850fa8c212f064d5ef664355ce2111efe -- View it on GitLab: https://salsa.debian.org/debian-gis-team/zoo-project/commit/098f3eb850fa8c212f064d5ef664355ce2111efe You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gitlab at salsa.debian.org Mon Sep 30 19:09:54 2019 From: gitlab at salsa.debian.org (Bas Couwenberg) Date: Mon, 30 Sep 2019 18:09:54 +0000 Subject: [Git][debian-gis-team/debian-gis-team.pages.debian.net][master] Bump Standards-Version to 4.4.1, no changes. Message-ID: <5d9244f253f2_46f62ac0ff19eb88296489@godard.mail> Bas Couwenberg pushed to branch master at Debian GIS Project / debian-gis-team.pages.debian.net Commits: c4986d72 by Bas Couwenberg at 2019-09-30T18:09:40Z Bump Standards-Version to 4.4.1, no changes. - - - - - 2 changed files: - policy.xml - public/policy/policy.html Changes: ===================================== policy.xml ===================================== @@ -2560,7 +2560,7 @@ Uploaders: John Doe <johndoe at example.com>, If no changes are needed, please indicate this fact in the changelog, and increment the value of the field. -Standards-Version: 4.4.0 +Standards-Version: 4.4.1 ===================================== public/policy/policy.html ===================================== @@ -43,7 +43,7 @@ Uploaders: John Doe <johndoe at example.com>
-Standards-Version: 4.4.0
+Standards-Version: 4.4.1
 

  • Homepage.  Should be documented whenever possible. View it on GitLab: https://salsa.debian.org/debian-gis-team/debian-gis-team.pages.debian.net/commit/c4986d7282b6311381b6a0bc9b3dd3bc445531e4 -- View it on GitLab: https://salsa.debian.org/debian-gis-team/debian-gis-team.pages.debian.net/commit/c4986d7282b6311381b6a0bc9b3dd3bc445531e4 You're receiving this email because of your account on salsa.debian.org. -------------- next part -------------- An HTML attachment was scrubbed... URL: From info at techadvisors.info Mon Sep 30 23:57:41 2019 From: info at techadvisors.info (Laura Scott) Date: Mon, 30 Sep 2019 22:57:41 +0000 Subject: Audio Transcription Service Provider Message-ID: An HTML attachment was scrubbed... URL: