[pyresample] 01/07: New upstream version 1.8.0

Antonio Valentino a_valentino-guest at moszumanska.debian.org
Mon Feb 5 07:26:58 UTC 2018


This is an automated email from the git hooks/post-receive script.

a_valentino-guest pushed a commit to branch master
in repository pyresample.

commit d34730461cbcaf39471f55e5e408457949e51d6d
Author: Antonio Valentino <antonio.valentino at tiscali.it>
Date:   Mon Feb 5 06:45:11 2018 +0000

    New upstream version 1.8.0
---
 .bumpversion.cfg                 |   2 +-
 changelog.rst                    |  91 +++++++++-
 docs/source/plot.rst             |  50 +++---
 docs/source/swath.rst            |   9 +
 pyresample/__init__.py           |  14 +-
 pyresample/geometry.py           | 222 ++++++++++-------------
 pyresample/kd_tree.py            | 371 ++++++++++++++++++---------------------
 pyresample/test/test_geometry.py |  94 +++-------
 pyresample/test/test_utils.py    |  18 ++
 pyresample/utils.py              |  83 ++++++---
 pyresample/version.py            |   2 +-
 setup.py                         |   5 +-
 12 files changed, 494 insertions(+), 467 deletions(-)

diff --git a/.bumpversion.cfg b/.bumpversion.cfg
index e1d2934..e10021a 100644
--- a/.bumpversion.cfg
+++ b/.bumpversion.cfg
@@ -1,5 +1,5 @@
 [bumpversion]
-current_version = 1.7.1
+current_version = 1.8.0
 commit = True
 tag = True
 
diff --git a/changelog.rst b/changelog.rst
index 2560491..4906438 100644
--- a/changelog.rst
+++ b/changelog.rst
@@ -2,6 +2,86 @@ Changelog
 =========
 
 
+v1.8.0 (2018-02-02)
+-------------------
+- update changelog. [Martin Raspaud]
+- Bump version: 1.7.1 → 1.8.0. [Martin Raspaud]
+- Merge branch 'develop' into new_release. [Martin Raspaud]
+- Merge pull request #95 from pytroll/bugfix-pyproj-version. [Martin
+  Raspaud]
+
+  Provide the minimum version of pyproj needed
+- Provide the minimum version of pyproj needed. [Martin Raspaud]
+- Merge pull request #94 from pytroll/optimize-xarray. [Martin Raspaud]
+
+  Optimize xarray
+- Add test for new wrap_and_check function. [davidh-ssec]
+- Rename chunk size environment variable to PYTROLL_CHUNK_SIZE. [davidh-
+  ssec]
+- Fix circular import between geometry and init's CHUNK_SIZE. [davidh-
+  ssec]
+- Revert import removal in init and add easy access imports. [davidh-
+  ssec]
+
+  Includes attempt to remove circular dependency between utils and
+  geometry module.
+
+- Use central CHUNK_SIZE constant for dask based operations. [davidh-
+  ssec]
+- Add `check_and_wrap` utility function and fix various docstring
+  issues. [davidh-ssec]
+- Remove tests for removed features. [davidh-ssec]
+- Remove longitude/latitude validity checks in BaseDefinition. [davidh-
+  ssec]
+
+  This was causing issues with dask based inputs and was a performance
+  penalty for all use cases even when the arrays were valid. Removing
+  this check should not affect 99% of users.
+
+- Combine dask operations to improve resampling performance. [davidh-
+  ssec]
+
+  Still a lot that could be done probably.
+
+- Fix dask minimum version number for meshgrid support. [davidh-ssec]
+- Add dask extra to setup.py to specify minimum dask version. [davidh-
+  ssec]
+
+  pyresample uses dask meshgrid which came in version 1.9
+
+- Merge pull request #86 from pytroll/feature-multiple-dims. [Martin
+  Raspaud]
+
+  [WIP] Feature multiple dims
+- Remove explicit chunksize. [Martin Raspaud]
+- Clean up with pep8. [Martin Raspaud]
+- Take care of coordinates when resampling. [Martin Raspaud]
+- Define default blocksizes for dask arrays. [Martin Raspaud]
+- Merge branch 'feature-optimize-dask' into feature-multiple-dims.
+  [Martin Raspaud]
+- Style cleanup. [Martin Raspaud]
+- Fix get_hashable_array for variations of np arrays. [Martin Raspaud]
+- Print warning when wrapping is needed independently of type. [Martin
+  Raspaud]
+- Change default blocksize to 5000. [Martin Raspaud]
+- Make use of dask's map_blocks. [Martin Raspaud]
+
+  Instead of writing our own array definitions
+- Revert "Make resampling lazy" [Martin Raspaud]
+
+  This reverts commit 5a4f9c342f9c8262c06c28986163fc682242ce75.
+
+- Make resampling lazy. [Martin Raspaud]
+- Revert yapf change. [Martin Raspaud]
+- Clean up code (pycodestyle, pydocstyle) [Martin Raspaud]
+- Make XR resampling work with more dimensions. [Martin Raspaud]
+- Merge pull request #91 from avalentino/issues/gh-090. [David Hoese]
+
+  Fix test_get_array_hashable on big-endian machines (closes #90)
+- Fix test_get_array_hashable on big-endian machines. [Antonio
+  Valentino]
+
+
 v1.7.1 (2017-12-21)
 -------------------
 - update changelog. [davidh-ssec]
@@ -612,7 +692,6 @@ v1.2.2 (2016-06-21)
   Without this, the compilation of the ewa extension crashes.
 
 
-
 v1.2.1 (2016-06-21)
 -------------------
 - update changelog. [Martin Raspaud]
@@ -768,11 +847,9 @@ v1.2.0 (2016-06-17)
 - Make kd_tree test work on older numpy version. [Martin Raspaud]
 
   VisibleDeprecationWarning is not available in numpy <1.9.
-
 - Adapt to newest pykdtree version. [Martin Raspaud]
 
   The kdtree object's attribute `data_pts` has been renamed to `data`.
-
 - Run tests on python 3.5 in travis also. [Martin Raspaud]
 
 
@@ -784,7 +861,6 @@ v1.1.6 (2016-02-25)
 
   A previous commit was looking for a 'data_pts' attribute in the kdtree
   object, which is available in pykdtree, but not scipy.
-
 - Merge pull request #32 from mitkin/master. [Martin Raspaud]
 
   [tests] Skip deprecation warnings in test_gauss_multi_uncert
@@ -796,7 +872,6 @@ v1.1.6 (2016-02-25)
   The latest matplotlib (1.5) doesn't support python 2.6 and 3.3. This patch
   chooses the right matplotlib version to install depending on the python
   version at hand.
-
 - Skip deprecation warnings. [Mikhail Itkin]
 
   Catch the rest of the warnings. Check if there is only one, and
@@ -838,7 +913,6 @@ Other
 - Bugfix to address a numpy DeprecationWarning. [Martin Raspaud]
 
   Numpy won't take non-integer indices soon, so make index an int.
-
 - Merge branch 'release-1.1.3' [Martin Raspaud]
 - Merge branch 'licence-lgpl' into pre-master. [Martin Raspaud]
 - Switch to lgplv3, and bump up version number. [Martin Raspaud]
@@ -1060,7 +1134,7 @@ Other
 - Set svn:mime-type. [StorPipfugl]
 - Corrected doc errors. [StorPipfugl]
 - Removed dist dir. [StorPipfugl]
--  [StorPipfugl]
+- No commit message. [StorPipfugl]
 - Updated documentation. New release. [StorPipfugl]
 - Started updating docstrings. [StorPipfugl]
 - Restructured API. [StorPipfugl]
@@ -1073,8 +1147,9 @@ Other
 - Removed unneeded function. [StorPipfugl]
 - Mime types set. [StorPipfugl]
 - Mime types set. [StorPipfugl]
--  [StorPipfugl]
+- No commit message. [StorPipfugl]
 - Moved to Google Code under GPLv3 license. [StorPipfugl]
 - moved to Google Code. [StorPipfugl]
 
 
+
diff --git a/docs/source/plot.rst b/docs/source/plot.rst
index d8efb36..7a8b223 100644
--- a/docs/source/plot.rst
+++ b/docs/source/plot.rst
@@ -14,16 +14,17 @@ The function **plot.save_quicklook** saves the Basemap image directly to file.
 
 .. doctest::
 
- >>> import numpy as np	
- >>> import pyresample as pr
+ >>> import numpy as np
+ >>> from pyresample import load_area, save_quicklook, SwathDefinition
+ >>> from pyresample.kd_tree import resample_nearest
  >>> lons = np.zeros(1000)
  >>> lats = np.arange(-80, -90, -0.01)
  >>> tb37v = np.arange(1000)
- >>> area_def = pr.utils.load_area('areas.cfg', 'ease_sh')
- >>> swath_def = pr.geometry.SwathDefinition(lons, lats)
- >>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def,
- ...                                      radius_of_influence=20000, fill_value=None)
- >>> pr.plot.save_quicklook('tb37v_quick.png', area_def, result, label='Tb 37v (K)')
+ >>> area_def = load_area('areas.cfg', 'ease_sh')
+ >>> swath_def = SwathDefinition(lons, lats)
+ >>> result = resample_nearest(swath_def, tb37v, area_def,
+ ...                           radius_of_influence=20000, fill_value=None)
+ >>> save_quicklook('tb37v_quick.png', area_def, result, label='Tb 37v (K)')
 
 Assuming **lons**, **lats** and **tb37v** are initialized with real data the result might look something like this:
   .. image:: _static/images/tb37v_quick.png
@@ -49,14 +50,15 @@ Assuming the file **areas.cfg** has the following area definition:
 **Example usage:**
 
  >>> import numpy as np 
- >>> import pyresample as pr
+ >>> from pyresample import load_area, save_quicklook, SwathDefinition
+ >>> from pyresample.kd_tree import resample_nearest
  >>> lons = np.zeros(1000)
  >>> lats = np.arange(-80, -90, -0.01)
  >>> tb37v = np.arange(1000)
- >>> area_def = pr.utils.load_area('areas.cfg', 'pc_world')
- >>> swath_def = pr.geometry.SwathDefinition(lons, lats)
- >>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
- >>> pr.plot.save_quicklook('tb37v_pc.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
+ >>> area_def = load_area('areas.cfg', 'pc_world')
+ >>> swath_def = SwathDefinition(lons, lats)
+ >>> result = resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
+ >>> save_quicklook('tb37v_pc.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
 
 Assuming **lons**, **lats** and **tb37v** are initialized with real data the result might look something like this:
   .. image:: _static/images/tb37v_pc.png
@@ -81,14 +83,15 @@ Assuming the file **areas.cfg** has the following area definition for an ortho p
 **Example usage:**
 
  >>> import numpy as np 
- >>> import pyresample as pr
+ >>> from pyresample import load_area, save_quicklook, SwathDefinition
+ >>> from pyresample.kd_tree import resample_nearest
  >>> lons = np.zeros(1000)
  >>> lats = np.arange(-80, -90, -0.01)
  >>> tb37v = np.arange(1000)
- >>> area_def = pr.utils.load_area('areas.cfg', 'ortho')
- >>> swath_def = pr.geometry.SwathDefinition(lons, lats)
- >>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
- >>> pr.plot.save_quicklook('tb37v_ortho.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
+ >>> area_def = load_area('areas.cfg', 'ortho')
+ >>> swath_def = SwathDefinition(lons, lats)
+ >>> result = resample_nearest(swath_def, tb37v, area_def, radius_of_influence=20000, fill_value=None)
+ >>> save_quicklook('tb37v_ortho.png', area_def, result, num_meridians=0, num_parallels=0, label='Tb 37v (K)')
 
 Assuming **lons**, **lats** and **tb37v** are initialized with real data the result might look something like this:
   .. image:: _static/images/tb37v_ortho.png
@@ -105,15 +108,16 @@ AreaDefintion using the **plot.area_def2basemap(area_def, **kwargs)** function.
 
  >>> import numpy as np	
  >>> import matplotlib.pyplot as plt
- >>> import pyresample as pr
+ >>> from pyresample import load_area, save_quicklook, area_def2basemap, SwathDefinition
+ >>> from pyresample.kd_tree import resample_nearest
  >>> lons = np.zeros(1000)
  >>> lats = np.arange(-80, -90, -0.01)
  >>> tb37v = np.arange(1000)
- >>> area_def = pr.utils.load_area('areas.cfg', 'ease_sh')
- >>> swath_def = pr.geometry.SwathDefinition(lons, lats)
- >>> result = pr.kd_tree.resample_nearest(swath_def, tb37v, area_def,
- ...                                      radius_of_influence=20000, fill_value=None)
- >>> bmap = pr.plot.area_def2basemap(area_def)
+ >>> area_def = load_area('areas.cfg', 'ease_sh')
+ >>> swath_def = SwathDefinition(lons, lats)
+ >>> result = resample_nearest(swath_def, tb37v, area_def,
+ ...                           radius_of_influence=20000, fill_value=None)
+ >>> bmap = area_def2basemap(area_def)
  >>> bmng = bmap.bluemarble()
  >>> col = bmap.imshow(result, origin='upper')
  >>> plt.savefig('tb37v_bmng.png', bbox_inches='tight')
diff --git a/docs/source/swath.rst b/docs/source/swath.rst
index 0e06930..90361fb 100644
--- a/docs/source/swath.rst
+++ b/docs/source/swath.rst
@@ -6,6 +6,15 @@ Resampling of swath data
 Pyresample can be used to resample a swath dataset to a grid, a grid to a swath or a swath to another swath. 
 Resampling can be done using nearest neighbour method, Guassian weighting, weighting with an arbitrary radial function.
 
+.. versionchanged:: 1.8.0
+
+    `SwathDefinition` no longer checks the validity of the provided longitude
+    and latitude coordinates to improve performance. Longitude arrays are
+    expected to be between -180 and 180 degrees, latitude -90 to 90 degrees.
+    This also applies to all geometry definitions that are provided longitude
+    and latitude arrays on initialization. Use
+    `pyresample.utils.check_and_wrap` to preprocess your arrays.
+
 pyresample.image
 ----------------
 The ImageContainerNearest and ImageContanerBilinear classes can be used for resampling of swaths as well as grids.  Below is an example using nearest neighbour resampling.
diff --git a/pyresample/__init__.py b/pyresample/__init__.py
index 7a79e45..b36aceb 100644
--- a/pyresample/__init__.py
+++ b/pyresample/__init__.py
@@ -15,18 +15,28 @@
 # You should have received a copy of the GNU Lesser General Public License along
 # with this program.  If not, see <http://www.gnu.org/licenses/>.
 
-from __future__ import absolute_import
+import os
+
+CHUNK_SIZE = os.getenv('PYTROLL_CHUNK_SIZE', 4096)
 
 from pyresample.version import __version__
+# Backwards compatibility
 from pyresample import geometry
 from pyresample import grid
 from pyresample import image
 from pyresample import kd_tree
 from pyresample import utils
 from pyresample import plot
+# Easy access
+from pyresample.geometry import (SwathDefinition,
+                                 AreaDefinition,
+                                 DynamicAreaDefinition)
+from pyresample.utils import load_area
+from pyresample.kd_tree import XArrayResamplerNN
+from pyresample.plot import save_quicklook, area_def2basemap
 
 __all__ = ['grid', 'image', 'kd_tree',
-           'utils', 'plot', 'geo_filter', 'geometry']
+           'utils', 'plot', 'geo_filter', 'geometry', 'CHUNK_SIZE']
 
 
 def get_capabilities():
diff --git a/pyresample/geometry.py b/pyresample/geometry.py
index 9044f0f..2e7f8be 100644
--- a/pyresample/geometry.py
+++ b/pyresample/geometry.py
@@ -31,7 +31,7 @@ import numpy as np
 import yaml
 from pyproj import Geod, Proj
 
-from pyresample import _spatial_mp, utils
+from pyresample import _spatial_mp, utils, CHUNK_SIZE
 
 try:
     from xarray import DataArray
@@ -41,11 +41,11 @@ except ImportError:
 logger = getLogger(__name__)
 
 
-class DimensionError(Exception):
+class DimensionError(ValueError):
     pass
 
 
-class IncompatibleAreas(Exception):
+class IncompatibleAreas(ValueError):
     """Error when the areas to combine are not compatible."""
 
 
@@ -63,7 +63,17 @@ class Boundary(object):
 
 class BaseDefinition(object):
 
-    """Base class for geometry definitions"""
+    """Base class for geometry definitions
+
+    .. versionchanged:: 1.8.0
+
+        `BaseDefinition` no longer checks the validity of the provided
+        longitude and latitude coordinates to improve performance. Longitude
+        arrays are expected to be between -180 and 180 degrees, latitude -90
+        to 90 degrees. Use `pyresample.utils.check_and_wrap` to preprocess
+        your arrays.
+
+    """
 
     def __init__(self, lons=None, lats=None, nprocs=1):
         if type(lons) != type(lats):
@@ -76,31 +86,8 @@ class BaseDefinition(object):
                 raise ValueError('lons and lats must have same shape')
 
         self.nprocs = nprocs
-
-        # check the latitutes
-        if lats is not None:
-            if isinstance(lats, np.ndarray) and (lats.min() < -90. or lats.max() > 90.):
-                raise ValueError(
-                    'Some latitudes are outside the [-90.;+90] validity range')
-            elif not isinstance(lats, np.ndarray):
-                # assume we have to mask an xarray
-                lats = lats.where((lats >= -90.) & (lats <= 90.))
         self.lats = lats
-
-        # check the longitudes
-        if lons is not None:
-            if isinstance(lons, np.ndarray) and (lons.min() < -180. or lons.max() >= +180.):
-                # issue warning
-                warnings.warn('All geometry objects expect longitudes in the [-180:+180[ range. ' +
-                              'We will now automatically wrap your longitudes into [-180:+180[, and continue. ' +
-                              'To avoid this warning next time, use routine utils.wrap_longitudes().')
-                # assume we have to mask an xarray
-                # wrap longitudes to [-180;+180[
-                lons = utils.wrap_longitudes(lons)
-            elif not isinstance(lons, np.ndarray):
-                lons = utils.wrap_longitudes(lons)
         self.lons = lons
-
         self.ndim = None
         self.cartesian_coords = None
 
@@ -133,26 +120,24 @@ class BaseDefinition(object):
         return not self.__eq__(other)
 
     def get_area_extent_for_subset(self, row_LR, col_LR, row_UL, col_UL):
-        """Retrieves area_extent for a subdomain
-        rows    are counted from upper left to lower left
-        columns are counted from upper left to upper right
-
-        :Parameters:
-        row_LR : int
-            row of the lower right pixel
-        col_LR : int
-            col of the lower right pixel
-        row_UL : int
-            row of the upper left pixel
-        col_UL : int
-            col of the upper left pixel
+        """Calculate extent for a subdomain of this area
 
-        :Returns:
-        area_extent : list
-            Area extent as a list (LL_x, LL_y, UR_x, UR_y) of the subset
+        Rows are counted from upper left to lower left and columns are
+        counted from upper left to upper right.
+
+        Args:
+            row_LR (int): row of the lower right pixel
+            col_LR (int): col of the lower right pixel
+            row_UL (int): row of the upper left pixel
+            col_UL (int): col of the upper left pixel
+
+        Returns:
+            area_extent (tuple):
+                Area extent (LL_x, LL_y, UR_x, UR_y) of the subset
+
+        Author:
+            Ulrich Hamann
 
-        :Author:
-        Ulrich Hamann
         """
 
         (a, b) = self.get_proj_coords(data_slice=(row_LR, col_LR))
@@ -162,7 +147,7 @@ class BaseDefinition(object):
         c = c + 0.5 * self.pixel_size_x
         d = d + 0.5 * self.pixel_size_y
 
-        return (a, b, c, d)
+        return a, b, c, d
 
     def get_lonlat(self, row, col):
         """Retrieve lon and lat of single pixel
@@ -454,7 +439,7 @@ def get_array_hashable(arr):
         try:
             return arr.name.encode('utf-8')  # dask array
         except AttributeError:
-            return arr.view(np.uint8)  # np array
+            return np.asarray(arr).view(np.uint8)  # np array
 
 
 class SwathDefinition(CoordinateDefinition):
@@ -511,7 +496,7 @@ class SwathDefinition(CoordinateDefinition):
 
         return self.hash
 
-    def get_lonlats_dask(self, blocksize=1000, dtype=None):
+    def get_lonlats_dask(self, chunks=CHUNK_SIZE):
         """Get the lon lats as a single dask array."""
         import dask.array as da
         lons, lats = self.get_lonlats()
@@ -520,9 +505,9 @@ class SwathDefinition(CoordinateDefinition):
             return lons.data, lats.data
         else:
             lons = da.from_array(np.asanyarray(lons),
-                                 chunks=blocksize * lons.ndim)
+                                 chunks=chunks)
             lats = da.from_array(np.asanyarray(lats),
-                                 chunks=blocksize * lats.ndim)
+                                 chunks=chunks)
         return lons, lats
 
     def _compute_omerc_parameters(self, ellipsoid):
@@ -967,37 +952,42 @@ class AreaDefinition(BaseDefinition):
 
         return self.get_xy_from_proj_coords(xm_, ym_)
 
-    def get_xy_from_proj_coords(self, xm_, ym_):
-        """Retrieve closest x and y coordinates (column, row indices) for a
-        location specified with projection coordinates (xm_,ym_) in meters.
-        A ValueError is raised, if the return point is outside the area domain. If
-        xm_,ym_ is a tuple of sequences of projection coordinates, a tuple of
-        masked arrays are returned.
+    def get_xy_from_proj_coords(self, xm, ym):
+        """Find closest grid cell index for a specified projection coordinate.
 
-        :Input:
-        xm_ : point or sequence (list or array) of x-coordinates in m (map projection)
-        ym_ : point or sequence (list or array) of y-coordinates in m (map projection)
+        If xm, ym is a tuple of sequences of projection coordinates, a tuple
+        of masked arrays are returned.
+
+        Args:
+            xm (list or array): point or sequence of x-coordinates in
+                                 meters (map projection)
+            ym (list or array): point or sequence of y-coordinates in
+                                 meters (map projection)
+
+        Returns:
+            x, y : column and row grid cell indexes as 2 scalars or arrays
+
+        Raises:
+            ValueError: if the return point is outside the area domain
 
-        :Returns:
-        (x, y) : tuple of integer points/arrays
         """
 
-        if isinstance(xm_, list):
-            xm_ = np.array(xm_)
-        if isinstance(ym_, list):
-            ym_ = np.array(ym_)
+        if isinstance(xm, list):
+            xm = np.array(xm)
+        if isinstance(ym, list):
+            ym = np.array(ym)
 
-        if ((isinstance(xm_, np.ndarray) and
-             not isinstance(ym_, np.ndarray)) or
-            (not isinstance(xm_, np.ndarray) and
-             isinstance(ym_, np.ndarray))):
-            raise ValueError("Both projection coordinates xm_ and ym_ needs to be of " +
+        if ((isinstance(xm, np.ndarray) and
+             not isinstance(ym, np.ndarray)) or
+            (not isinstance(xm, np.ndarray) and
+             isinstance(ym, np.ndarray))):
+            raise ValueError("Both projection coordinates xm and ym needs to be of " +
                              "the same type and have the same dimensions!")
 
-        if isinstance(xm_, np.ndarray) and isinstance(ym_, np.ndarray):
-            if xm_.shape != ym_.shape:
+        if isinstance(xm, np.ndarray) and isinstance(ym, np.ndarray):
+            if xm.shape != ym.shape:
                 raise ValueError(
-                    "projection coordinates xm_ and ym_ is not of the same shape!")
+                    "projection coordinates xm and ym is not of the same shape!")
 
         upl_x = self.area_extent[0]
         upl_y = self.area_extent[3]
@@ -1007,8 +997,8 @@ class AreaDefinition(BaseDefinition):
         yscale = (self.area_extent[1] -
                   self.area_extent[3]) / float(self.y_size)
 
-        x__ = (xm_ - upl_x) / xscale
-        y__ = (ym_ - upl_y) / yscale
+        x__ = (xm - upl_x) / xscale
+        y__ = (ym - upl_y) / yscale
 
         if isinstance(x__, np.ndarray) and isinstance(y__, np.ndarray):
             mask = (((x__ < 0) | (x__ > self.x_size)) |
@@ -1038,36 +1028,25 @@ class AreaDefinition(BaseDefinition):
 
         return self.get_lonlats(nprocs=None, data_slice=(row, col))
 
-    def get_proj_coords_dask(self, blocksize, dtype=None):
-        from dask.base import tokenize
-        from dask.array import Array
+    def get_proj_vectors_dask(self, chunks=CHUNK_SIZE, dtype=None):
+        import dask.array as da
         if dtype is None:
             dtype = self.dtype
 
-        vchunks = range(0, self.y_size, blocksize)
-        hchunks = range(0, self.x_size, blocksize)
-
-        token = tokenize(blocksize)
-        name = 'get_proj_coords-' + token
-
-        dskx = {(name, i, j, 0): (self.get_proj_coords_array,
-                                  (slice(vcs, min(vcs + blocksize, self.y_size)),
-                                   slice(hcs, min(hcs + blocksize, self.x_size))),
-                                  False, dtype)
-                for i, vcs in enumerate(vchunks)
-                for j, hcs in enumerate(hchunks)
-                }
-
-        res = Array(dskx, name, shape=list(self.shape) + [2],
-                    chunks=(blocksize, blocksize, 2),
-                    dtype=dtype)
-        return res
+        target_x = da.arange(self.x_size, chunks=chunks, dtype=dtype) * \
+            self.pixel_size_x + self.pixel_upper_left[0]
+        target_y = da.arange(self.y_size, chunks=chunks, dtype=dtype) * - \
+            self.pixel_size_y + self.pixel_upper_left[1]
+        return target_x, target_y
 
-    def get_proj_coords_array(self, data_slice=None, cache=False, dtype=None):
-        return np.dstack(self.get_proj_coords(data_slice, cache, dtype))
+    def get_proj_coords_dask(self, chunks=CHUNK_SIZE, dtype=None):
+        # TODO: Add rotation
+        import dask.array as da
+        target_x, target_y = self.get_proj_vectors_dask(chunks, dtype)
+        return da.meshgrid(target_x, target_y)
 
     def get_proj_coords(self, data_slice=None, cache=False, dtype=None):
-        """Get projection coordinates of grid
+        """Get projection coordinates of grid.
 
         Parameters
         ----------
@@ -1080,12 +1059,12 @@ class AreaDefinition(BaseDefinition):
         -------
         (target_x, target_y) : tuple of numpy arrays
             Grids of area x- and y-coordinates in projection units
-        """
 
+        """
         def do_rotation(xspan, yspan, rot_deg=0):
             rot_rad = np.radians(rot_deg)
             rot_mat = np.array([[np.cos(rot_rad),  np.sin(rot_rad)],
-                          [-np.sin(rot_rad), np.cos(rot_rad)]])
+                                [-np.sin(rot_rad), np.cos(rot_rad)]])
             x, y = np.meshgrid(xspan, yspan)
             return np.einsum('ji, mni -> jmn', rot_mat, np.dstack([x, y]))
 
@@ -1226,37 +1205,22 @@ class AreaDefinition(BaseDefinition):
                 Coordinate(corner_lons[2], corner_lats[2]),
                 Coordinate(corner_lons[3], corner_lats[3])]
 
-    def get_lonlats_dask(self, blocksize=1000, dtype=None):
-        from dask.base import tokenize
-        from dask.array import Array
-        import pyproj
+    def get_lonlats_dask(self, chunks=CHUNK_SIZE, dtype=None):
+        from dask.array import map_blocks
 
         dtype = dtype or self.dtype
-        proj_coords = self.get_proj_coords_dask(blocksize, dtype)
-        target_x, target_y = proj_coords[:, :, 0], proj_coords[:, :, 1]
+        target_x, target_y = self.get_proj_coords_dask(chunks, dtype)
 
-        target_proj = pyproj.Proj(**self.proj_dict)
+        target_proj = Proj(**self.proj_dict)
 
         def invproj(data1, data2):
-            return np.dstack(target_proj(data1.compute(), data2.compute(), inverse=True))
-        token = tokenize(str(self), blocksize, dtype)
-        name = 'get_lonlats-' + token
-
-        vchunks = range(0, self.y_size, blocksize)
-        hchunks = range(0, self.x_size, blocksize)
-
-        dsk = {(name, i, j, 0): (invproj,
-                                 target_x[slice(vcs, min(vcs + blocksize, self.y_size)),
-                                          slice(hcs, min(hcs + blocksize, self.x_size))],
-                                 target_y[slice(vcs, min(vcs + blocksize, self.y_size)),
-                                          slice(hcs, min(hcs + blocksize, self.x_size))])
-               for i, vcs in enumerate(vchunks)
-               for j, hcs in enumerate(hchunks)
-               }
-
-        res = Array(dsk, name, shape=list(self.shape) + [2],
-                    chunks=(blocksize, blocksize, 2),
-                    dtype=dtype)
+            return np.dstack(target_proj(data1, data2, inverse=True))
+
+        res = map_blocks(invproj, target_x, target_y, chunks=(target_x.chunks[0],
+                                                              target_x.chunks[1],
+                                                              2),
+                         new_axis=[2])
+
         return res[:, :, 0], res[:, :, 1]
 
     def get_lonlats(self, nprocs=None, data_slice=None, cache=False, dtype=None):
@@ -1448,13 +1412,13 @@ class StackedAreaDefinition(BaseDefinition):
 
         return self.lons, self.lats
 
-    def get_lonlats_dask(self, blocksize=1000, dtype=None):
+    def get_lonlats_dask(self, chunks=CHUNK_SIZE, dtype=None):
         """"Return lon and lat dask arrays of the area."""
         import dask.array as da
         llons = []
         llats = []
         for definition in self.defs:
-            lons, lats = definition.get_lonlats_dask(blocksize=blocksize,
+            lons, lats = definition.get_lonlats_dask(chunks=chunks,
                                                      dtype=dtype)
 
             llons.append(lons)
diff --git a/pyresample/kd_tree.py b/pyresample/kd_tree.py
index 0f9f54c..8c89d79 100644
--- a/pyresample/kd_tree.py
+++ b/pyresample/kd_tree.py
@@ -6,7 +6,7 @@
 # This program is free software: you can redistribute it and/or modify it under
 # the terms of the GNU Lesser General Public License as published by the Free
 # Software Foundation, either version 3 of the License, or
-#(at your option) any later version.
+# (at your option) any later version.
 #
 # This program is distributed in the hope that it will be useful, but WITHOUT
 # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
@@ -15,7 +15,6 @@
 #
 # You should have received a copy of the GNU Lesser General Public License along
 # with this program.  If not, see <http://www.gnu.org/licenses/>.
-
 """Handles reprojection of geolocated data. Several types of resampling are
 supported"""
 
@@ -28,7 +27,7 @@ from logging import getLogger
 
 import numpy as np
 
-from pyresample import _spatial_mp, data_reduce, geometry
+from pyresample import _spatial_mp, data_reduce, geometry, CHUNK_SIZE
 
 logger = getLogger(__name__)
 
@@ -56,7 +55,7 @@ except ImportError:
         raise ImportError('Either pykdtree or scipy must be available')
 
 
-class EmptyResult(Exception):
+class EmptyResult(ValueError):
     pass
 
 
@@ -67,9 +66,15 @@ def which_kdtree():
     return kd_tree_name
 
 
-def resample_nearest(source_geo_def, data, target_geo_def,
-                     radius_of_influence, epsilon=0,
-                     fill_value=0, reduce_data=True, nprocs=1, segments=None):
+def resample_nearest(source_geo_def,
+                     data,
+                     target_geo_def,
+                     radius_of_influence,
+                     epsilon=0,
+                     fill_value=0,
+                     reduce_data=True,
+                     nprocs=1,
+                     segments=None):
     """Resamples data using kd-tree nearest neighbour approach
 
     Parameters
@@ -113,8 +118,9 @@ def resample_nearest(source_geo_def, data, target_geo_def,
 
 def resample_gauss(source_geo_def, data, target_geo_def,
                    radius_of_influence, sigmas, neighbours=8, epsilon=0,
-                   fill_value=0, reduce_data=True, nprocs=1, segments=None, with_uncert=False):
-    """Resamples data using kd-tree gaussian weighting neighbour approach
+                   fill_value=0, reduce_data=True, nprocs=1, segments=None,
+                   with_uncert=False):
+    """Resamples data using kd-tree gaussian weighting neighbour approach.
 
     Parameters
     ----------
@@ -159,8 +165,8 @@ def resample_gauss(source_geo_def, data, target_geo_def,
         Source data resampled to target geometry.
         Weighted standard devaition for all pixels having more than one source value
         Counts of number of source values used in weighting per pixel
-    """
 
+    """
     def gauss(sigma):
         # Return gauss function object
         return lambda r: np.exp(-r ** 2 / float(sigma) ** 2)
@@ -395,8 +401,11 @@ def get_neighbour_info(source_geo_def, target_geo_def, radius_of_influence,
     return valid_input_index, valid_output_index, index_array, distance_array
 
 
-def _get_valid_input_index(source_geo_def, target_geo_def, reduce_data,
-                           radius_of_influence, nprocs=1):
+def _get_valid_input_index(source_geo_def,
+                           target_geo_def,
+                           reduce_data,
+                           radius_of_influence,
+                           nprocs=1):
     """Find indices of reduced inputput data"""
 
     source_lons, source_lats = source_geo_def.get_lonlats(nprocs=nprocs)
@@ -433,7 +442,7 @@ def _get_valid_input_index(source_geo_def, target_geo_def, reduce_data,
                     source_lons, source_lats,
                     radius_of_influence)
 
-    if(isinstance(valid_input_index, np.ma.core.MaskedArray)):
+    if (isinstance(valid_input_index, np.ma.core.MaskedArray)):
         # Make sure valid_input_index is not a masked array
         valid_input_index = valid_input_index.filled(False)
 
@@ -473,9 +482,11 @@ def _get_valid_output_index(source_geo_def, target_geo_def, target_lons,
     return valid_output_index
 
 
-def _create_resample_kdtree(source_lons, source_lats, valid_input_index, nprocs=1):
+def _create_resample_kdtree(source_lons,
+                            source_lats,
+                            valid_input_index,
+                            nprocs=1):
     """Set up kd tree on input"""
-
     """
     if not isinstance(source_geo_def, geometry.BaseDefinition):
         raise TypeError('source_geo_def must be of geometry type')
@@ -494,8 +505,8 @@ def _create_resample_kdtree(source_lons, source_lats, valid_input_index, nprocs=
     else:
         cartesian = _spatial_mp.Cartesian()
 
-    input_coords = cartesian.transform_lonlats(
-        source_lons_valid, source_lats_valid)
+    input_coords = cartesian.transform_lonlats(source_lons_valid,
+                                               source_lats_valid)
 
     if input_coords.size == 0:
         raise EmptyResult('No valid data points in input data')
@@ -504,17 +515,22 @@ def _create_resample_kdtree(source_lons, source_lats, valid_input_index, nprocs=
     if kd_tree_name == 'pykdtree':
         resample_kdtree = KDTree(input_coords)
     elif nprocs > 1:
-        resample_kdtree = _spatial_mp.cKDTree_MP(input_coords,
-                                                 nprocs=nprocs)
+        resample_kdtree = _spatial_mp.cKDTree_MP(input_coords, nprocs=nprocs)
     else:
         resample_kdtree = sp.cKDTree(input_coords)
 
     return resample_kdtree
 
 
-def _query_resample_kdtree(resample_kdtree, source_geo_def, target_geo_def,
-                           radius_of_influence, data_slice,
-                           neighbours=8, epsilon=0, reduce_data=True, nprocs=1):
+def _query_resample_kdtree(resample_kdtree,
+                           source_geo_def,
+                           target_geo_def,
+                           radius_of_influence,
+                           data_slice,
+                           neighbours=8,
+                           epsilon=0,
+                           reduce_data=True,
+                           nprocs=1):
     """Query kd-tree on slice of target coordinates"""
 
     # Check validity of input
@@ -548,8 +564,8 @@ def _query_resample_kdtree(resample_kdtree, source_geo_def, target_geo_def,
     target_lons_valid = target_lons.ravel()[valid_output_index]
     target_lats_valid = target_lats.ravel()[valid_output_index]
 
-    output_coords = cartesian.transform_lonlats(
-        target_lons_valid, target_lats_valid)
+    output_coords = cartesian.transform_lonlats(target_lons_valid,
+                                                target_lats_valid)
 
     # pykdtree requires query points have same data type as kdtree.
     try:
@@ -890,10 +906,15 @@ def get_sample_from_neighbour_info(resample_type, output_shape, data,
 
 
 class XArrayResamplerNN(object):
-
-    def __init__(self, source_geo_def, target_geo_def, radius_of_influence,
-                 neighbours=8, epsilon=0, reduce_data=True,
-                 nprocs=1, segments=None):
+    def __init__(self,
+                 source_geo_def,
+                 target_geo_def,
+                 radius_of_influence,
+                 neighbours=8,
+                 epsilon=0,
+                 reduce_data=True,
+                 nprocs=1,
+                 segments=None):
         """
         Parameters
         ----------
@@ -939,229 +960,175 @@ class XArrayResamplerNN(object):
         y_coords = R * da.cos(da.deg2rad(lats)) * da.sin(da.deg2rad(lons))
         z_coords = R * da.sin(da.deg2rad(lats))
 
-        return da.stack((x_coords, y_coords, z_coords), axis=-1)
+        return da.stack(
+            (x_coords.ravel(), y_coords.ravel(), z_coords.ravel()), axis=-1)
 
-    def _create_resample_kdtree(self, source_lons, source_lats, valid_input_index):
+    def _create_resample_kdtree(self):
         """Set up kd tree on input"""
+        source_lons, source_lats = self.source_geo_def.get_lonlats_dask()
+        valid_input_idx = ((source_lons >= -180) & (source_lons <= 180) &
+                           (source_lats <= 90) & (source_lats >= -90))
 
-        """
-        if not isinstance(source_geo_def, geometry.BaseDefinition):
-            raise TypeError('source_geo_def must be of geometry type')
-
-        #Get reduced cartesian coordinates and flatten them
-        source_cartesian_coords = source_geo_def.get_cartesian_coords(nprocs=nprocs)
-        input_coords = geometry._flatten_cartesian_coords(source_cartesian_coords)
-        input_coords = input_coords[valid_input_index]
-        """
-
-        vii = valid_input_index.compute().ravel()
-        source_lons_valid = source_lons.ravel()[vii]
-        source_lats_valid = source_lats.ravel()[vii]
-
-        input_coords = self.transform_lonlats(source_lons_valid,
-                                              source_lats_valid)
-
-        if input_coords.size == 0:
-            raise EmptyResult('No valid data points in input data')
+        # FIXME: Is dask smart enough to only compute the pixels we end up
+        #        using even with this complicated indexing
+        input_coords = self.transform_lonlats(source_lons,
+                                              source_lats)
+        input_coords = input_coords[valid_input_idx.ravel(), :]
 
         # Build kd-tree on input
         input_coords = input_coords.astype(np.float)
+        valid_input_idx, input_coords = da.compute(valid_input_idx,
+                                                   input_coords)
         if kd_tree_name == 'pykdtree':
-            resample_kdtree = KDTree(input_coords.compute())
+            resample_kdtree = KDTree(input_coords)
         else:
-            resample_kdtree = sp.cKDTree(input_coords.compute())
+            resample_kdtree = sp.cKDTree(input_coords)
 
-        return resample_kdtree
+        return valid_input_idx, resample_kdtree
 
-    def _query_resample_kdtree(self, resample_kdtree, target_lons,
-                               target_lats, valid_output_index,
+    def _query_resample_kdtree(self,
+                               resample_kdtree,
+                               target_lons,
+                               target_lats,
+                               valid_output_index,
                                reduce_data=True):
-        """Query kd-tree on slice of target coordinates"""
-        from dask.base import tokenize
-        from dask.array import Array
-
-        def query(target_lons, target_lats, valid_output_index, c_slice):
-            voi = valid_output_index[c_slice].compute()
+        """Query kd-tree on slice of target coordinates."""
+        def query_no_distance(target_lons, target_lats, valid_output_index):
+            voi = valid_output_index
             shape = voi.shape
             voir = voi.ravel()
-            target_lons_valid = target_lons[c_slice].ravel()[voir]
-            target_lats_valid = target_lats[c_slice].ravel()[voir]
+            target_lons_valid = target_lons.ravel()[voir]
+            target_lats_valid = target_lats.ravel()[voir]
 
             coords = self.transform_lonlats(target_lons_valid,
                                             target_lats_valid)
-            distance_array, index_array = np.stack(
-                resample_kdtree.query(coords.compute(),
-                                      k=self.neighbours,
-                                      eps=self.epsilon,
-                                      distance_upper_bound=self.radius_of_influence))
-
-            res_ia = np.full(shape, fill_value=np.nan, dtype=np.float)
-            res_da = np.full(shape, fill_value=np.nan, dtype=np.float)
-            res_ia[voi] = index_array
-            res_da[voi] = distance_array
-            return np.stack([res_ia, res_da], axis=-1)
-
-        token = tokenize(1000)
-        name = 'query-' + token
-
-        dsk = {}
-        vstart = 0
-
-        for i, vck in enumerate(valid_output_index.chunks[0]):
-            hstart = 0
-            for j, hck in enumerate(valid_output_index.chunks[1]):
-                c_slice = (slice(vstart, vstart + vck),
-                           slice(hstart, hstart + hck))
-                dsk[(name, i, j, 0)] = (query, target_lons,
-                                        target_lats, valid_output_index,
-                                        c_slice)
-                hstart += hck
-            vstart += vck
-
-        res = Array(dsk, name,
-                    shape=list(valid_output_index.shape) + [2],
-                    chunks=list(valid_output_index.chunks) + [2],
-                    dtype=target_lons.dtype)
-
-        index_array = res[:, :, 0].astype(np.uint)
-        distance_array = res[:, :, 1]
-        return index_array, distance_array
+            distance_array, index_array = resample_kdtree.query(
+                coords.compute(),
+                k=self.neighbours,
+                eps=self.epsilon,
+                distance_upper_bound=self.radius_of_influence)
+
+            # KDTree query returns out-of-bounds neighbors as `len(arr)`
+            # which is an invalid index, we mask those out so -1 represents
+            # invalid values
+            # voi is 2D, index_array is 1D
+            bad_mask = index_array >= resample_kdtree.n
+            voi[bad_mask.reshape(shape)] = False
+            res_ia = np.full(shape, fill_value=-1, dtype=np.int)
+            res_ia[voi] = index_array[~bad_mask]
+            # res_ia[voi >= resample_kdtree.n] = -1
+            # res_ia[voi] = index_array
+            # res_ia[voi >= resample_kdtree.n] = -1
+            return res_ia
+
+        res = da.map_blocks(query_no_distance, target_lons, target_lats,
+                            valid_output_index, dtype=np.int)
+        return res, None
 
     def get_neighbour_info(self):
-        """Returns neighbour info
+        """Return neighbour info.
 
         Returns
         -------
         (valid_input_index, valid_output_index,
         index_array, distance_array) : tuple of numpy arrays
             Neighbour resampling info
-        """
 
+        """
         if self.source_geo_def.size < self.neighbours:
             warnings.warn('Searching for %s neighbours in %s data points' %
                           (self.neighbours, self.source_geo_def.size))
 
-        source_lons, source_lats = self.source_geo_def.get_lonlats_dask()
-        valid_input_index = ((source_lons >= -180) & (source_lons <= 180) &
-                             (source_lats <= 90) & (source_lats >= -90))
-
         # Create kd-tree
-        try:
-            resample_kdtree = self._create_resample_kdtree(source_lons,
-                                                           source_lats,
-                                                           valid_input_index)
-
-        except EmptyResult:
+        valid_input_idx, resample_kdtree = self._create_resample_kdtree()
+        if resample_kdtree.n == 0:
             # Handle if all input data is reduced away
-            valid_output_index, index_array, distance_array = \
+            valid_output_idx, index_arr, distance_arr = \
                 _create_empty_info(self.source_geo_def,
                                    self.target_geo_def, self.neighbours)
-            self.valid_input_index = valid_input_index
-            self.valid_output_index = valid_output_index
-            self.index_array = index_array
-            self.distance_array = distance_array
-            return (valid_input_index, valid_output_index, index_array,
-                    distance_array)
+            self.valid_input_index = valid_input_idx
+            self.valid_output_index = valid_output_idx
+            self.index_array = index_arr
+            self.distance_array = distance_arr
+            return valid_input_idx, valid_output_idx, index_arr, distance_arr
 
         target_lons, target_lats = self.target_geo_def.get_lonlats_dask()
-        valid_output_index = ((target_lons >= -180) & (target_lons <= 180) &
-                              (target_lats <= 90) & (target_lats >= -90))
+        valid_output_idx = ((target_lons >= -180) & (target_lons <= 180) &
+                            (target_lats <= 90) & (target_lats >= -90))
 
-        index_array, distance_array = self._query_resample_kdtree(resample_kdtree,
-                                                                  target_lons,
-                                                                  target_lats,
-                                                                  valid_output_index)
+        index_arr, distance_arr = self._query_resample_kdtree(
+            resample_kdtree, target_lons, target_lats, valid_output_idx)
 
-        self.valid_input_index = valid_input_index
-        self.valid_output_index = valid_output_index
-        self.index_array = index_array
-        self.distance_array = distance_array
+        self.valid_output_index, self.index_array = \
+            da.compute(valid_output_idx, index_arr)
+        self.valid_input_index = valid_input_idx
+        self.distance_array = distance_arr
 
-        return valid_input_index, valid_output_index, index_array, distance_array
+        return (self.valid_input_index,
+                self.valid_output_index,
+                self.index_array,
+                self.distance_array)
 
     def get_sample_from_neighbour_info(self, data, fill_value=np.nan):
+        # FIXME: can be this made into a dask construct ?
+        cols, lines = np.meshgrid(
+            np.arange(data['x'].size), np.arange(data['y'].size))
+        try:
+            self.valid_input_index = self.valid_input_index.compute()
+        except AttributeError:
+            pass
+        vii = self.valid_input_index.squeeze()
+        try:
+            self.index_array = self.index_array.compute()
+        except AttributeError:
+            pass
+
+        # ia contains reduced (valid) indices of the source array, and has the
+        # shape of the destination array
+        ia = self.index_array
+        rlines = lines[vii][ia]
+        rcols = cols[vii][ia]
+
+        slices = []
+        mask_slices = []
+        mask_2d_added = False
+        coords = {}
+        try:
+            coord_x, coord_y = self.target_geo_def.get_proj_vectors_dask()
+        except AttributeError:
+            coord_x, coord_y = None, None
 
-        # flatten x and y in the source array
-
-        output_shape = []
-        chunks = []
-        source_dims = data.dims
-        for dim in source_dims:
+        for i, dim in enumerate(data.dims):
             if dim == 'y':
-                output_shape += [self.target_geo_def.y_size]
-                chunks += [1000]
+                slices.append(rlines)
+                if not mask_2d_added:
+                    mask_slices.append(ia == -1)
+                    mask_2d_added = True
+                if coord_y is not None:
+                    coords[dim] = coord_y
             elif dim == 'x':
-                output_shape += [self.target_geo_def.x_size]
-                chunks += [1000]
-            else:
-                output_shape += [data[dim].size]
-                chunks += [10]
-
-        new_dims = []
-        xy_dims = []
-        source_shape = [1, 1]
-        chunks = [1, 1]
-        for i, dim in enumerate(data.dims):
-            if dim not in ['x', 'y']:
-                new_dims.append(dim)
-                source_shape[1] *= data.shape[i]
-                chunks[1] *= 10
-            else:
-                xy_dims.append(dim)
-                source_shape[0] *= data.shape[i]
-                chunks[0] *= 1000
-
-        new_dims = xy_dims + new_dims
-
-        target_shape = [np.prod(self.target_geo_def.shape), source_shape[1]]
-        source_data = data.transpose(*new_dims).data.reshape(source_shape)
-
-        input_size = self.valid_input_index.sum()
-        index_mask = (self.index_array == input_size)
-        new_index_array = da.where(
-            index_mask, 0, self.index_array).ravel().astype(int).compute()
-        valid_targets = self.valid_output_index.ravel()
-
-        target_lines = []
-
-        for line in range(target_shape[1]):
-            #target_data_line = target_data[:, line]
-            new_data = source_data[:, line][self.valid_input_index.ravel()]
-            # could this be a bug in dask ? we have to compute to avoid errors
-            result = new_data.compute()[new_index_array]
-            result[index_mask.ravel()] = fill_value
-            #target_data_line = da.full(target_shape[0], np.nan, chunks=1000000)
-            target_data_line = np.full(target_shape[0], fill_value)
-            target_data_line[valid_targets] = result
-            target_lines.append(target_data_line[:, np.newaxis])
-
-        target_data = np.hstack(target_lines)
-
-        new_shape = []
-        for dim in new_dims:
-            if dim == 'x':
-                new_shape.append(self.target_geo_def.x_size)
-            elif dim == 'y':
-                new_shape.append(self.target_geo_def.y_size)
+                slices.append(rcols)
+                if not mask_2d_added:
+                    mask_slices.append(ia == -1)
+                    mask_2d_added = True
+                if coord_x is not None:
+                    coords[dim] = coord_x
             else:
-                new_shape.append(data[dim].size)
-
-        output_arr = DataArray(da.from_array(target_data.reshape(new_shape), chunks=[1000] * len(new_shape)),
-                               dims=new_dims)
-        for dim in source_dims:
-            if dim == 'x':
-                output_arr['x'] = self.target_geo_def.proj_x_coords
-            elif dim == 'y':
-                output_arr['y'] = self.target_geo_def.proj_y_coords
-            else:
-                output_arr[dim] = data[dim]
+                slices.append(slice(None))
+                mask_slices.append(slice(None))
+                try:
+                    coords[dim] = data.coords[dim]
+                except KeyError:
+                    pass
 
-        return output_arr.transpose(*source_dims)
+        res = data.values[slices]
+        res[mask_slices] = fill_value
+        res = DataArray(da.from_array(res, chunks=CHUNK_SIZE), dims=data.dims, coords=coords)
+        return res
 
 
 def _get_fill_mask_value(data_dtype):
     """Returns the maximum value of dtype"""
-
     if issubclass(data_dtype.type, np.floating):
         fill_value = np.finfo(data_dtype.type).max
     elif issubclass(data_dtype.type, np.integer):
diff --git a/pyresample/test/test_geometry.py b/pyresample/test/test_geometry.py
index 4aec478..ae8990b 100644
--- a/pyresample/test/test_geometry.py
+++ b/pyresample/test/test_geometry.py
@@ -91,43 +91,6 @@ class Test(unittest.TestCase):
         self.assertTrue((cart_coords.sum() - exp) < 1e-7 * exp,
                         msg='Calculation of cartesian coordinates failed')
 
-    def test_base_lat_invalid(self):
-
-        lons = np.arange(-135., +135, 20.)
-        lats = np.ones_like(lons) * 70.
-        lats[0] = -95
-        lats[1] = +95
-        self.assertRaises(
-            ValueError, geometry.BaseDefinition, lons=lons, lats=lats)
-
-    def test_base_lon_wrapping(self):
-
-        lons1 = np.arange(-135., +135, 50.)
-        lats = np.ones_like(lons1) * 70.
-
-        with catch_warnings() as w1:
-            base_def1 = geometry.BaseDefinition(lons1, lats)
-            self.assertFalse(
-                len(w1) != 0, 'Got warning <%s>, but was not expecting one' % w1)
-
-        lons2 = np.where(lons1 < 0, lons1 + 360, lons1)
-        with catch_warnings() as w2:
-            base_def2 = geometry.BaseDefinition(lons2, lats)
-            self.assertFalse(
-                len(w2) != 1, 'Failed to trigger a warning on longitude wrapping')
-            self.assertFalse(('-180:+180' not in str(w2[0].message)),
-                             'Failed to trigger correct warning about longitude wrapping')
-
-        self.assertFalse(
-            base_def1 != base_def2, 'longitude wrapping to [-180:+180] did not work')
-
-        with catch_warnings() as w3:
-            base_def3 = geometry.BaseDefinition(None, None)
-            self.assertFalse(
-                len(w3) != 0, 'Got warning <%s>, but was not expecting one' % w3)
-
-        self.assert_raises(ValueError, base_def3.get_lonlats)
-
     def test_base_type(self):
         lons1 = np.arange(-135., +135, 50.)
         lats = np.ones_like(lons1) * 70.
@@ -216,13 +179,24 @@ class Test(unittest.TestCase):
 
     def test_get_array_hashable(self):
         arr = np.array([1.2, 1.3, 1.4, 1.5])
-
-        np.testing.assert_allclose(np.array([ 51,  51,  51,  51,  51,  51, 243,
-                                           63, 205, 204, 204, 204, 204,
-                                           204, 244,  63, 102, 102, 102, 102,
-                                           102, 102, 246,  63,   0,   0,
-                                           0,   0,   0,   0, 248,  63],
-                                           dtype=np.uint8),
+        if sys.byteorder == 'little':
+            # arr.view(np.uint8)
+            reference = np.array([ 51,  51,  51,  51,  51,  51, 243,
+                                   63, 205, 204, 204, 204, 204,
+                                   204, 244,  63, 102, 102, 102, 102,
+                                   102, 102, 246,  63,   0,   0,
+                                   0,   0,   0,   0, 248,  63],
+                                   dtype=np.uint8)
+        else:
+            # on le machines use arr.byteswap().view(np.uint8)
+            reference = np.array([ 63, 243,  51,  51,  51,  51,  51,
+                                   51,  63, 244, 204, 204, 204,
+                                   204, 204, 205,  63, 246, 102, 102,
+                                   102, 102, 102, 102,  63, 248,
+                                   0,   0,   0,   0,   0,   0],
+                                   dtype=np.uint8)
+
+        np.testing.assert_allclose(reference,
                                     geometry.get_array_hashable(arr))
 
         try:
@@ -231,13 +205,8 @@ class Test(unittest.TestCase):
             pass
         else:
             xrarr = xr.DataArray(arr)
-            np.testing.assert_allclose(np.array([ 51,  51,  51,  51,  51,  51, 243,
-                                           63, 205, 204, 204, 204, 204,
-                                           204, 244,  63, 102, 102, 102, 102,
-                                           102, 102, 246,  63,   0,   0,
-                                           0,   0,   0,   0, 248,  63],
-                                           dtype=np.uint8),
-                                    geometry.get_array_hashable(arr))
+            np.testing.assert_allclose(reference,
+                                       geometry.get_array_hashable(arr))
 
             xrarr.attrs['hash'] = 42
             self.assertEqual(geometry.get_array_hashable(xrarr),
@@ -755,27 +724,6 @@ class TestSwathDefinition(unittest.TestCase):
         self.assertFalse(id(lons1) != id(lons2) or id(lats1) != id(lats2),
                          msg='Caching of swath coordinates failed')
 
-    def test_swath_wrap(self):
-        lons1 = np.fromfunction(lambda y, x: 3 + (10.0 / 100) * x, (5000, 100))
-        lats1 = np.fromfunction(
-            lambda y, x: 75 - (50.0 / 5000) * y, (5000, 100))
-
-        lons1 += 180.
-        with catch_warnings() as w1:
-            swath_def = geometry.BaseDefinition(lons1, lats1)
-            self.assertFalse(
-                len(w1) != 1, 'Failed to trigger a warning on longitude wrapping')
-            self.assertFalse(('-180:+180' not in str(w1[0].message)),
-                             'Failed to trigger correct warning about longitude wrapping')
-
-        lons2, lats2 = swath_def.get_lonlats()
-
-        self.assertTrue(id(lons1) != id(lons2),
-                        msg='Caching of swath coordinates failed with longitude wrapping')
-
-        self.assertTrue(lons2.min() > -180 and lons2.max() < 180,
-                        'Wrapping of longitudes failed for SwathDefinition')
-
     def test_concat_1d(self):
         lons1 = np.array([1, 2, 3])
         lats1 = np.array([1, 2, 3])
@@ -894,7 +842,7 @@ class TestSwathDefinition(unittest.TestCase):
 
         lons = np.array([[-45., 0., 45.],
                          [-90, 0., 90.],
-                         [-135., 180., 135.]]).T
+                         [-135., -180., 135.]]).T
 
         area = geometry.SwathDefinition(lons, lats)
         lons, lats = area.get_edge_lonlats()
diff --git a/pyresample/test/test_utils.py b/pyresample/test/test_utils.py
index b43655c..0272bf9 100644
--- a/pyresample/test/test_utils.py
+++ b/pyresample/test/test_utils.py
@@ -188,6 +188,24 @@ class TestMisc(unittest.TestCase):
         self.assertFalse(
             (wlons.min() < -180) or (wlons.max() >= 180) or (+180 in wlons))
 
+    def test_wrap_and_check(self):
+        from pyresample import utils
+
+        lons1 = np.arange(-135., +135, 50.)
+        lats = np.ones_like(lons1) * 70.
+        new_lons, new_lats = utils.check_and_wrap(lons1, lats)
+        self.assertIs(lats, new_lats)
+        self.assertTrue(np.isclose(lons1, new_lons).all())
+
+        lons2 = np.where(lons1 < 0, lons1 + 360, lons1)
+        new_lons, new_lats = utils.check_and_wrap(lons2, lats)
+        self.assertIs(lats, new_lats)
+        # after wrapping lons2 should look like lons1
+        self.assertTrue(np.isclose(lons1, new_lons).all())
+
+        lats2 = lats + 25.
+        self.assertRaises(ValueError, utils.check_and_wrap, lons1, lats2)
+
     def test_unicode_proj4_string(self):
         """Test that unicode is accepted for area creation.
         """
diff --git a/pyresample/utils.py b/pyresample/utils.py
index c97ae5a..c1570f1 100644
--- a/pyresample/utils.py
+++ b/pyresample/utils.py
@@ -30,10 +30,8 @@ import yaml
 from configobj import ConfigObj
 from collections import Mapping
 
-import pyresample as pr
 
-
-class AreaNotFound(Exception):
+class AreaNotFound(KeyError):
 
     """Exception raised when specified are is no found in file"""
     pass
@@ -126,6 +124,7 @@ def _parse_yaml_area_file(area_file_name, *regions):
     the files, using the first file as the "base", replacing things after
     that.
     """
+    from pyresample.geometry import DynamicAreaDefinition
     area_dict = _read_yaml_area_file_content(area_file_name)
     area_list = regions or area_dict.keys()
 
@@ -154,11 +153,11 @@ def _parse_yaml_area_file(area_file_name, *regions):
             rotation = params['rotation']
         except KeyError:
             rotation = 0
-        area = pr.geometry.DynamicAreaDefinition(area_name, description,
-                                                 projection, xsize, ysize,
-                                                 area_extent,
-                                                 optimize_projection,
-                                                 rotation)
+        area = DynamicAreaDefinition(area_name, description,
+                                     projection, xsize, ysize,
+                                     area_extent,
+                                     optimize_projection,
+                                     rotation)
         try:
             area = area.freeze()
         except (TypeError, AttributeError):
@@ -230,6 +229,7 @@ def _parse_legacy_area_file(area_file_name, *regions):
 
 def _create_area(area_id, area_content):
     """Parse area configuration"""
+    from pyresample.geometry import AreaDefinition
 
     config_obj = area_content.replace('{', '').replace('};', '')
     config_obj = ConfigObj([line.replace(':', '=', 1)
@@ -257,10 +257,10 @@ def _create_area(area_id, area_content):
         config['AREA_EXTENT'][i] = float(val)
 
     config['PCS_DEF'] = _get_proj4_args(config['PCS_DEF'])
-    return pr.geometry.AreaDefinition(config['REGION'], config['NAME'],
-                                      config['PCS_ID'], config['PCS_DEF'],
-                                      config['XSIZE'], config['YSIZE'],
-                                      config['AREA_EXTENT'], config['ROTATION'])
+    return AreaDefinition(config['REGION'], config['NAME'],
+                          config['PCS_ID'], config['PCS_DEF'],
+                          config['XSIZE'], config['YSIZE'],
+                          config['AREA_EXTENT'], config['ROTATION'])
 
 
 def get_area_def(area_id, area_name, proj_id, proj4_args, x_size, y_size,
@@ -292,9 +292,10 @@ def get_area_def(area_id, area_name, proj_id, proj4_args, x_size, y_size,
         AreaDefinition object
     """
 
+    from pyresample.geometry import AreaDefinition
     proj_dict = _get_proj4_args(proj4_args)
-    return pr.geometry.AreaDefinition(area_id, area_name, proj_id, proj_dict,
-                                      x_size, y_size, area_extent)
+    return AreaDefinition(area_id, area_name, proj_id, proj_dict,
+                          x_size, y_size, area_extent)
 
 
 def generate_quick_linesample_arrays(source_area_def, target_area_def,
@@ -314,11 +315,12 @@ def generate_quick_linesample_arrays(source_area_def, target_area_def,
     -------
     (row_indices, col_indices) : tuple of numpy arrays
     """
+    from pyresample.grid import get_linesample
     lons, lats = target_area_def.get_lonlats(nprocs)
 
-    source_pixel_y, source_pixel_x = pr.grid.get_linesample(lons, lats,
-                                                            source_area_def,
-                                                            nprocs=nprocs)
+    source_pixel_y, source_pixel_x = get_linesample(lons, lats,
+                                                    source_area_def,
+                                                    nprocs=nprocs)
 
     source_pixel_x = _downcast_index_array(source_pixel_x,
                                            source_area_def.shape[1])
@@ -350,12 +352,13 @@ def generate_nearest_neighbour_linesample_arrays(source_area_def,
     (row_indices, col_indices) : tuple of numpy arrays
     """
 
+    from pyresample.kd_tree import get_neighbour_info
     valid_input_index, valid_output_index, index_array, distance_array = \
-        pr.kd_tree.get_neighbour_info(source_area_def,
-                                      target_area_def,
-                                      radius_of_influence,
-                                      neighbours=1,
-                                      nprocs=nprocs)
+        get_neighbour_info(source_area_def,
+                           target_area_def,
+                           radius_of_influence,
+                           neighbours=1,
+                           nprocs=nprocs)
     # Enumerate rows and cols
     rows = np.fromfunction(lambda i, j: i, source_area_def.shape,
                            dtype=np.int32).ravel()
@@ -459,11 +462,11 @@ def proj4_radius_parameters(proj4_dict):
         new_info = proj4_str_to_dict(proj4_dict)
     else:
         new_info = proj4_dict.copy()
-      
+
     # load information from PROJ.4 about the ellipsis if possible
-    
+
     from pyproj import Geod
-    
+
     if 'ellps' in new_info:
         geod = Geod(**new_info)
         new_info['a'] = geod.a
@@ -481,7 +484,7 @@ def proj4_radius_parameters(proj4_dict):
             geod = Geod(**{'ellps': 'WGS84'})
             new_info['a'] = geod.a
             new_info['b'] = geod.b
-	     
+
     return float(new_info['a']), float(new_info['b'])
 
 
@@ -513,6 +516,34 @@ def wrap_longitudes(lons):
     return (lons + 180) % 360 - 180
 
 
+def check_and_wrap(lons, lats):
+    """Wrap longitude to [-180:+180[ and check latitude for validity.
+
+    Args:
+        lons (ndarray): Longitude degrees
+        lats (ndarray): Latitude degrees
+
+    Returns:
+        lons, lats: Longitude degrees in the range [-180:180[ and the original
+                    latitude array
+
+    Raises:
+        ValueError: If latitude array is not between -90 and 90
+
+    """
+    # check the latitutes
+    if lats.min() < -90. or lats.max() > 90.:
+        raise ValueError(
+            'Some latitudes are outside the [-90.:+90] validity range')
+
+    # check the longitudes
+    if lons.min() < -180. or lons.max() >= 180.:
+        # wrap longitudes to [-180;+180[
+        lons = wrap_longitudes(lons)
+
+    return lons, lats
+
+
 def recursive_dict_update(d, u):
     """Recursive dictionary update using
 
diff --git a/pyresample/version.py b/pyresample/version.py
index 38c6ea6..a77942a 100644
--- a/pyresample/version.py
+++ b/pyresample/version.py
@@ -15,4 +15,4 @@
 # You should have received a copy of the GNU Lesser General Public License along
 # with this program.  If not, see <http://www.gnu.org/licenses/>.
 
-__version__ = '1.7.1'
+__version__ = '1.8.0'
diff --git a/setup.py b/setup.py
index 7b666f7..45a49d2 100644
--- a/setup.py
+++ b/setup.py
@@ -26,11 +26,12 @@ from setuptools.command.build_ext import build_ext as _build_ext
 
 version = imp.load_source('pyresample.version', 'pyresample/version.py')
 
-requirements = ['setuptools>=3.2', 'pyproj', 'numpy', 'configobj',
+requirements = ['setuptools>=3.2', 'pyproj>=1.9.5.1', 'numpy', 'configobj',
                 'pykdtree>=1.1.1', 'pyyaml', 'six']
 extras_require = {'pykdtree': ['pykdtree>=1.1.1'],
                   'numexpr': ['numexpr'],
-                  'quicklook': ['matplotlib', 'basemap', 'pillow']}
+                  'quicklook': ['matplotlib', 'basemap', 'pillow'],
+                  'dask': ['dask>=0.16.1']}
 
 test_requires = []
 if sys.version_info < (3, 3):

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/pkg-grass/pyresample.git



More information about the Pkg-grass-devel mailing list