[Git][debian-gis-team/python-pdal][upstream] New upstream version 2.0.0+ds

Bas Couwenberg gitlab at salsa.debian.org
Sun Apr 8 17:25:16 UTC 2018


Bas Couwenberg pushed to branch upstream at Debian GIS Project / python-pdal


Commits:
ee79bb2d by Bas Couwenberg at 2018-04-08T19:12:32+02:00
New upstream version 2.0.0+ds
- - - - -


11 changed files:

- PKG-INFO
- README.rst
- VERSION.txt
- + pdal/PyArray.hpp
- pdal/PyPipeline.cpp
- pdal/PyPipeline.hpp
- pdal/__init__.py
- pdal/libpdalpython.cpp
- pdal/libpdalpython.pyx
- setup.py
- test/test_pipeline.py


Changes:

=====================================
PKG-INFO
=====================================
--- a/PKG-INFO
+++ b/PKG-INFO
@@ -1,11 +1,12 @@
 Metadata-Version: 1.1
 Name: PDAL
-Version: 1.6.0
+Version: 2.0.0
 Summary: Point cloud data processing
 Home-page: http://pdal.io
 Author: Howard Butler
 Author-email: howard at hobu.co
 License: BSD
+Description-Content-Type: UNKNOWN
 Description: ================================================================================
         PDAL
         ================================================================================
@@ -14,6 +15,10 @@ Description: ===================================================================
         arrays. Additionally, you can use it to fetch `schema`_ and `metadata`_ from
         PDAL operations.
         
+        The repository for PDAL's Python extension is available at https://github.com/PDAL/python
+        
+        It is released independently from PDAL itself as of PDAL 1.7.
+        
         Usage
         --------------------------------------------------------------------------------
         
@@ -53,7 +58,7 @@ Description: ===================================================================
         Requirements
         ================================================================================
         
-        * PDAL 1.4+
+        * PDAL 1.7+
         * Python >=2.7 (including Python 3.x)
         
         


=====================================
README.rst
=====================================
--- a/README.rst
+++ b/README.rst
@@ -6,6 +6,10 @@ The PDAL Python extension allows you to process data with PDAL into `Numpy`_
 arrays. Additionally, you can use it to fetch `schema`_ and `metadata`_ from
 PDAL operations.
 
+The repository for PDAL's Python extension is available at https://github.com/PDAL/python
+
+It is released independently from PDAL itself as of PDAL 1.7.
+
 Usage
 --------------------------------------------------------------------------------
 
@@ -45,6 +49,6 @@ sorts it by the ``X`` dimension:
 Requirements
 ================================================================================
 
-* PDAL 1.4+
+* PDAL 1.7+
 * Python >=2.7 (including Python 3.x)
 


=====================================
VERSION.txt
=====================================
--- a/VERSION.txt
+++ b/VERSION.txt
@@ -1 +1 @@
-1.6.0
\ No newline at end of file
+2.0.0
\ No newline at end of file


=====================================
pdal/PyArray.hpp
=====================================
--- /dev/null
+++ b/pdal/PyArray.hpp
@@ -0,0 +1,212 @@
+/******************************************************************************
+* Copyright (c) 2011, Michael P. Gerlek (mpg at flaxen.com)
+*
+* All rights reserved.
+*
+* Redistribution and use in source and binary forms, with or without
+* modification, are permitted provided that the following
+* conditions are met:
+*
+*     * Redistributions of source code must retain the above copyright
+*       notice, this list of conditions and the following disclaimer.
+*     * Redistributions in binary form must reproduce the above copyright
+*       notice, this list of conditions and the following disclaimer in
+*       the documentation and/or other materials provided
+*       with the distribution.
+*     * Neither the name of Hobu, Inc. or Flaxen Geo Consulting nor the
+*       names of its contributors may be used to endorse or promote
+*       products derived from this software without specific prior
+*       written permission.
+*
+* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
+* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
+* COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
+* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
+* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
+* OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED
+* AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
+* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT
+* OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
+* OF SUCH DAMAGE.
+****************************************************************************/
+
+#pragma once
+
+#include <pdal/PointView.hpp>
+
+#include <algorithm>
+
+#pragma warning(disable: 4127) // conditional expression is constant
+
+
+#include <Python.h>
+#undef toupper
+#undef tolower
+#undef isspace
+
+#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION
+#include <numpy/arrayobject.h>
+
+// forward declare PyObject so we don't need the python headers everywhere
+// see: http://mail.python.org/pipermail/python-dev/2003-August/037601.html
+#ifndef PyObject_HEAD
+struct _object;
+typedef _object PyObject;
+#endif
+
+namespace pdal
+{
+namespace python
+{
+
+
+class PDAL_DLL Array
+{
+public:
+
+    Array() : m_py_array(0)
+    {}
+
+    ~Array()
+    {
+        cleanup();
+    }
+
+
+    inline void update(PointViewPtr view)
+    {
+        typedef std::unique_ptr<std::vector<uint8_t>> DataPtr;
+        cleanup();
+        int nd = 1;
+        Dimension::IdList dims = view->dims();
+        npy_intp mydims = view->size();
+        npy_intp* ndims = &mydims;
+        std::vector<npy_intp> strides(dims.size());
+
+
+        DataPtr pdata( new std::vector<uint8_t>(view->pointSize()* view->size(), 0));
+
+        PyArray_Descr *dtype(0);
+        PyObject * dtype_dict = (PyObject*)buildNumpyDescription(view);
+        if (!dtype_dict)
+            throw pdal_error("Unable to build numpy dtype description dictionary");
+        int did_convert = PyArray_DescrConverter(dtype_dict, &dtype);
+        if (did_convert == NPY_FAIL)
+            throw pdal_error("Unable to build numpy dtype");
+        Py_XDECREF(dtype_dict);
+
+#ifdef NPY_ARRAY_CARRAY
+        int flags = NPY_ARRAY_CARRAY;
+#else
+        int flags = NPY_CARRAY;
+#endif
+        uint8_t* sp = pdata.get()->data();
+        PyObject * pyArray = PyArray_NewFromDescr(&PyArray_Type,
+                                                  dtype,
+                                                  nd,
+                                                  ndims,
+                                                  0,
+                                                  sp,
+                                                  flags,
+                                                  NULL);
+
+        // copy the data
+        uint8_t* p(sp);
+        DimTypeList types = view->dimTypes();
+        for (PointId idx = 0; idx < view->size(); idx++)
+        {
+            p = sp + (view->pointSize() * idx);
+            view->getPackedPoint(types, idx, (char*)p);
+        }
+
+        m_py_array = pyArray;
+        m_data_array = std::move(pdata);
+    }
+
+
+    inline PyObject* getPythonArray() const { return m_py_array; }
+
+
+private:
+
+    inline void cleanup()
+    {
+        PyObject* p = (PyObject*)(m_py_array);
+        Py_XDECREF(p);
+        m_data_array.reset();
+    }
+
+    inline PyObject* buildNumpyDescription(PointViewPtr view) const
+    {
+
+        // Build up a numpy dtype dictionary
+        //
+        // {'formats': ['f8', 'f8', 'f8', 'u2', 'u1', 'u1', 'u1', 'u1', 'u1', 'f4', 'u1', 'u2', 'f8', 'u2', 'u2', 'u2'],
+        // 'names': ['X', 'Y', 'Z', 'Intensity', 'ReturnNumber', 'NumberOfReturns',
+        // 'ScanDirectionFlag', 'EdgeOfFlightLine', 'Classification',
+        // 'ScanAngleRank', 'UserData', 'PointSourceId', 'GpsTime', 'Red', 'Green',
+        // 'Blue']}
+        //
+
+        std::stringstream oss;
+        Dimension::IdList dims = view->dims();
+
+        PyObject* dict = PyDict_New();
+        PyObject* sizes = PyList_New(dims.size());
+        PyObject* formats = PyList_New(dims.size());
+        PyObject* titles = PyList_New(dims.size());
+
+        for (Dimension::IdList::size_type i=0; i < dims.size(); ++i)
+        {
+            Dimension::Id id = (dims[i]);
+            Dimension::Type t = view->dimType(id);
+            npy_intp stride = view->dimSize(id);
+
+            std::string name = view->dimName(id);
+
+            std::string kind("i");
+            Dimension::BaseType b = Dimension::base(t);
+            if (b == Dimension::BaseType::Unsigned)
+                kind = "u";
+            else if (b == Dimension::BaseType::Floating)
+                kind = "f";
+            else
+            {
+                std::stringstream o;
+                oss << "unable to map kind '" << kind <<"' to PDAL dimension type";
+                throw pdal::pdal_error(o.str());
+            }
+
+            oss << kind << stride;
+            PyObject* pySize = PyLong_FromLong(stride);
+            PyObject* pyTitle = PyUnicode_FromString(name.c_str());
+            PyObject* pyFormat = PyUnicode_FromString(oss.str().c_str());
+
+            PyList_SetItem(sizes, i, pySize);
+            PyList_SetItem(titles, i, pyTitle);
+            PyList_SetItem(formats, i, pyFormat);
+
+            oss.str("");
+        }
+
+        PyDict_SetItemString(dict, "names", titles);
+        PyDict_SetItemString(dict, "formats", formats);
+
+    //     PyObject* obj = PyUnicode_AsASCIIString(PyObject_Str(dict));
+    //     const char* s = PyBytes_AsString(obj);
+    //     std::string output(s);
+    //     std::cout << "array: " << output << std::endl;
+        return dict;
+    }
+
+    PyObject* m_py_array;
+    std::unique_ptr<std::vector<uint8_t> > m_data_array;
+
+    Array& operator=(Array const& rhs);
+};
+
+} // namespace python
+} // namespace pdal
+


=====================================
pdal/PyPipeline.cpp
=====================================
--- a/pdal/PyPipeline.cpp
+++ b/pdal/PyPipeline.cpp
@@ -37,21 +37,36 @@
 #include <pdal/XMLSchema.hpp>
 #endif
 
+#ifndef _WIN32
+#include <dlfcn.h>
+#endif
+
+#include <Python.h>
+#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION
+#include <numpy/arrayobject.h>
+
+#include "PyArray.hpp"
 
 namespace libpdalpython
 {
 
+using namespace pdal::python;
+
 Pipeline::Pipeline(std::string const& json)
     : m_executor(json)
 {
-    auto initNumpy = []()
-    {
+    // Make the symbols in pdal_base global so that they're accessible
+    // to PDAL plugins.  Python dlopen's this extension with RTLD_LOCAL,
+    // which means that without this, symbols in libpdal_base aren't available
+    // for resolution of symbols on future runtime linking.  This is an issue
+    // on Apline and other Linux variants that doesn't use UNIQUE symbols
+    // for C++ template statics. only
+#ifndef _WIN32
+    ::dlopen("libpdal_base.so", RTLD_NOLOAD | RTLD_GLOBAL);
+#endif
 #undef NUMPY_IMPORT_ARRAY_RETVAL
 #define NUMPY_IMPORT_ARRAY_RETVAL
-        import_array();
-    };
-
-    initNumpy();
+    import_array();
 }
 
 Pipeline::~Pipeline()
@@ -80,9 +95,9 @@ bool Pipeline::validate()
     return m_executor.validate();
 }
 
-std::vector<PArray> Pipeline::getArrays() const
+std::vector<Array *> Pipeline::getArrays() const
 {
-    std::vector<PArray> output;
+    std::vector<Array *> output;
 
     if (!m_executor.executed())
         throw python_error("call execute() before fetching arrays");
@@ -91,7 +106,8 @@ std::vector<PArray> Pipeline::getArrays() const
 
     for (auto i: pvset)
     {
-        PArray array = new pdal::python::Array;
+        //ABELL - Leak?
+        Array *array = new pdal::python::Array;
         array->update(i);
         output.push_back(array);
     }


=====================================
pdal/PyPipeline.hpp
=====================================
--- a/pdal/PyPipeline.hpp
+++ b/pdal/PyPipeline.hpp
@@ -37,7 +37,6 @@
 #include <pdal/PipelineManager.hpp>
 #include <pdal/PipelineWriter.hpp>
 #include <pdal/util/FileUtils.hpp>
-#include <pdal/PyArray.hpp>
 #include <pdal/PipelineExecutor.hpp>
 
 #include <string>
@@ -46,12 +45,13 @@
 #undef tolower
 #undef isspace
 
-#ifndef PY_ARRAY_UNIQUE_SYMBOL
-#define PY_ARRAY_UNIQUE_SYMBOL LIBPDALPYTHON_ARRAY_API
-#endif
-
-#include <numpy/arrayobject.h>
-
+namespace pdal
+{
+namespace python
+{
+    class Array;
+}
+}
 
 namespace libpdalpython
 {
@@ -63,8 +63,6 @@ public:
         {}
 };
 
-    typedef pdal::python::Array* PArray;
-
 class Pipeline {
 public:
     Pipeline(std::string const& xml);
@@ -88,16 +86,13 @@ public:
     {
         return m_executor.getLog();
     }
-    std::vector<PArray> getArrays() const;
-
+    std::vector<pdal::python::Array *> getArrays() const;
 
     void setLogLevel(int level);
     int getLogLevel() const;
 
 private:
-
     pdal::PipelineExecutor m_executor;
-
 };
 
 }


=====================================
pdal/__init__.py
=====================================
--- a/pdal/__init__.py
+++ b/pdal/__init__.py
@@ -1,3 +1,3 @@
-__version__='1.6.0'
+__version__='2.0.0'
 
 from .pipeline import Pipeline


=====================================
pdal/libpdalpython.cpp
=====================================
The diff for this file was not included because it is too large.

=====================================
pdal/libpdalpython.pyx
=====================================
--- a/pdal/libpdalpython.pyx
+++ b/pdal/libpdalpython.pyx
@@ -12,7 +12,7 @@ from cpython cimport PyObject, Py_INCREF
 from cython.operator cimport dereference as deref, preincrement as inc
 
 
-cdef extern from "pdal/PyArray.hpp" namespace "pdal::python":
+cdef extern from "PyArray.hpp" namespace "pdal::python":
     cdef cppclass Array:
         void* getPythonArray() except+
 


=====================================
setup.py
=====================================
--- a/setup.py
+++ b/setup.py
@@ -106,6 +106,12 @@ include_dirs = []
 library_dirs = []
 libraries = []
 extra_link_args = []
+extra_compile_args = []
+
+if os.name in ['nt']:
+    library_dirs = ['c:/OSGeo4W64/lib']
+    libraries = ['pdalcpp','pdal_util','ws2_32']
+    extra_compile_args = ['/DNOMINMAX',]
 
 from setuptools.extension import Extension as DistutilsExtension
 
@@ -144,11 +150,14 @@ if pdal_config and "clean" not in sys.argv:
             libraries.append(item[2:])
 
 include_dirs.append(numpy.get_include())
-extra_compile_args = ['-std=c++11',]
+
+if os.name != 'nt':
+    extra_compile_args = ['-std=c++11','-Wno-unknown-pragmas']
 
 DEBUG=False
 if DEBUG:
-    extra_compile_args += ['-g','-O0']
+    if os.name != 'nt':
+        extra_compile_args += ['-g','-O0']
 
 sources=['pdal/libpdalpython'+ext, "pdal/PyPipeline.cpp"  ]
 extensions = [DistutilsExtension("*",
@@ -190,6 +199,7 @@ setup_args = dict(
         'Topic :: Scientific/Engineering :: GIS',
     ],
     cmdclass           = {},
+    install_requires   = ['numpy', 'packaging'],
 )
 setup(ext_modules=extensions, **setup_args)
 


=====================================
test/test_pipeline.py
=====================================
--- a/test/test_pipeline.py
+++ b/test/test_pipeline.py
@@ -2,9 +2,7 @@ import unittest
 import pdal
 import os
 
-DATADIRECTORY = os.environ.get('PDAL_TEST_DIR')
-if not DATADIRECTORY:
-    DATADIRECTORY = "../test"
+DATADIRECTORY = "./test/data"
 
 bad_json = u"""
 {
@@ -18,11 +16,10 @@ bad_json = u"""
 }
 """
 
+print (os.path.abspath(os.path.join(DATADIRECTORY, 'sort.json')))
+
 class TestPipeline(unittest.TestCase):
 
-    DATADIRECTORY = os.environ.get('PDAL_TEST_DIR')
-    if not DATADIRECTORY:
-        DATADIRECTORY = "../test"
     def fetch_json(self, filename):
         import os
         fn = DATADIRECTORY + os.path.sep +  filename
@@ -31,18 +28,18 @@ class TestPipeline(unittest.TestCase):
             output = f.read().decode('UTF-8')
         return output
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/pipeline/sort.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')),
                          "missing test data")
     def test_construction(self):
         """Can we construct a PDAL pipeline"""
-        json = self.fetch_json('/data/pipeline/sort.json')
+        json = self.fetch_json('sort.json')
         r = pdal.Pipeline(json)
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/pipeline/sort.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')),
                          "missing test data")
     def test_execution(self):
         """Can we execute a PDAL pipeline"""
-        x = self.fetch_json('/data/pipeline/sort.json')
+        x = self.fetch_json('sort.json')
         r = pdal.Pipeline(x)
         r.execute()
         self.assertGreater(len(r.pipeline), 200)
@@ -53,11 +50,11 @@ class TestPipeline(unittest.TestCase):
         with self.assertRaises(RuntimeError):
             r.validate()
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/pipeline/sort.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')),
                          "missing test data")
     def test_array(self):
         """Can we fetch PDAL data as a numpy array"""
-        json = self.fetch_json('/data/pipeline/sort.json')
+        json = self.fetch_json('sort.json')
         r = pdal.Pipeline(json)
         r.execute()
         arrays = r.arrays
@@ -67,11 +64,11 @@ class TestPipeline(unittest.TestCase):
         self.assertAlmostEqual(a[0][0], 635619.85, 7)
         self.assertAlmostEqual(a[1064][2], 456.92, 7)
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/pipeline/sort.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')),
                          "missing test data")
     def test_metadata(self):
         """Can we fetch PDAL metadata"""
-        json = self.fetch_json('/data/pipeline/sort.json')
+        json = self.fetch_json('sort.json')
         r = pdal.Pipeline(json)
         r.execute()
         metadata = r.metadata
@@ -80,40 +77,40 @@ class TestPipeline(unittest.TestCase):
         self.assertEqual(j["metadata"]["readers.las"]["count"], 1065)
 
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/pipeline/sort.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')),
                          "missing test data")
     def test_no_execute(self):
         """Does fetching arrays without executing throw an exception"""
-        json = self.fetch_json('/data/pipeline/sort.json')
+        json = self.fetch_json('sort.json')
         r = pdal.Pipeline(json)
         with self.assertRaises(RuntimeError):
             r.arrays
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/pipeline/reproject.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'reproject.json')),
                          "missing test data")
     def test_logging(self):
         """Can we fetch log output"""
-        json = self.fetch_json('/data/pipeline/reproject.json')
+        json = self.fetch_json('reproject.json')
         r = pdal.Pipeline(json)
         r.loglevel = 8
         count = r.execute()
         self.assertEqual(count, 789)
         self.assertEqual(r.log.split()[0], '(pypipeline')
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/pipeline/sort.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'sort.json')),
                          "missing test data")
     def test_schema(self):
         """Fetching a schema works"""
-        json = self.fetch_json('/data/pipeline/sort.json')
+        json = self.fetch_json('sort.json')
         r = pdal.Pipeline(json)
         r.execute()
         self.assertEqual(r.schema['schema']['dimensions'][0]['name'], 'X')
 
-    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'data/filters/chip.json')),
+    @unittest.skipUnless(os.path.exists(os.path.join(DATADIRECTORY, 'chip.json')),
                          "missing test data")
     def test_merged_arrays(self):
         """Can we fetch multiple point views from merged PDAL data """
-        json = self.fetch_json('/data/filters/chip.json')
+        json = self.fetch_json('chip.json')
         r = pdal.Pipeline(json)
         r.execute()
         arrays = r.arrays



View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/ee79bb2d3752f5c10490d1b4f68b0609f59724d7

---
View it on GitLab: https://salsa.debian.org/debian-gis-team/python-pdal/commit/ee79bb2d3752f5c10490d1b4f68b0609f59724d7
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.alioth.debian.org/pipermail/pkg-grass-devel/attachments/20180408/cae6b6e4/attachment-0001.html>


More information about the Pkg-grass-devel mailing list