[Python-modules-commits] [cloudpickle] 01/04: Import cloudpickle_0.2.1.orig.tar.gz

Diane Trout diane at moszumanska.debian.org
Mon Dec 12 21:31:39 UTC 2016


This is an automated email from the git hooks/post-receive script.

diane pushed a commit to branch master
in repository cloudpickle.

commit 2185ff9396606b75875a81621eaf1efe51568de1
Author: Diane Trout <diane at ghic.org>
Date:   Mon Dec 12 13:11:41 2016 -0800

    Import cloudpickle_0.2.1.orig.tar.gz
---
 LICENSE                                   |  32 ++
 MANIFEST.in                               |  12 +
 PKG-INFO                                  | 115 +++++
 README.md                                 |  90 ++++
 cloudpickle.egg-info/PKG-INFO             | 115 +++++
 cloudpickle.egg-info/SOURCES.txt          |  16 +
 cloudpickle.egg-info/dependency_links.txt |   1 +
 cloudpickle.egg-info/pbr.json             |   1 +
 cloudpickle.egg-info/top_level.txt        |   1 +
 cloudpickle/__init__.py                   |   5 +
 cloudpickle/cloudpickle.py                | 765 ++++++++++++++++++++++++++++++
 setup.cfg                                 |   8 +
 setup.py                                  |  37 ++
 tests/__init__.py                         |   0
 tests/cloudpickle_file_test.py            | 120 +++++
 tests/cloudpickle_test.py                 | 295 ++++++++++++
 tests/testutils.py                        |  72 +++
 17 files changed, 1685 insertions(+)

diff --git a/LICENSE b/LICENSE
new file mode 100644
index 0000000..d112c48
--- /dev/null
+++ b/LICENSE
@@ -0,0 +1,32 @@
+This module was extracted from the `cloud` package, developed by
+PiCloud, Inc.
+
+Copyright (c) 2015, Cloudpickle contributors.
+Copyright (c) 2012, Regents of the University of California.
+Copyright (c) 2009 PiCloud, Inc. http://www.picloud.com.
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions
+are met:
+    * Redistributions of source code must retain the above copyright
+      notice, this list of conditions and the following disclaimer.
+    * Redistributions in binary form must reproduce the above copyright
+      notice, this list of conditions and the following disclaimer in the
+      documentation and/or other materials provided with the distribution.
+    * Neither the name of the University of California, Berkeley nor the
+      names of its contributors may be used to endorse or promote
+      products derived from this software without specific prior written
+      permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
+TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
+LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
+NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
+SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/MANIFEST.in b/MANIFEST.in
new file mode 100644
index 0000000..74f5b20
--- /dev/null
+++ b/MANIFEST.in
@@ -0,0 +1,12 @@
+include AUTHORS.rst
+include CONTRIBUTING.rst
+include HISTORY.rst
+include LICENSE
+include README.rst
+include README.md
+
+recursive-include tests *
+recursive-exclude * __pycache__
+recursive-exclude * *.py[co]
+
+recursive-include docs *.rst conf.py Makefile make.bat
diff --git a/PKG-INFO b/PKG-INFO
new file mode 100644
index 0000000..6aba4a1
--- /dev/null
+++ b/PKG-INFO
@@ -0,0 +1,115 @@
+Metadata-Version: 1.1
+Name: cloudpickle
+Version: 0.2.1
+Summary: Extended pickling support for Python objects
+Home-page: https://github.com/cloudpipe/cloudpickle
+Author: Cloudpipe
+Author-email: cloudpipe at googlegroups.com
+License: LICENSE.txt
+Description: # cloudpickle
+        
+        [![Build Status](https://travis-ci.org/cloudpipe/cloudpickle.svg?branch=master
+            )](https://travis-ci.org/cloudpipe/cloudpickle)
+        [![codecov.io](https://codecov.io/github/cloudpipe/cloudpickle/coverage.svg?branch=master)](https://codecov.io/github/cloudpipe/cloudpickle?branch=master)
+        
+        `cloudpickle` makes it possible to serialize Python constructs not supported
+        by the default `pickle` module from the Python standard library.
+        
+        `cloudpickle` is especially useful for cluster computing where Python
+        expressions are shipped over the network to execute on remote hosts, possibly
+        close to the data.
+        
+        Among other things, `cloudpickle` supports pickling for lambda expressions,
+        functions and classes defined interactively in the `__main__` module.
+        
+        
+        Installation
+        ------------
+        
+        The latest release of `cloudpickle` is available from
+        [pypi](https://pypi.python.org/pypi/cloudpickle):
+        
+            pip install cloudpickle
+        
+        
+        Examples
+        --------
+        
+        Pickling a lambda expression:
+        
+        ```python
+        >>> import cloudpickle
+        >>> squared = lambda x: x ** 2
+        >>> pickled_lambda = cloudpickle.dumps(squared)
+        
+        >>> import pickle
+        >>> new_squared = pickle.loads(pickled_lambda)
+        >>> new_squared(2)
+        4
+        ```
+        
+        Pickling a function interactively defined in a Python shell session
+        (in the `__main__` module):
+        
+        ```python
+        >>> CONSTANT = 42
+        >>> def my_function(data):
+        ...    return data + CONSTANT
+        ...
+        >>> pickled_function = cloudpickle.dumps(my_function)
+        >>> pickle.loads(pickled_function)(43)
+        85
+        ```
+        
+        Running the tests
+        -----------------
+        
+        - With `tox`, to test run the tests for all the supported versions of
+          Python and PyPy:
+        
+              pip install tox
+              tox
+        
+          or alternatively for a specific environment:
+        
+              tox -e py27
+        
+        
+        - With `py.test` to only run the tests for your current version of
+          Python:
+        
+              pip install -r dev-requirements.txt
+              PYTHONPATH='.:tests' py.test
+        
+        
+        History
+        -------
+        
+        `cloudpickle` was initially developed by picloud.com and shipped as part of
+        the client SDK.
+        
+        A copy of `cloudpickle.py` was included as part of PySpark, the Python
+        interface to [Apache Spark](https://spark.apache.org/). Davies Liu, Josh
+        Rosen, Thom Neale and other Apache Spark developers improved it significantly,
+        most notably to add support for PyPy and Python 3.
+        
+        The aim of the `cloudpickle` project is to make that work available to a wider
+        audience outside of the Spark ecosystem and to make it easier to improve it
+        further notably with the help of a dedicated non-regression test suite.
+        
+Platform: UNKNOWN
+Classifier: Development Status :: 4 - Beta
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: POSIX
+Classifier: Operating System :: Microsoft :: Windows
+Classifier: Operating System :: MacOS :: MacOS X
+Classifier: Programming Language :: Python :: 2.6
+Classifier: Programming Language :: Python :: 2.7
+Classifier: Programming Language :: Python :: 3.3
+Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Topic :: Scientific/Engineering
+Classifier: Topic :: System :: Distributed Computing
diff --git a/README.md b/README.md
new file mode 100644
index 0000000..4706d85
--- /dev/null
+++ b/README.md
@@ -0,0 +1,90 @@
+# cloudpickle
+
+[![Build Status](https://travis-ci.org/cloudpipe/cloudpickle.svg?branch=master
+    )](https://travis-ci.org/cloudpipe/cloudpickle)
+[![codecov.io](https://codecov.io/github/cloudpipe/cloudpickle/coverage.svg?branch=master)](https://codecov.io/github/cloudpipe/cloudpickle?branch=master)
+
+`cloudpickle` makes it possible to serialize Python constructs not supported
+by the default `pickle` module from the Python standard library.
+
+`cloudpickle` is especially useful for cluster computing where Python
+expressions are shipped over the network to execute on remote hosts, possibly
+close to the data.
+
+Among other things, `cloudpickle` supports pickling for lambda expressions,
+functions and classes defined interactively in the `__main__` module.
+
+
+Installation
+------------
+
+The latest release of `cloudpickle` is available from
+[pypi](https://pypi.python.org/pypi/cloudpickle):
+
+    pip install cloudpickle
+
+
+Examples
+--------
+
+Pickling a lambda expression:
+
+```python
+>>> import cloudpickle
+>>> squared = lambda x: x ** 2
+>>> pickled_lambda = cloudpickle.dumps(squared)
+
+>>> import pickle
+>>> new_squared = pickle.loads(pickled_lambda)
+>>> new_squared(2)
+4
+```
+
+Pickling a function interactively defined in a Python shell session
+(in the `__main__` module):
+
+```python
+>>> CONSTANT = 42
+>>> def my_function(data):
+...    return data + CONSTANT
+...
+>>> pickled_function = cloudpickle.dumps(my_function)
+>>> pickle.loads(pickled_function)(43)
+85
+```
+
+Running the tests
+-----------------
+
+- With `tox`, to test run the tests for all the supported versions of
+  Python and PyPy:
+
+      pip install tox
+      tox
+
+  or alternatively for a specific environment:
+
+      tox -e py27
+
+
+- With `py.test` to only run the tests for your current version of
+  Python:
+
+      pip install -r dev-requirements.txt
+      PYTHONPATH='.:tests' py.test
+
+
+History
+-------
+
+`cloudpickle` was initially developed by picloud.com and shipped as part of
+the client SDK.
+
+A copy of `cloudpickle.py` was included as part of PySpark, the Python
+interface to [Apache Spark](https://spark.apache.org/). Davies Liu, Josh
+Rosen, Thom Neale and other Apache Spark developers improved it significantly,
+most notably to add support for PyPy and Python 3.
+
+The aim of the `cloudpickle` project is to make that work available to a wider
+audience outside of the Spark ecosystem and to make it easier to improve it
+further notably with the help of a dedicated non-regression test suite.
diff --git a/cloudpickle.egg-info/PKG-INFO b/cloudpickle.egg-info/PKG-INFO
new file mode 100644
index 0000000..6aba4a1
--- /dev/null
+++ b/cloudpickle.egg-info/PKG-INFO
@@ -0,0 +1,115 @@
+Metadata-Version: 1.1
+Name: cloudpickle
+Version: 0.2.1
+Summary: Extended pickling support for Python objects
+Home-page: https://github.com/cloudpipe/cloudpickle
+Author: Cloudpipe
+Author-email: cloudpipe at googlegroups.com
+License: LICENSE.txt
+Description: # cloudpickle
+        
+        [![Build Status](https://travis-ci.org/cloudpipe/cloudpickle.svg?branch=master
+            )](https://travis-ci.org/cloudpipe/cloudpickle)
+        [![codecov.io](https://codecov.io/github/cloudpipe/cloudpickle/coverage.svg?branch=master)](https://codecov.io/github/cloudpipe/cloudpickle?branch=master)
+        
+        `cloudpickle` makes it possible to serialize Python constructs not supported
+        by the default `pickle` module from the Python standard library.
+        
+        `cloudpickle` is especially useful for cluster computing where Python
+        expressions are shipped over the network to execute on remote hosts, possibly
+        close to the data.
+        
+        Among other things, `cloudpickle` supports pickling for lambda expressions,
+        functions and classes defined interactively in the `__main__` module.
+        
+        
+        Installation
+        ------------
+        
+        The latest release of `cloudpickle` is available from
+        [pypi](https://pypi.python.org/pypi/cloudpickle):
+        
+            pip install cloudpickle
+        
+        
+        Examples
+        --------
+        
+        Pickling a lambda expression:
+        
+        ```python
+        >>> import cloudpickle
+        >>> squared = lambda x: x ** 2
+        >>> pickled_lambda = cloudpickle.dumps(squared)
+        
+        >>> import pickle
+        >>> new_squared = pickle.loads(pickled_lambda)
+        >>> new_squared(2)
+        4
+        ```
+        
+        Pickling a function interactively defined in a Python shell session
+        (in the `__main__` module):
+        
+        ```python
+        >>> CONSTANT = 42
+        >>> def my_function(data):
+        ...    return data + CONSTANT
+        ...
+        >>> pickled_function = cloudpickle.dumps(my_function)
+        >>> pickle.loads(pickled_function)(43)
+        85
+        ```
+        
+        Running the tests
+        -----------------
+        
+        - With `tox`, to test run the tests for all the supported versions of
+          Python and PyPy:
+        
+              pip install tox
+              tox
+        
+          or alternatively for a specific environment:
+        
+              tox -e py27
+        
+        
+        - With `py.test` to only run the tests for your current version of
+          Python:
+        
+              pip install -r dev-requirements.txt
+              PYTHONPATH='.:tests' py.test
+        
+        
+        History
+        -------
+        
+        `cloudpickle` was initially developed by picloud.com and shipped as part of
+        the client SDK.
+        
+        A copy of `cloudpickle.py` was included as part of PySpark, the Python
+        interface to [Apache Spark](https://spark.apache.org/). Davies Liu, Josh
+        Rosen, Thom Neale and other Apache Spark developers improved it significantly,
+        most notably to add support for PyPy and Python 3.
+        
+        The aim of the `cloudpickle` project is to make that work available to a wider
+        audience outside of the Spark ecosystem and to make it easier to improve it
+        further notably with the help of a dedicated non-regression test suite.
+        
+Platform: UNKNOWN
+Classifier: Development Status :: 4 - Beta
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: POSIX
+Classifier: Operating System :: Microsoft :: Windows
+Classifier: Operating System :: MacOS :: MacOS X
+Classifier: Programming Language :: Python :: 2.6
+Classifier: Programming Language :: Python :: 2.7
+Classifier: Programming Language :: Python :: 3.3
+Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Topic :: Scientific/Engineering
+Classifier: Topic :: System :: Distributed Computing
diff --git a/cloudpickle.egg-info/SOURCES.txt b/cloudpickle.egg-info/SOURCES.txt
new file mode 100644
index 0000000..8063c75
--- /dev/null
+++ b/cloudpickle.egg-info/SOURCES.txt
@@ -0,0 +1,16 @@
+LICENSE
+MANIFEST.in
+README.md
+setup.cfg
+setup.py
+cloudpickle/__init__.py
+cloudpickle/cloudpickle.py
+cloudpickle.egg-info/PKG-INFO
+cloudpickle.egg-info/SOURCES.txt
+cloudpickle.egg-info/dependency_links.txt
+cloudpickle.egg-info/pbr.json
+cloudpickle.egg-info/top_level.txt
+tests/__init__.py
+tests/cloudpickle_file_test.py
+tests/cloudpickle_test.py
+tests/testutils.py
\ No newline at end of file
diff --git a/cloudpickle.egg-info/dependency_links.txt b/cloudpickle.egg-info/dependency_links.txt
new file mode 100644
index 0000000..8b13789
--- /dev/null
+++ b/cloudpickle.egg-info/dependency_links.txt
@@ -0,0 +1 @@
+
diff --git a/cloudpickle.egg-info/pbr.json b/cloudpickle.egg-info/pbr.json
new file mode 100644
index 0000000..1784f4a
--- /dev/null
+++ b/cloudpickle.egg-info/pbr.json
@@ -0,0 +1 @@
+{"is_release": false, "git_version": "4e34fd2"}
\ No newline at end of file
diff --git a/cloudpickle.egg-info/top_level.txt b/cloudpickle.egg-info/top_level.txt
new file mode 100644
index 0000000..37d5682
--- /dev/null
+++ b/cloudpickle.egg-info/top_level.txt
@@ -0,0 +1 @@
+cloudpickle
diff --git a/cloudpickle/__init__.py b/cloudpickle/__init__.py
new file mode 100644
index 0000000..891a802
--- /dev/null
+++ b/cloudpickle/__init__.py
@@ -0,0 +1,5 @@
+from __future__ import absolute_import
+
+from cloudpickle.cloudpickle import *
+
+__version__ = '0.2.1'
diff --git a/cloudpickle/cloudpickle.py b/cloudpickle/cloudpickle.py
new file mode 100644
index 0000000..306c859
--- /dev/null
+++ b/cloudpickle/cloudpickle.py
@@ -0,0 +1,765 @@
+"""
+This class is defined to override standard pickle functionality
+
+The goals of it follow:
+-Serialize lambdas and nested functions to compiled byte code
+-Deal with main module correctly
+-Deal with other non-serializable objects
+
+It does not include an unpickler, as standard python unpickling suffices.
+
+This module was extracted from the `cloud` package, developed by `PiCloud, Inc.
+<http://www.picloud.com>`_.
+
+Copyright (c) 2012, Regents of the University of California.
+Copyright (c) 2009 `PiCloud, Inc. <http://www.picloud.com>`_.
+All rights reserved.
+
+Redistribution and use in source and binary forms, with or without
+modification, are permitted provided that the following conditions
+are met:
+    * Redistributions of source code must retain the above copyright
+      notice, this list of conditions and the following disclaimer.
+    * Redistributions in binary form must reproduce the above copyright
+      notice, this list of conditions and the following disclaimer in the
+      documentation and/or other materials provided with the distribution.
+    * Neither the name of the University of California, Berkeley nor the
+      names of its contributors may be used to endorse or promote
+      products derived from this software without specific prior written
+      permission.
+
+THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
+TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
+LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
+NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
+SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+"""
+from __future__ import print_function
+
+import operator
+import io
+import imp
+import pickle
+import struct
+import sys
+import types
+from functools import partial
+import itertools
+import dis
+import traceback
+
+if sys.version < '3':
+    from pickle import Pickler
+    try:
+        from cStringIO import StringIO
+    except ImportError:
+        from StringIO import StringIO
+    PY3 = False
+else:
+    types.ClassType = type
+    from pickle import _Pickler as Pickler
+    from io import BytesIO as StringIO
+    PY3 = True
+
+#relevant opcodes
+STORE_GLOBAL = dis.opname.index('STORE_GLOBAL')
+DELETE_GLOBAL = dis.opname.index('DELETE_GLOBAL')
+LOAD_GLOBAL = dis.opname.index('LOAD_GLOBAL')
+GLOBAL_OPS = [STORE_GLOBAL, DELETE_GLOBAL, LOAD_GLOBAL]
+HAVE_ARGUMENT = dis.HAVE_ARGUMENT
+EXTENDED_ARG = dis.EXTENDED_ARG
+
+
+def islambda(func):
+    return getattr(func,'__name__') == '<lambda>'
+
+
+_BUILTIN_TYPE_NAMES = {}
+for k, v in types.__dict__.items():
+    if type(v) is type:
+        _BUILTIN_TYPE_NAMES[v] = k
+
+
+def _builtin_type(name):
+    return getattr(types, name)
+
+
+class CloudPickler(Pickler):
+
+    dispatch = Pickler.dispatch.copy()
+
+    def __init__(self, file, protocol=None):
+        Pickler.__init__(self, file, protocol)
+        # set of modules to unpickle
+        self.modules = set()
+        # map ids to dictionary. used to ensure that functions can share global env
+        self.globals_ref = {}
+
+    def dump(self, obj):
+        self.inject_addons()
+        try:
+            return Pickler.dump(self, obj)
+        except RuntimeError as e:
+            if 'recursion' in e.args[0]:
+                msg = """Could not pickle object as excessively deep recursion required."""
+                raise pickle.PicklingError(msg)
+
+    def save_memoryview(self, obj):
+        """Fallback to save_string"""
+        Pickler.save_string(self, str(obj))
+
+    def save_buffer(self, obj):
+        """Fallback to save_string"""
+        Pickler.save_string(self,str(obj))
+    if PY3:
+        dispatch[memoryview] = save_memoryview
+    else:
+        dispatch[buffer] = save_buffer
+
+    def save_unsupported(self, obj):
+        raise pickle.PicklingError("Cannot pickle objects of type %s" % type(obj))
+    dispatch[types.GeneratorType] = save_unsupported
+
+    # itertools objects do not pickle!
+    for v in itertools.__dict__.values():
+        if type(v) is type:
+            dispatch[v] = save_unsupported
+
+    def save_module(self, obj):
+        """
+        Save a module as an import
+        """
+        mod_name = obj.__name__
+        # If module is successfully found then it is not a dynamically created module
+        try:
+            _find_module(mod_name)
+            is_dynamic = False
+        except ImportError:
+            is_dynamic = True
+
+        self.modules.add(obj)
+        if is_dynamic:
+            self.save_reduce(dynamic_subimport, (obj.__name__, vars(obj)), obj=obj)
+        else:
+            self.save_reduce(subimport, (obj.__name__,), obj=obj)
+    dispatch[types.ModuleType] = save_module
+
+    def save_codeobject(self, obj):
+        """
+        Save a code object
+        """
+        if PY3:
+            args = (
+                obj.co_argcount, obj.co_kwonlyargcount, obj.co_nlocals, obj.co_stacksize,
+                obj.co_flags, obj.co_code, obj.co_consts, obj.co_names, obj.co_varnames,
+                obj.co_filename, obj.co_name, obj.co_firstlineno, obj.co_lnotab, obj.co_freevars,
+                obj.co_cellvars
+            )
+        else:
+            args = (
+                obj.co_argcount, obj.co_nlocals, obj.co_stacksize, obj.co_flags, obj.co_code,
+                obj.co_consts, obj.co_names, obj.co_varnames, obj.co_filename, obj.co_name,
+                obj.co_firstlineno, obj.co_lnotab, obj.co_freevars, obj.co_cellvars
+            )
+        self.save_reduce(types.CodeType, args, obj=obj)
+    dispatch[types.CodeType] = save_codeobject
+
+    def save_function(self, obj, name=None):
+        """ Registered with the dispatch to handle all function types.
+
+        Determines what kind of function obj is (e.g. lambda, defined at
+        interactive prompt, etc) and handles the pickling appropriately.
+        """
+        write = self.write
+
+        if name is None:
+            name = obj.__name__
+        modname = pickle.whichmodule(obj, name)
+        # print('which gives %s %s %s' % (modname, obj, name))
+        try:
+            themodule = sys.modules[modname]
+        except KeyError:
+            # eval'd items such as namedtuple give invalid items for their function __module__
+            modname = '__main__'
+
+        if modname == '__main__':
+            themodule = None
+
+        if themodule:
+            self.modules.add(themodule)
+            if getattr(themodule, name, None) is obj:
+                return self.save_global(obj, name)
+
+        # if func is lambda, def'ed at prompt, is in main, or is nested, then
+        # we'll pickle the actual function object rather than simply saving a
+        # reference (as is done in default pickler), via save_function_tuple.
+        if (islambda(obj)
+                or getattr(obj.__code__, 'co_filename', None) == '<stdin>'
+                or themodule is None):
+            self.save_function_tuple(obj)
+            return
+        else:
+            # func is nested
+            klass = getattr(themodule, name, None)
+            if klass is None or klass is not obj:
+                self.save_function_tuple(obj)
+                return
+
+        if obj.__dict__:
+            # essentially save_reduce, but workaround needed to avoid recursion
+            self.save(_restore_attr)
+            write(pickle.MARK + pickle.GLOBAL + modname + '\n' + name + '\n')
+            self.memoize(obj)
+            self.save(obj.__dict__)
+            write(pickle.TUPLE + pickle.REDUCE)
+        else:
+            write(pickle.GLOBAL + modname + '\n' + name + '\n')
+            self.memoize(obj)
+    dispatch[types.FunctionType] = save_function
+
+    def save_function_tuple(self, func):
+        """  Pickles an actual func object.
+
+        A func comprises: code, globals, defaults, closure, and dict.  We
+        extract and save these, injecting reducing functions at certain points
+        to recreate the func object.  Keep in mind that some of these pieces
+        can contain a ref to the func itself.  Thus, a naive save on these
+        pieces could trigger an infinite loop of save's.  To get around that,
+        we first create a skeleton func object using just the code (this is
+        safe, since this won't contain a ref to the func), and memoize it as
+        soon as it's created.  The other stuff can then be filled in later.
+        """
+        save = self.save
+        write = self.write
+
+        code, f_globals, defaults, closure, dct, base_globals = self.extract_func_data(func)
+
+        save(_fill_function)  # skeleton function updater
+        write(pickle.MARK)    # beginning of tuple that _fill_function expects
+
+        # create a skeleton function object and memoize it
+        save(_make_skel_func)
+        save((code, closure, base_globals))
+        write(pickle.REDUCE)
+        self.memoize(func)
+
+        # save the rest of the func data needed by _fill_function
+        save(f_globals)
+        save(defaults)
+        save(dct)
+        write(pickle.TUPLE)
+        write(pickle.REDUCE)  # applies _fill_function on the tuple
+
+    @staticmethod
+    def extract_code_globals(co):
+        """
+        Find all globals names read or written to by codeblock co
+        """
+
+        code = getattr(co, 'co_code', None)
+        if code is None:
+            return set()
+        if not PY3:
+            code = [ord(c) for c in code]
+        names = co.co_names
+        out_names = set()
+
+        n = len(code)
+        i = 0
+        extended_arg = 0
+        while i < n:
+            op = code[i]
+
+            i += 1
+            if op >= HAVE_ARGUMENT:
+                oparg = code[i] + code[i+1] * 256 + extended_arg
+                extended_arg = 0
+                i += 2
+                if op == EXTENDED_ARG:
+                    extended_arg = oparg*65536
+                if op in GLOBAL_OPS:
+                    out_names.add(names[oparg])
+
+        # see if nested function have any global refs
+        if co.co_consts:
+            for const in co.co_consts:
+                if type(const) is types.CodeType:
+                    out_names |= CloudPickler.extract_code_globals(const)
+
+        return out_names
+
+    def extract_func_data(self, func):
+        """
+        Turn the function into a tuple of data necessary to recreate it:
+            code, globals, defaults, closure, dict
+        """
+        code = func.__code__
+
+        # extract all global ref's
+        func_global_refs = self.extract_code_globals(code)
+
+        # process all variables referenced by global environment
+        f_globals = {}
+        for var in func_global_refs:
+            if var in func.__globals__:
+                f_globals[var] = func.__globals__[var]
+
+        # defaults requires no processing
+        defaults = func.__defaults__
+
+        # process closure
+        closure = [c.cell_contents for c in func.__closure__] if func.__closure__ else []
+
+        # save the dict
+        dct = func.__dict__
+
+        base_globals = self.globals_ref.get(id(func.__globals__), {})
+        self.globals_ref[id(func.__globals__)] = base_globals
+
+        return (code, f_globals, defaults, closure, dct, base_globals)
+
+    def save_builtin_function(self, obj):
+        if obj.__module__ == "__builtin__":
+            return self.save_global(obj)
+        return self.save_function(obj)
+    dispatch[types.BuiltinFunctionType] = save_builtin_function
+
+    def save_global(self, obj, name=None, pack=struct.pack):
+        if obj.__module__ == "__builtin__" or obj.__module__ == "builtins":
+            if obj in _BUILTIN_TYPE_NAMES:
+                return self.save_reduce(_builtin_type, (_BUILTIN_TYPE_NAMES[obj],), obj=obj)
+
+        if name is None:
+            name = obj.__name__
+
+        modname = getattr(obj, "__module__", None)
+        if modname is None:
+            modname = pickle.whichmodule(obj, name)
+
+        if modname == '__main__':
+            themodule = None
+        else:
+            __import__(modname)
+            themodule = sys.modules[modname]
+            self.modules.add(themodule)
+
+        if hasattr(themodule, name) and getattr(themodule, name) is obj:
+            return Pickler.save_global(self, obj, name)
+
+        typ = type(obj)
+        if typ is not obj and isinstance(obj, (type, types.ClassType)):
+            d = dict(obj.__dict__)  # copy dict proxy to a dict
+            if not isinstance(d.get('__dict__', None), property):
+                # don't extract dict that are properties
+                d.pop('__dict__', None)
+            d.pop('__weakref__', None)
+
+            # hack as __new__ is stored differently in the __dict__
+            new_override = d.get('__new__', None)
+            if new_override:
+                d['__new__'] = obj.__new__
+
+            self.save_reduce(typ, (obj.__name__, obj.__bases__, d), obj=obj)
+        else:
+            raise pickle.PicklingError("Can't pickle %r" % obj)
+
+    dispatch[type] = save_global
+    dispatch[types.ClassType] = save_global
+
+    def save_instancemethod(self, obj):
+        # Memoization rarely is ever useful due to python bounding
+        if obj.__self__ is None:
+            self.save_reduce(getattr, (obj.im_class, obj.__name__))
+        else:
+            if PY3:
+                self.save_reduce(types.MethodType, (obj.__func__, obj.__self__), obj=obj)
+            else:
+                self.save_reduce(types.MethodType, (obj.__func__, obj.__self__, obj.__self__.__class__),
+                         obj=obj)
+    dispatch[types.MethodType] = save_instancemethod
+
+    def save_inst(self, obj):
+        """Inner logic to save instance. Based off pickle.save_inst
+        Supports __transient__"""
+        cls = obj.__class__
+
+        memo = self.memo
+        write = self.write
+        save = self.save
+
+        if hasattr(obj, '__getinitargs__'):
+            args = obj.__getinitargs__()
+            len(args)  # XXX Assert it's a sequence
+            pickle._keep_alive(args, memo)
+        else:
+            args = ()
+
+        write(pickle.MARK)
+
+        if self.bin:
+            save(cls)
+            for arg in args:
+                save(arg)
+            write(pickle.OBJ)
+        else:
+            for arg in args:
+                save(arg)
+            write(pickle.INST + cls.__module__ + '\n' + cls.__name__ + '\n')
+
+        self.memoize(obj)
+
+        try:
+            getstate = obj.__getstate__
+        except AttributeError:
+            stuff = obj.__dict__
+            #remove items if transient
+            if hasattr(obj, '__transient__'):
+                transient = obj.__transient__
+                stuff = stuff.copy()
+                for k in list(stuff.keys()):
+                    if k in transient:
+                        del stuff[k]
+        else:
+            stuff = getstate()
+            pickle._keep_alive(stuff, memo)
+        save(stuff)
+        write(pickle.BUILD)
+
+    if not PY3:
+        dispatch[types.InstanceType] = save_inst
+
+    def save_property(self, obj):
+        # properties not correctly saved in python
+        self.save_reduce(property, (obj.fget, obj.fset, obj.fdel, obj.__doc__), obj=obj)
+    dispatch[property] = save_property
+
+    def save_classmethod(self, obj):
+        try:
+            orig_func = obj.__func__
+        except AttributeError:  # Python 2.6
+            orig_func = obj.__get__(None, object)
+            if isinstance(obj, classmethod):
+                orig_func = orig_func.__func__  # Unbind
+        self.save_reduce(type(obj), (orig_func,), obj=obj)
+    dispatch[classmethod] = save_classmethod
+    dispatch[staticmethod] = save_classmethod
+
+    def save_itemgetter(self, obj):
+        """itemgetter serializer (needed for namedtuple support)"""
+        class Dummy:
+            def __getitem__(self, item):
+                return item
+        items = obj(Dummy())
+        if not isinstance(items, tuple):
+            items = (items, )
+        return self.save_reduce(operator.itemgetter, items)
+
+    if type(operator.itemgetter) is type:
+        dispatch[operator.itemgetter] = save_itemgetter
+
+    def save_attrgetter(self, obj):
+        """attrgetter serializer"""
+        class Dummy(object):
+            def __init__(self, attrs, index=None):
+                self.attrs = attrs
+                self.index = index
+            def __getattribute__(self, item):
+                attrs = object.__getattribute__(self, "attrs")
+                index = object.__getattribute__(self, "index")
+                if index is None:
+                    index = len(attrs)
+                    attrs.append(item)
+                else:
+                    attrs[index] = ".".join([attrs[index], item])
+                return type(self)(attrs, index)
+        attrs = []
+        obj(Dummy(attrs))
+        return self.save_reduce(operator.attrgetter, tuple(attrs))
+
+    if type(operator.attrgetter) is type:
+        dispatch[operator.attrgetter] = save_attrgetter
+
+    def save_reduce(self, func, args, state=None,
+                    listitems=None, dictitems=None, obj=None):
+        """Modified to support __transient__ on new objects
+        Change only affects protocol level 2 (which is always used by PiCloud"""
+        # Assert that args is a tuple or None
+        if not isinstance(args, tuple):
+            raise pickle.PicklingError("args from reduce() should be a tuple")
+
+        # Assert that func is callable
+        if not hasattr(func, '__call__'):
+            raise pickle.PicklingError("func from reduce should be callable")
+
+        save = self.save
+        write = self.write
+
+        # Protocol 2 special case: if func's name is __newobj__, use NEWOBJ
+        if self.proto >= 2 and getattr(func, "__name__", "") == "__newobj__":
+            #Added fix to allow transient
+            cls = args[0]
+            if not hasattr(cls, "__new__"):
+                raise pickle.PicklingError(
+                    "args[0] from __newobj__ args has no __new__")
+            if obj is not None and cls is not obj.__class__:
+                raise pickle.PicklingError(
+                    "args[0] from __newobj__ args has the wrong class")
+            args = args[1:]
+            save(cls)
+
+            #Don't pickle transient entries
+            if hasattr(obj, '__transient__'):
+                transient = obj.__transient__
+                state = state.copy()
... 811 lines suppressed ...

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/python-modules/packages/cloudpickle.git



More information about the Python-modules-commits mailing list