[Git][debian-gis-team/donfig][master] 6 commits: New upstream version 0.7.0+dfsg

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Mon Feb 7 07:18:01 GMT 2022



Antonio Valentino pushed to branch master at Debian GIS Project / donfig


Commits:
eabe372a by Antonio Valentino at 2022-02-07T07:04:41+00:00
New upstream version 0.7.0+dfsg
- - - - -
74f89db0 by Antonio Valentino at 2022-02-07T07:04:42+00:00
Update upstream source from tag 'upstream/0.7.0+dfsg'

Update to upstream version '0.7.0+dfsg'
with Debian dir 501cb2e03357a6b45e534352f1449308c44936f4
- - - - -
4578e94f by Antonio Valentino at 2022-02-07T07:07:06+00:00
New upstream release

- - - - -
a5b616b8 by Antonio Valentino at 2022-02-07T07:07:59+00:00
Update d/copyright

- - - - -
8a3fa796 by Antonio Valentino at 2022-02-07T07:13:45+00:00
Add dependency on cloudpickle

- - - - -
e527db2a by Antonio Valentino at 2022-02-07T07:14:20+00:00
Set distribution to unstable

- - - - -


21 changed files:

- + .pre-commit-config.yaml
- CHANGELOG.md
- README.rst
- RELEASING.md
- continuous_integration/environment.yaml
- debian/changelog
- debian/control
- debian/copyright
- debian/tests/control
- doc/conf.py
- doc/configuration.rst
- donfig/__init__.py
- + donfig/_lock.py
- donfig/config_obj.py
- donfig/tests/test_config.py
- + donfig/tests/test_lock.py
- donfig/utils.py
- donfig/version.py
- setup.cfg
- setup.py
- versioneer.py


Changes:

=====================================
.pre-commit-config.yaml
=====================================
@@ -0,0 +1,30 @@
+repos:
+  -   repo: https://github.com/pycqa/isort
+      rev: 5.9.3
+      hooks:
+        - id: isort
+          language_version: python3
+  - repo: https://github.com/asottile/pyupgrade
+    rev: v2.29.0
+    hooks:
+      - id: pyupgrade
+        args:
+          - --py37-plus
+  -   repo: https://github.com/psf/black
+      rev: 22.1.0
+      hooks:
+        - id: black
+          language_version: python3
+          args:
+            - --target-version=py37
+            - --exclude="(versioneer\.py|donfig\/version\.py)"
+  -   repo: https://gitlab.com/pycqa/flake8
+      rev: 3.9.2
+      hooks:
+        - id: flake8
+          language_version: python3
+
+ci:
+  # To trigger manually, comment on a pull request with "pre-commit.ci autofix"
+  autofix_prs: false
+  skip: []


=====================================
CHANGELOG.md
=====================================
@@ -1,3 +1,34 @@
+## Version 0.7.0 (2022/02/04)
+
+### Issues Closed
+
+* [Issue 17](https://github.com/pytroll/donfig/issues/17) - Threadlock TypeError when trying to pickle donfig object ([PR 22](https://github.com/pytroll/donfig/pull/22) by [@djhoese](https://github.com/djhoese))
+* [Issue 16](https://github.com/pytroll/donfig/issues/16) - Failure to initialize Config object ([PR 20](https://github.com/pytroll/donfig/pull/20) by [@djhoese](https://github.com/djhoese))
+* [Issue 14](https://github.com/pytroll/donfig/issues/14) - 0.6.0 release?
+* [Issue 13](https://github.com/pytroll/donfig/issues/13) - MNT: Stop using ci-helpers in appveyor.yml
+
+In this release 4 issues were closed.
+
+### Pull Requests Merged
+
+#### Bugs fixed
+
+* [PR 21](https://github.com/pytroll/donfig/pull/21) - Make `test__get_paths` robust to `site.PREFIXES` being set
+
+#### Features added
+
+* [PR 23](https://github.com/pytroll/donfig/pull/23) - Drop Python 3.6 support and add pre-commit
+* [PR 22](https://github.com/pytroll/donfig/pull/22) - Add SerializableLock from Dask to use in `Config.set` ([17](https://github.com/pytroll/donfig/issues/17))
+* [PR 19](https://github.com/pytroll/donfig/pull/19) - Refactor config default search path retrieval
+* [PR 18](https://github.com/pytroll/donfig/pull/18) - Expand YAML config search directories
+
+#### Documentation changes
+
+* [PR 20](https://github.com/pytroll/donfig/pull/20) - Fix inaccurate example of Config creation with defaults ([16](https://github.com/pytroll/donfig/issues/16))
+
+In this release 6 pull requests were closed.
+
+
 ## Version 0.6.0 (2021/01/17)
 
 ### Pull Requests Merged


=====================================
README.rst
=====================================
@@ -1,15 +1,19 @@
 Donfig
 ======
 
-.. image:: https://github.com/pytroll/donfig/workflows/CI/badge.svg?branch=master
+.. image:: https://github.com/pytroll/donfig/workflows/CI/badge.svg?branch=main
     :target: https://github.com/pytroll/donfig/actions?query=workflow%3A%22CI%22
 
-.. image:: https://codecov.io/gh/pytroll/donfig/branch/master/graph/badge.svg?token=xmvNtxzdoB
+.. image:: https://codecov.io/gh/pytroll/donfig/branch/main/graph/badge.svg?token=xmvNtxzdoB
    :target: https://codecov.io/gh/pytroll/donfig
 
 .. image:: https://anaconda.org/conda-forge/donfig/badges/version.svg
    :target: https://anaconda.org/conda-forge/donfig/
 
+.. image:: https://results.pre-commit.ci/badge/github/pytroll/donfig/main.svg
+   :target: https://results.pre-commit.ci/latest/github/pytroll/donfig/main
+   :alt: pre-commit.ci status
+
 Donfig is a python library meant to make configuration easier for other
 python packages. Donfig can be configured programmatically, by
 environment variables, or from YAML files in standard locations. The


=====================================
RELEASING.md
=====================================
@@ -1,6 +1,6 @@
 # Releasing Donfig
 
-1. checkout master
+1. checkout main branch
 2. pull from repo
 3. run the unittests
 4. run `loghub` and update the `CHANGELOG.md` file:


=====================================
continuous_integration/environment.yaml
=====================================
@@ -6,3 +6,4 @@ dependencies:
   - pytest
   - pytest-cov
   - python=3.9
+  - cloudpickle


=====================================
debian/changelog
=====================================
@@ -1,3 +1,11 @@
+donfig (0.7.0+dfsg-1) unstable; urgency=medium
+
+  * New upstream release.
+  * Update d/copyrigth.
+  * Add build/test dependency on python3-cloudpickle.
+
+ -- Antonio Valentino <antonio.valentino at tiscali.it>  Mon, 07 Feb 2022 07:14:01 +0000
+
 donfig (0.6.0+dfsg-3) unstable; urgency=medium
 
   * Team upload.


=====================================
debian/control
=====================================
@@ -7,6 +7,7 @@ Testsuite: autopkgtest-pkg-python
 Build-Depends: debhelper-compat (= 12),
                dh-python,
                python3-all,
+               python3-cloudpickle,
                python3-pytest,
                python3-setuptools,
                python3-yaml


=====================================
debian/copyright
=====================================
@@ -7,12 +7,12 @@ Comment: The upstream release is repacked in order to exclude minimized
 Files-Excluded: doc/_static/js-yaml.min.js
 
 Files: *
-Copyright: (c) 2018-2019 Donfig Developers
+Copyright: (c) 2018-2022 Donfig Developers
                2014-2018, Anaconda, Inc. and contributors
 License: Expat 
 
 Files: debian/*
-Copyright: 2021 Antonio Valentino <antonio.valentino at tiscali.it>
+Copyright: 2021-2022 Antonio Valentino <antonio.valentino at tiscali.it>
 License: Expat
 
 License: Expat


=====================================
debian/tests/control
=====================================
@@ -1,2 +1,2 @@
 Tests: python3
-Depends: @, python3-all, python3-pytest
+Depends: @, python3-all, python3-pytest, python3-cloudpickle


=====================================
doc/conf.py
=====================================
@@ -1,4 +1,3 @@
-# -*- coding: utf-8 -*-
 #
 # Configuration file for the Sphinx documentation builder.
 #
@@ -16,23 +15,22 @@
 #     sphinx-apidoc -f -T -o api ../donfig ../donfig/tests ../donfig/version.py
 import os
 import sys
-sys.path.insert(0, os.path.abspath('..'))
-from donfig import __version__
 
+sys.path.insert(0, os.path.abspath(".."))
+from donfig import __version__
 
 # -- Project information -----------------------------------------------------
 
-project = 'donfig'
-copyright = '2018, donfig Developers, 2014-2018, Anaconda, Inc. and contributors'
-author = 'donfig Developers'
+project = "donfig"
+copyright = "2018, donfig Developers, 2014-2018, Anaconda, Inc. and contributors"
+author = "donfig Developers"
 
 # The short X.Y version
-version = __version__.split('+')[0]
+version = __version__.split("+")[0]
 # The full version, including alpha/beta/rc tags.
 release = __version__
 
 
-
 # -- General configuration ---------------------------------------------------
 
 # If your documentation needs a minimal Sphinx version, state it here.
@@ -43,25 +41,25 @@ release = __version__
 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
 # ones.
 extensions = [
-    'sphinx.ext.autodoc',
-    'sphinx.ext.autosummary',
-    'sphinx.ext.doctest',
-    'sphinx.ext.intersphinx',
-    'sphinx.ext.viewcode',
-    'sphinx.ext.napoleon',
+    "sphinx.ext.autodoc",
+    "sphinx.ext.autosummary",
+    "sphinx.ext.doctest",
+    "sphinx.ext.intersphinx",
+    "sphinx.ext.viewcode",
+    "sphinx.ext.napoleon",
 ]
 
 # Add any paths that contain templates here, relative to this directory.
-templates_path = ['_templates']
+templates_path = ["_templates"]
 
 # The suffix(es) of source filenames.
 # You can specify multiple suffix as a list of string:
 #
 # source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+source_suffix = ".rst"
 
 # The master toctree document.
-master_doc = 'index'
+master_doc = "index"
 
 # The language for content autogenerated by Sphinx. Refer to documentation
 # for a list of supported languages.
@@ -73,7 +71,7 @@ language = None
 # List of patterns, relative to source directory, that match files and
 # directories to ignore when looking for source files.
 # This pattern also affects html_static_path and html_extra_path.
-exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
+exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
 
 # The name of the Pygments (syntax highlighting) style to use.
 pygments_style = None
@@ -84,7 +82,7 @@ pygments_style = None
 # The theme to use for HTML and HTML Help pages.  See the documentation for
 # a list of builtin themes.
 #
-html_theme = 'alabaster'
+html_theme = "alabaster"
 
 # Theme options are theme-specific and customize the look and feel of a theme
 # further.  For a list of options available for each theme, see the
@@ -95,15 +93,15 @@ html_theme = 'alabaster'
 # Add any paths that contain custom static files (such as style sheets) here,
 # relative to this directory. They are copied after the builtin static files,
 # so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ['_static']
+html_static_path = ["_static"]
 
 html_css_files = [
-    'style.css',
+    "style.css",
 ]
 
 html_js_files = [
-    'config_converter.js',
-    'js-yaml.min.js',
+    "config_converter.js",
+    "js-yaml.min.js",
 ]
 
 # Custom sidebar templates, must be a dictionary that maps document names
@@ -120,7 +118,7 @@ html_js_files = [
 # -- Options for HTMLHelp output ---------------------------------------------
 
 # Output file base name for HTML help builder.
-htmlhelp_basename = 'donfigdoc'
+htmlhelp_basename = "donfigdoc"
 
 
 # -- Options for LaTeX output ------------------------------------------------
@@ -129,15 +127,12 @@ latex_elements = {
     # The paper size ('letterpaper' or 'a4paper').
     #
     # 'papersize': 'letterpaper',
-
     # The font size ('10pt', '11pt' or '12pt').
     #
     # 'pointsize': '10pt',
-
     # Additional stuff for the LaTeX preamble.
     #
     # 'preamble': '',
-
     # Latex figure (float) alignment
     #
     # 'figure_align': 'htbp',
@@ -147,8 +142,7 @@ latex_elements = {
 # (source start file, target name, title,
 #  author, documentclass [howto, manual, or own class]).
 latex_documents = [
-    (master_doc, 'donfig.tex', 'donfig Documentation',
-     'donfig Developers', 'manual'),
+    (master_doc, "donfig.tex", "donfig Documentation", "donfig Developers", "manual"),
 ]
 
 
@@ -156,10 +150,7 @@ latex_documents = [
 
 # One entry per manual page. List of tuples
 # (source start file, name, description, authors, manual section).
-man_pages = [
-    (master_doc, 'donfig', 'donfig Documentation',
-     [author], 1)
-]
+man_pages = [(master_doc, "donfig", "donfig Documentation", [author], 1)]
 
 
 # -- Options for Texinfo output ----------------------------------------------
@@ -168,9 +159,15 @@ man_pages = [
 # (source start file, target name, title, author,
 #  dir menu entry, description, category)
 texinfo_documents = [
-    (master_doc, 'donfig', 'donfig Documentation',
-     author, 'donfig', 'One line description of project.',
-     'Miscellaneous'),
+    (
+        master_doc,
+        "donfig",
+        "donfig Documentation",
+        author,
+        "donfig",
+        "One line description of project.",
+        "Miscellaneous",
+    ),
 ]
 
 
@@ -189,7 +186,7 @@ epub_title = project
 # epub_uid = ''
 
 # A list of files that should not be packed into the epub file.
-epub_exclude_files = ['search.html']
+epub_exclude_files = ["search.html"]
 
 
 # -- Extension configuration -------------------------------------------------
@@ -198,6 +195,6 @@ epub_exclude_files = ['search.html']
 
 # Example configuration for intersphinx: refer to the Python standard library.
 intersphinx_mapping = {
-    'python': ('https://docs.python.org/', None),
-    'dask': ('https://dask.pydata.org/en/latest', None),
-}
\ No newline at end of file
+    "python": ("https://docs.python.org/", None),
+    "dask": ("https://dask.pydata.org/en/latest", None),
+}


=====================================
doc/configuration.rst
=====================================
@@ -45,7 +45,7 @@ search for YAML files can be customized as well as default options:
 
 .. code-block:: python
 
-    config = Config('mypkg', defaults={'key1': 'default_val'}, paths=['/usr/local/etc/'])
+    config = Config('mypkg', defaults=[{'key1': 'default_val'}], paths=['/usr/local/etc/'])
 
 Access Configuration
 --------------------
@@ -112,7 +112,9 @@ These files can live in any of the following locations:
 
 1.  The ``~/.config/mypkg`` directory in the user's home directory
 2.  The ``{sys.prefix}/etc/mypkg`` directory local to Python
-3.  The root directory (specified by the ``MYPKG_ROOT_CONFIG`` environment
+3.  The ``{prefix}/etc/mypkg`` directories with ``{prefix}`` in `site.PREFIXES
+    <https://docs.python.org/3/library/site.html#site.PREFIXES>`_
+4.  The root directory (specified by the ``MYPKG_ROOT_CONFIG`` environment
     variable or ``/etc/mypkg/`` by default)
 
 Donfig searches for *all* YAML files within each of these directories and merges
@@ -126,11 +128,6 @@ subprojects to manage configuration files separately, but have them merge
 into the same global configuration (ex. ``dask``, ``dask-kubernetes``,
 ``dask-ml``).
 
-.. note::
-
-    For historical reasons we also look in the ``~/.mypkg`` directory for
-    config files.  This is deprecated and will soon be removed.*
-
 Environment Variables
 ~~~~~~~~~~~~~~~~~~~~~
 


=====================================
donfig/__init__.py
=====================================
@@ -1,6 +1,6 @@
-
 from .version import get_versions
-__version__ = get_versions()['version']
+
+__version__ = get_versions()["version"]
 del get_versions
 
 from .config_obj import Config  # noqa


=====================================
donfig/_lock.py
=====================================
@@ -0,0 +1,98 @@
+#!/usr/bin/env python
+#
+# Copyright (c) 2022 Donfig Developers
+# Copyright (c) 2014-2022, Anaconda, Inc. and contributors
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in all
+# copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+# SOFTWARE.
+"""Config syncronization locks ported from upstream dask.
+
+Originally part of Dask and stored in the dask/utils.py module. This module
+should be considered private and should not be imported directly by users.
+There are no guarantees that this module will exist in the future.
+
+"""
+
+import uuid
+from threading import Lock
+from weakref import WeakValueDictionary
+
+
+class SerializableLock:
+    """A Serializable per-process Lock.
+
+    This wraps a normal ``threading.Lock`` object and satisfies the same
+    interface.  However, this lock can also be serialized and sent to different
+    processes.  It will not block concurrent operations between processes (for
+    this you should look at ``multiprocessing.Lock`` or ``locket.lock_file``
+    but will consistently deserialize into the same lock.
+
+    So if we make a lock in one process::
+
+        lock = SerializableLock()
+
+    And then send it over to another process multiple times::
+
+        bytes = pickle.dumps(lock)
+        a = pickle.loads(bytes)
+        b = pickle.loads(bytes)
+
+    Then the deserialized objects will operate as though they were the same
+    lock, and collide as appropriate.
+
+    This is useful for consistently protecting resources on a per-process
+    level.
+
+    The creation of locks is itself not threadsafe.
+    """
+
+    _locks = WeakValueDictionary()
+
+    def __init__(self, token=None):
+        self.token = token or str(uuid.uuid4())
+        if self.token in SerializableLock._locks:
+            self.lock = SerializableLock._locks[self.token]
+        else:
+            self.lock = Lock()
+            SerializableLock._locks[self.token] = self.lock
+
+    def acquire(self, *args, **kwargs):
+        return self.lock.acquire(*args, **kwargs)
+
+    def release(self, *args, **kwargs):
+        return self.lock.release(*args, **kwargs)
+
+    def __enter__(self):
+        self.lock.__enter__()
+
+    def __exit__(self, *args):
+        self.lock.__exit__(*args)
+
+    def locked(self):
+        return self.lock.locked()
+
+    def __getstate__(self):
+        return self.token
+
+    def __setstate__(self, token):
+        self.__init__(token)
+
+    def __str__(self):
+        return f"<{self.__class__.__name__}: {self.token}>"
+
+    __repr__ = __str__


=====================================
donfig/config_obj.py
=====================================
@@ -1,5 +1,4 @@
 #!/usr/bin/env python
-# -*- coding: utf-8 -*-
 #
 # Copyright (c) 2018-2019 Donfig Developers
 # Copyright (c) 2014-2018, Anaconda, Inc. and contributors
@@ -23,17 +22,14 @@
 # SOFTWARE.
 import ast
 import os
-import sys
-import threading
 import pprint
-from copy import deepcopy
+import site
+import sys
 from collections.abc import Mapping
+from contextlib import nullcontext
+from copy import deepcopy
 
-try:
-    from contextlib import nullcontext
-except ImportError:
-    # <python 3.7
-    from .utils import nullcontext
+from ._lock import SerializableLock
 
 try:
     import yaml
@@ -41,18 +37,7 @@ except ImportError:
     yaml = None
 
 
-if sys.version_info[0] == 2:
-    # python 2
-    def makedirs(name, mode=0o777, exist_ok=True):
-        try:
-            os.makedirs(name, mode=mode)
-        except OSError:
-            if not exist_ok or not os.path.isdir(name):
-                raise
-else:
-    makedirs = os.makedirs
-
-no_default = '__no_default__'
+no_default = "__no_default__"
 
 
 def canonical_name(k, config):
@@ -70,7 +55,7 @@ def canonical_name(k, config):
         # config is not a mapping, return the same name as provided
         return k
 
-    altk = k.replace('_', '-') if '_' in k else k.replace('-', '_')
+    altk = k.replace("_", "-") if "_" in k else k.replace("-", "_")
 
     if altk in config:
         return altk
@@ -78,7 +63,7 @@ def canonical_name(k, config):
     return k
 
 
-def update(old, new, priority='new'):
+def update(old, new, priority="new"):
     """Update a nested dictionary with values from another
 
     This is like dict.update except that it smoothly merges nested values
@@ -116,7 +101,7 @@ def update(old, new, priority='new'):
                 old[k] = {}
             update(old[k], v, priority=priority)
         else:
-            if priority == 'new' or k not in old:
+            if priority == "new" or k not in old:
                 old[k] = v
 
     return old
@@ -158,11 +143,14 @@ def collect_yaml(paths):
         if os.path.exists(path):
             if os.path.isdir(path):
                 try:
-                    file_paths.extend(sorted([
-                        os.path.join(path, p)
-                        for p in os.listdir(path)
-                        if os.path.splitext(p)[1].lower() in ('.json', '.yaml', '.yml')
-                    ]))
+                    file_paths.extend(
+                        sorted(
+                            os.path.join(path, p)
+                            for p in os.listdir(path)
+                            if os.path.splitext(p)[1].lower()
+                            in (".json", ".yaml", ".yml")
+                        )
+                    )
                 except OSError:
                     # Ignore permission errors
                     pass
@@ -177,7 +165,7 @@ def collect_yaml(paths):
             with open(path) as f:
                 data = yaml.safe_load(f.read()) or {}
                 configs.append(data)
-        except (OSError, IOError):
+        except OSError:
             # Ignore permission errors
             pass
 
@@ -202,7 +190,7 @@ def collect_env(prefix, env=None):
     prefix_len = len(prefix)
     for name, value in env.items():
         if name.startswith(prefix):
-            varname = name[prefix_len:].lower().replace('__', '.')
+            varname = name[prefix_len:].lower().replace("__", ".")
             try:
                 d[varname] = ast.literal_eval(value)
             except (SyntaxError, ValueError):
@@ -215,7 +203,7 @@ def collect_env(prefix, env=None):
     return result
 
 
-class ConfigSet(object):
+class ConfigSet:
     """Temporarily set configuration values within a context manager
 
     Note, this class should be used directly from the `Config`
@@ -234,6 +222,7 @@ class ConfigSet(object):
     donfig.Config.get
 
     """
+
     def __init__(self, config, lock, arg=None, **kwargs):
         with lock:
             self.config = config
@@ -333,29 +322,41 @@ def expand_environment_variables(config):
         return config
 
 
-class Config(object):
-    def __init__(self, name, defaults=None, paths=None, env=None, env_var=None, root_env_var=None, env_prefix=None):
+class Config:
+    def __init__(
+        self,
+        name,
+        defaults=None,
+        paths=None,
+        env=None,
+        env_var=None,
+        root_env_var=None,
+        env_prefix=None,
+    ):
         if root_env_var is None:
-            root_env_var = '{}_ROOT_CONFIG'.format(name.upper())
+            root_env_var = f"{name.upper()}_ROOT_CONFIG"
         if paths is None:
             paths = [
-                os.getenv(root_env_var, '/etc/{}'.format(name)),
-                os.path.join(sys.prefix, 'etc', name),
-                os.path.join(os.path.expanduser('~'), '.config', name),
-                os.path.join(os.path.expanduser('~'), '.{}'.format(name))
+                os.getenv(root_env_var, f"/etc/{name}"),
+                os.path.join(sys.prefix, "etc", name),
+                *[os.path.join(prefix, "etc", name) for prefix in site.PREFIXES],
+                os.path.join(os.path.expanduser("~"), ".config", name),
             ]
 
         if env_prefix is None:
-            env_prefix = "{}_".format(name.upper())
+            env_prefix = f"{name.upper()}_"
         if env is None:
             env = os.environ
         if env_var is None:
-            env_var = '{}_CONFIG'.format(name.upper())
+            env_var = f"{name.upper()}_CONFIG"
         if env_var in os.environ:
             main_path = os.environ[env_var]
             paths.append(main_path)
         else:
-            main_path = os.path.join(os.path.expanduser('~'), '.config', name)
+            main_path = os.path.join(os.path.expanduser("~"), ".config", name)
+
+        # Remove duplicate paths while preserving ordering
+        paths = list(reversed(list(dict.fromkeys(reversed(paths)))))
 
         self.name = name
         self.env_prefix = env_prefix
@@ -364,7 +365,7 @@ class Config(object):
         self.paths = paths
         self.defaults = defaults or []
         self.config = {}
-        self.config_lock = threading.Lock()
+        self.config_lock = SerializableLock()
         self.refresh()
 
     def __contains__(self, item):
@@ -439,7 +440,7 @@ class Config(object):
         self.clear()
 
         for d in self.defaults:
-            update(self.config, d, priority='old')
+            update(self.config, d, priority="old")
 
         update(self.config, self.collect(**kwargs))
 
@@ -466,7 +467,7 @@ class Config(object):
         donfig.Config.set
 
         """
-        keys = key.split('.')
+        keys = key.split(".")
         result = self.config
         for k in keys:
             k = canonical_name(k, result)
@@ -490,7 +491,7 @@ class Config(object):
 
         """
         self.defaults.append(new)
-        update(self.config, new, priority='old')
+        update(self.config, new, priority="old")
 
     def to_dict(self):
         """Return dictionary copy of configuration.
@@ -516,7 +517,7 @@ class Config(object):
         """
         self.config = merge(self.config, dicts)
 
-    def update(self, new, priority='new'):
+    def update(self, new, priority="new"):
         """Update the internal configuration dictionary with `new`.
 
         See :func:`~donfig.config_obj.update` for more information.
@@ -625,27 +626,29 @@ class Config(object):
 
         try:
             if not os.path.exists(destination):
-                makedirs(directory, exist_ok=True)
+                os.makedirs(directory, exist_ok=True)
 
                 # Atomically create destination.  Parallel testing discovered
                 # a race condition where a process can be busy creating the
                 # destination while another process reads an empty config file.
-                tmp = '%s.tmp.%d' % (destination, os.getpid())
+                tmp = "%s.tmp.%d" % (destination, os.getpid())
                 with open(source) as f:
                     lines = list(f)
 
                 if comment:
-                    lines = ['# ' + line
-                             if line.strip() and not line.startswith('#')
-                             else line
-                             for line in lines]
+                    lines = [
+                        "# " + line
+                        if line.strip() and not line.startswith("#")
+                        else line
+                        for line in lines
+                    ]
 
-                with open(tmp, 'w') as f:
-                    f.write(''.join(lines))
+                with open(tmp, "w") as f:
+                    f.write("".join(lines))
 
                 try:
                     os.rename(tmp, destination)
                 except OSError:
                     os.remove(tmp)
-        except (IOError, OSError):
+        except OSError:
             pass


=====================================
donfig/tests/test_config.py
=====================================
@@ -1,5 +1,4 @@
 #!/usr/bin/env python
-# -*- coding: utf-8 -*-
 #
 # Copyright (c) 2018 Donfig Developers
 # Copyright (c) 2014-2018, Anaconda, Inc. and contributors
@@ -21,76 +20,80 @@
 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
 # SOFTWARE.
-import yaml
 import os
+import site
 import stat
+import subprocess
 import sys
+from collections import OrderedDict
+from contextlib import contextmanager
 
+import cloudpickle
 import pytest
+import yaml
 
-from donfig.config_obj import (Config, update, merge, collect_yaml,
-                               collect_env, expand_environment_variables,
-                               canonical_name)
+from donfig.config_obj import (
+    Config,
+    canonical_name,
+    collect_env,
+    collect_yaml,
+    expand_environment_variables,
+    merge,
+    update,
+)
 from donfig.utils import tmpfile
-from collections import OrderedDict
-from contextlib import contextmanager
 
-CONFIG_NAME = 'mytest'
-ENV_PREFIX = CONFIG_NAME.upper() + '_'
+CONFIG_NAME = "mytest"
+ENV_PREFIX = CONFIG_NAME.upper() + "_"
 
 
 def test_canonical_name():
-    c = {'foo-bar': 1,
-         'fizz_buzz': 2}
-    assert canonical_name('foo-bar', c) == 'foo-bar'
-    assert canonical_name('foo_bar', c) == 'foo-bar'
-    assert canonical_name('fizz-buzz', c) == 'fizz_buzz'
-    assert canonical_name('fizz_buzz', c) == 'fizz_buzz'
-    assert canonical_name('new-key', c) == 'new-key'
-    assert canonical_name('new_key', c) == 'new_key'
+    c = {"foo-bar": 1, "fizz_buzz": 2}
+    assert canonical_name("foo-bar", c) == "foo-bar"
+    assert canonical_name("foo_bar", c) == "foo-bar"
+    assert canonical_name("fizz-buzz", c) == "fizz_buzz"
+    assert canonical_name("fizz_buzz", c) == "fizz_buzz"
+    assert canonical_name("new-key", c) == "new-key"
+    assert canonical_name("new_key", c) == "new_key"
 
 
 def test_update():
-    a = {'x': 1, 'y': {'a': 1}}
-    b = {'x': 2, 'z': 3, 'y': OrderedDict({'b': 2})}
+    a = {"x": 1, "y": {"a": 1}}
+    b = {"x": 2, "z": 3, "y": OrderedDict({"b": 2})}
     update(b, a)
-    assert b == {'x': 1, 'y': {'a': 1, 'b': 2}, 'z': 3}
+    assert b == {"x": 1, "y": {"a": 1, "b": 2}, "z": 3}
 
-    a = {'x': 1, 'y': {'a': 1}}
-    b = {'x': 2, 'z': 3, 'y': {'a': 3, 'b': 2}}
-    update(b, a, priority='old')
-    assert b == {'x': 2, 'y': {'a': 3, 'b': 2}, 'z': 3}
+    a = {"x": 1, "y": {"a": 1}}
+    b = {"x": 2, "z": 3, "y": {"a": 3, "b": 2}}
+    update(b, a, priority="old")
+    assert b == {"x": 2, "y": {"a": 3, "b": 2}, "z": 3}
 
 
 def test_merge():
-    a = {'x': 1, 'y': {'a': 1}}
-    b = {'x': 2, 'z': 3, 'y': {'b': 2}}
+    a = {"x": 1, "y": {"a": 1}}
+    b = {"x": 2, "z": 3, "y": {"b": 2}}
 
-    expected = {
-        'x': 2,
-        'y': {'a': 1, 'b': 2},
-        'z': 3
-    }
+    expected = {"x": 2, "y": {"a": 1, "b": 2}, "z": 3}
 
     c = merge(a, b)
     assert c == expected
 
 
 def test_collect_yaml_paths():
-    a = {'x': 1, 'y': {'a': 1}}
-    b = {'x': 2, 'z': 3, 'y': {'b': 2}}
+    a = {"x": 1, "y": {"a": 1}}
+    b = {"x": 2, "z": 3, "y": {"b": 2}}
 
     expected = {
-        'x': 2,
-        'y': {'a': 1, 'b': 2},
-        'z': 3,
+        "x": 2,
+        "y": {"a": 1, "b": 2},
+        "z": 3,
     }
 
-    with tmpfile(extension='yaml') as fn1:
-        with tmpfile(extension='yaml') as fn2:
-            with open(fn1, 'w') as f:
+    with tmpfile(extension="yaml") as fn1:
+        with tmpfile(extension="yaml") as fn2:
+            with open(fn1, "w") as f:
                 yaml.dump(a, f)
-            with open(fn2, 'w') as f:
+            with open(fn2, "w") as f:
                 yaml.dump(b, f)
 
             config = merge(*collect_yaml(paths=[fn1, fn2]))
@@ -98,20 +101,20 @@ def test_collect_yaml_paths():
 
 
 def test_collect_yaml_dir():
-    a = {'x': 1, 'y': {'a': 1}}
-    b = {'x': 2, 'z': 3, 'y': {'b': 2}}
+    a = {"x": 1, "y": {"a": 1}}
+    b = {"x": 2, "z": 3, "y": {"b": 2}}
 
     expected = {
-        'x': 2,
-        'y': {'a': 1, 'b': 2},
-        'z': 3,
+        "x": 2,
+        "y": {"a": 1, "b": 2},
+        "z": 3,
     }
 
     with tmpfile() as dirname:
         os.mkdir(dirname)
-        with open(os.path.join(dirname, 'a.yaml'), mode='w') as f:
+        with open(os.path.join(dirname, "a.yaml"), mode="w") as f:
             yaml.dump(a, f)
-        with open(os.path.join(dirname, 'b.yaml'), mode='w') as f:
+        with open(os.path.join(dirname, "b.yaml"), mode="w") as f:
             yaml.dump(b, f)
 
         config = merge(*collect_yaml(paths=[dirname]))
@@ -129,23 +132,24 @@ def no_read_permissions(path):
         os.chmod(path, perm_orig)
 
 
- at pytest.mark.skipif(sys.platform == 'win32',
-                    reason="Can't make writeonly file on windows")
- at pytest.mark.parametrize('kind', ['directory', 'file'])
+ at pytest.mark.skipif(
+    sys.platform == "win32", reason="Can't make writeonly file on windows"
+)
+ at pytest.mark.parametrize("kind", ["directory", "file"])
 def test_collect_yaml_permission_errors(tmpdir, kind):
-    a = {'x': 1, 'y': 2}
-    b = {'y': 3, 'z': 4}
+    a = {"x": 1, "y": 2}
+    b = {"y": 3, "z": 4}
 
     dir_path = str(tmpdir)
-    a_path = os.path.join(dir_path, 'a.yaml')
-    b_path = os.path.join(dir_path, 'b.yaml')
+    a_path = os.path.join(dir_path, "a.yaml")
+    b_path = os.path.join(dir_path, "b.yaml")
 
-    with open(a_path, mode='w') as f:
+    with open(a_path, mode="w") as f:
         yaml.dump(a, f)
-    with open(b_path, mode='w') as f:
+    with open(b_path, mode="w") as f:
         yaml.dump(b, f)
 
-    if kind == 'directory':
+    if kind == "directory":
         cant_read = dir_path
         expected = {}
     else:
@@ -158,47 +162,48 @@ def test_collect_yaml_permission_errors(tmpdir, kind):
 
 
 def test_env():
-    env = {ENV_PREFIX + 'A_B': '123',
-           ENV_PREFIX + 'C': 'True',
-           ENV_PREFIX + 'D': 'hello',
-           ENV_PREFIX + 'E__X': '123',
-           ENV_PREFIX + 'E__Y': '456',
-           ENV_PREFIX + 'F': '[1, 2, "3"]',
-           ENV_PREFIX + 'G': '/not/parsable/as/literal',
-           'FOO': 'not included',
-           }
+    env = {
+        ENV_PREFIX + "A_B": "123",
+        ENV_PREFIX + "C": "True",
+        ENV_PREFIX + "D": "hello",
+        ENV_PREFIX + "E__X": "123",
+        ENV_PREFIX + "E__Y": "456",
+        ENV_PREFIX + "F": '[1, 2, "3"]',
+        ENV_PREFIX + "G": "/not/parsable/as/literal",
+        "FOO": "not included",
+    }
 
     expected = {
-        'a_b': 123,
-        'c': True,
-        'd': 'hello',
-        'e': {'x': 123, 'y': 456},
-        'f': [1, 2, "3"],
-        'g': '/not/parsable/as/literal',
+        "a_b": 123,
+        "c": True,
+        "d": "hello",
+        "e": {"x": 123, "y": 456},
+        "f": [1, 2, "3"],
+        "g": "/not/parsable/as/literal",
     }
 
-    res = collect_env(CONFIG_NAME.upper() + '_', env)
+    res = collect_env(CONFIG_NAME.upper() + "_", env)
     assert res == expected
 
 
 def test_collect():
-    a = {'x': 1, 'y': {'a': 1}}
-    b = {'x': 2, 'z': 3, 'y': {'b': 2}}
-    env = {ENV_PREFIX + 'W': 4}
+    a = {"x": 1, "y": {"a": 1}}
+    b = {"x": 2, "z": 3, "y": {"b": 2}}
+    env = {ENV_PREFIX + "W": 4}
 
     expected = {
-        'w': 4,
-        'x': 2,
-        'y': {'a': 1, 'b': 2},
-        'z': 3,
+        "w": 4,
+        "x": 2,
+        "y": {"a": 1, "b": 2},
+        "z": 3,
     }
 
     config = Config(CONFIG_NAME)
-    with tmpfile(extension='yaml') as fn1:
-        with tmpfile(extension='yaml') as fn2:
-            with open(fn1, 'w') as f:
+    with tmpfile(extension="yaml") as fn1:
+        with tmpfile(extension="yaml") as fn2:
+            with open(fn1, "w") as f:
                 yaml.dump(a, f)
-            with open(fn2, 'w') as f:
+            with open(fn2, "w") as f:
                 yaml.dump(b, f)
 
             config = config.collect([fn1, fn2], env=env)
@@ -206,48 +211,48 @@ def test_collect():
 
 
 def test_collect_env_none():
-    os.environ[ENV_PREFIX + 'FOO'] = 'bar'
+    os.environ[ENV_PREFIX + "FOO"] = "bar"
     config = Config(CONFIG_NAME)
     try:
         config = config.collect([])
-        assert config == {'foo': 'bar'}
+        assert config == {"foo": "bar"}
     finally:
-        del os.environ[ENV_PREFIX + 'FOO']
+        del os.environ[ENV_PREFIX + "FOO"]
 
 
 def test_get():
     test_config = Config(CONFIG_NAME)
-    test_config.config = {'x': 1, 'y': {'a': 2}}
+    test_config.config = {"x": 1, "y": {"a": 2}}
 
-    assert test_config.get('x') == 1
-    assert test_config['x'] == 1
-    assert test_config.get('y.a') == 2
-    assert test_config['y.a'] == 2
-    assert test_config.get('y.b', 123) == 123
+    assert test_config.get("x") == 1
+    assert test_config["x"] == 1
+    assert test_config.get("y.a") == 2
+    assert test_config["y.a"] == 2
+    assert test_config.get("y.b", 123) == 123
     with pytest.raises(KeyError):
-        test_config.get('y.b')
+        test_config.get("y.b")
     with pytest.raises(KeyError):
-        test_config['y.b']
+        test_config["y.b"]
 
 
 def test_contains():
     test_config = Config(CONFIG_NAME)
-    test_config.config = {'x': 1, 'y': {'a': 2}}
+    test_config.config = {"x": 1, "y": {"a": 2}}
 
-    assert 'x' in test_config
-    assert 'y.a' in test_config
-    assert 'y.b' not in test_config
+    assert "x" in test_config
+    assert "y.a" in test_config
+    assert "y.b" not in test_config
 
 
 def test_ensure_file(tmpdir):
-    a = {'x': 1, 'y': {'a': 1}}
-    b = {'x': 123}
+    a = {"x": 1, "y": {"a": 1}}
+    b = {"x": 123}
 
-    source = os.path.join(str(tmpdir), 'source.yaml')
-    dest = os.path.join(str(tmpdir), 'dest')
-    destination = os.path.join(dest, 'source.yaml')
+    source = os.path.join(str(tmpdir), "source.yaml")
+    dest = os.path.join(str(tmpdir), "dest")
+    destination = os.path.join(dest, "source.yaml")
 
-    with open(source, 'w') as f:
+    with open(source, "w") as f:
         yaml.dump(a, f)
 
     config = Config(CONFIG_NAME)
@@ -258,7 +263,7 @@ def test_ensure_file(tmpdir):
     assert result == a
 
     # don't overwrite old config files
-    with open(source, 'w') as f:
+    with open(source, "w") as f:
         yaml.dump(b, f)
 
     config.ensure_file(source=source, destination=dest, comment=False)
@@ -274,7 +279,7 @@ def test_ensure_file(tmpdir):
 
     with open(destination) as f:
         text = f.read()
-    assert '123' in text
+    assert "123" in text
 
     with open(destination) as f:
         result = yaml.safe_load(f)
@@ -284,24 +289,24 @@ def test_ensure_file(tmpdir):
 def test_set():
     config = Config(CONFIG_NAME)
     with config.set(abc=123):
-        assert config.config['abc'] == 123
+        assert config.config["abc"] == 123
         with config.set(abc=456):
-            assert config.config['abc'] == 456
-        assert config.config['abc'] == 123
+            assert config.config["abc"] == 456
+        assert config.config["abc"] == 123
 
-    assert 'abc' not in config.config
+    assert "abc" not in config.config
 
-    with config.set({'abc': 123}):
-        assert config.config['abc'] == 123
-    assert 'abc' not in config.config
+    with config.set({"abc": 123}):
+        assert config.config["abc"] == 123
+    assert "abc" not in config.config
 
-    with config.set({'abc.x': 1, 'abc.y': 2, 'abc.z.a': 3}):
-        assert config.config['abc'] == {'x': 1, 'y': 2, 'z': {'a': 3}}
-    assert 'abc' not in config.config
+    with config.set({"abc.x": 1, "abc.y": 2, "abc.z.a": 3}):
+        assert config.config["abc"] == {"x": 1, "y": 2, "z": {"a": 3}}
+    assert "abc" not in config.config
 
     config.config = {}
-    config.set({'abc.x': 123})
-    assert config.config['abc']['x'] == 123
+    config.set({"abc.x": 123})
+    assert config.config["abc"]["x"] == 123
 
 
 def test_set_kwargs():
@@ -323,30 +328,31 @@ def test_set_kwargs():
 
 def test_set_nested():
     config = Config(CONFIG_NAME)
-    with config.set({'abc': {'x': 123}}):
-        assert config.config['abc'] == {'x': 123}
-        with config.set({'abc.y': 456}):
-            assert config.config['abc'] == {'x': 123, 'y': 456}
-        assert config.config['abc'] == {'x': 123}
-    assert 'abc' not in config.config
+    with config.set({"abc": {"x": 123}}):
+        assert config.config["abc"] == {"x": 123}
+        with config.set({"abc.y": 456}):
+            assert config.config["abc"] == {"x": 123, "y": 456}
+        assert config.config["abc"] == {"x": 123}
+    assert "abc" not in config.config
 
 
 def test_set_hard_to_copyables():
     import threading
+
     config = Config(CONFIG_NAME)
     with config.set(x=threading.Lock()):
         with config.set(y=1):
             pass
 
 
- at pytest.mark.parametrize('mkdir', [True, False])
+ at pytest.mark.parametrize("mkdir", [True, False])
 def test_ensure_file_directory(mkdir, tmpdir):
-    a = {'x': 1, 'y': {'a': 1}}
+    a = {"x": 1, "y": {"a": 1}}
 
-    source = os.path.join(str(tmpdir), 'source.yaml')
-    dest = os.path.join(str(tmpdir), 'dest')
+    source = os.path.join(str(tmpdir), "source.yaml")
+    dest = os.path.join(str(tmpdir), "dest")
 
-    with open(source, 'w') as f:
+    with open(source, "w") as f:
         yaml.dump(a, f)
 
     if mkdir:
@@ -356,17 +362,17 @@ def test_ensure_file_directory(mkdir, tmpdir):
     config.ensure_file(source=source, destination=dest)
 
     assert os.path.isdir(dest)
-    assert os.path.exists(os.path.join(dest, 'source.yaml'))
+    assert os.path.exists(os.path.join(dest, "source.yaml"))
 
 
 def test_ensure_file_defaults_to_TEST_CONFIG_directory(tmpdir):
-    a = {'x': 1, 'y': {'a': 1}}
-    source = os.path.join(str(tmpdir), 'source.yaml')
-    with open(source, 'w') as f:
+    a = {"x": 1, "y": {"a": 1}}
+    source = os.path.join(str(tmpdir), "source.yaml")
+    with open(source, "w") as f:
         yaml.dump(a, f)
 
-    config = Config('test')
-    destination = os.path.join(str(tmpdir), 'test')
+    config = Config("test")
+    destination = os.path.join(str(tmpdir), "test")
     PATH = config.main_path
     try:
         config.main_path = destination
@@ -381,88 +387,91 @@ def test_ensure_file_defaults_to_TEST_CONFIG_directory(tmpdir):
 
 def test_rename():
     config = Config(CONFIG_NAME)
-    aliases = {'foo_bar': 'foo.bar'}
-    config.config = {'foo-bar': 123}
+    aliases = {"foo_bar": "foo.bar"}
+    config.config = {"foo-bar": 123}
     config.rename(aliases)
-    assert config.config == {'foo': {'bar': 123}}
+    assert config.config == {"foo": {"bar": 123}}
 
 
 def test_refresh():
     defaults = []
     config = Config(CONFIG_NAME, defaults=defaults)
 
-    config.update_defaults({'a': 1})
-    assert config.config == {'a': 1}
-
-    config.refresh(paths=[], env={ENV_PREFIX + 'B': '2'})
-    assert config.config == {'a': 1, 'b': 2}
-
-    config.refresh(paths=[], env={ENV_PREFIX + 'C': '3'})
-    assert config.config == {'a': 1, 'c': 3}
-
-
- at pytest.mark.parametrize('inp,out', [
-    ('1', '1'),
-    (1, 1),
-    ('$FOO', 'foo'),
-    ([1, '$FOO'], [1, 'foo']),
-    ((1, '$FOO'), (1, 'foo')),
-    ({1, '$FOO'}, {1, 'foo'}),
-    ({'a': '$FOO'}, {'a': 'foo'}),
-    ({'a': 'A', 'b': [1, '2', '$FOO']}, {'a': 'A', 'b': [1, '2', 'foo']})
-])
+    config.update_defaults({"a": 1})
+    assert config.config == {"a": 1}
+
+    config.refresh(paths=[], env={ENV_PREFIX + "B": "2"})
+    assert config.config == {"a": 1, "b": 2}
+
+    config.refresh(paths=[], env={ENV_PREFIX + "C": "3"})
+    assert config.config == {"a": 1, "c": 3}
+
+
+ at pytest.mark.parametrize(
+    "inp,out",
+    [
+        ("1", "1"),
+        (1, 1),
+        ("$FOO", "foo"),
+        ([1, "$FOO"], [1, "foo"]),
+        ((1, "$FOO"), (1, "foo")),
+        ({1, "$FOO"}, {1, "foo"}),
+        ({"a": "$FOO"}, {"a": "foo"}),
+        ({"a": "A", "b": [1, "2", "$FOO"]}, {"a": "A", "b": [1, "2", "foo"]}),
+    ],
+)
 def test_expand_environment_variables(inp, out):
     try:
         os.environ["FOO"] = "foo"
         assert expand_environment_variables(inp) == out
     finally:
-        del os.environ['FOO']
+        del os.environ["FOO"]
 
 
 def test_env_var_canonical_name(monkeypatch):
     value = 3
-    monkeypatch.setenv(ENV_PREFIX + 'A_B', str(value))
+    monkeypatch.setenv(ENV_PREFIX + "A_B", str(value))
     config = Config(CONFIG_NAME)
 
-    assert config.get('a_b') == value
-    assert config.get('a-b') == value
+    assert config.get("a_b") == value
+    assert config.get("a-b") == value
 
 
 def test_get_set_canonical_name():
-    c = {'x-y': {'a_b': 123}}
+    c = {"x-y": {"a_b": 123}}
     config = Config(CONFIG_NAME)
     config.update(c)
 
-    keys = ['x_y.a_b', 'x-y.a-b', 'x_y.a-b']
+    keys = ["x_y.a_b", "x-y.a-b", "x_y.a-b"]
     for k in keys:
         assert config.get(k) == 123
 
-    with config.set({'x_y': {'a-b': 456}}):
+    with config.set({"x_y": {"a-b": 456}}):
         for k in keys:
             assert config.get(k) == 456
 
     # No change to new keys in sub dicts
-    with config.set({'x_y': {'a-b': {'c_d': 1}, 'e-f': 2}}):
-        assert config.get('x_y.a-b') == {'c_d': 1}
-        assert config.get('x_y.e_f') == 2
+    with config.set({"x_y": {"a-b": {"c_d": 1}, "e-f": 2}}):
+        assert config.get("x_y.a-b") == {"c_d": 1}
+        assert config.get("x_y.e_f") == 2
 
 
- at pytest.mark.parametrize('key', ['custom_key', 'custom-key'])
+ at pytest.mark.parametrize("key", ["custom_key", "custom-key"])
 def test_get_set_roundtrip(key):
     value = 123
     config = Config(CONFIG_NAME)
     with config.set({key: value}):
-        assert config.get('custom_key') == value
-        assert config.get('custom-key') == value
+        assert config.get("custom_key") == value
+        assert config.get("custom-key") == value
 
 
 def test_merge_none_to_dict():
-    assert merge({'a': None, 'c': 0}, {'a': {'b': 1}}) == {'a': {'b': 1}, 'c': 0}
+    assert merge({"a": None, "c": 0}, {"a": {"b": 1}}) == {"a": {"b": 1}, "c": 0}
 
 
 def test_pprint(capsys):
     test_config = Config(CONFIG_NAME)
-    test_config.config = {'x': 1, 'y': {'a': 2}}
+    test_config.config = {"x": 1, "y": {"a": 2}}
     test_config.pprint()
     cap_out = capsys.readouterr()[0]
     assert cap_out == """{'x': 1, 'y': {'a': 2}}\n"""
@@ -470,17 +479,73 @@ def test_pprint(capsys):
 
 def test_to_dict():
     test_config = Config(CONFIG_NAME)
-    test_config.config = {'x': 1, 'y': {'a': 2}}
+    test_config.config = {"x": 1, "y": {"a": 2}}
     d = test_config.to_dict()
     assert d == test_config.config
     # make sure we copied
-    d['z'] = 3
-    d['y']['b'] = 4
+    d["z"] = 3
+    d["y"]["b"] = 4
     assert d != test_config.config
-    assert d['y'] != test_config.config['y']
-
-
-if __name__ == '__main__':
-    import sys
-    import pytest
-    sys.exit(pytest.main(sys.argv))
+    assert d["y"] != test_config.config["y"]
+
+
+def test_path_includes_site_prefix():
+
+    command = (
+        "import site, os; "
+        'prefix = os.path.join("include", "this", "path"); '
+        "site.PREFIXES.append(prefix); "
+        "from donfig import Config; "
+        f"config = Config('{CONFIG_NAME}'); "
+        "print(config.paths); "
+        f'assert os.path.join(prefix, "etc", "{CONFIG_NAME}") in config.paths'
+    )
+
+    subprocess.check_call([sys.executable, "-c", command])
+
+
+def test__get_paths(monkeypatch):
+    # These settings, if present, would interfere with these tests
+    # We temporarily remove them to avoid interference from the
+    # machine where tests are being run.
+    monkeypatch.delenv("MYPKG_CONFIG", raising=False)
+    monkeypatch.delenv("MYPKG_ROOT_CONFIG", raising=False)
+    monkeypatch.setattr(site, "PREFIXES", [])
+
+    expected = [
+        "/etc/mypkg",
+        os.path.join(sys.prefix, "etc", "mypkg"),
+        os.path.join(os.path.expanduser("~"), ".config", "mypkg"),
+    ]
+    config = Config("mypkg")
+    assert config.paths == expected
+    assert len(config.paths) == len(set(config.paths))  # No duplicate paths
+
+    with monkeypatch.context() as m:
+        m.setenv("MYPKG_CONFIG", "foo-bar")
+        config = Config("mypkg")
+        paths = config.paths
+        assert paths == expected + ["foo-bar"]
+        assert len(paths) == len(set(paths))
+
+    with monkeypatch.context() as m:
+        m.setenv("MYPKG_ROOT_CONFIG", "foo-bar")
+        config = Config("mypkg")
+        paths = config.paths
+        assert paths == ["foo-bar"] + expected[1:]
+        assert len(paths) == len(set(paths))
+
+    with monkeypatch.context() as m:
+        prefix = os.path.join("include", "this", "path")
+        m.setattr(site, "PREFIXES", site.PREFIXES + [prefix])
+        config = Config("mypkg")
+        paths = config.paths
+        assert os.path.join(prefix, "etc", "mypkg") in paths
+        assert len(paths) == len(set(paths))
+
+
+def test_serialization():
+    config = Config(CONFIG_NAME)
+    config.set(one_key="one_value")
+    new_config = cloudpickle.loads(cloudpickle.dumps(config))
+    assert new_config.get("one_key") == "one_value"


=====================================
donfig/tests/test_lock.py
=====================================
@@ -0,0 +1,87 @@
+#!/usr/bin/env python
+#
+# Copyright (c) 2022 Donfig Developers
+# Copyright (c) 2014-2018, Anaconda, Inc. and contributors
+#
+# Permission is hereby granted, free of charge, to any person obtaining a copy
+# of this software and associated documentation files (the "Software"), to deal
+# in the Software without restriction, including without limitation the rights
+# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+# copies of the Software, and to permit persons to whom the Software is
+# furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included in all
+# copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+# SOFTWARE.
+
+import pickle
+
+from .._lock import SerializableLock
+
+
+def test_SerializableLock():
+    a = SerializableLock()
+    b = SerializableLock()
+    with a:
+        pass
+
+    with a:
+        with b:
+            pass
+
+    with a:
+        assert not a.acquire(False)
+
+    a2 = pickle.loads(pickle.dumps(a))
+    a3 = pickle.loads(pickle.dumps(a))
+    a4 = pickle.loads(pickle.dumps(a2))
+
+    for x in [a, a2, a3, a4]:
+        for y in [a, a2, a3, a4]:
+            with x:
+                assert not y.acquire(False)
+
+    b2 = pickle.loads(pickle.dumps(b))
+    b3 = pickle.loads(pickle.dumps(b2))
+
+    for x in [a, a2, a3, a4]:
+        for y in [b, b2, b3]:
+            with x:
+                with y:
+                    pass
+            with y:
+                with x:
+                    pass
+
+
+def test_SerializableLock_name_collision():
+    a = SerializableLock("a")
+    b = SerializableLock("b")
+    c = SerializableLock("a")
+    d = SerializableLock()
+
+    assert a.lock is not b.lock
+    assert a.lock is c.lock
+    assert d.lock not in (a.lock, b.lock, c.lock)
+
+
+def test_SerializableLock_locked():
+    a = SerializableLock("a")
+    assert not a.locked()
+    with a:
+        assert a.locked()
+    assert not a.locked()
+
+
+def test_SerializableLock_acquire_blocking():
+    a = SerializableLock("a")
+    assert a.acquire(blocking=True)
+    assert not a.acquire(blocking=False)
+    a.release()


=====================================
donfig/utils.py
=====================================
@@ -1,5 +1,4 @@
 #!/usr/bin/env python
-# -*- coding: utf-8 -*-
 #
 # Copyright (c) 2018 Donfig Developers
 # Copyright (c) 2014-2018, Anaconda, Inc. and contributors
@@ -22,30 +21,14 @@
 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
 # SOFTWARE.
 import os
-import tempfile
 import shutil
-from contextlib import contextmanager
-
-try:
-    from contextlib import AbstractContextManager
-except ImportError:
-    AbstractContextManager = object
-
-try:
-    from contextlib import suppress
-except ImportError:
-    # Python <3.4
-    @contextmanager
-    def suppress(*exceptions):
-        try:
-            yield
-        except exceptions:
-            pass
+import tempfile
+from contextlib import contextmanager, suppress
 
 
 @contextmanager
-def tmpfile(extension='', dir=None):
-    extension = '.' + extension.lstrip('.')
+def tmpfile(extension="", dir=None):
+    extension = "." + extension.lstrip(".")
     handle, filename = tempfile.mkstemp(extension, dir=dir)
     os.close(handle)
     os.remove(filename)
@@ -59,26 +42,3 @@ def tmpfile(extension='', dir=None):
             else:
                 with suppress(OSError):
                     os.remove(filename)
-
-
-# copied from cpython 3.7 source
-class nullcontext(AbstractContextManager):
-    """Context manager that does no additional processing.
-
-    Used as a stand-in for a normal context manager, when a particular
-    block of code is only sometimes used with a normal context manager::
-
-        cm = optional_cm if condition else nullcontext()
-        with cm:
-            # Perform operation, using optional_cm if condition is True
-
-    """
-
-    def __init__(self, enter_result=None):
-        self.enter_result = enter_result
-
-    def __enter__(self):
-        return self.enter_result
-
-    def __exit__(self, *excinfo):
-        pass


=====================================
donfig/version.py
=====================================
@@ -1,4 +1,3 @@
-
 # This file helps to compute a version number in source trees obtained from
 # git-archive tarball (such as those provided by githubs download-from-tag
 # feature). Distribution tarballs (built by setup.py sdist) and build
@@ -23,9 +22,9 @@ def get_keywords():
     # setup.py/versioneer.py will grep for the variable names, so they must
     # each be defined on a line of their own. _version.py will just call
     # get_keywords().
-    git_refnames = " (tag: v0.6.0)"
-    git_full = "4ad0cc483044369b2ef5608ae24bad5ba50bb7f2"
-    git_date = "2021-01-17 19:57:14 -0600"
+    git_refnames = " (HEAD -> main, tag: v0.7.0)"
+    git_full = "37a400cfbacd2dcff3e7dc323aaf1aa55862341c"
+    git_date = "2022-02-04 21:06:55 -0600"
     keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
     return keywords
 
@@ -58,17 +57,18 @@ HANDLERS = {}
 
 def register_vcs_handler(vcs, method):  # decorator
     """Decorator to mark a method as the handler for a particular VCS."""
+
     def decorate(f):
         """Store f in HANDLERS[vcs][method]."""
         if vcs not in HANDLERS:
             HANDLERS[vcs] = {}
         HANDLERS[vcs][method] = f
         return f
+
     return decorate
 
 
-def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
-                env=None):
+def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, env=None):
     """Call the given command(s)."""
     assert isinstance(commands, list)
     p = None
@@ -76,12 +76,15 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
         try:
             dispcmd = str([c] + args)
             # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
+            p = subprocess.Popen(
+                [c] + args,
+                cwd=cwd,
+                env=env,
+                stdout=subprocess.PIPE,
+                stderr=(subprocess.PIPE if hide_stderr else None),
+            )
             break
-        except EnvironmentError:
+        except OSError:
             e = sys.exc_info()[1]
             if e.errno == errno.ENOENT:
                 continue
@@ -91,7 +94,7 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
             return None, None
     else:
         if verbose:
-            print("unable to find command, tried %s" % (commands,))
+            print(f"unable to find command, tried {commands}")
         return None, None
     stdout = p.communicate()[0].strip()
     if sys.version_info[0] >= 3:
@@ -116,16 +119,22 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
     for i in range(3):
         dirname = os.path.basename(root)
         if dirname.startswith(parentdir_prefix):
-            return {"version": dirname[len(parentdir_prefix):],
-                    "full-revisionid": None,
-                    "dirty": False, "error": None, "date": None}
+            return {
+                "version": dirname[len(parentdir_prefix) :],
+                "full-revisionid": None,
+                "dirty": False,
+                "error": None,
+                "date": None,
+            }
         else:
             rootdirs.append(root)
             root = os.path.dirname(root)  # up a level
 
     if verbose:
-        print("Tried directories %s but none started with prefix %s" %
-              (str(rootdirs), parentdir_prefix))
+        print(
+            "Tried directories %s but none started with prefix %s"
+            % (str(rootdirs), parentdir_prefix)
+        )
     raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
 
 
@@ -138,7 +147,7 @@ def git_get_keywords(versionfile_abs):
     # _version.py.
     keywords = {}
     try:
-        f = open(versionfile_abs, "r")
+        f = open(versionfile_abs)
         for line in f.readlines():
             if line.strip().startswith("git_refnames ="):
                 mo = re.search(r'=\s*"(.*)"', line)
@@ -153,7 +162,7 @@ def git_get_keywords(versionfile_abs):
                 if mo:
                     keywords["date"] = mo.group(1)
         f.close()
-    except EnvironmentError:
+    except OSError:
         pass
     return keywords
 
@@ -177,11 +186,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         if verbose:
             print("keywords are unexpanded, not using")
         raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
+    refs = {r.strip() for r in refnames.strip("()").split(",")}
     # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
     # just "foo-1.0". If we see a "tag: " prefix, prefer those.
     TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
+    tags = {r[len(TAG) :] for r in refs if r.startswith(TAG)}
     if not tags:
         # Either we're using git < 1.8.3, or there really are no tags. We use
         # a heuristic: assume all version tags have a digit. The old git %d
@@ -190,7 +199,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # between branches and tags. By ignoring refnames without digits, we
         # filter out many common branch names like "release" and
         # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
+        tags = {r for r in refs if re.search(r"\d", r)}
         if verbose:
             print("discarding '%s', no digits" % ",".join(refs - tags))
     if verbose:
@@ -198,19 +207,26 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
     for ref in sorted(tags):
         # sorting will prefer e.g. "2.0" over "2.0rc1"
         if ref.startswith(tag_prefix):
-            r = ref[len(tag_prefix):]
+            r = ref[len(tag_prefix) :]
             if verbose:
                 print("picking %s" % r)
-            return {"version": r,
-                    "full-revisionid": keywords["full"].strip(),
-                    "dirty": False, "error": None,
-                    "date": date}
+            return {
+                "version": r,
+                "full-revisionid": keywords["full"].strip(),
+                "dirty": False,
+                "error": None,
+                "date": date,
+            }
     # no suitable tags, so version is "0+unknown", but full hex is still there
     if verbose:
         print("no suitable tags, using unknown + full revision id")
-    return {"version": "0+unknown",
-            "full-revisionid": keywords["full"].strip(),
-            "dirty": False, "error": "no suitable tags", "date": None}
+    return {
+        "version": "0+unknown",
+        "full-revisionid": keywords["full"].strip(),
+        "dirty": False,
+        "error": "no suitable tags",
+        "date": None,
+    }
 
 
 @register_vcs_handler("git", "pieces_from_vcs")
@@ -225,8 +241,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     if sys.platform == "win32":
         GITS = ["git.cmd", "git.exe"]
 
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
+    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root, hide_stderr=True)
     if rc != 0:
         if verbose:
             print("Directory %s not under git control" % root)
@@ -234,10 +249,19 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
 
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%s*" % tag_prefix],
-                                   cwd=root)
+    describe_out, rc = run_command(
+        GITS,
+        [
+            "describe",
+            "--tags",
+            "--dirty",
+            "--always",
+            "--long",
+            "--match",
+            "%s*" % tag_prefix,
+        ],
+        cwd=root,
+    )
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
@@ -260,17 +284,16 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     dirty = git_describe.endswith("-dirty")
     pieces["dirty"] = dirty
     if dirty:
-        git_describe = git_describe[:git_describe.rindex("-dirty")]
+        git_describe = git_describe[: git_describe.rindex("-dirty")]
 
     # now we have TAG-NUM-gHEX or HEX
 
     if "-" in git_describe:
         # TAG-NUM-gHEX
-        mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
+        mo = re.search(r"^(.+)-(\d+)-g([0-9a-f]+)$", git_describe)
         if not mo:
             # unparseable. Maybe git-describe is misbehaving?
-            pieces["error"] = ("unable to parse git-describe output: '%s'"
-                               % describe_out)
+            pieces["error"] = "unable to parse git-describe output: '%s'" % describe_out
             return pieces
 
         # tag
@@ -279,10 +302,12 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
             if verbose:
                 fmt = "tag '%s' doesn't start with prefix '%s'"
                 print(fmt % (full_tag, tag_prefix))
-            pieces["error"] = ("tag '%s' doesn't start with prefix '%s'"
-                               % (full_tag, tag_prefix))
+            pieces["error"] = "tag '{}' doesn't start with prefix '{}'".format(
+                full_tag,
+                tag_prefix,
+            )
             return pieces
-        pieces["closest-tag"] = full_tag[len(tag_prefix):]
+        pieces["closest-tag"] = full_tag[len(tag_prefix) :]
 
         # distance: number of commits since tag
         pieces["distance"] = int(mo.group(2))
@@ -293,13 +318,13 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
+        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
         pieces["distance"] = int(count_out)  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"],
-                       cwd=root)[0].strip()
+    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[
+        0
+    ].strip()
     pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
 
     return pieces
@@ -330,8 +355,7 @@ def render_pep440(pieces):
                 rendered += ".dirty"
     else:
         # exception #1
-        rendered = "0+untagged.%d.g%s" % (pieces["distance"],
-                                          pieces["short"])
+        rendered = "0+untagged.%d.g%s" % (pieces["distance"], pieces["short"])
         if pieces["dirty"]:
             rendered += ".dirty"
     return rendered
@@ -445,11 +469,13 @@ def render_git_describe_long(pieces):
 def render(pieces, style):
     """Render the given version pieces into the requested style."""
     if pieces["error"]:
-        return {"version": "unknown",
-                "full-revisionid": pieces.get("long"),
-                "dirty": None,
-                "error": pieces["error"],
-                "date": None}
+        return {
+            "version": "unknown",
+            "full-revisionid": pieces.get("long"),
+            "dirty": None,
+            "error": pieces["error"],
+            "date": None,
+        }
 
     if not style or style == "default":
         style = "pep440"  # the default
@@ -469,9 +495,13 @@ def render(pieces, style):
     else:
         raise ValueError("unknown style '%s'" % style)
 
-    return {"version": rendered, "full-revisionid": pieces["long"],
-            "dirty": pieces["dirty"], "error": None,
-            "date": pieces.get("date")}
+    return {
+        "version": rendered,
+        "full-revisionid": pieces["long"],
+        "dirty": pieces["dirty"],
+        "error": None,
+        "date": pieces.get("date"),
+    }
 
 
 def get_versions():
@@ -485,8 +515,7 @@ def get_versions():
     verbose = cfg.verbose
 
     try:
-        return git_versions_from_keywords(get_keywords(), cfg.tag_prefix,
-                                          verbose)
+        return git_versions_from_keywords(get_keywords(), cfg.tag_prefix, verbose)
     except NotThisMethod:
         pass
 
@@ -495,13 +524,16 @@ def get_versions():
         # versionfile_source is the relative path from the top of the source
         # tree (where the .git directory might live) to this file. Invert
         # this to find the root from __file__.
-        for i in cfg.versionfile_source.split('/'):
+        for i in cfg.versionfile_source.split("/"):
             root = os.path.dirname(root)
     except NameError:
-        return {"version": "0+unknown", "full-revisionid": None,
-                "dirty": None,
-                "error": "unable to find root of source tree",
-                "date": None}
+        return {
+            "version": "0+unknown",
+            "full-revisionid": None,
+            "dirty": None,
+            "error": "unable to find root of source tree",
+            "date": None,
+        }
 
     try:
         pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose)
@@ -515,6 +547,10 @@ def get_versions():
     except NotThisMethod:
         pass
 
-    return {"version": "0+unknown", "full-revisionid": None,
-            "dirty": None,
-            "error": "unable to compute version", "date": None}
+    return {
+        "version": "0+unknown",
+        "full-revisionid": None,
+        "dirty": None,
+        "error": "unable to compute version",
+        "date": None,
+    }


=====================================
setup.cfg
=====================================
@@ -2,8 +2,33 @@
 universal=1
 
 [flake8]
+# References:
+# https://flake8.readthedocs.io/en/latest/user/configuration.html
+# https://flake8.readthedocs.io/en/latest/user/error-codes.html
+# https://pycodestyle.pycqa.org/en/latest/intro.html#error-codes
+exclude = __init__.py,versioneer.py
+ignore =
+    E20,   # Extra space in brackets
+    E231,E241,  # Multiple spaces around ","
+    E26,   # Comments
+    E4,    # Import formatting
+    E721,  # Comparing types instead of isinstance
+    E731,  # Assigning lambda expression
+    E741,  # Ambiguous variable names
+    W503,  # line break before binary operator
+    W504,  # line break after binary operator
+    F811,  # redefinition of unused 'loop' from line 10
 max-line-length = 120
 
+[isort]
+sections = FUTURE,STDLIB,THIRDPARTY,FIRSTPARTY,LOCALFOLDER
+profile = black
+skip_gitignore = true
+force_to_top = true
+default_section = THIRDPARTY
+known_first_party = donfig
+skip = versioneer.py,donfig/version.py
+
 [versioneer]
 VCS = git
 style = pep440


=====================================
setup.py
=====================================
@@ -1,5 +1,4 @@
 #!/usr/bin/env python
-# -*- coding: utf-8 -*-
 #
 # Copyright (c) 2018 Donfig Developers
 #
@@ -21,28 +20,32 @@
 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
 # SOFTWARE.
 
-from setuptools import setup, find_packages
+from setuptools import find_packages, setup
+
 import versioneer
 
-NAME = 'donfig'
-README = open('README.rst', 'r').read()
+NAME = "donfig"
+README = open("README.rst").read()
 
-setup(name=NAME,
-      version=versioneer.get_version(),
-      cmdclass=versioneer.get_cmdclass(),
-      description='Python package for configuring a python package',
-      long_description=README,
-      author='Donfig Developers',
-      classifiers=["Development Status :: 3 - Alpha",
-                   "Intended Audience :: Developers",
-                   "License :: OSI Approved :: MIT License",
-                   "Operating System :: OS Independent",
-                   "Programming Language :: Python",
-                   "Topic :: Scientific/Engineering"],
-      url="https://github.com/pytroll/donfig",
-      packages=find_packages(),
-      zip_safe=False,
-      install_requires=['pyyaml'],
-      tests_require=['pytest'],
-      python_requires='>=3.6',
-      )
+setup(
+    name=NAME,
+    version=versioneer.get_version(),
+    cmdclass=versioneer.get_cmdclass(),
+    description="Python package for configuring a python package",
+    long_description=README,
+    author="Donfig Developers",
+    classifiers=[
+        "Development Status :: 3 - Alpha",
+        "Intended Audience :: Developers",
+        "License :: OSI Approved :: MIT License",
+        "Operating System :: OS Independent",
+        "Programming Language :: Python",
+        "Topic :: Scientific/Engineering",
+    ],
+    url="https://github.com/pytroll/donfig",
+    packages=find_packages(),
+    zip_safe=False,
+    install_requires=["pyyaml"],
+    tests_require=["pytest", "cloudpickle"],
+    python_requires=">=3.7",
+)


=====================================
versioneer.py
=====================================
@@ -1,4 +1,3 @@
-
 # Version: 0.18
 
 """The Versioneer - like a rocketeer, but for versions.
@@ -276,7 +275,6 @@ https://creativecommons.org/publicdomain/zero/1.0/ .
 
 """
 
-from __future__ import print_function
 try:
     import configparser
 except ImportError:
@@ -308,11 +306,13 @@ def get_root():
         setup_py = os.path.join(root, "setup.py")
         versioneer_py = os.path.join(root, "versioneer.py")
     if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)):
-        err = ("Versioneer was unable to run the project root directory. "
-               "Versioneer requires setup.py to be executed from "
-               "its immediate directory (like 'python setup.py COMMAND'), "
-               "or in a way that lets it use sys.argv[0] to find the root "
-               "(like 'python path/to/setup.py COMMAND').")
+        err = (
+            "Versioneer was unable to run the project root directory. "
+            "Versioneer requires setup.py to be executed from "
+            "its immediate directory (like 'python setup.py COMMAND'), "
+            "or in a way that lets it use sys.argv[0] to find the root "
+            "(like 'python path/to/setup.py COMMAND')."
+        )
         raise VersioneerBadRootError(err)
     try:
         # Certain runtime workflows (setup.py install/develop in a setuptools
@@ -325,8 +325,10 @@ def get_root():
         me_dir = os.path.normcase(os.path.splitext(me)[0])
         vsr_dir = os.path.normcase(os.path.splitext(versioneer_py)[0])
         if me_dir != vsr_dir:
-            print("Warning: build in %s is using versioneer.py from %s"
-                  % (os.path.dirname(me), versioneer_py))
+            print(
+                "Warning: build in %s is using versioneer.py from %s"
+                % (os.path.dirname(me), versioneer_py)
+            )
     except NameError:
         pass
     return root
@@ -340,7 +342,7 @@ def get_config_from_root(root):
     # the top of versioneer.py for instructions on writing your setup.cfg .
     setup_cfg = os.path.join(root, "setup.cfg")
     parser = configparser.SafeConfigParser()
-    with open(setup_cfg, "r") as f:
+    with open(setup_cfg) as f:
         parser.readfp(f)
     VCS = parser.get("versioneer", "VCS")  # mandatory
 
@@ -348,6 +350,7 @@ def get_config_from_root(root):
         if parser.has_option("versioneer", name):
             return parser.get("versioneer", name)
         return None
+
     cfg = VersioneerConfig()
     cfg.VCS = VCS
     cfg.style = get(parser, "style") or ""
@@ -372,17 +375,18 @@ HANDLERS = {}
 
 def register_vcs_handler(vcs, method):  # decorator
     """Decorator to mark a method as the handler for a particular VCS."""
+
     def decorate(f):
         """Store f in HANDLERS[vcs][method]."""
         if vcs not in HANDLERS:
             HANDLERS[vcs] = {}
         HANDLERS[vcs][method] = f
         return f
+
     return decorate
 
 
-def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
-                env=None):
+def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, env=None):
     """Call the given command(s)."""
     assert isinstance(commands, list)
     p = None
@@ -390,12 +394,15 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
         try:
             dispcmd = str([c] + args)
             # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
+            p = subprocess.Popen(
+                [c] + args,
+                cwd=cwd,
+                env=env,
+                stdout=subprocess.PIPE,
+                stderr=(subprocess.PIPE if hide_stderr else None),
+            )
             break
-        except EnvironmentError:
+        except OSError:
             e = sys.exc_info()[1]
             if e.errno == errno.ENOENT:
                 continue
@@ -405,7 +412,7 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
             return None, None
     else:
         if verbose:
-            print("unable to find command, tried %s" % (commands,))
+            print(f"unable to find command, tried {commands}")
         return None, None
     stdout = p.communicate()[0].strip()
     if sys.version_info[0] >= 3:
@@ -418,7 +425,9 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
     return stdout, p.returncode
 
 
-LONG_VERSION_PY['git'] = '''
+LONG_VERSION_PY[
+    "git"
+] = r'''
 # This file helps to compute a version number in source trees obtained from
 # git-archive tarball (such as those provided by githubs download-from-tag
 # feature). Distribution tarballs (built by setup.py sdist) and build
@@ -950,7 +959,7 @@ def git_get_keywords(versionfile_abs):
     # _version.py.
     keywords = {}
     try:
-        f = open(versionfile_abs, "r")
+        f = open(versionfile_abs)
         for line in f.readlines():
             if line.strip().startswith("git_refnames ="):
                 mo = re.search(r'=\s*"(.*)"', line)
@@ -965,7 +974,7 @@ def git_get_keywords(versionfile_abs):
                 if mo:
                     keywords["date"] = mo.group(1)
         f.close()
-    except EnvironmentError:
+    except OSError:
         pass
     return keywords
 
@@ -989,11 +998,11 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         if verbose:
             print("keywords are unexpanded, not using")
         raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
+    refs = {r.strip() for r in refnames.strip("()").split(",")}
     # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
     # just "foo-1.0". If we see a "tag: " prefix, prefer those.
     TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
+    tags = {r[len(TAG) :] for r in refs if r.startswith(TAG)}
     if not tags:
         # Either we're using git < 1.8.3, or there really are no tags. We use
         # a heuristic: assume all version tags have a digit. The old git %d
@@ -1002,7 +1011,7 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
         # between branches and tags. By ignoring refnames without digits, we
         # filter out many common branch names like "release" and
         # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
+        tags = {r for r in refs if re.search(r"\d", r)}
         if verbose:
             print("discarding '%s', no digits" % ",".join(refs - tags))
     if verbose:
@@ -1010,19 +1019,26 @@ def git_versions_from_keywords(keywords, tag_prefix, verbose):
     for ref in sorted(tags):
         # sorting will prefer e.g. "2.0" over "2.0rc1"
         if ref.startswith(tag_prefix):
-            r = ref[len(tag_prefix):]
+            r = ref[len(tag_prefix) :]
             if verbose:
                 print("picking %s" % r)
-            return {"version": r,
-                    "full-revisionid": keywords["full"].strip(),
-                    "dirty": False, "error": None,
-                    "date": date}
+            return {
+                "version": r,
+                "full-revisionid": keywords["full"].strip(),
+                "dirty": False,
+                "error": None,
+                "date": date,
+            }
     # no suitable tags, so version is "0+unknown", but full hex is still there
     if verbose:
         print("no suitable tags, using unknown + full revision id")
-    return {"version": "0+unknown",
-            "full-revisionid": keywords["full"].strip(),
-            "dirty": False, "error": "no suitable tags", "date": None}
+    return {
+        "version": "0+unknown",
+        "full-revisionid": keywords["full"].strip(),
+        "dirty": False,
+        "error": "no suitable tags",
+        "date": None,
+    }
 
 
 @register_vcs_handler("git", "pieces_from_vcs")
@@ -1037,8 +1053,7 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     if sys.platform == "win32":
         GITS = ["git.cmd", "git.exe"]
 
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
+    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root, hide_stderr=True)
     if rc != 0:
         if verbose:
             print("Directory %s not under git control" % root)
@@ -1046,10 +1061,19 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
 
     # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
     # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%s*" % tag_prefix],
-                                   cwd=root)
+    describe_out, rc = run_command(
+        GITS,
+        [
+            "describe",
+            "--tags",
+            "--dirty",
+            "--always",
+            "--long",
+            "--match",
+            "%s*" % tag_prefix,
+        ],
+        cwd=root,
+    )
     # --long was added in git-1.5.5
     if describe_out is None:
         raise NotThisMethod("'git describe' failed")
@@ -1072,17 +1096,16 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     dirty = git_describe.endswith("-dirty")
     pieces["dirty"] = dirty
     if dirty:
-        git_describe = git_describe[:git_describe.rindex("-dirty")]
+        git_describe = git_describe[: git_describe.rindex("-dirty")]
 
     # now we have TAG-NUM-gHEX or HEX
 
     if "-" in git_describe:
         # TAG-NUM-gHEX
-        mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
+        mo = re.search(r"^(.+)-(\d+)-g([0-9a-f]+)$", git_describe)
         if not mo:
             # unparseable. Maybe git-describe is misbehaving?
-            pieces["error"] = ("unable to parse git-describe output: '%s'"
-                               % describe_out)
+            pieces["error"] = "unable to parse git-describe output: '%s'" % describe_out
             return pieces
 
         # tag
@@ -1091,10 +1114,12 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
             if verbose:
                 fmt = "tag '%s' doesn't start with prefix '%s'"
                 print(fmt % (full_tag, tag_prefix))
-            pieces["error"] = ("tag '%s' doesn't start with prefix '%s'"
-                               % (full_tag, tag_prefix))
+            pieces["error"] = "tag '{}' doesn't start with prefix '{}'".format(
+                full_tag,
+                tag_prefix,
+            )
             return pieces
-        pieces["closest-tag"] = full_tag[len(tag_prefix):]
+        pieces["closest-tag"] = full_tag[len(tag_prefix) :]
 
         # distance: number of commits since tag
         pieces["distance"] = int(mo.group(2))
@@ -1105,13 +1130,13 @@ def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
     else:
         # HEX: no tags
         pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
+        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
         pieces["distance"] = int(count_out)  # total number of commits
 
     # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"],
-                       cwd=root)[0].strip()
+    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[
+        0
+    ].strip()
     pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
 
     return pieces
@@ -1139,13 +1164,13 @@ def do_vcs_install(manifest_in, versionfile_source, ipy):
     files.append(versioneer_file)
     present = False
     try:
-        f = open(".gitattributes", "r")
+        f = open(".gitattributes")
         for line in f.readlines():
             if line.strip().startswith(versionfile_source):
                 if "export-subst" in line.strip().split()[1:]:
                     present = True
         f.close()
-    except EnvironmentError:
+    except OSError:
         pass
     if not present:
         f = open(".gitattributes", "a+")
@@ -1167,16 +1192,22 @@ def versions_from_parentdir(parentdir_prefix, root, verbose):
     for i in range(3):
         dirname = os.path.basename(root)
         if dirname.startswith(parentdir_prefix):
-            return {"version": dirname[len(parentdir_prefix):],
-                    "full-revisionid": None,
-                    "dirty": False, "error": None, "date": None}
+            return {
+                "version": dirname[len(parentdir_prefix) :],
+                "full-revisionid": None,
+                "dirty": False,
+                "error": None,
+                "date": None,
+            }
         else:
             rootdirs.append(root)
             root = os.path.dirname(root)  # up a level
 
     if verbose:
-        print("Tried directories %s but none started with prefix %s" %
-              (str(rootdirs), parentdir_prefix))
+        print(
+            "Tried directories %s but none started with prefix %s"
+            % (str(rootdirs), parentdir_prefix)
+        )
     raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
 
 
@@ -1203,13 +1234,15 @@ def versions_from_file(filename):
     try:
         with open(filename) as f:
             contents = f.read()
-    except EnvironmentError:
+    except OSError:
         raise NotThisMethod("unable to read _version.py")
-    mo = re.search(r"version_json = '''\n(.*)'''  # END VERSION_JSON",
-                   contents, re.M | re.S)
+    mo = re.search(
+        r"version_json = '''\n(.*)'''  # END VERSION_JSON", contents, re.M | re.S
+    )
     if not mo:
-        mo = re.search(r"version_json = '''\r\n(.*)'''  # END VERSION_JSON",
-                       contents, re.M | re.S)
+        mo = re.search(
+            r"version_json = '''\r\n(.*)'''  # END VERSION_JSON", contents, re.M | re.S
+        )
     if not mo:
         raise NotThisMethod("no version_json in _version.py")
     return json.loads(mo.group(1))
@@ -1218,12 +1251,11 @@ def versions_from_file(filename):
 def write_to_version_file(filename, versions):
     """Write the given version number to the given _version.py file."""
     os.unlink(filename)
-    contents = json.dumps(versions, sort_keys=True,
-                          indent=1, separators=(",", ": "))
+    contents = json.dumps(versions, sort_keys=True, indent=1, separators=(",", ": "))
     with open(filename, "w") as f:
         f.write(SHORT_VERSION_PY % contents)
 
-    print("set %s to '%s'" % (filename, versions["version"]))
+    print("set {} to '{}'".format(filename, versions["version"]))
 
 
 def plus_or_dot(pieces):
@@ -1251,8 +1283,7 @@ def render_pep440(pieces):
                 rendered += ".dirty"
     else:
         # exception #1
-        rendered = "0+untagged.%d.g%s" % (pieces["distance"],
-                                          pieces["short"])
+        rendered = "0+untagged.%d.g%s" % (pieces["distance"], pieces["short"])
         if pieces["dirty"]:
             rendered += ".dirty"
     return rendered
@@ -1366,11 +1397,13 @@ def render_git_describe_long(pieces):
 def render(pieces, style):
     """Render the given version pieces into the requested style."""
     if pieces["error"]:
-        return {"version": "unknown",
-                "full-revisionid": pieces.get("long"),
-                "dirty": None,
-                "error": pieces["error"],
-                "date": None}
+        return {
+            "version": "unknown",
+            "full-revisionid": pieces.get("long"),
+            "dirty": None,
+            "error": pieces["error"],
+            "date": None,
+        }
 
     if not style or style == "default":
         style = "pep440"  # the default
@@ -1390,9 +1423,13 @@ def render(pieces, style):
     else:
         raise ValueError("unknown style '%s'" % style)
 
-    return {"version": rendered, "full-revisionid": pieces["long"],
-            "dirty": pieces["dirty"], "error": None,
-            "date": pieces.get("date")}
+    return {
+        "version": rendered,
+        "full-revisionid": pieces["long"],
+        "dirty": pieces["dirty"],
+        "error": None,
+        "date": pieces.get("date"),
+    }
 
 
 class VersioneerBadRootError(Exception):
@@ -1415,8 +1452,9 @@ def get_versions(verbose=False):
     handlers = HANDLERS.get(cfg.VCS)
     assert handlers, "unrecognized VCS '%s'" % cfg.VCS
     verbose = verbose or cfg.verbose
-    assert cfg.versionfile_source is not None, \
-        "please set versioneer.versionfile_source"
+    assert (
+        cfg.versionfile_source is not None
+    ), "please set versioneer.versionfile_source"
     assert cfg.tag_prefix is not None, "please set versioneer.tag_prefix"
 
     versionfile_abs = os.path.join(root, cfg.versionfile_source)
@@ -1442,7 +1480,7 @@ def get_versions(verbose=False):
     try:
         ver = versions_from_file(versionfile_abs)
         if verbose:
-            print("got version from file %s %s" % (versionfile_abs, ver))
+            print(f"got version from file {versionfile_abs} {ver}")
         return ver
     except NotThisMethod:
         pass
@@ -1470,9 +1508,13 @@ def get_versions(verbose=False):
     if verbose:
         print("unable to compute version")
 
-    return {"version": "0+unknown", "full-revisionid": None,
-            "dirty": None, "error": "unable to compute version",
-            "date": None}
+    return {
+        "version": "0+unknown",
+        "full-revisionid": None,
+        "dirty": None,
+        "error": "unable to compute version",
+        "date": None,
+    }
 
 
 def get_version():
@@ -1521,6 +1563,7 @@ def get_cmdclass():
             print(" date: %s" % vers.get("date"))
             if vers["error"]:
                 print(" error: %s" % vers["error"])
+
     cmds["version"] = cmd_version
 
     # we override "build_py" in both distutils and setuptools
@@ -1553,14 +1596,15 @@ def get_cmdclass():
             # now locate _version.py in the new build/ directory and replace
             # it with an updated value
             if cfg.versionfile_build:
-                target_versionfile = os.path.join(self.build_lib,
-                                                  cfg.versionfile_build)
+                target_versionfile = os.path.join(self.build_lib, cfg.versionfile_build)
                 print("UPDATING %s" % target_versionfile)
                 write_to_version_file(target_versionfile, versions)
+
     cmds["build_py"] = cmd_build_py
 
     if "cx_Freeze" in sys.modules:  # cx_freeze enabled?
         from cx_Freeze.dist import build_exe as _build_exe
+
         # nczeczulin reports that py2exe won't like the pep440-style string
         # as FILEVERSION, but it can be used for PRODUCTVERSION, e.g.
         # setup(console=[{
@@ -1581,17 +1625,21 @@ def get_cmdclass():
                 os.unlink(target_versionfile)
                 with open(cfg.versionfile_source, "w") as f:
                     LONG = LONG_VERSION_PY[cfg.VCS]
-                    f.write(LONG %
-                            {"DOLLAR": "$",
-                             "STYLE": cfg.style,
-                             "TAG_PREFIX": cfg.tag_prefix,
-                             "PARENTDIR_PREFIX": cfg.parentdir_prefix,
-                             "VERSIONFILE_SOURCE": cfg.versionfile_source,
-                             })
+                    f.write(
+                        LONG
+                        % {
+                            "DOLLAR": "$",
+                            "STYLE": cfg.style,
+                            "TAG_PREFIX": cfg.tag_prefix,
+                            "PARENTDIR_PREFIX": cfg.parentdir_prefix,
+                            "VERSIONFILE_SOURCE": cfg.versionfile_source,
+                        }
+                    )
+
         cmds["build_exe"] = cmd_build_exe
         del cmds["build_py"]
 
-    if 'py2exe' in sys.modules:  # py2exe enabled?
+    if "py2exe" in sys.modules:  # py2exe enabled?
         try:
             from py2exe.distutils_buildexe import py2exe as _py2exe  # py3
         except ImportError:
@@ -1610,13 +1658,17 @@ def get_cmdclass():
                 os.unlink(target_versionfile)
                 with open(cfg.versionfile_source, "w") as f:
                     LONG = LONG_VERSION_PY[cfg.VCS]
-                    f.write(LONG %
-                            {"DOLLAR": "$",
-                             "STYLE": cfg.style,
-                             "TAG_PREFIX": cfg.tag_prefix,
-                             "PARENTDIR_PREFIX": cfg.parentdir_prefix,
-                             "VERSIONFILE_SOURCE": cfg.versionfile_source,
-                             })
+                    f.write(
+                        LONG
+                        % {
+                            "DOLLAR": "$",
+                            "STYLE": cfg.style,
+                            "TAG_PREFIX": cfg.tag_prefix,
+                            "PARENTDIR_PREFIX": cfg.parentdir_prefix,
+                            "VERSIONFILE_SOURCE": cfg.versionfile_source,
+                        }
+                    )
+
         cmds["py2exe"] = cmd_py2exe
 
     # we override different "sdist" commands for both environments
@@ -1643,8 +1695,10 @@ def get_cmdclass():
             # updated value
             target_versionfile = os.path.join(base_dir, cfg.versionfile_source)
             print("UPDATING %s" % target_versionfile)
-            write_to_version_file(target_versionfile,
-                                  self._versioneer_generated_versions)
+            write_to_version_file(
+                target_versionfile, self._versioneer_generated_versions
+            )
+
     cmds["sdist"] = cmd_sdist
 
     return cmds
@@ -1699,11 +1753,9 @@ def do_setup():
     root = get_root()
     try:
         cfg = get_config_from_root(root)
-    except (EnvironmentError, configparser.NoSectionError,
-            configparser.NoOptionError) as e:
+    except (OSError, configparser.NoSectionError, configparser.NoOptionError) as e:
         if isinstance(e, (EnvironmentError, configparser.NoSectionError)):
-            print("Adding sample versioneer config to setup.cfg",
-                  file=sys.stderr)
+            print("Adding sample versioneer config to setup.cfg", file=sys.stderr)
             with open(os.path.join(root, "setup.cfg"), "a") as f:
                 f.write(SAMPLE_CONFIG)
         print(CONFIG_ERROR, file=sys.stderr)
@@ -1712,20 +1764,23 @@ def do_setup():
     print(" creating %s" % cfg.versionfile_source)
     with open(cfg.versionfile_source, "w") as f:
         LONG = LONG_VERSION_PY[cfg.VCS]
-        f.write(LONG % {"DOLLAR": "$",
-                        "STYLE": cfg.style,
-                        "TAG_PREFIX": cfg.tag_prefix,
-                        "PARENTDIR_PREFIX": cfg.parentdir_prefix,
-                        "VERSIONFILE_SOURCE": cfg.versionfile_source,
-                        })
-
-    ipy = os.path.join(os.path.dirname(cfg.versionfile_source),
-                       "__init__.py")
+        f.write(
+            LONG
+            % {
+                "DOLLAR": "$",
+                "STYLE": cfg.style,
+                "TAG_PREFIX": cfg.tag_prefix,
+                "PARENTDIR_PREFIX": cfg.parentdir_prefix,
+                "VERSIONFILE_SOURCE": cfg.versionfile_source,
+            }
+        )
+
+    ipy = os.path.join(os.path.dirname(cfg.versionfile_source), "__init__.py")
     if os.path.exists(ipy):
         try:
-            with open(ipy, "r") as f:
+            with open(ipy) as f:
                 old = f.read()
-        except EnvironmentError:
+        except OSError:
             old = ""
         if INIT_PY_SNIPPET not in old:
             print(" appending to %s" % ipy)
@@ -1744,12 +1799,12 @@ def do_setup():
     manifest_in = os.path.join(root, "MANIFEST.in")
     simple_includes = set()
     try:
-        with open(manifest_in, "r") as f:
+        with open(manifest_in) as f:
             for line in f:
                 if line.startswith("include "):
                     for include in line.split()[1:]:
                         simple_includes.add(include)
-    except EnvironmentError:
+    except OSError:
         pass
     # That doesn't cover everything MANIFEST.in can do
     # (http://docs.python.org/2/distutils/sourcedist.html#commands), so
@@ -1762,8 +1817,10 @@ def do_setup():
     else:
         print(" 'versioneer.py' already in MANIFEST.in")
     if cfg.versionfile_source not in simple_includes:
-        print(" appending versionfile_source ('%s') to MANIFEST.in" %
-              cfg.versionfile_source)
+        print(
+            " appending versionfile_source ('%s') to MANIFEST.in"
+            % cfg.versionfile_source
+        )
         with open(manifest_in, "a") as f:
             f.write("include %s\n" % cfg.versionfile_source)
     else:
@@ -1781,7 +1838,7 @@ def scan_setup_py():
     found = set()
     setters = False
     errors = 0
-    with open("setup.py", "r") as f:
+    with open("setup.py") as f:
         for line in f.readlines():
             if "import versioneer" in line:
                 found.add("import")



View it on GitLab: https://salsa.debian.org/debian-gis-team/donfig/-/compare/3c35c1c74f0ed421bda3b19f3a338af5f7d8f00b...e527db2aba4eea811babc2ed68684c02f26e1fed

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/donfig/-/compare/3c35c1c74f0ed421bda3b19f3a338af5f7d8f00b...e527db2aba4eea811babc2ed68684c02f26e1fed
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20220207/35b17c15/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list