[med-svn] [Git][med-team/python-questplus][upstream] New upstream version 2023.1

Andreas Tille (@tille) gitlab at salsa.debian.org
Tue Jul 16 10:36:51 BST 2024



Andreas Tille pushed to branch upstream at Debian Med / python-questplus


Commits:
d015787f by Andreas Tille at 2024-07-16T10:37:37+02:00
New upstream version 2023.1
- - - - -


21 changed files:

- − .appveyor.yml
- + .github/workflows/main.yml
- + .github/workflows/release.yml
- .readthedocs.yml
- − .travis.yml
- + BUILDING.md
- CHANGES.md
- MANIFEST.in
- doc/source/conf.py
- doc/source/installation.md
- + pyproject.toml
- questplus/__init__.py
- − questplus/_version.py
- − questplus/demos/__init__.py
- questplus/psychometric_function.py
- questplus/qp.py
- questplus/tests/test_qp.py
- questplus/utils.py
- − setup.cfg
- − setup.py
- − versioneer.py


Changes:

=====================================
.appveyor.yml deleted
=====================================
@@ -1,37 +0,0 @@
-build: off
-
-environment:
-  MINICONDA: C:\\Miniconda3-x64
-  matrix:
-    - CONDA_PYTHON_VERSION: "3.6"
-    - CONDA_PYTHON_VERSION: "3.7"
-    - CONDA_PYTHON_VERSION: "3.8"
-
-init:
-  - "ECHO %CONDA_PYTHON_VERSION% %MINICONDA%"
-  - call %MINICONDA%\Scripts\activate.bat
-  - conda config --set always_yes yes --set changeps1 no
-  - conda update -q conda
-  - conda info -a
-  - "conda create -n questplus -c conda-forge python=%CONDA_PYTHON_VERSION% pytest numpy scipy xarray json_tricks"
-  - call conda activate questplus
-#  - conda list
-
-install:
-  - python setup.py build
-
-  # Build & install sdist.
-  - python setup.py sdist --formats=zip
-#  - pip install --no-deps dist/questplus-*.zip
-#  - pip uninstall --yes questplus
-
-  # Build & install wheel.
-  - python setup.py bdist_wheel
-#  - pip install --no-deps dist/questplus-*.whl
-#  - pip uninstall --yes questplus
-
-  - ps: Remove-Item –path dist, build –recurse
-  - pip install --no-deps .
-
-test_script:
-  - py.test


=====================================
.github/workflows/main.yml
=====================================
@@ -0,0 +1,60 @@
+# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
+# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
+
+name: Unit tests
+
+on:
+  # push:
+  #   branches: ['**']
+  pull_request:
+    branches: ['**']
+  create:
+    branches: [main]
+    tags: ['**']
+  # schedule:
+  #   - cron: "0 4 * * *"
+
+jobs:
+  test:
+
+    strategy:
+      fail-fast: false
+      matrix:
+        python-version: ["3.8", "3.9", "3.10", "3.11"]
+        os: [ubuntu-latest, windows-latest, macos-latest]
+
+    runs-on: ${{ matrix.os }}
+
+    defaults:
+      run:
+        shell: bash
+
+    steps:
+    - uses: actions/checkout at v3
+    - name: Set up Python ${{ matrix.python-version }}
+      uses: actions/setup-python at v4
+      with:
+        python-version: ${{ matrix.python-version }}
+    - name: Install dependencies
+      run: |
+        python -m pip install --upgrade pip wheel build
+        python -m pip install pytest
+    - name: Build sdist
+      run: python -m build --sdist
+    - name: Install sdist
+      run: |
+        pip install --no-deps dist/questplus-*.*
+        pip uninstall --yes questplus
+        rm -rf dist
+    - name: Build wheel
+      run: python -m build --wheel
+    - name: Install wheel
+      run: |
+        pip install --no-deps dist/questplus-*.*
+        pip uninstall --yes questplus
+        rm -rf dist
+    - name: Install questplus
+      run: pip install .
+    - name: Test with pytest
+      run: |
+        pytest


=====================================
.github/workflows/release.yml
=====================================
@@ -0,0 +1,56 @@
+# Upload a Python Package using Twine when a release is created
+
+name: Build
+on:
+  release:
+    types: [published]
+  push:
+    branches:
+      - main
+  pull_request:
+    branches:
+      - main
+
+permissions:
+  contents: read
+
+jobs:
+  package:
+    runs-on: ubuntu-latest
+    steps:
+    - uses: actions/checkout at v3
+    - name: Set up Python
+      uses: actions/setup-python at v4
+      with:
+        python-version: '3.11'
+    - name: Install dependencies
+      run: |
+        python -m pip install --upgrade pip
+        pip install build twine
+    - name: Build package
+      run: python -m build --sdist --wheel
+    - name: Check package
+      run: twine check --strict dist/*
+    - name: Check env vars
+      run: |
+        echo "Triggered by: ${{ github.event_name }}"
+    - uses: actions/upload-artifact at v3
+      with:
+        name: dist
+        path: dist
+
+  # PyPI on release
+  pypi:
+    needs: package
+    runs-on: ubuntu-latest
+    if: github.event_name == 'release'
+    steps:
+    - uses: actions/download-artifact at v3
+      with:
+        name: dist
+        path: dist
+    - name: Publish to PyPI
+      uses: pypa/gh-action-pypi-publish at release/v1
+      with:
+        user: __token__
+        password: ${{ secrets.PYPI_API_TOKEN }}


=====================================
.readthedocs.yml
=====================================
@@ -4,7 +4,7 @@ sphinx:
   configuration: doc/source/conf.py
 
 python:
-  version: 3.7
+  version: 3.10
   install:
     - requirements: doc/requirements-rtd.txt
     - method: pip


=====================================
.travis.yml deleted
=====================================
@@ -1,56 +0,0 @@
-matrix:
-  include:
-    - os: linux
-      env: CONDA_PYTHON_VERSION=3.6
-
-    - os: linux
-      env: CONDA_PYTHON_VERSION=3.7
- 
-    - os: linux
-      env: CONDA_PYTHON_VERSION=3.8
-
-    - os: osx
-      env: CONDA_PYTHON_VERSION=3.6
-
-    - os: osx
-      env: CONDA_PYTHON_VERSION=3.7
-
-    - os: osx
-      env: CONDA_PYTHON_VERSION=3.8
-
-before_install:
-  - echo "Installing Miniconda environment..."
-  - if [ $TRAVIS_OS_NAME == 'linux' ];
-    then export MINICONDA_URL=https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh;
-    else export MINICONDA_URL=https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh;
-    fi
-  - wget $MINICONDA_URL -O miniconda.sh
-  - bash miniconda.sh -b -p $HOME/miniconda
-  - source ~/miniconda/etc/profile.d/conda.sh # Initialize shell.
-  - hash -r
-  - conda config --set always_yes yes --set changeps1 no
-  - conda update -q conda
-  - conda info -a
-  - conda create -n questplus -c conda-forge python=$CONDA_PYTHON_VERSION pytest numpy scipy xarray json_tricks
-  - conda activate questplus
-  - conda list
-
-install:
-  - python setup.py build
-
-  # Build & install sdist.
-  - python setup.py sdist
-  - pip install --no-deps dist/questplus-*.tar.gz
-  - pip uninstall --yes questplus
-
-  # Build & install wheel.
-  - python setup.py bdist_wheel
-  - pip install --no-deps dist/questplus-*.whl
-  - pip uninstall --yes questplus
-
-  - rm -rf dist/ build/
-
-  - pip install --no-deps .
-
-script:
-  - py.test


=====================================
BUILDING.md
=====================================
@@ -0,0 +1,8 @@
+## Building a release
+
+* Create `sdist` and `wheel` distributions:
+  ```python
+  python -m build --sdist --wheel
+  ```
+  (optionally append `--no-isolation` if running into group policy
+   permission errors on managed Windows systems)


=====================================
CHANGES.md
=====================================
@@ -1,13 +1,27 @@
+Changes
+=======
+
+v2023.1
+--------
+
+* Fix definition of `norm_cdf` psychometric function, by [Alex Forrence](https://github.com/aforren1)
+* Ensure compatibility with latest from NumPy and xarray versions
+* Add Thurstone scaling
+* Minimal required Python version is now 3.8
+
 v2019.4
 -------
+
 * Allow JSON serialization of random number generator
 
 v2019.3
 -------
+
 * Allow to pass a prior when instantiating `QuestPlusWeibull`
 
 v2019.2
 -------
+
 * Allow passing a random seed via `stim_selection_options` keyword
   argument
 * Better handling of `stim_selection_options` defaults (now allows
@@ -15,6 +29,7 @@ v2019.2
 
 v2019.1
 -------
+
 * Allow to pass priors for only some parameters
   (the remaining parameters will be assigned an uninformative prior)
 * Add more docstrings, fix typo in usage example
@@ -22,5 +37,6 @@ v2019.1
 
 v0.0.5
 ------
+
 * Allow retrieval of marginal posterior PDFs via `QuestPlus.marginal_posterior`
 * Allow `nan` values in JSON output (violating JSON std but useful)


=====================================
MANIFEST.in
=====================================
@@ -1,5 +1,3 @@
 include LICENSE
 include AUTHORS
 include README.md
-include versioneer.py
-include questplus/_version.py


=====================================
doc/source/conf.py
=====================================
@@ -7,28 +7,29 @@
 import os
 import sys
 
-abs_path = os.path.abspath('../../questplus')
+abs_path = os.path.abspath("../../questplus")
 assert os.path.exists(abs_path)
 sys.path.insert(0, abs_path)
 
 
 # -- Project information -----------------------------------------------------
 
-project = 'questplus'
-copyright = '2019, Richard Höchenberger'
-author = 'Richard Höchenberger'
+project = "questplus"
+copyright = "2019, Richard Höchenberger"
+author = "Richard Höchenberger"
 
-extensions = ['recommonmark',  # markdown support
-              'sphinx.ext.autodoc',
-              'sphinx.ext.autosummary',
-              # 'sphinx.ext.viewcode',
-              'sphinx.ext.napoleon',
-              'sphinx_autodoc_typehints',  # needs to be loaded AFTER napoleon
-              # 'sphinx.ext.coverage'
-              ]
+extensions = [
+    "recommonmark",  # markdown support
+    "sphinx.ext.autodoc",
+    "sphinx.ext.autosummary",
+    # 'sphinx.ext.viewcode',
+    "sphinx.ext.napoleon",
+    "sphinx_autodoc_typehints",  # needs to be loaded AFTER napoleon
+    # 'sphinx.ext.coverage'
+]
 
 # Add any paths that contain templates here, relative to this directory.
-templates_path = ['_templates']
+templates_path = ["_templates"]
 
 # List of patterns, relative to source directory, that match files and
 # directories to ignore when looking for source files.
@@ -41,11 +42,11 @@ exclude_patterns = []
 # The theme to use for HTML and HTML Help pages.  See the documentation for
 # a list of builtin themes.
 
-html_theme = 'sphinx_rtd_theme'
+html_theme = "sphinx_rtd_theme"
 
 # Add any paths that contain custom static files (such as style sheets) here,
 # relative to this directory. They are copied after the builtin static files,
 # so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ['_static']
+html_static_path = ["_static"]
 
-autoclass_content = 'both'  # Document __init__() methods as well.
+autoclass_content = "both"  # Document __init__() methods as well.


=====================================
doc/source/installation.md
=====================================
@@ -2,7 +2,7 @@
 
 ### Requirements
 
-- Python 3.6+
+- Python 3.8+
 - `xarray`
 - `scipy`
 - `json_tricks`


=====================================
pyproject.toml
=====================================
@@ -0,0 +1,43 @@
+[project]
+name = "questplus"
+description = "A QUEST+ implementation in Python."
+readme = "README.md"
+requires-python = ">=3.8"
+license = {file = "LICENSE"}
+keywords = ["science", "neuroscience", "psychology", "staircase"]
+authors = [
+  {name = "Richard Höchenberger"},
+  {email = "richard.hoechenberger at gmail.com"}
+]
+classifiers = [
+  "Intended Audience :: Science/Research",
+  "Programming Language :: Python"
+]
+dependencies = [
+  "numpy",
+  "scipy",
+  "xarray",
+  "json_tricks"
+]
+dynamic = ["version"]
+
+[build-system]
+requires = ["setuptools>=45", "setuptools_scm[toml]>=6.2", "wheel"]
+build-backend = "setuptools.build_meta"
+
+[tool.setuptools_scm]
+
+[tool.black]
+target-version = ['py38']
+include = '\.pyi?$'
+# 'extend-exclude' excludes files or directories in addition to the defaults
+extend-exclude = '''
+(
+  | ^/questplus/tests/
+)
+'''
+
+[tool.pytest.ini_options]
+filterwarnings = [
+    "error"
+]


=====================================
questplus/__init__.py
=====================================
@@ -1,5 +1,9 @@
-from .qp import QuestPlus, QuestPlusWeibull
+from importlib.metadata import version, PackageNotFoundError
 
-from ._version import get_versions
-__version__ = get_versions()['version']
-del get_versions
+try:
+    __version__ = version("questplus")
+except PackageNotFoundError:
+    # package is not installed
+    pass
+
+from .qp import QuestPlus, QuestPlusWeibull, QuestPlusThurstone  # noqa: F401


=====================================
questplus/_version.py deleted
=====================================
@@ -1,520 +0,0 @@
-
-# This file helps to compute a version number in source trees obtained from
-# git-archive tarball (such as those provided by githubs download-from-tag
-# feature). Distribution tarballs (built by setup.py sdist) and build
-# directories (produced by setup.py build) will contain a much shorter file
-# that just contains the computed version number.
-
-# This file is released into the public domain. Generated by
-# versioneer-0.18 (https://github.com/warner/python-versioneer)
-
-"""Git implementation of _version.py."""
-
-import errno
-import os
-import re
-import subprocess
-import sys
-
-
-def get_keywords():
-    """Get the keywords needed to look up the version information."""
-    # these strings will be replaced by git during git-archive.
-    # setup.py/versioneer.py will grep for the variable names, so they must
-    # each be defined on a line of their own. _version.py will just call
-    # get_keywords().
-    git_refnames = " (HEAD -> master, tag: 2019.4)"
-    git_full = "9abd28b7a64e1a7ded530b9603361842ee873859"
-    git_date = "2019-12-24 19:07:26 +0100"
-    keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
-    return keywords
-
-
-class VersioneerConfig:
-    """Container for Versioneer configuration parameters."""
-
-
-def get_config():
-    """Create, populate and return the VersioneerConfig() object."""
-    # these strings are filled in when 'setup.py versioneer' creates
-    # _version.py
-    cfg = VersioneerConfig()
-    cfg.VCS = "git"
-    cfg.style = "pep440"
-    cfg.tag_prefix = ""
-    cfg.parentdir_prefix = "None"
-    cfg.versionfile_source = "questplus/_version.py"
-    cfg.verbose = False
-    return cfg
-
-
-class NotThisMethod(Exception):
-    """Exception raised if a method is not valid for the current scenario."""
-
-
-LONG_VERSION_PY = {}
-HANDLERS = {}
-
-
-def register_vcs_handler(vcs, method):  # decorator
-    """Decorator to mark a method as the handler for a particular VCS."""
-    def decorate(f):
-        """Store f in HANDLERS[vcs][method]."""
-        if vcs not in HANDLERS:
-            HANDLERS[vcs] = {}
-        HANDLERS[vcs][method] = f
-        return f
-    return decorate
-
-
-def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
-                env=None):
-    """Call the given command(s)."""
-    assert isinstance(commands, list)
-    p = None
-    for c in commands:
-        try:
-            dispcmd = str([c] + args)
-            # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
-            break
-        except EnvironmentError:
-            e = sys.exc_info()[1]
-            if e.errno == errno.ENOENT:
-                continue
-            if verbose:
-                print("unable to run %s" % dispcmd)
-                print(e)
-            return None, None
-    else:
-        if verbose:
-            print("unable to find command, tried %s" % (commands,))
-        return None, None
-    stdout = p.communicate()[0].strip()
-    if sys.version_info[0] >= 3:
-        stdout = stdout.decode()
-    if p.returncode != 0:
-        if verbose:
-            print("unable to run %s (error)" % dispcmd)
-            print("stdout was %s" % stdout)
-        return None, p.returncode
-    return stdout, p.returncode
-
-
-def versions_from_parentdir(parentdir_prefix, root, verbose):
-    """Try to determine the version from the parent directory name.
-
-    Source tarballs conventionally unpack into a directory that includes both
-    the project name and a version string. We will also support searching up
-    two directory levels for an appropriately named parent directory
-    """
-    rootdirs = []
-
-    for i in range(3):
-        dirname = os.path.basename(root)
-        if dirname.startswith(parentdir_prefix):
-            return {"version": dirname[len(parentdir_prefix):],
-                    "full-revisionid": None,
-                    "dirty": False, "error": None, "date": None}
-        else:
-            rootdirs.append(root)
-            root = os.path.dirname(root)  # up a level
-
-    if verbose:
-        print("Tried directories %s but none started with prefix %s" %
-              (str(rootdirs), parentdir_prefix))
-    raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
-
-
- at register_vcs_handler("git", "get_keywords")
-def git_get_keywords(versionfile_abs):
-    """Extract version information from the given file."""
-    # the code embedded in _version.py can just fetch the value of these
-    # keywords. When used from setup.py, we don't want to import _version.py,
-    # so we do it with a regexp instead. This function is not used from
-    # _version.py.
-    keywords = {}
-    try:
-        f = open(versionfile_abs, "r")
-        for line in f.readlines():
-            if line.strip().startswith("git_refnames ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["refnames"] = mo.group(1)
-            if line.strip().startswith("git_full ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["full"] = mo.group(1)
-            if line.strip().startswith("git_date ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["date"] = mo.group(1)
-        f.close()
-    except EnvironmentError:
-        pass
-    return keywords
-
-
- at register_vcs_handler("git", "keywords")
-def git_versions_from_keywords(keywords, tag_prefix, verbose):
-    """Get version information from git keywords."""
-    if not keywords:
-        raise NotThisMethod("no keywords at all, weird")
-    date = keywords.get("date")
-    if date is not None:
-        # git-2.2.0 added "%cI", which expands to an ISO-8601 -compliant
-        # datestamp. However we prefer "%ci" (which expands to an "ISO-8601
-        # -like" string, which we must then edit to make compliant), because
-        # it's been around since git-1.5.3, and it's too difficult to
-        # discover which version we're using, or to work around using an
-        # older one.
-        date = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
-    refnames = keywords["refnames"].strip()
-    if refnames.startswith("$Format"):
-        if verbose:
-            print("keywords are unexpanded, not using")
-        raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
-    # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
-    # just "foo-1.0". If we see a "tag: " prefix, prefer those.
-    TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
-    if not tags:
-        # Either we're using git < 1.8.3, or there really are no tags. We use
-        # a heuristic: assume all version tags have a digit. The old git %d
-        # expansion behaves like git log --decorate=short and strips out the
-        # refs/heads/ and refs/tags/ prefixes that would let us distinguish
-        # between branches and tags. By ignoring refnames without digits, we
-        # filter out many common branch names like "release" and
-        # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
-        if verbose:
-            print("discarding '%s', no digits" % ",".join(refs - tags))
-    if verbose:
-        print("likely tags: %s" % ",".join(sorted(tags)))
-    for ref in sorted(tags):
-        # sorting will prefer e.g. "2.0" over "2.0rc1"
-        if ref.startswith(tag_prefix):
-            r = ref[len(tag_prefix):]
-            if verbose:
-                print("picking %s" % r)
-            return {"version": r,
-                    "full-revisionid": keywords["full"].strip(),
-                    "dirty": False, "error": None,
-                    "date": date}
-    # no suitable tags, so version is "0+unknown", but full hex is still there
-    if verbose:
-        print("no suitable tags, using unknown + full revision id")
-    return {"version": "0+unknown",
-            "full-revisionid": keywords["full"].strip(),
-            "dirty": False, "error": "no suitable tags", "date": None}
-
-
- at register_vcs_handler("git", "pieces_from_vcs")
-def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
-    """Get version from 'git describe' in the root of the source tree.
-
-    This only gets called if the git-archive 'subst' keywords were *not*
-    expanded, and _version.py hasn't already been rewritten with a short
-    version string, meaning we're inside a checked out source tree.
-    """
-    GITS = ["git"]
-    if sys.platform == "win32":
-        GITS = ["git.cmd", "git.exe"]
-
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
-    if rc != 0:
-        if verbose:
-            print("Directory %s not under git control" % root)
-        raise NotThisMethod("'git rev-parse --git-dir' returned error")
-
-    # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
-    # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%s*" % tag_prefix],
-                                   cwd=root)
-    # --long was added in git-1.5.5
-    if describe_out is None:
-        raise NotThisMethod("'git describe' failed")
-    describe_out = describe_out.strip()
-    full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
-    if full_out is None:
-        raise NotThisMethod("'git rev-parse' failed")
-    full_out = full_out.strip()
-
-    pieces = {}
-    pieces["long"] = full_out
-    pieces["short"] = full_out[:7]  # maybe improved later
-    pieces["error"] = None
-
-    # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
-    # TAG might have hyphens.
-    git_describe = describe_out
-
-    # look for -dirty suffix
-    dirty = git_describe.endswith("-dirty")
-    pieces["dirty"] = dirty
-    if dirty:
-        git_describe = git_describe[:git_describe.rindex("-dirty")]
-
-    # now we have TAG-NUM-gHEX or HEX
-
-    if "-" in git_describe:
-        # TAG-NUM-gHEX
-        mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
-        if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
-            pieces["error"] = ("unable to parse git-describe output: '%s'"
-                               % describe_out)
-            return pieces
-
-        # tag
-        full_tag = mo.group(1)
-        if not full_tag.startswith(tag_prefix):
-            if verbose:
-                fmt = "tag '%s' doesn't start with prefix '%s'"
-                print(fmt % (full_tag, tag_prefix))
-            pieces["error"] = ("tag '%s' doesn't start with prefix '%s'"
-                               % (full_tag, tag_prefix))
-            return pieces
-        pieces["closest-tag"] = full_tag[len(tag_prefix):]
-
-        # distance: number of commits since tag
-        pieces["distance"] = int(mo.group(2))
-
-        # commit: short hex revision ID
-        pieces["short"] = mo.group(3)
-
-    else:
-        # HEX: no tags
-        pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
-        pieces["distance"] = int(count_out)  # total number of commits
-
-    # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"],
-                       cwd=root)[0].strip()
-    pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
-
-    return pieces
-
-
-def plus_or_dot(pieces):
-    """Return a + if we don't already have one, else return a ."""
-    if "+" in pieces.get("closest-tag", ""):
-        return "."
-    return "+"
-
-
-def render_pep440(pieces):
-    """Build up version string, with post-release "local version identifier".
-
-    Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you
-    get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty
-
-    Exceptions:
-    1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += plus_or_dot(pieces)
-            rendered += "%d.g%s" % (pieces["distance"], pieces["short"])
-            if pieces["dirty"]:
-                rendered += ".dirty"
-    else:
-        # exception #1
-        rendered = "0+untagged.%d.g%s" % (pieces["distance"],
-                                          pieces["short"])
-        if pieces["dirty"]:
-            rendered += ".dirty"
-    return rendered
-
-
-def render_pep440_pre(pieces):
-    """TAG[.post.devDISTANCE] -- No -dirty.
-
-    Exceptions:
-    1: no tags. 0.post.devDISTANCE
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"]:
-            rendered += ".post.dev%d" % pieces["distance"]
-    else:
-        # exception #1
-        rendered = "0.post.dev%d" % pieces["distance"]
-    return rendered
-
-
-def render_pep440_post(pieces):
-    """TAG[.postDISTANCE[.dev0]+gHEX] .
-
-    The ".dev0" means dirty. Note that .dev0 sorts backwards
-    (a dirty tree will appear "older" than the corresponding clean one),
-    but you shouldn't be releasing software with -dirty anyways.
-
-    Exceptions:
-    1: no tags. 0.postDISTANCE[.dev0]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += ".post%d" % pieces["distance"]
-            if pieces["dirty"]:
-                rendered += ".dev0"
-            rendered += plus_or_dot(pieces)
-            rendered += "g%s" % pieces["short"]
-    else:
-        # exception #1
-        rendered = "0.post%d" % pieces["distance"]
-        if pieces["dirty"]:
-            rendered += ".dev0"
-        rendered += "+g%s" % pieces["short"]
-    return rendered
-
-
-def render_pep440_old(pieces):
-    """TAG[.postDISTANCE[.dev0]] .
-
-    The ".dev0" means dirty.
-
-    Eexceptions:
-    1: no tags. 0.postDISTANCE[.dev0]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += ".post%d" % pieces["distance"]
-            if pieces["dirty"]:
-                rendered += ".dev0"
-    else:
-        # exception #1
-        rendered = "0.post%d" % pieces["distance"]
-        if pieces["dirty"]:
-            rendered += ".dev0"
-    return rendered
-
-
-def render_git_describe(pieces):
-    """TAG[-DISTANCE-gHEX][-dirty].
-
-    Like 'git describe --tags --dirty --always'.
-
-    Exceptions:
-    1: no tags. HEX[-dirty]  (note: no 'g' prefix)
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"]:
-            rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
-    else:
-        # exception #1
-        rendered = pieces["short"]
-    if pieces["dirty"]:
-        rendered += "-dirty"
-    return rendered
-
-
-def render_git_describe_long(pieces):
-    """TAG-DISTANCE-gHEX[-dirty].
-
-    Like 'git describe --tags --dirty --always -long'.
-    The distance/hash is unconditional.
-
-    Exceptions:
-    1: no tags. HEX[-dirty]  (note: no 'g' prefix)
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
-    else:
-        # exception #1
-        rendered = pieces["short"]
-    if pieces["dirty"]:
-        rendered += "-dirty"
-    return rendered
-
-
-def render(pieces, style):
-    """Render the given version pieces into the requested style."""
-    if pieces["error"]:
-        return {"version": "unknown",
-                "full-revisionid": pieces.get("long"),
-                "dirty": None,
-                "error": pieces["error"],
-                "date": None}
-
-    if not style or style == "default":
-        style = "pep440"  # the default
-
-    if style == "pep440":
-        rendered = render_pep440(pieces)
-    elif style == "pep440-pre":
-        rendered = render_pep440_pre(pieces)
-    elif style == "pep440-post":
-        rendered = render_pep440_post(pieces)
-    elif style == "pep440-old":
-        rendered = render_pep440_old(pieces)
-    elif style == "git-describe":
-        rendered = render_git_describe(pieces)
-    elif style == "git-describe-long":
-        rendered = render_git_describe_long(pieces)
-    else:
-        raise ValueError("unknown style '%s'" % style)
-
-    return {"version": rendered, "full-revisionid": pieces["long"],
-            "dirty": pieces["dirty"], "error": None,
-            "date": pieces.get("date")}
-
-
-def get_versions():
-    """Get version information or return default if unable to do so."""
-    # I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have
-    # __file__, we can work backwards from there to the root. Some
-    # py2exe/bbfreeze/non-CPython implementations don't do __file__, in which
-    # case we can only use expanded keywords.
-
-    cfg = get_config()
-    verbose = cfg.verbose
-
-    try:
-        return git_versions_from_keywords(get_keywords(), cfg.tag_prefix,
-                                          verbose)
-    except NotThisMethod:
-        pass
-
-    try:
-        root = os.path.realpath(__file__)
-        # versionfile_source is the relative path from the top of the source
-        # tree (where the .git directory might live) to this file. Invert
-        # this to find the root from __file__.
-        for i in cfg.versionfile_source.split('/'):
-            root = os.path.dirname(root)
-    except NameError:
-        return {"version": "0+unknown", "full-revisionid": None,
-                "dirty": None,
-                "error": "unable to find root of source tree",
-                "date": None}
-
-    try:
-        pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose)
-        return render(pieces, cfg.style)
-    except NotThisMethod:
-        pass
-
-    try:
-        if cfg.parentdir_prefix:
-            return versions_from_parentdir(cfg.parentdir_prefix, root, verbose)
-    except NotThisMethod:
-        pass
-
-    return {"version": "0+unknown", "full-revisionid": None,
-            "dirty": None,
-            "error": "unable to compute version", "date": None}


=====================================
questplus/demos/__init__.py deleted
=====================================


=====================================
questplus/psychometric_function.py
=====================================
@@ -1,17 +1,20 @@
 from typing import Union, Iterable
 import numpy as np
+from numpy.typing import ArrayLike
 import scipy.stats
 import xarray as xr
 
 
-def weibull(*,
-            intensity: Union[float, Iterable[float]],
-            threshold: Union[float, Iterable[float]],
-            slope: Union[float, Iterable[float]] = 3.5,
-            lower_asymptote: Union[float, Iterable[float]] = 0.01,
-            lapse_rate: Union[float, Iterable[float]] = 0.01,
-            scale: str = 'log10') -> xr.DataArray:
-    """
+def weibull(
+    *,
+    intensity: Union[float, Iterable[float]],
+    threshold: Union[float, Iterable[float]],
+    slope: Union[float, Iterable[float]] = 3.5,
+    lower_asymptote: Union[float, Iterable[float]] = 0.01,
+    lapse_rate: Union[float, Iterable[float]] = 0.01,
+    scale: str = "log10",
+) -> xr.DataArray:
+    r"""
     A Weibull psychometric function.
 
     Parameters
@@ -75,46 +78,53 @@ def weibull(*,
     #                                        lapse_rate,
     #                                        indexing='ij', sparse=True)
 
-    x = xr.DataArray(data=intensity, dims=['intensity'],
-                     coords=dict(intensity=intensity))
-    t = xr.DataArray(data=threshold, dims=['threshold'],
-                     coords=dict(threshold=threshold))
-    beta = xr.DataArray(data=slope, dims=['slope'],
-                        coords=dict(slope=slope))
-    gamma = xr.DataArray(data=lower_asymptote, dims=['lower_asymptote'],
-                         coords=dict(lower_asymptote=lower_asymptote))
-    delta = xr.DataArray(data=lapse_rate, dims=['lapse_rate'],
-                         coords=dict(lapse_rate=lapse_rate))
+    x = xr.DataArray(
+        data=intensity, dims=["intensity"], coords=dict(intensity=intensity)
+    )
+    t = xr.DataArray(
+        data=threshold, dims=["threshold"], coords=dict(threshold=threshold)
+    )
+    beta = xr.DataArray(data=slope, dims=["slope"], coords=dict(slope=slope))
+    gamma = xr.DataArray(
+        data=lower_asymptote,
+        dims=["lower_asymptote"],
+        coords=dict(lower_asymptote=lower_asymptote),
+    )
+    delta = xr.DataArray(
+        data=lapse_rate, dims=["lapse_rate"], coords=dict(lapse_rate=lapse_rate)
+    )
     assert np.atleast_1d(x.squeeze()).shape == np.atleast_1d(intensity).shape
     assert np.atleast_1d(t.squeeze()).shape == np.atleast_1d(threshold).shape
     assert np.atleast_1d(beta.squeeze()).shape == np.atleast_1d(slope).shape
     assert np.atleast_1d(gamma.squeeze()).shape == np.atleast_1d(lower_asymptote).shape
     assert np.atleast_1d(delta.squeeze()).shape == np.atleast_1d(lapse_rate).shape
 
-    if scale == 'linear':
-        p = 1 - delta - (1 - gamma - delta) * np.exp(-(x / t)**beta)
-    elif scale == 'log10':
-        p = 1 - delta - (1 - gamma - delta) * np.exp(-10 ** (beta * (x - t)))
-    elif scale == 'dB':
-        p = 1 - delta - (1 - gamma - delta) * np.exp(-10 ** (beta * (x - t) / 20))
+    if scale == "linear":
+        p = 1 - delta - (1 - gamma - delta) * np.exp(-((x / t) ** beta))
+    elif scale == "log10":
+        p = 1 - delta - (1 - gamma - delta) * np.exp(-(10 ** (beta * (x - t))))
+    elif scale == "dB":
+        p = 1 - delta - (1 - gamma - delta) * np.exp(-(10 ** (beta * (x - t) / 20)))
     else:
-        raise ValueError('Invalid scale specified.')
+        raise ValueError("Invalid scale specified.")
 
     return p
 
 
-def csf(*,
-        contrast: Union[float, Iterable[float]],
-        spatial_freq: Union[float, Iterable[float]],
-        temporal_freq: Union[float, Iterable[float]],
-        c0: Union[float, Iterable[float]],
-        cf: Union[float, Iterable[float]],
-        cw: Union[float, Iterable[float]],
-        min_thresh: Union[float, Iterable[float]],
-        slope: Union[float, Iterable[float]] = 3.5,
-        lower_asymptote: Union[float, Iterable[float]] = 0.01,
-        lapse_rate: Union[float, Iterable[float]] = 0.01,
-        scale: str = 'log10') -> np.ndarray:
+def csf(
+    *,
+    contrast: Union[float, Iterable[float]],
+    spatial_freq: Union[float, Iterable[float]],
+    temporal_freq: Union[float, Iterable[float]],
+    c0: Union[float, Iterable[float]],
+    cf: Union[float, Iterable[float]],
+    cw: Union[float, Iterable[float]],
+    min_thresh: Union[float, Iterable[float]],
+    slope: Union[float, Iterable[float]] = 3.5,
+    lower_asymptote: Union[float, Iterable[float]] = 0.01,
+    lapse_rate: Union[float, Iterable[float]] = 0.01,
+    scale: str = "log10",
+) -> np.ndarray:
     """
     The spatio-temporal contrast sensitivity function.
 
@@ -154,26 +164,30 @@ def csf(*,
     #     slope, lower_asymptote, lapse_rate,
     #     indexing='ij', sparse=True)
 
-    x = xr.DataArray(data=contrast, dims=['contrast'],
-                     coords=dict(contrast=contrast))
-    f = xr.DataArray(data=spatial_freq, dims=['spatial_freq'],
-                     coords=dict(spatial_freq=spatial_freq))
-    w = xr.DataArray(data=temporal_freq, dims=['temporal_freq'],
-                     coords=dict(temporal_freq=temporal_freq))
-    c0_ = xr.DataArray(data=c0, dims=['c0'],
-                       coords=dict(c0=c0))
-    cf_ = xr.DataArray(data=cf, dims=['cf'],
-                       coords=dict(cf=cf))
-    cw_ = xr.DataArray(data=cw, dims=['cw'],
-                       coords=dict(cw=cw))
-    min_t = xr.DataArray(data=min_thresh, dims=['min_thresh'],
-                         coords=dict(min_thresh=min_thresh))
-    beta = xr.DataArray(data=slope, dims=['slope'],
-                        coords=dict(slope=slope))
-    gamma = xr.DataArray(data=lower_asymptote, dims=['lower_asymptote'],
-                         coords=dict(lower_asymptote=lower_asymptote))
-    delta = xr.DataArray(data=lapse_rate, dims=['lapse_rate'],
-                         coords=dict(lapse_rate=lapse_rate))
+    x = xr.DataArray(data=contrast, dims=["contrast"], coords=dict(contrast=contrast))
+    f = xr.DataArray(
+        data=spatial_freq, dims=["spatial_freq"], coords=dict(spatial_freq=spatial_freq)
+    )
+    w = xr.DataArray(
+        data=temporal_freq,
+        dims=["temporal_freq"],
+        coords=dict(temporal_freq=temporal_freq),
+    )
+    c0_ = xr.DataArray(data=c0, dims=["c0"], coords=dict(c0=c0))
+    cf_ = xr.DataArray(data=cf, dims=["cf"], coords=dict(cf=cf))
+    cw_ = xr.DataArray(data=cw, dims=["cw"], coords=dict(cw=cw))
+    min_t = xr.DataArray(
+        data=min_thresh, dims=["min_thresh"], coords=dict(min_thresh=min_thresh)
+    )
+    beta = xr.DataArray(data=slope, dims=["slope"], coords=dict(slope=slope))
+    gamma = xr.DataArray(
+        data=lower_asymptote,
+        dims=["lower_asymptote"],
+        coords=dict(lower_asymptote=lower_asymptote),
+    )
+    delta = xr.DataArray(
+        data=lapse_rate, dims=["lapse_rate"], coords=dict(lapse_rate=lapse_rate)
+    )
 
     t = np.maximum(min_t, c0_ + cf_ * f + cw_ * w)
 
@@ -184,25 +198,27 @@ def csf(*,
     #             lapse_rate=lapse_rate,
     #             scale=scale)
 
-    if scale == 'linear':
-        p = 1 - delta - (1 - gamma - delta) * np.exp(-(x / t)**beta)
-    elif scale == 'log10':
-        p = 1 - delta - (1 - gamma - delta) * np.exp(-10 ** (beta * (x - t)))
-    elif scale == 'dB':
-        p = 1 - delta - (1 - gamma - delta) * np.exp(-10 ** (beta * (x - t) / 20))
+    if scale == "linear":
+        p = 1 - delta - (1 - gamma - delta) * np.exp(-((x / t) ** beta))
+    elif scale == "log10":
+        p = 1 - delta - (1 - gamma - delta) * np.exp(-(10 ** (beta * (x - t))))
+    elif scale == "dB":
+        p = 1 - delta - (1 - gamma - delta) * np.exp(-(10 ** (beta * (x - t) / 20)))
     else:
-        raise ValueError('Invalid scale specified.')
+        raise ValueError("Invalid scale specified.")
 
     return p
 
 
-def norm_cdf(*,
-             intensity: Union[float, Iterable[float]],
-             mean: Union[float, Iterable[float]],
-             sd: Union[float, Iterable[float]],
-             lower_asymptote: Union[float, Iterable[float]] = 0.01,
-             lapse_rate: Union[float, Iterable[float]] = 0.01,
-             scale: str = 'linear'):
+def norm_cdf(
+    *,
+    intensity: Union[float, Iterable[float]],
+    mean: Union[float, Iterable[float]],
+    sd: Union[float, Iterable[float]],
+    lower_asymptote: Union[float, Iterable[float]] = 0.01,
+    lapse_rate: Union[float, Iterable[float]] = 0.01,
+    scale: str = "linear",
+):
     """
     The cumulate normal distribution.
 
@@ -219,9 +235,11 @@ def norm_cdf(*,
     -------
 
     """
-    if scale != 'linear':
-        msg = ('Currently, only linear stimulus scaling is supported for this '
-               'psychometric function.')
+    if scale != "linear":
+        msg = (
+            "Currently, only linear stimulus scaling is supported for this "
+            "psychometric function."
+        )
         raise ValueError(msg)
 
     intensity = np.atleast_1d(intensity)
@@ -230,16 +248,19 @@ def norm_cdf(*,
     lower_asymptote = np.atleast_1d(lower_asymptote)
     lapse_rate = np.atleast_1d(lapse_rate)
 
-    x = xr.DataArray(data=intensity, dims=['intensity'],
-                     coords=dict(intensity=intensity))
-    mu = xr.DataArray(data=mean, dims=['mean'],
-                      coords=dict(mean=mean))
-    sd_ = xr.DataArray(data=sd, dims=['sd'],
-                       coords=dict(sd=sd))
-    gamma = xr.DataArray(data=lower_asymptote, dims=['lower_asymptote'],
-                         coords=dict(lower_asymptote=lower_asymptote))
-    delta = xr.DataArray(data=lapse_rate, dims=['lapse_rate'],
-                         coords=dict(lapse_rate=lapse_rate))
+    x = xr.DataArray(
+        data=intensity, dims=["intensity"], coords=dict(intensity=intensity)
+    )
+    mu = xr.DataArray(data=mean, dims=["mean"], coords=dict(mean=mean))
+    sd_ = xr.DataArray(data=sd, dims=["sd"], coords=dict(sd=sd))
+    gamma = xr.DataArray(
+        data=lower_asymptote,
+        dims=["lower_asymptote"],
+        coords=dict(lower_asymptote=lower_asymptote),
+    )
+    delta = xr.DataArray(
+        data=lapse_rate, dims=["lapse_rate"], coords=dict(lapse_rate=lapse_rate)
+    )
 
     # x, mu, sd_, delta = np.meshgrid(intensity,
     #                                 mean,
@@ -256,18 +277,20 @@ def norm_cdf(*,
 
     def _mu_func(x, mu, sd_, gamma, delta):
         norm = scipy.stats.norm(loc=mu, scale=sd_)
-        return delta + (1 - gamma - delta) * norm.cdf(x)
+        return gamma + (1 - gamma - delta) * norm.cdf(x)
 
     p = xr.apply_ufunc(_mu_func, x, mu, sd_, gamma, delta)
     return p
 
 
-def norm_cdf_2(*,
-               intensity: Union[float, Iterable[float]],
-               mean: Union[float, Iterable[float]],
-               sd: Union[float, Iterable[float]],
-               lapse_rate: Union[float, Iterable[float]] = 0.01,
-               scale: str = 'linear'):
+def norm_cdf_2(
+    *,
+    intensity: Union[float, Iterable[float]],
+    mean: Union[float, Iterable[float]],
+    sd: Union[float, Iterable[float]],
+    lapse_rate: Union[float, Iterable[float]] = 0.01,
+    scale: str = "linear",
+):
     """
     The cumulative normal distribution with lapse rate equal to lower
     asymptote.
@@ -284,9 +307,11 @@ def norm_cdf_2(*,
     -------
 
     """
-    if scale != 'linear':
-        msg = ('Currently, only linear stimulus scaling is supported for this '
-               'psychometric function.')
+    if scale != "linear":
+        msg = (
+            "Currently, only linear stimulus scaling is supported for this "
+            "psychometric function."
+        )
         raise ValueError(msg)
 
     intensity = np.atleast_1d(intensity)
@@ -294,18 +319,170 @@ def norm_cdf_2(*,
     sd = np.atleast_1d(sd)
     lapse_rate = np.atleast_1d(lapse_rate)
 
-    x = xr.DataArray(data=intensity, dims=['intensity'],
-                     coords=dict(intensity=intensity))
-    mu = xr.DataArray(data=mean, dims=['mean'],
-                      coords=dict(mean=mean))
-    sd_ = xr.DataArray(data=sd, dims=['sd'],
-                       coords=dict(sd=sd))
-    delta = xr.DataArray(data=lapse_rate, dims=['lapse_rate'],
-                         coords=dict(lapse_rate=lapse_rate))
+    x = xr.DataArray(
+        data=intensity, dims=["intensity"], coords=dict(intensity=intensity)
+    )
+    mu = xr.DataArray(data=mean, dims=["mean"], coords=dict(mean=mean))
+    sd_ = xr.DataArray(data=sd, dims=["sd"], coords=dict(sd=sd))
+    delta = xr.DataArray(
+        data=lapse_rate, dims=["lapse_rate"], coords=dict(lapse_rate=lapse_rate)
+    )
 
     def _mu_func(x, mu, sd_, delta):
         norm = scipy.stats.norm(loc=mu, scale=sd_)
-        return delta + (1 - 2*delta) * norm.cdf(x)
+        return delta + (1 - 2 * delta) * norm.cdf(x)
 
     p = xr.apply_ufunc(_mu_func, x, mu, sd_, delta)
     return p
+
+
+def scaling_function(
+    *,
+    x: Union[ArrayLike, float],
+    m: float,
+    mag_min: float = 0,
+    mag_max: float = 1,
+    t: float,
+    q: float,
+) -> ArrayLike:
+    """
+    The scaling function.
+
+    Parameters
+    ----------
+    x
+        This pyhysical stimulus magnitude(s).
+    m
+        The maximum value of the subjective scale.
+    mag_min
+        The minimum value of the physical stimulus magnitude.
+    mag_max
+        The maximum value of the physical stimulus magnitude.
+    t
+        The threshold value (physical stimulus magnitude at which the
+        participant starts to perceive the stimulus).
+    q
+        The power exponent.
+
+    Returns
+    -------
+    result
+        The subjectively perceived intensities corresponding to the physical
+        stimulus magnitudes.
+    """
+    # x = np.atleast_1d(x)
+    # m = np.atleast_1d(m)
+    # mag_min = np.atleast_1d(mag_min)
+    # mag_max = np.atleast_1d(mag_max)
+    # t = np.atleast_1d(t)
+    # q = np.atleast_1d(q)
+    #
+    # assert len(mag_min) == len(mag_max) == 1
+
+    nom = np.maximum(mag_min, x - t)
+    denom = mag_max - t
+
+    result = m * (nom / denom) ** q
+    return result
+
+
+def thurstone_scaling_function(
+    *,
+    physical_magnitudes_stim_1: Union[ArrayLike, float],
+    physical_magnitudes_stim_2: Union[ArrayLike, float],
+    threshold: Union[ArrayLike, float],
+    power: Union[ArrayLike, float],
+    perceptual_scale_max: Union[ArrayLike, float],
+) -> ArrayLike:
+    """
+    The Thurstone scaling function.
+
+    Parameters
+    ----------
+    physical_magnitudes_stim_1, physical_magnitudes_stim_2
+        This pyhysical stimulus magnitudes the participant is asked to
+        compare. All possible pairings will be generated automatically.
+        The values in each array must be unique.
+    threshold
+        The threshold value (physical stimulus magnitude at which the
+        participant starts to perceive the stimulus).
+    power
+        The power exponent.
+    perceptual_scale_max
+        The maximum value of the subjective perceptual scale (in JND / S.D.).
+
+    Returns
+    -------
+    """
+    physical_magnitudes_stim_1 = np.atleast_1d(physical_magnitudes_stim_1)
+    physical_magnitudes_stim_2 = np.atleast_1d(physical_magnitudes_stim_2)
+    threshold = np.atleast_1d(threshold)
+    power = np.atleast_1d(power)
+    perceptual_scale_max = np.atleast_1d(perceptual_scale_max)
+
+    # assert np.allclose(physical_magnitudes_stim_1, physical_magnitudes_stim_2)
+    # mag_min = x1.min()
+    # mag_max = x2.max()
+
+    # assert len(physical_magnitudes_stim_1) == len(physical_magnitudes_stim_2)
+    # assert np.allclose(physical_magnitudes_stim_1.min(), physical_magnitudes_stim_2.min())
+    # assert np.allclose(physical_magnitudes_stim_1.max(), physical_magnitudes_stim_2.max())
+
+    if not np.array_equal(
+        np.unique(physical_magnitudes_stim_1), np.sort(physical_magnitudes_stim_1)
+    ):
+        raise ValueError(f"Values in physical_magnitudes_stim_1 must be unique.")
+    if not np.array_equal(
+        np.unique(physical_magnitudes_stim_2), np.sort(physical_magnitudes_stim_2)
+    ):
+        raise ValueError(f"Values in physical_magnitudes_stim_2 must be unique.")
+
+    # mag_min = np.min([physical_magnitudes_stim_1, physical_magnitudes_stim_2])
+    mag_min = 0
+    mag_max = np.hstack([
+        physical_magnitudes_stim_1,
+        physical_magnitudes_stim_2
+    ]).max()
+
+    physical_magnitudes_stim_1 = xr.DataArray(
+        data=physical_magnitudes_stim_1,
+        dims=["physical_magnitude_stim_1"],
+        coords={"physical_magnitude_stim_1": physical_magnitudes_stim_1},
+    )
+    physical_magnitudes_stim_2 = xr.DataArray(
+        data=physical_magnitudes_stim_2,
+        dims=["physical_magnitude_stim_2"],
+        coords={"physical_magnitude_stim_2": physical_magnitudes_stim_2},
+    )
+    threshold = xr.DataArray(
+        data=threshold, dims=["threshold"], coords={"threshold": threshold}
+    )
+    power = xr.DataArray(data=power, dims=["power"], coords={"power": power})
+    perceptual_scale_max = xr.DataArray(
+        data=perceptual_scale_max,
+        dims=["perceptual_scale_max"],
+        coords={"perceptual_scale_max": perceptual_scale_max},
+    )
+
+    scale_x1 = scaling_function(
+        x=physical_magnitudes_stim_1,
+        m=perceptual_scale_max,
+        mag_min=mag_min,
+        mag_max=mag_max,
+        t=threshold,
+        q=power,
+    )
+    scale_x2 = scaling_function(
+        x=physical_magnitudes_stim_2,
+        m=perceptual_scale_max,
+        mag_min=mag_min,
+        mag_max=mag_max,
+        t=threshold,
+        q=power,
+    )
+
+    def _mu_func(scale_x1, scale_x2):
+        return scipy.stats.norm.cdf((scale_x1 - scale_x2) / np.sqrt(2))
+
+    result = xr.apply_ufunc(_mu_func, scale_x1, scale_x2)
+    return result


=====================================
questplus/qp.py
=====================================
@@ -1,23 +1,27 @@
-from typing import Optional, Sequence
-import xarray as xr
+from typing import Optional, Sequence, Literal
+from copy import deepcopy
+
 import numpy as np
+import xarray as xr
 import json_tricks
-from copy import deepcopy
 
 from questplus import psychometric_function
 
 
 class QuestPlus:
-    def __init__(self, *,
-                 stim_domain: dict,
-                 param_domain: dict,
-                 outcome_domain: dict,
-                 prior: Optional[dict] = None,
-                 func: str,
-                 stim_scale: str,
-                 stim_selection_method: str = 'min_entropy',
-                 stim_selection_options: Optional[dict] = None,
-                 param_estimation_method: str = 'mean'):
+    def __init__(
+        self,
+        *,
+        stim_domain: dict,
+        param_domain: dict,
+        outcome_domain: dict,
+        prior: Optional[dict] = None,
+        func: Literal["weibull", "csf", "norm_cdf", "norm_cdf_2", "thurstone_scaling"],
+        stim_scale: Optional[Literal["log10", "dB", "linear"]],
+        stim_selection_method: str = "min_entropy",
+        stim_selection_options: Optional[dict] = None,
+        param_estimation_method: str = "mean",
+    ):
         """
         A QUEST+ staircase procedure.
 
@@ -46,13 +50,11 @@ class QuestPlus:
             A-priori probabilities of parameter values.
 
         func
-            The psychometric function whose parameters to estimate. Currently
-            supported are the Weibull function, `weibull`, and the spatio-
-            temporal contrast sensitivity function, `csf`.
+            The psychometric function whose parameters to estimate.
 
         stim_scale
-            The scale on which the stimuli are provided. Currently supported
-            are the decadic logarithm, `log10`; and decibels, `dB`.
+            The scale on which the stimuli are provided. Has no effect for the
+            Thurstonian scaling function.
 
         stim_selection_method
             How to select the next stimulus. `min_entropy` picks the stimulus
@@ -77,6 +79,12 @@ class QuestPlus:
             the posterior distribution).
 
         """
+        if func == "thurstone_scaling" and stim_scale is not None:
+            raise ValueError(
+                "The Thurstonian scaling function cannot be used with "
+                "a stim_scale parameter."
+            )
+
         self.func = func
         self.stim_scale = stim_scale
         self.stim_domain = self._ensure_ndarray(stim_domain)
@@ -89,28 +97,34 @@ class QuestPlus:
 
         self.stim_selection = stim_selection_method
 
-        if self.stim_selection == 'min_n_entropy':
-            from ._constants import (DEFAULT_N, DEFAULT_RANDOM_SEED,
-                                     DEFAULT_MAX_CONSECUTIVE_REPS)
+        if self.stim_selection == "min_n_entropy":
+            from ._constants import (
+                DEFAULT_N,
+                DEFAULT_RANDOM_SEED,
+                DEFAULT_MAX_CONSECUTIVE_REPS,
+            )
 
             if stim_selection_options is None:
                 self.stim_selection_options = dict(
                     n=DEFAULT_N,
                     max_consecutive_reps=DEFAULT_MAX_CONSECUTIVE_REPS,
-                    random_seed=DEFAULT_RANDOM_SEED)
+                    random_seed=DEFAULT_RANDOM_SEED,
+                )
             else:
                 self.stim_selection_options = stim_selection_options.copy()
 
-                if 'n' not in stim_selection_options:
-                    self.stim_selection_options['n'] = DEFAULT_N
-                if 'max_consecutive_reps' not in stim_selection_options:
-                    self.stim_selection_options['max_consecutive_reps'] = DEFAULT_MAX_CONSECUTIVE_REPS
-                if 'random_seed' not in stim_selection_options:
-                    self.stim_selection_options['random_seed'] = DEFAULT_RANDOM_SEED
+                if "n" not in stim_selection_options:
+                    self.stim_selection_options["n"] = DEFAULT_N
+                if "max_consecutive_reps" not in stim_selection_options:
+                    self.stim_selection_options[
+                        "max_consecutive_reps"
+                    ] = DEFAULT_MAX_CONSECUTIVE_REPS
+                if "random_seed" not in stim_selection_options:
+                    self.stim_selection_options["random_seed"] = DEFAULT_RANDOM_SEED
 
             del DEFAULT_N, DEFAULT_MAX_CONSECUTIVE_REPS, DEFAULT_RANDOM_SEED
 
-            seed = self.stim_selection_options['random_seed']
+            seed = self.stim_selection_options["random_seed"]
             self._rng = np.random.RandomState(seed=seed)
             del seed
         else:
@@ -131,8 +145,7 @@ class QuestPlus:
 
         return x
 
-    def _gen_prior(self, *,
-                   prior: dict) -> xr.DataArray:
+    def _gen_prior(self, *, prior: dict) -> xr.DataArray:
         """
         Raises
         ------
@@ -147,11 +160,13 @@ class QuestPlus:
             # Uninformative prior.
             prior = np.ones([len(x) for x in self.param_domain.values()])
         elif set(prior_orig.keys()) - set(self.param_domain.keys()):
-            msg = (f'Mismatch between specified parameter domain and supplied '
-                   f'prior.\n'
-                   f'You specified priors for the following parameters that '
-                   f'do not appear in the parameter domain: '
-                   f'{set(prior_orig.keys()) - set(self.param_domain.keys())}')
+            msg = (
+                f"Mismatch between specified parameter domain and supplied "
+                f"prior.\n"
+                f"You specified priors for the following parameters that "
+                f"do not appear in the parameter domain: "
+                f"{set(prior_orig.keys()) - set(self.param_domain.keys())}"
+            )
             raise ValueError(msg)
         elif set(self.param_domain.keys()) - set(prior_orig.keys()):
             # The user specified prior probabilities for only a subset
@@ -163,28 +178,30 @@ class QuestPlus:
                     prior_vals = np.atleast_1d(prior_orig[param_name])
                 else:
                     prior_vals = np.ones(len(param_vals))
-          
+
                 grid_dims.append(prior_vals)
 
-            prior_grid = np.meshgrid(*grid_dims,
-                                     sparse=True, indexing='ij')
-            prior = np.prod(prior_grid)
+            prior_grid = np.meshgrid(*grid_dims, sparse=True, indexing="ij")
+            prior = np.prod(
+                np.array(prior_grid, dtype="object")  # avoid warning re "ragged" array
+            )
         else:
             # A "proper" prior was specified (i.e., prior probabilities for
             # all parameters.)
-            prior_grid = np.meshgrid(*list(prior_orig.values()),
-                                     sparse=True, indexing='ij')
-            prior = np.prod(prior_grid)
+            prior_grid = np.meshgrid(
+                *list(prior_orig.values()), sparse=True, indexing="ij"
+            )
+            prior = np.prod(
+                np.array(prior_grid, dtype="object")  # avoid warning re "ragged" array
+            )
 
         # Normalize.
         prior /= prior.sum()
 
         # Create the prior object we are actually going to use.
-        dims = *self.param_domain.keys(),
+        dims = (*self.param_domain.keys(),)
         coords = dict(**self.param_domain)
-        prior_ = xr.DataArray(data=prior,
-                              dims=dims,
-                              coords=coords)
+        prior_ = xr.DataArray(data=prior, dims=dims, coords=coords)
 
         return prior_
 
@@ -192,43 +209,54 @@ class QuestPlus:
         outcome_dim_name = list(self.outcome_domain.keys())[0]
         outcome_values = list(self.outcome_domain.values())[0]
 
-        if self.func in ['weibull', 'csf', 'norm_cdf', 'norm_cdf_2']:
-            if self.func == 'weibull':
-                f = psychometric_function.weibull
-            elif self.func == 'csf':
-                f = psychometric_function.csf
-            elif self.func == 'norm_cdf':
-                f = psychometric_function.norm_cdf
-            else:
-                f = psychometric_function.norm_cdf_2
-
-            prop_correct = f(**self.stim_domain,
-                             **self.param_domain,
-                             scale=self.stim_scale)
-
-            prop_incorrect = 1 - prop_correct
-
-            # Now this is a bit awkward. We concatenate the psychometric
-            # functions for the different responses. To do that, we first have
-            # to add an additional dimension.
-            # TODO: There's got to be a neater way to do this?!
-            corr_resp_dim = {outcome_dim_name: [outcome_values[0]]}
-            inccorr_resp_dim = {outcome_dim_name: [outcome_values[1]]}
-
-            prop_correct = prop_correct.expand_dims(corr_resp_dim)
-            prop_incorrect = prop_incorrect.expand_dims(inccorr_resp_dim)
-
-            pf_values = xr.concat([prop_correct, prop_incorrect],
-                                  dim=outcome_dim_name,
-                                  coords=self.outcome_domain)
+        if self.func not in [
+            "weibull",
+            "csf",
+            "norm_cdf",
+            "norm_cdf_2",
+            "thurstone_scaling",
+        ]:
+            raise ValueError(
+                f"Unknown psychometric function name specified: {self.func}"
+            )
+
+        if self.func == "weibull":
+            f = psychometric_function.weibull
+        elif self.func == "csf":
+            f = psychometric_function.csf
+        elif self.func == "norm_cdf":
+            f = psychometric_function.norm_cdf
+        elif self.func == "norm_cdf_2":
+            f = psychometric_function.norm_cdf_2
+        elif self.func == "thurstone_scaling":
+            f = psychometric_function.thurstone_scaling_function
+
+        if self.func == "thurstone_scaling":
+            prop_correct = f(**self.stim_domain, **self.param_domain)
         else:
-            raise ValueError('Unknown psychometric function name specified.')
-
+            prop_correct = f(
+                **self.stim_domain, **self.param_domain, scale=self.stim_scale
+            )
+        prop_incorrect = 1 - prop_correct
+
+        # Now this is a bit awkward. We concatenate the psychometric
+        # functions for the different responses. To do that, we first have
+        # to add an additional dimension.
+        # TODO: There's got to be a neater way to do this?!
+        corr_resp_dim = {outcome_dim_name: [outcome_values[0]]}
+        inccorr_resp_dim = {outcome_dim_name: [outcome_values[1]]}
+
+        prop_correct = prop_correct.expand_dims(corr_resp_dim)
+        prop_incorrect = prop_incorrect.expand_dims(inccorr_resp_dim)
+
+        pf_values = xr.concat(
+            [prop_correct, prop_incorrect],
+            dim=outcome_dim_name,
+            coords=self.outcome_domain,
+        )
         return pf_values
 
-    def update(self, *,
-               stim: dict,
-               outcome: dict) -> None:
+    def update(self, *, stim: dict, outcome: dict) -> None:
         """
         Inform QUEST+ about a newly gathered measurement outcome for a given
         stimulus parameter set, and update the posterior accordingly.
@@ -242,8 +270,7 @@ class QuestPlus:
             The observed outcome.
 
         """
-        likelihood = (self.likelihoods
-                      .sel(**stim, **outcome))
+        likelihood = self.likelihoods.sel(**stim, **outcome)
 
         self.posterior = self.posterior * likelihood
         self.posterior /= self.posterior.sum()
@@ -268,25 +295,37 @@ class QuestPlus:
         new_posterior /= pk
 
         # Entropies.
-        # Note that np.log(0) returns nan; xr.DataArray.sum() has special
-        # handling for this case.
-        H = -((new_posterior * np.log(new_posterior))
-              .sum(dim=self.param_domain.keys()))
+        #
+        # Note:
+        #   - np.log(0) returns -inf (division by zero)
+        #   - the multiplcation of new_posterior with -inf values generates
+        #     NaN's
+        #   - xr.DataArray.sum() has special handling for NaN's.
+        #
+        # NumPy also emits a warning, which we suppress here.
+        with np.errstate(divide="ignore"):
+            H = -(
+                (new_posterior * np.log(new_posterior)).sum(
+                    dim=self.param_domain.keys()
+                )
+            )
 
         # Expected entropies for all possible stimulus parameters.
         EH = (pk * H).sum(dim=list(self.outcome_domain.keys()))
 
-        if self.stim_selection == 'min_entropy':
-            # Get coordinates of stimulus properties that minimize entropy.
-            index = np.unravel_index(EH.argmin(), EH.shape)
-            coords = EH[index].coords
-            stim = {stim_property: stim_val.item()
-                    for stim_property, stim_val in coords.items()}
+        if self.stim_selection == "min_entropy":
+            # Get the stimulus properties that minimize entropy.
+            indices = EH.argmin(dim=...)
+            stim = dict()
+            for stim_property, index in indices.items():
+                stim_val = EH[stim_property][index].item()
+                stim[stim_property] = stim_val
+
             self.entropy = EH.min().item()
-        elif self.stim_selection == 'min_n_entropy':
+        elif self.stim_selection == "min_n_entropy":
             # Number of stimuli to include (the n stimuli that yield the lowest
             # entropies)
-            n_stim = self.stim_selection_options['n']
+            n_stim = self.stim_selection_options["n"]
 
             indices = np.unravel_index(EH.argsort(), EH.shape)[0]
             indices = indices[:n_stim]
@@ -296,21 +335,24 @@ class QuestPlus:
                 # (stimulus parameters).
                 candidate_index = self._rng.choice(indices)
                 coords = EH[candidate_index].coords
-                stim = {stim_property: stim_val.item()
-                        for stim_property, stim_val in coords.items()}
+                stim = {
+                    stim_property: stim_val.item()
+                    for stim_property, stim_val in coords.items()
+                }
 
-                max_reps = self.stim_selection_options['max_consecutive_reps']
+                max_reps = self.stim_selection_options["max_consecutive_reps"]
 
                 if len(self.stim_history) < 2:
                     break
-                elif all([stim == prev_stim
-                          for prev_stim in self.stim_history[-max_reps:]]):
+                elif all(
+                    [stim == prev_stim for prev_stim in self.stim_history[-max_reps:]]
+                ):
                     # Shuffle again.
                     continue
                 else:
                     break
         else:
-            raise ValueError('Unknown stim_selection supplied.')
+            raise ValueError("Unknown stim_selection supplied.")
 
         return stim
 
@@ -332,18 +374,18 @@ class QuestPlus:
             params = list(self.param_domain.keys())
             params.remove(param_name)
 
-            if method == 'mean':
-                param_estimates[param_name] = ((self.posterior.sum(dim=params) *
-                                                self.param_domain[param_name])
-                                               .sum()
-                                               .item())
-            elif method == 'mode':
-                index = np.unravel_index(self.posterior.argmax(),
-                                         self.posterior.shape)
-                coords = self.posterior[index].coords
+            if method == "mean":
+                param_estimates[param_name] = (
+                    (self.posterior.sum(dim=params) * self.param_domain[param_name])
+                    .sum()
+                    .item()
+                )
+            elif method == "mode":
+                indices = self.posterior.argmax(dim=...)
+                coords = self.posterior[indices]
                 param_estimates[param_name] = coords[param_name].item()
             else:
-                raise ValueError('Unknown method parameter.')
+                raise ValueError("Unknown method parameter.")
 
         return param_estimates
 
@@ -361,9 +403,9 @@ class QuestPlus:
         for param_name in self.param_domain.keys():
             marginalized_out_params = list(self.param_domain.keys())
             marginalized_out_params.remove(param_name)
-            marginal_posterior[param_name] = (self.posterior
-                                              .sum(dim=marginalized_out_params)
-                                              .values)
+            marginal_posterior[param_name] = self.posterior.sum(
+                dim=marginalized_out_params
+            ).values
 
         return marginal_posterior
 
@@ -435,18 +477,21 @@ class QuestPlus:
             return False
 
         for param_name in self.param_domain.keys():
-            if not np.array_equal(self.param_domain[param_name],
-                                  other.param_domain[param_name]):
+            if not np.array_equal(
+                self.param_domain[param_name], other.param_domain[param_name]
+            ):
                 return False
 
         for stim_property in self.stim_domain.keys():
-            if not np.array_equal(self.stim_domain[stim_property],
-                                  other.stim_domain[stim_property]):
+            if not np.array_equal(
+                self.stim_domain[stim_property], other.stim_domain[stim_property]
+            ):
                 return False
 
         for outcome_name in self.outcome_domain.keys():
-            if not np.array_equal(self.outcome_domain[outcome_name],
-                                  other.outcome_domain[outcome_name]):
+            if not np.array_equal(
+                self.outcome_domain[outcome_name], other.outcome_domain[outcome_name]
+            ):
                 return False
 
         if self.stim_selection != other.stim_selection:
@@ -474,83 +519,92 @@ class QuestPlus:
 
 
 class QuestPlusWeibull(QuestPlus):
-    def __init__(self, *,
-                 intensities: Sequence,
-                 thresholds: Sequence,
-                 slopes: Sequence,
-                 lower_asymptotes: Sequence,
-                 lapse_rates: Sequence,
-                 prior: Optional[dict] = None,
-                 responses: Sequence = ('Yes', 'No'),
-                 stim_scale: str = 'log10',
-                 stim_selection_method: str = 'min_entropy',
-                 stim_selection_options: Optional[dict] = None,
-                 param_estimation_method: str = 'mean'):
-        super().__init__(stim_domain=dict(intensity=intensities),
-                         param_domain=dict(threshold=thresholds,
-                                           slope=slopes,
-                                           lower_asymptote=lower_asymptotes,
-                                           lapse_rate=lapse_rates),
-                         outcome_domain=dict(response=responses),
-                         prior=prior,
-                         stim_scale=stim_scale,
-                         stim_selection_method=stim_selection_method,
-                         stim_selection_options=stim_selection_options,
-                         param_estimation_method=param_estimation_method,
-                         func='weibull')
+    def __init__(
+        self,
+        *,
+        intensities: Sequence,
+        thresholds: Sequence,
+        slopes: Sequence,
+        lower_asymptotes: Sequence,
+        lapse_rates: Sequence,
+        prior: Optional[dict] = None,
+        responses: Sequence = ("Yes", "No"),
+        stim_scale: str = "log10",
+        stim_selection_method: str = "min_entropy",
+        stim_selection_options: Optional[dict] = None,
+        param_estimation_method: str = "mean",
+    ):
+        """QUEST+ using the Weibull distribution function.
+
+        This is a convenience class that wraps `QuestPlus`.
+        """
+        super().__init__(
+            stim_domain=dict(intensity=intensities),
+            param_domain=dict(
+                threshold=thresholds,
+                slope=slopes,
+                lower_asymptote=lower_asymptotes,
+                lapse_rate=lapse_rates,
+            ),
+            outcome_domain=dict(response=responses),
+            prior=prior,
+            stim_scale=stim_scale,
+            stim_selection_method=stim_selection_method,
+            stim_selection_options=stim_selection_options,
+            param_estimation_method=param_estimation_method,
+            func="weibull",
+        )
 
     @property
     def intensities(self) -> np.ndarray:
         """
         Stimulus intensity or contrast domain.
         """
-        return self.stim_domain['intensity']
+        return self.stim_domain["intensity"]
 
     @property
     def thresholds(self) -> np.ndarray:
         """
         The threshold parameter domain.
         """
-        return self.param_domain['threshold']
+        return self.param_domain["threshold"]
 
     @property
     def slopes(self) -> np.ndarray:
         """
         The slope parameter domain.
         """
-        return self.param_domain['slope']
+        return self.param_domain["slope"]
 
     @property
     def lower_asymptotes(self) -> np.ndarray:
         """
         The lower asymptote parameter domain.
         """
-        return self.param_domain['lower_asymptote']
+        return self.param_domain["lower_asymptote"]
 
     @property
     def lapse_rates(self) -> np.ndarray:
         """
         The lapse rate parameter domain.
         """
-        return self.param_domain['lapse_rate']
+        return self.param_domain["lapse_rate"]
 
     @property
     def responses(self) -> np.ndarray:
         """
         The response (outcome) domain.
         """
-        return self.outcome_domain['response']
+        return self.outcome_domain["response"]
 
     @property
     def next_intensity(self) -> float:
         """
         The intensity or contrast to present next.
         """
-        return super().next_stim['intensity']
+        return super().next_stim["intensity"]
 
-    def update(self, *,
-               intensity: float,
-               response: str) -> None:
+    def update(self, *, intensity: float, response: str) -> None:
         """
         Inform QUEST+ about a newly gathered measurement outcome for a given
         stimulus intensity or contrast, and update the posterior accordingly.
@@ -564,5 +618,101 @@ class QuestPlusWeibull(QuestPlus):
             The observed response.
 
         """
-        super().update(stim=dict(intensity=intensity),
-                       outcome=dict(response=response))
+        super().update(stim=dict(intensity=intensity), outcome=dict(response=response))
+
+
+class QuestPlusThurstone(QuestPlus):
+    def __init__(
+        self,
+        *,
+        physical_magnitudes_stim_1: Sequence,
+        physical_magnitudes_stim_2: Sequence,
+        thresholds: Sequence,
+        powers: Sequence,
+        perceptual_scale_maxs: Sequence,
+        prior: Optional[dict] = None,
+        responses: Sequence = ("First", "Second"),
+        stim_selection_method: str = "min_entropy",
+        stim_selection_options: Optional[dict] = None,
+        param_estimation_method: str = "mean",
+    ):
+        """QUEST+ for Thurstonian scaling.
+
+        This is a convenience class that wraps `QuestPlus`.
+        """
+        super().__init__(
+            stim_domain={
+                "physical_magnitudes_stim_1": physical_magnitudes_stim_1,
+                "physical_magnitudes_stim_2": physical_magnitudes_stim_2,
+            },
+            param_domain={
+                "threshold": thresholds,
+                "power": powers,
+                "perceptual_scale_max": perceptual_scale_maxs,
+            },
+            outcome_domain={"response": responses},
+            prior=prior,
+            stim_scale=None,
+            stim_selection_method=stim_selection_method,
+            stim_selection_options=stim_selection_options,
+            param_estimation_method=param_estimation_method,
+            func="thurstone_scaling",
+        )
+
+    @property
+    def physical_magnitudes_stim_1(self) -> np.ndarray:
+        """
+        Physical magnitudes of the first stimulus.
+        """
+        return self.stim_domain["physical_magnitudes_stim_1"]
+
+    @property
+    def physical_magnitudes_stim_2(self) -> np.ndarray:
+        """
+        Physical magnitudes of the second stimulus.
+        """
+        return self.stim_domain["physical_magnitudes_stim_2"]
+
+    @property
+    def thresholds(self) -> np.ndarray:
+        """
+        The threshold parameter domain.
+        """
+        return self.param_domain["threshold"]
+
+    @property
+    def powers(self) -> np.ndarray:
+        """
+        The power parameter domain.
+        """
+        return self.param_domain["power"]
+
+    @property
+    def perceptual_scale_maxss(self) -> np.ndarray:
+        """
+        The "maximum value of the subjective perceptual scale" parameter domain.
+        """
+        return self.param_domain["perceptual_scale_max"]
+
+    @property
+    def responses(self) -> np.ndarray:
+        """
+        The response (outcome) domain.
+        """
+        return self.outcome_domain["response"]
+
+    def update(self, *, stim: dict, response: str) -> None:
+        """
+        Inform QUEST+ about a newly gathered measurement outcome for a given
+        stimulus parameter set, and update the posterior accordingly.
+
+        Parameters
+        ----------
+        stim
+            The stimulus that was used to generate the given outcome.
+
+        outcome
+            The observed outcome.
+
+        """
+        super().update(stim=stim, outcome=dict(response=response))


=====================================
questplus/tests/test_qp.py
=====================================
@@ -1,7 +1,7 @@
 import pytest
 import scipy.stats
 import numpy as np
-from questplus.qp import QuestPlus, QuestPlusWeibull
+from questplus.qp import QuestPlus, QuestPlusWeibull, QuestPlusThurstone
 from questplus import _constants
 
 
@@ -373,6 +373,82 @@ def test_spatial_contrast_sensitivity():
                        expected_mode_cf)
 
 
+def test_thurstone_scaling():
+    """
+    Watson 2017, Example 6:
+    "Thurstone scaling {2, 3, 2}"
+    """
+    stim_magnitudes = np.arange(0, 1+0.1, 0.1)
+    perceptual_scale_maxs = np.arange(1, 10+1)
+    thresholds = np.arange(0, 0.9+0.1, 0.1)
+    powers = np.arange(0.1, 1+0.1, 0.1)
+
+    # Due to differences in rounding, the order of stimuli (1 or 2) is swapped on some trials
+    # compared to the paper. We therefore have to swap the example response as well.
+    #
+    # We're only testing the first 22 trials here.
+    responses = ['Second'] * 6
+    responses.extend(['Second'])       # rounding difference
+    responses.extend(['Second'] * 13)
+    responses.extend(['Second'])       # rounding difference
+    responses.extend(['First'])
+
+    expected_stims = [
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.7},
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.6},
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.5},
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.4},
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.0},
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.4},
+        {'physical_magnitude_stim_1': 0.0, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.5, 'physical_magnitude_stim_2': 1.0},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.5, 'physical_magnitude_stim_2': 1.0},
+        {'physical_magnitude_stim_1': 0.5, 'physical_magnitude_stim_2': 1.0},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.5, 'physical_magnitude_stim_2': 1.0},
+        {'physical_magnitude_stim_1': 0.6, 'physical_magnitude_stim_2': 1.0},
+        {'physical_magnitude_stim_1': 0.2, 'physical_magnitude_stim_2': 0.3},
+        {'physical_magnitude_stim_1': 0.6, 'physical_magnitude_stim_2': 1.0},
+    ]
+
+    qp = QuestPlusThurstone(
+        physical_magnitudes_stim_1=stim_magnitudes,
+        physical_magnitudes_stim_2=stim_magnitudes,
+        thresholds=thresholds,
+        powers=powers,
+        perceptual_scale_maxs=perceptual_scale_maxs
+    )
+
+    for trial_idx, x in enumerate(zip(expected_stims, responses)):
+        expected_stim, response = x
+        
+        expected_stim_1 = expected_stim['physical_magnitude_stim_1']
+        expected_stim_2 = expected_stim['physical_magnitude_stim_2']
+        
+        next_stim_1 =  qp.next_stim['physical_magnitude_stim_1']
+        next_stim_2 =  qp.next_stim['physical_magnitude_stim_2']
+
+        if trial_idx in (6, 20):
+            # Rounding errors make the algorithm behave differently on different platforms.
+            if (
+                expected_stim_1 == next_stim_2 and
+                expected_stim_2 == next_stim_1
+            ):
+                expected_stim_1, expected_stim_2 = expected_stim_2, expected_stim_1
+                response = 'First' if response == 'Second' else 'Second'
+
+        assert np.isclose(next_stim_1, expected_stim_1)
+        assert np.isclose(next_stim_2, expected_stim_2)
+        qp.update(stim=qp.next_stim, response=response)
+
+
 def test_weibull():
     threshold = np.arange(-40, 0 + 1)
     slope, guess, lapse = 3.5, 0.5, 0.02


=====================================
questplus/utils.py
=====================================
@@ -3,12 +3,14 @@ import numpy as np
 from questplus import psychometric_function
 
 
-def simulate_response(*,
-                      func: str = 'weibull',
-                      stimulus: dict,
-                      params: dict,
-                      response_domain: Sequence = ('Correct', 'Incorrect'),
-                      stim_scale: str = 'log10') -> Union[float, str]:
+def simulate_response(
+    *,
+    func: str = "weibull",
+    stimulus: dict,
+    params: dict,
+    response_domain: Sequence = ("Correct", "Incorrect"),
+    stim_scale: str = "log10",
+) -> Union[float, str]:
     """
     Simulate an observer with the given psychometric function parameters.
 
@@ -36,14 +38,14 @@ def simulate_response(*,
         A simulated response for the given stimulus.
 
     """
-    if func == 'weibull':
+    if func == "weibull":
         f = psychometric_function.weibull
-        p_correct = f(intensity=stimulus['intensity'],
-                      **params, scale=stim_scale).squeeze()
+        p_correct = f(
+            intensity=stimulus["intensity"], **params, scale=stim_scale
+        ).squeeze()
 
-        response = np.random.choice(response_domain,
-                                    p=[p_correct, 1-p_correct])
+        response = np.random.choice(response_domain, p=[p_correct, 1 - p_correct])
     else:
-        raise ValueError('Invalid function specified.')
+        raise ValueError("Invalid function specified.")
 
     return response


=====================================
setup.cfg deleted
=====================================
@@ -1,38 +0,0 @@
-[metadata]
-name = questplus
-author = Richard Höchenberger <richard.hoechenberger at gmail.com>
-author_email = richard.hoechenberger at gmail.com
-url = https://github.com/hoechenberger/questplus
-project_urls =
-    Source Code = https://github.com/hoechenberger/questplus
-    Bug Tracker = https://github.com/hoechenberger/questplus/issues
-license = GPL v3
-license_file = LICENSE
-description = A QUEST+ implementation in Python.
-long_description = file: README.md
-long_description_content_type = text/markdown
-classifiers =
-    Intended Audience :: Science/Research
-    Programming Language :: Python
-    Programming Language :: Python :: 3
-    Programming Language :: Python :: 3.6
-    Programming Language :: Python :: 3.7
-    Programming Language :: Python :: 3.8
-
-[options]
-python_requires = >=3.6
-install_requires =
-    numpy
-    scipy
-    xarray
-    json_tricks
-
-[bdist_wheel]
-universal = 1
-
-[versioneer]
-VCS = git
-style = pep440
-versionfile_source = questplus/_version.py
-versionfile_build = questplus/_version.py
-tag_prefix = ''


=====================================
setup.py deleted
=====================================
@@ -1,15 +0,0 @@
-import sys
-import versioneer
-
-
-try:
-    from setuptools import setup, find_packages
-except ImportError:
-    raise sys.exit('Could not import setuptools.')
-
-
-setup(
-    version=versioneer.get_version(),
-    cmdclass=versioneer.get_cmdclass(),
-    packages=find_packages()
-)


=====================================
versioneer.py deleted
=====================================
@@ -1,1822 +0,0 @@
-
-# Version: 0.18
-
-"""The Versioneer - like a rocketeer, but for versions.
-
-The Versioneer
-==============
-
-* like a rocketeer, but for versions!
-* https://github.com/warner/python-versioneer
-* Brian Warner
-* License: Public Domain
-* Compatible With: python2.6, 2.7, 3.2, 3.3, 3.4, 3.5, 3.6, and pypy
-* [![Latest Version]
-(https://pypip.in/version/versioneer/badge.svg?style=flat)
-](https://pypi.python.org/pypi/versioneer/)
-* [![Build Status]
-(https://travis-ci.org/warner/python-versioneer.png?branch=master)
-](https://travis-ci.org/warner/python-versioneer)
-
-This is a tool for managing a recorded version number in distutils-based
-python projects. The goal is to remove the tedious and error-prone "update
-the embedded version string" step from your release process. Making a new
-release should be as easy as recording a new tag in your version-control
-system, and maybe making new tarballs.
-
-
-## Quick Install
-
-* `pip install versioneer` to somewhere to your $PATH
-* add a `[versioneer]` section to your setup.cfg (see below)
-* run `versioneer install` in your source tree, commit the results
-
-## Version Identifiers
-
-Source trees come from a variety of places:
-
-* a version-control system checkout (mostly used by developers)
-* a nightly tarball, produced by build automation
-* a snapshot tarball, produced by a web-based VCS browser, like github's
-  "tarball from tag" feature
-* a release tarball, produced by "setup.py sdist", distributed through PyPI
-
-Within each source tree, the version identifier (either a string or a number,
-this tool is format-agnostic) can come from a variety of places:
-
-* ask the VCS tool itself, e.g. "git describe" (for checkouts), which knows
-  about recent "tags" and an absolute revision-id
-* the name of the directory into which the tarball was unpacked
-* an expanded VCS keyword ($Id$, etc)
-* a `_version.py` created by some earlier build step
-
-For released software, the version identifier is closely related to a VCS
-tag. Some projects use tag names that include more than just the version
-string (e.g. "myproject-1.2" instead of just "1.2"), in which case the tool
-needs to strip the tag prefix to extract the version identifier. For
-unreleased software (between tags), the version identifier should provide
-enough information to help developers recreate the same tree, while also
-giving them an idea of roughly how old the tree is (after version 1.2, before
-version 1.3). Many VCS systems can report a description that captures this,
-for example `git describe --tags --dirty --always` reports things like
-"0.7-1-g574ab98-dirty" to indicate that the checkout is one revision past the
-0.7 tag, has a unique revision id of "574ab98", and is "dirty" (it has
-uncommitted changes.
-
-The version identifier is used for multiple purposes:
-
-* to allow the module to self-identify its version: `myproject.__version__`
-* to choose a name and prefix for a 'setup.py sdist' tarball
-
-## Theory of Operation
-
-Versioneer works by adding a special `_version.py` file into your source
-tree, where your `__init__.py` can import it. This `_version.py` knows how to
-dynamically ask the VCS tool for version information at import time.
-
-`_version.py` also contains `$Revision$` markers, and the installation
-process marks `_version.py` to have this marker rewritten with a tag name
-during the `git archive` command. As a result, generated tarballs will
-contain enough information to get the proper version.
-
-To allow `setup.py` to compute a version too, a `versioneer.py` is added to
-the top level of your source tree, next to `setup.py` and the `setup.cfg`
-that configures it. This overrides several distutils/setuptools commands to
-compute the version when invoked, and changes `setup.py build` and `setup.py
-sdist` to replace `_version.py` with a small static file that contains just
-the generated version data.
-
-## Installation
-
-See [INSTALL.md](./INSTALL.md) for detailed installation instructions.
-
-## Version-String Flavors
-
-Code which uses Versioneer can learn about its version string at runtime by
-importing `_version` from your main `__init__.py` file and running the
-`get_versions()` function. From the "outside" (e.g. in `setup.py`), you can
-import the top-level `versioneer.py` and run `get_versions()`.
-
-Both functions return a dictionary with different flavors of version
-information:
-
-* `['version']`: A condensed version string, rendered using the selected
-  style. This is the most commonly used value for the project's version
-  string. The default "pep440" style yields strings like `0.11`,
-  `0.11+2.g1076c97`, or `0.11+2.g1076c97.dirty`. See the "Styles" section
-  below for alternative styles.
-
-* `['full-revisionid']`: detailed revision identifier. For Git, this is the
-  full SHA1 commit id, e.g. "1076c978a8d3cfc70f408fe5974aa6c092c949ac".
-
-* `['date']`: Date and time of the latest `HEAD` commit. For Git, it is the
-  commit date in ISO 8601 format. This will be None if the date is not
-  available.
-
-* `['dirty']`: a boolean, True if the tree has uncommitted changes. Note that
-  this is only accurate if run in a VCS checkout, otherwise it is likely to
-  be False or None
-
-* `['error']`: if the version string could not be computed, this will be set
-  to a string describing the problem, otherwise it will be None. It may be
-  useful to throw an exception in setup.py if this is set, to avoid e.g.
-  creating tarballs with a version string of "unknown".
-
-Some variants are more useful than others. Including `full-revisionid` in a
-bug report should allow developers to reconstruct the exact code being tested
-(or indicate the presence of local changes that should be shared with the
-developers). `version` is suitable for display in an "about" box or a CLI
-`--version` output: it can be easily compared against release notes and lists
-of bugs fixed in various releases.
-
-The installer adds the following text to your `__init__.py` to place a basic
-version in `YOURPROJECT.__version__`:
-
-    from ._version import get_versions
-    __version__ = get_versions()['version']
-    del get_versions
-
-## Styles
-
-The setup.cfg `style=` configuration controls how the VCS information is
-rendered into a version string.
-
-The default style, "pep440", produces a PEP440-compliant string, equal to the
-un-prefixed tag name for actual releases, and containing an additional "local
-version" section with more detail for in-between builds. For Git, this is
-TAG[+DISTANCE.gHEX[.dirty]] , using information from `git describe --tags
---dirty --always`. For example "0.11+2.g1076c97.dirty" indicates that the
-tree is like the "1076c97" commit but has uncommitted changes (".dirty"), and
-that this commit is two revisions ("+2") beyond the "0.11" tag. For released
-software (exactly equal to a known tag), the identifier will only contain the
-stripped tag, e.g. "0.11".
-
-Other styles are available. See [details.md](details.md) in the Versioneer
-source tree for descriptions.
-
-## Debugging
-
-Versioneer tries to avoid fatal errors: if something goes wrong, it will tend
-to return a version of "0+unknown". To investigate the problem, run `setup.py
-version`, which will run the version-lookup code in a verbose mode, and will
-display the full contents of `get_versions()` (including the `error` string,
-which may help identify what went wrong).
-
-## Known Limitations
-
-Some situations are known to cause problems for Versioneer. This details the
-most significant ones. More can be found on Github
-[issues page](https://github.com/warner/python-versioneer/issues).
-
-### Subprojects
-
-Versioneer has limited support for source trees in which `setup.py` is not in
-the root directory (e.g. `setup.py` and `.git/` are *not* siblings). The are
-two common reasons why `setup.py` might not be in the root:
-
-* Source trees which contain multiple subprojects, such as
-  [Buildbot](https://github.com/buildbot/buildbot), which contains both
-  "master" and "slave" subprojects, each with their own `setup.py`,
-  `setup.cfg`, and `tox.ini`. Projects like these produce multiple PyPI
-  distributions (and upload multiple independently-installable tarballs).
-* Source trees whose main purpose is to contain a C library, but which also
-  provide bindings to Python (and perhaps other langauges) in subdirectories.
-
-Versioneer will look for `.git` in parent directories, and most operations
-should get the right version string. However `pip` and `setuptools` have bugs
-and implementation details which frequently cause `pip install .` from a
-subproject directory to fail to find a correct version string (so it usually
-defaults to `0+unknown`).
-
-`pip install --editable .` should work correctly. `setup.py install` might
-work too.
-
-Pip-8.1.1 is known to have this problem, but hopefully it will get fixed in
-some later version.
-
-[Bug #38](https://github.com/warner/python-versioneer/issues/38) is tracking
-this issue. The discussion in
-[PR #61](https://github.com/warner/python-versioneer/pull/61) describes the
-issue from the Versioneer side in more detail.
-[pip PR#3176](https://github.com/pypa/pip/pull/3176) and
-[pip PR#3615](https://github.com/pypa/pip/pull/3615) contain work to improve
-pip to let Versioneer work correctly.
-
-Versioneer-0.16 and earlier only looked for a `.git` directory next to the
-`setup.cfg`, so subprojects were completely unsupported with those releases.
-
-### Editable installs with setuptools <= 18.5
-
-`setup.py develop` and `pip install --editable .` allow you to install a
-project into a virtualenv once, then continue editing the source code (and
-test) without re-installing after every change.
-
-"Entry-point scripts" (`setup(entry_points={"console_scripts": ..})`) are a
-convenient way to specify executable scripts that should be installed along
-with the python package.
-
-These both work as expected when using modern setuptools. When using
-setuptools-18.5 or earlier, however, certain operations will cause
-`pkg_resources.DistributionNotFound` errors when running the entrypoint
-script, which must be resolved by re-installing the package. This happens
-when the install happens with one version, then the egg_info data is
-regenerated while a different version is checked out. Many setup.py commands
-cause egg_info to be rebuilt (including `sdist`, `wheel`, and installing into
-a different virtualenv), so this can be surprising.
-
-[Bug #83](https://github.com/warner/python-versioneer/issues/83) describes
-this one, but upgrading to a newer version of setuptools should probably
-resolve it.
-
-### Unicode version strings
-
-While Versioneer works (and is continually tested) with both Python 2 and
-Python 3, it is not entirely consistent with bytes-vs-unicode distinctions.
-Newer releases probably generate unicode version strings on py2. It's not
-clear that this is wrong, but it may be surprising for applications when then
-write these strings to a network connection or include them in bytes-oriented
-APIs like cryptographic checksums.
-
-[Bug #71](https://github.com/warner/python-versioneer/issues/71) investigates
-this question.
-
-
-## Updating Versioneer
-
-To upgrade your project to a new release of Versioneer, do the following:
-
-* install the new Versioneer (`pip install -U versioneer` or equivalent)
-* edit `setup.cfg`, if necessary, to include any new configuration settings
-  indicated by the release notes. See [UPGRADING](./UPGRADING.md) for details.
-* re-run `versioneer install` in your source tree, to replace
-  `SRC/_version.py`
-* commit any changed files
-
-## Future Directions
-
-This tool is designed to make it easily extended to other version-control
-systems: all VCS-specific components are in separate directories like
-src/git/ . The top-level `versioneer.py` script is assembled from these
-components by running make-versioneer.py . In the future, make-versioneer.py
-will take a VCS name as an argument, and will construct a version of
-`versioneer.py` that is specific to the given VCS. It might also take the
-configuration arguments that are currently provided manually during
-installation by editing setup.py . Alternatively, it might go the other
-direction and include code from all supported VCS systems, reducing the
-number of intermediate scripts.
-
-
-## License
-
-To make Versioneer easier to embed, all its code is dedicated to the public
-domain. The `_version.py` that it creates is also in the public domain.
-Specifically, both are released under the Creative Commons "Public Domain
-Dedication" license (CC0-1.0), as described in
-https://creativecommons.org/publicdomain/zero/1.0/ .
-
-"""
-
-from __future__ import print_function
-try:
-    import configparser
-except ImportError:
-    import ConfigParser as configparser
-import errno
-import json
-import os
-import re
-import subprocess
-import sys
-
-
-class VersioneerConfig:
-    """Container for Versioneer configuration parameters."""
-
-
-def get_root():
-    """Get the project root directory.
-
-    We require that all commands are run from the project root, i.e. the
-    directory that contains setup.py, setup.cfg, and versioneer.py .
-    """
-    root = os.path.realpath(os.path.abspath(os.getcwd()))
-    setup_py = os.path.join(root, "setup.py")
-    versioneer_py = os.path.join(root, "versioneer.py")
-    if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)):
-        # allow 'python path/to/setup.py COMMAND'
-        root = os.path.dirname(os.path.realpath(os.path.abspath(sys.argv[0])))
-        setup_py = os.path.join(root, "setup.py")
-        versioneer_py = os.path.join(root, "versioneer.py")
-    if not (os.path.exists(setup_py) or os.path.exists(versioneer_py)):
-        err = ("Versioneer was unable to run the project root directory. "
-               "Versioneer requires setup.py to be executed from "
-               "its immediate directory (like 'python setup.py COMMAND'), "
-               "or in a way that lets it use sys.argv[0] to find the root "
-               "(like 'python path/to/setup.py COMMAND').")
-        raise VersioneerBadRootError(err)
-    try:
-        # Certain runtime workflows (setup.py install/develop in a setuptools
-        # tree) execute all dependencies in a single python process, so
-        # "versioneer" may be imported multiple times, and python's shared
-        # module-import table will cache the first one. So we can't use
-        # os.path.dirname(__file__), as that will find whichever
-        # versioneer.py was first imported, even in later projects.
-        me = os.path.realpath(os.path.abspath(__file__))
-        me_dir = os.path.normcase(os.path.splitext(me)[0])
-        vsr_dir = os.path.normcase(os.path.splitext(versioneer_py)[0])
-        if me_dir != vsr_dir:
-            print("Warning: build in %s is using versioneer.py from %s"
-                  % (os.path.dirname(me), versioneer_py))
-    except NameError:
-        pass
-    return root
-
-
-def get_config_from_root(root):
-    """Read the project setup.cfg file to determine Versioneer config."""
-    # This might raise EnvironmentError (if setup.cfg is missing), or
-    # configparser.NoSectionError (if it lacks a [versioneer] section), or
-    # configparser.NoOptionError (if it lacks "VCS="). See the docstring at
-    # the top of versioneer.py for instructions on writing your setup.cfg .
-    setup_cfg = os.path.join(root, "setup.cfg")
-    parser = configparser.SafeConfigParser()
-    with open(setup_cfg, "r") as f:
-        parser.readfp(f)
-    VCS = parser.get("versioneer", "VCS")  # mandatory
-
-    def get(parser, name):
-        if parser.has_option("versioneer", name):
-            return parser.get("versioneer", name)
-        return None
-    cfg = VersioneerConfig()
-    cfg.VCS = VCS
-    cfg.style = get(parser, "style") or ""
-    cfg.versionfile_source = get(parser, "versionfile_source")
-    cfg.versionfile_build = get(parser, "versionfile_build")
-    cfg.tag_prefix = get(parser, "tag_prefix")
-    if cfg.tag_prefix in ("''", '""'):
-        cfg.tag_prefix = ""
-    cfg.parentdir_prefix = get(parser, "parentdir_prefix")
-    cfg.verbose = get(parser, "verbose")
-    return cfg
-
-
-class NotThisMethod(Exception):
-    """Exception raised if a method is not valid for the current scenario."""
-
-
-# these dictionaries contain VCS-specific tools
-LONG_VERSION_PY = {}
-HANDLERS = {}
-
-
-def register_vcs_handler(vcs, method):  # decorator
-    """Decorator to mark a method as the handler for a particular VCS."""
-    def decorate(f):
-        """Store f in HANDLERS[vcs][method]."""
-        if vcs not in HANDLERS:
-            HANDLERS[vcs] = {}
-        HANDLERS[vcs][method] = f
-        return f
-    return decorate
-
-
-def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
-                env=None):
-    """Call the given command(s)."""
-    assert isinstance(commands, list)
-    p = None
-    for c in commands:
-        try:
-            dispcmd = str([c] + args)
-            # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
-            break
-        except EnvironmentError:
-            e = sys.exc_info()[1]
-            if e.errno == errno.ENOENT:
-                continue
-            if verbose:
-                print("unable to run %s" % dispcmd)
-                print(e)
-            return None, None
-    else:
-        if verbose:
-            print("unable to find command, tried %s" % (commands,))
-        return None, None
-    stdout = p.communicate()[0].strip()
-    if sys.version_info[0] >= 3:
-        stdout = stdout.decode()
-    if p.returncode != 0:
-        if verbose:
-            print("unable to run %s (error)" % dispcmd)
-            print("stdout was %s" % stdout)
-        return None, p.returncode
-    return stdout, p.returncode
-
-
-LONG_VERSION_PY['git'] = '''
-# This file helps to compute a version number in source trees obtained from
-# git-archive tarball (such as those provided by githubs download-from-tag
-# feature). Distribution tarballs (built by setup.py sdist) and build
-# directories (produced by setup.py build) will contain a much shorter file
-# that just contains the computed version number.
-
-# This file is released into the public domain. Generated by
-# versioneer-0.18 (https://github.com/warner/python-versioneer)
-
-"""Git implementation of _version.py."""
-
-import errno
-import os
-import re
-import subprocess
-import sys
-
-
-def get_keywords():
-    """Get the keywords needed to look up the version information."""
-    # these strings will be replaced by git during git-archive.
-    # setup.py/versioneer.py will grep for the variable names, so they must
-    # each be defined on a line of their own. _version.py will just call
-    # get_keywords().
-    git_refnames = "%(DOLLAR)sFormat:%%d%(DOLLAR)s"
-    git_full = "%(DOLLAR)sFormat:%%H%(DOLLAR)s"
-    git_date = "%(DOLLAR)sFormat:%%ci%(DOLLAR)s"
-    keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
-    return keywords
-
-
-class VersioneerConfig:
-    """Container for Versioneer configuration parameters."""
-
-
-def get_config():
-    """Create, populate and return the VersioneerConfig() object."""
-    # these strings are filled in when 'setup.py versioneer' creates
-    # _version.py
-    cfg = VersioneerConfig()
-    cfg.VCS = "git"
-    cfg.style = "%(STYLE)s"
-    cfg.tag_prefix = "%(TAG_PREFIX)s"
-    cfg.parentdir_prefix = "%(PARENTDIR_PREFIX)s"
-    cfg.versionfile_source = "%(VERSIONFILE_SOURCE)s"
-    cfg.verbose = False
-    return cfg
-
-
-class NotThisMethod(Exception):
-    """Exception raised if a method is not valid for the current scenario."""
-
-
-LONG_VERSION_PY = {}
-HANDLERS = {}
-
-
-def register_vcs_handler(vcs, method):  # decorator
-    """Decorator to mark a method as the handler for a particular VCS."""
-    def decorate(f):
-        """Store f in HANDLERS[vcs][method]."""
-        if vcs not in HANDLERS:
-            HANDLERS[vcs] = {}
-        HANDLERS[vcs][method] = f
-        return f
-    return decorate
-
-
-def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,
-                env=None):
-    """Call the given command(s)."""
-    assert isinstance(commands, list)
-    p = None
-    for c in commands:
-        try:
-            dispcmd = str([c] + args)
-            # remember shell=False, so use git.cmd on windows, not just git
-            p = subprocess.Popen([c] + args, cwd=cwd, env=env,
-                                 stdout=subprocess.PIPE,
-                                 stderr=(subprocess.PIPE if hide_stderr
-                                         else None))
-            break
-        except EnvironmentError:
-            e = sys.exc_info()[1]
-            if e.errno == errno.ENOENT:
-                continue
-            if verbose:
-                print("unable to run %%s" %% dispcmd)
-                print(e)
-            return None, None
-    else:
-        if verbose:
-            print("unable to find command, tried %%s" %% (commands,))
-        return None, None
-    stdout = p.communicate()[0].strip()
-    if sys.version_info[0] >= 3:
-        stdout = stdout.decode()
-    if p.returncode != 0:
-        if verbose:
-            print("unable to run %%s (error)" %% dispcmd)
-            print("stdout was %%s" %% stdout)
-        return None, p.returncode
-    return stdout, p.returncode
-
-
-def versions_from_parentdir(parentdir_prefix, root, verbose):
-    """Try to determine the version from the parent directory name.
-
-    Source tarballs conventionally unpack into a directory that includes both
-    the project name and a version string. We will also support searching up
-    two directory levels for an appropriately named parent directory
-    """
-    rootdirs = []
-
-    for i in range(3):
-        dirname = os.path.basename(root)
-        if dirname.startswith(parentdir_prefix):
-            return {"version": dirname[len(parentdir_prefix):],
-                    "full-revisionid": None,
-                    "dirty": False, "error": None, "date": None}
-        else:
-            rootdirs.append(root)
-            root = os.path.dirname(root)  # up a level
-
-    if verbose:
-        print("Tried directories %%s but none started with prefix %%s" %%
-              (str(rootdirs), parentdir_prefix))
-    raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
-
-
- at register_vcs_handler("git", "get_keywords")
-def git_get_keywords(versionfile_abs):
-    """Extract version information from the given file."""
-    # the code embedded in _version.py can just fetch the value of these
-    # keywords. When used from setup.py, we don't want to import _version.py,
-    # so we do it with a regexp instead. This function is not used from
-    # _version.py.
-    keywords = {}
-    try:
-        f = open(versionfile_abs, "r")
-        for line in f.readlines():
-            if line.strip().startswith("git_refnames ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["refnames"] = mo.group(1)
-            if line.strip().startswith("git_full ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["full"] = mo.group(1)
-            if line.strip().startswith("git_date ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["date"] = mo.group(1)
-        f.close()
-    except EnvironmentError:
-        pass
-    return keywords
-
-
- at register_vcs_handler("git", "keywords")
-def git_versions_from_keywords(keywords, tag_prefix, verbose):
-    """Get version information from git keywords."""
-    if not keywords:
-        raise NotThisMethod("no keywords at all, weird")
-    date = keywords.get("date")
-    if date is not None:
-        # git-2.2.0 added "%%cI", which expands to an ISO-8601 -compliant
-        # datestamp. However we prefer "%%ci" (which expands to an "ISO-8601
-        # -like" string, which we must then edit to make compliant), because
-        # it's been around since git-1.5.3, and it's too difficult to
-        # discover which version we're using, or to work around using an
-        # older one.
-        date = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
-    refnames = keywords["refnames"].strip()
-    if refnames.startswith("$Format"):
-        if verbose:
-            print("keywords are unexpanded, not using")
-        raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
-    # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
-    # just "foo-1.0". If we see a "tag: " prefix, prefer those.
-    TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
-    if not tags:
-        # Either we're using git < 1.8.3, or there really are no tags. We use
-        # a heuristic: assume all version tags have a digit. The old git %%d
-        # expansion behaves like git log --decorate=short and strips out the
-        # refs/heads/ and refs/tags/ prefixes that would let us distinguish
-        # between branches and tags. By ignoring refnames without digits, we
-        # filter out many common branch names like "release" and
-        # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
-        if verbose:
-            print("discarding '%%s', no digits" %% ",".join(refs - tags))
-    if verbose:
-        print("likely tags: %%s" %% ",".join(sorted(tags)))
-    for ref in sorted(tags):
-        # sorting will prefer e.g. "2.0" over "2.0rc1"
-        if ref.startswith(tag_prefix):
-            r = ref[len(tag_prefix):]
-            if verbose:
-                print("picking %%s" %% r)
-            return {"version": r,
-                    "full-revisionid": keywords["full"].strip(),
-                    "dirty": False, "error": None,
-                    "date": date}
-    # no suitable tags, so version is "0+unknown", but full hex is still there
-    if verbose:
-        print("no suitable tags, using unknown + full revision id")
-    return {"version": "0+unknown",
-            "full-revisionid": keywords["full"].strip(),
-            "dirty": False, "error": "no suitable tags", "date": None}
-
-
- at register_vcs_handler("git", "pieces_from_vcs")
-def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
-    """Get version from 'git describe' in the root of the source tree.
-
-    This only gets called if the git-archive 'subst' keywords were *not*
-    expanded, and _version.py hasn't already been rewritten with a short
-    version string, meaning we're inside a checked out source tree.
-    """
-    GITS = ["git"]
-    if sys.platform == "win32":
-        GITS = ["git.cmd", "git.exe"]
-
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
-    if rc != 0:
-        if verbose:
-            print("Directory %%s not under git control" %% root)
-        raise NotThisMethod("'git rev-parse --git-dir' returned error")
-
-    # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
-    # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%%s*" %% tag_prefix],
-                                   cwd=root)
-    # --long was added in git-1.5.5
-    if describe_out is None:
-        raise NotThisMethod("'git describe' failed")
-    describe_out = describe_out.strip()
-    full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
-    if full_out is None:
-        raise NotThisMethod("'git rev-parse' failed")
-    full_out = full_out.strip()
-
-    pieces = {}
-    pieces["long"] = full_out
-    pieces["short"] = full_out[:7]  # maybe improved later
-    pieces["error"] = None
-
-    # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
-    # TAG might have hyphens.
-    git_describe = describe_out
-
-    # look for -dirty suffix
-    dirty = git_describe.endswith("-dirty")
-    pieces["dirty"] = dirty
-    if dirty:
-        git_describe = git_describe[:git_describe.rindex("-dirty")]
-
-    # now we have TAG-NUM-gHEX or HEX
-
-    if "-" in git_describe:
-        # TAG-NUM-gHEX
-        mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
-        if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
-            pieces["error"] = ("unable to parse git-describe output: '%%s'"
-                               %% describe_out)
-            return pieces
-
-        # tag
-        full_tag = mo.group(1)
-        if not full_tag.startswith(tag_prefix):
-            if verbose:
-                fmt = "tag '%%s' doesn't start with prefix '%%s'"
-                print(fmt %% (full_tag, tag_prefix))
-            pieces["error"] = ("tag '%%s' doesn't start with prefix '%%s'"
-                               %% (full_tag, tag_prefix))
-            return pieces
-        pieces["closest-tag"] = full_tag[len(tag_prefix):]
-
-        # distance: number of commits since tag
-        pieces["distance"] = int(mo.group(2))
-
-        # commit: short hex revision ID
-        pieces["short"] = mo.group(3)
-
-    else:
-        # HEX: no tags
-        pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
-        pieces["distance"] = int(count_out)  # total number of commits
-
-    # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%%ci", "HEAD"],
-                       cwd=root)[0].strip()
-    pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
-
-    return pieces
-
-
-def plus_or_dot(pieces):
-    """Return a + if we don't already have one, else return a ."""
-    if "+" in pieces.get("closest-tag", ""):
-        return "."
-    return "+"
-
-
-def render_pep440(pieces):
-    """Build up version string, with post-release "local version identifier".
-
-    Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you
-    get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty
-
-    Exceptions:
-    1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += plus_or_dot(pieces)
-            rendered += "%%d.g%%s" %% (pieces["distance"], pieces["short"])
-            if pieces["dirty"]:
-                rendered += ".dirty"
-    else:
-        # exception #1
-        rendered = "0+untagged.%%d.g%%s" %% (pieces["distance"],
-                                          pieces["short"])
-        if pieces["dirty"]:
-            rendered += ".dirty"
-    return rendered
-
-
-def render_pep440_pre(pieces):
-    """TAG[.post.devDISTANCE] -- No -dirty.
-
-    Exceptions:
-    1: no tags. 0.post.devDISTANCE
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"]:
-            rendered += ".post.dev%%d" %% pieces["distance"]
-    else:
-        # exception #1
-        rendered = "0.post.dev%%d" %% pieces["distance"]
-    return rendered
-
-
-def render_pep440_post(pieces):
-    """TAG[.postDISTANCE[.dev0]+gHEX] .
-
-    The ".dev0" means dirty. Note that .dev0 sorts backwards
-    (a dirty tree will appear "older" than the corresponding clean one),
-    but you shouldn't be releasing software with -dirty anyways.
-
-    Exceptions:
-    1: no tags. 0.postDISTANCE[.dev0]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += ".post%%d" %% pieces["distance"]
-            if pieces["dirty"]:
-                rendered += ".dev0"
-            rendered += plus_or_dot(pieces)
-            rendered += "g%%s" %% pieces["short"]
-    else:
-        # exception #1
-        rendered = "0.post%%d" %% pieces["distance"]
-        if pieces["dirty"]:
-            rendered += ".dev0"
-        rendered += "+g%%s" %% pieces["short"]
-    return rendered
-
-
-def render_pep440_old(pieces):
-    """TAG[.postDISTANCE[.dev0]] .
-
-    The ".dev0" means dirty.
-
-    Eexceptions:
-    1: no tags. 0.postDISTANCE[.dev0]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += ".post%%d" %% pieces["distance"]
-            if pieces["dirty"]:
-                rendered += ".dev0"
-    else:
-        # exception #1
-        rendered = "0.post%%d" %% pieces["distance"]
-        if pieces["dirty"]:
-            rendered += ".dev0"
-    return rendered
-
-
-def render_git_describe(pieces):
-    """TAG[-DISTANCE-gHEX][-dirty].
-
-    Like 'git describe --tags --dirty --always'.
-
-    Exceptions:
-    1: no tags. HEX[-dirty]  (note: no 'g' prefix)
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"]:
-            rendered += "-%%d-g%%s" %% (pieces["distance"], pieces["short"])
-    else:
-        # exception #1
-        rendered = pieces["short"]
-    if pieces["dirty"]:
-        rendered += "-dirty"
-    return rendered
-
-
-def render_git_describe_long(pieces):
-    """TAG-DISTANCE-gHEX[-dirty].
-
-    Like 'git describe --tags --dirty --always -long'.
-    The distance/hash is unconditional.
-
-    Exceptions:
-    1: no tags. HEX[-dirty]  (note: no 'g' prefix)
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        rendered += "-%%d-g%%s" %% (pieces["distance"], pieces["short"])
-    else:
-        # exception #1
-        rendered = pieces["short"]
-    if pieces["dirty"]:
-        rendered += "-dirty"
-    return rendered
-
-
-def render(pieces, style):
-    """Render the given version pieces into the requested style."""
-    if pieces["error"]:
-        return {"version": "unknown",
-                "full-revisionid": pieces.get("long"),
-                "dirty": None,
-                "error": pieces["error"],
-                "date": None}
-
-    if not style or style == "default":
-        style = "pep440"  # the default
-
-    if style == "pep440":
-        rendered = render_pep440(pieces)
-    elif style == "pep440-pre":
-        rendered = render_pep440_pre(pieces)
-    elif style == "pep440-post":
-        rendered = render_pep440_post(pieces)
-    elif style == "pep440-old":
-        rendered = render_pep440_old(pieces)
-    elif style == "git-describe":
-        rendered = render_git_describe(pieces)
-    elif style == "git-describe-long":
-        rendered = render_git_describe_long(pieces)
-    else:
-        raise ValueError("unknown style '%%s'" %% style)
-
-    return {"version": rendered, "full-revisionid": pieces["long"],
-            "dirty": pieces["dirty"], "error": None,
-            "date": pieces.get("date")}
-
-
-def get_versions():
-    """Get version information or return default if unable to do so."""
-    # I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have
-    # __file__, we can work backwards from there to the root. Some
-    # py2exe/bbfreeze/non-CPython implementations don't do __file__, in which
-    # case we can only use expanded keywords.
-
-    cfg = get_config()
-    verbose = cfg.verbose
-
-    try:
-        return git_versions_from_keywords(get_keywords(), cfg.tag_prefix,
-                                          verbose)
-    except NotThisMethod:
-        pass
-
-    try:
-        root = os.path.realpath(__file__)
-        # versionfile_source is the relative path from the top of the source
-        # tree (where the .git directory might live) to this file. Invert
-        # this to find the root from __file__.
-        for i in cfg.versionfile_source.split('/'):
-            root = os.path.dirname(root)
-    except NameError:
-        return {"version": "0+unknown", "full-revisionid": None,
-                "dirty": None,
-                "error": "unable to find root of source tree",
-                "date": None}
-
-    try:
-        pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose)
-        return render(pieces, cfg.style)
-    except NotThisMethod:
-        pass
-
-    try:
-        if cfg.parentdir_prefix:
-            return versions_from_parentdir(cfg.parentdir_prefix, root, verbose)
-    except NotThisMethod:
-        pass
-
-    return {"version": "0+unknown", "full-revisionid": None,
-            "dirty": None,
-            "error": "unable to compute version", "date": None}
-'''
-
-
- at register_vcs_handler("git", "get_keywords")
-def git_get_keywords(versionfile_abs):
-    """Extract version information from the given file."""
-    # the code embedded in _version.py can just fetch the value of these
-    # keywords. When used from setup.py, we don't want to import _version.py,
-    # so we do it with a regexp instead. This function is not used from
-    # _version.py.
-    keywords = {}
-    try:
-        f = open(versionfile_abs, "r")
-        for line in f.readlines():
-            if line.strip().startswith("git_refnames ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["refnames"] = mo.group(1)
-            if line.strip().startswith("git_full ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["full"] = mo.group(1)
-            if line.strip().startswith("git_date ="):
-                mo = re.search(r'=\s*"(.*)"', line)
-                if mo:
-                    keywords["date"] = mo.group(1)
-        f.close()
-    except EnvironmentError:
-        pass
-    return keywords
-
-
- at register_vcs_handler("git", "keywords")
-def git_versions_from_keywords(keywords, tag_prefix, verbose):
-    """Get version information from git keywords."""
-    if not keywords:
-        raise NotThisMethod("no keywords at all, weird")
-    date = keywords.get("date")
-    if date is not None:
-        # git-2.2.0 added "%cI", which expands to an ISO-8601 -compliant
-        # datestamp. However we prefer "%ci" (which expands to an "ISO-8601
-        # -like" string, which we must then edit to make compliant), because
-        # it's been around since git-1.5.3, and it's too difficult to
-        # discover which version we're using, or to work around using an
-        # older one.
-        date = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
-    refnames = keywords["refnames"].strip()
-    if refnames.startswith("$Format"):
-        if verbose:
-            print("keywords are unexpanded, not using")
-        raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
-    refs = set([r.strip() for r in refnames.strip("()").split(",")])
-    # starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
-    # just "foo-1.0". If we see a "tag: " prefix, prefer those.
-    TAG = "tag: "
-    tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])
-    if not tags:
-        # Either we're using git < 1.8.3, or there really are no tags. We use
-        # a heuristic: assume all version tags have a digit. The old git %d
-        # expansion behaves like git log --decorate=short and strips out the
-        # refs/heads/ and refs/tags/ prefixes that would let us distinguish
-        # between branches and tags. By ignoring refnames without digits, we
-        # filter out many common branch names like "release" and
-        # "stabilization", as well as "HEAD" and "master".
-        tags = set([r for r in refs if re.search(r'\d', r)])
-        if verbose:
-            print("discarding '%s', no digits" % ",".join(refs - tags))
-    if verbose:
-        print("likely tags: %s" % ",".join(sorted(tags)))
-    for ref in sorted(tags):
-        # sorting will prefer e.g. "2.0" over "2.0rc1"
-        if ref.startswith(tag_prefix):
-            r = ref[len(tag_prefix):]
-            if verbose:
-                print("picking %s" % r)
-            return {"version": r,
-                    "full-revisionid": keywords["full"].strip(),
-                    "dirty": False, "error": None,
-                    "date": date}
-    # no suitable tags, so version is "0+unknown", but full hex is still there
-    if verbose:
-        print("no suitable tags, using unknown + full revision id")
-    return {"version": "0+unknown",
-            "full-revisionid": keywords["full"].strip(),
-            "dirty": False, "error": "no suitable tags", "date": None}
-
-
- at register_vcs_handler("git", "pieces_from_vcs")
-def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
-    """Get version from 'git describe' in the root of the source tree.
-
-    This only gets called if the git-archive 'subst' keywords were *not*
-    expanded, and _version.py hasn't already been rewritten with a short
-    version string, meaning we're inside a checked out source tree.
-    """
-    GITS = ["git"]
-    if sys.platform == "win32":
-        GITS = ["git.cmd", "git.exe"]
-
-    out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root,
-                          hide_stderr=True)
-    if rc != 0:
-        if verbose:
-            print("Directory %s not under git control" % root)
-        raise NotThisMethod("'git rev-parse --git-dir' returned error")
-
-    # if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
-    # if there isn't one, this yields HEX[-dirty] (no NUM)
-    describe_out, rc = run_command(GITS, ["describe", "--tags", "--dirty",
-                                          "--always", "--long",
-                                          "--match", "%s*" % tag_prefix],
-                                   cwd=root)
-    # --long was added in git-1.5.5
-    if describe_out is None:
-        raise NotThisMethod("'git describe' failed")
-    describe_out = describe_out.strip()
-    full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
-    if full_out is None:
-        raise NotThisMethod("'git rev-parse' failed")
-    full_out = full_out.strip()
-
-    pieces = {}
-    pieces["long"] = full_out
-    pieces["short"] = full_out[:7]  # maybe improved later
-    pieces["error"] = None
-
-    # parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
-    # TAG might have hyphens.
-    git_describe = describe_out
-
-    # look for -dirty suffix
-    dirty = git_describe.endswith("-dirty")
-    pieces["dirty"] = dirty
-    if dirty:
-        git_describe = git_describe[:git_describe.rindex("-dirty")]
-
-    # now we have TAG-NUM-gHEX or HEX
-
-    if "-" in git_describe:
-        # TAG-NUM-gHEX
-        mo = re.search(r'^(.+)-(\d+)-g([0-9a-f]+)$', git_describe)
-        if not mo:
-            # unparseable. Maybe git-describe is misbehaving?
-            pieces["error"] = ("unable to parse git-describe output: '%s'"
-                               % describe_out)
-            return pieces
-
-        # tag
-        full_tag = mo.group(1)
-        if not full_tag.startswith(tag_prefix):
-            if verbose:
-                fmt = "tag '%s' doesn't start with prefix '%s'"
-                print(fmt % (full_tag, tag_prefix))
-            pieces["error"] = ("tag '%s' doesn't start with prefix '%s'"
-                               % (full_tag, tag_prefix))
-            return pieces
-        pieces["closest-tag"] = full_tag[len(tag_prefix):]
-
-        # distance: number of commits since tag
-        pieces["distance"] = int(mo.group(2))
-
-        # commit: short hex revision ID
-        pieces["short"] = mo.group(3)
-
-    else:
-        # HEX: no tags
-        pieces["closest-tag"] = None
-        count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"],
-                                    cwd=root)
-        pieces["distance"] = int(count_out)  # total number of commits
-
-    # commit date: see ISO-8601 comment in git_versions_from_keywords()
-    date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"],
-                       cwd=root)[0].strip()
-    pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
-
-    return pieces
-
-
-def do_vcs_install(manifest_in, versionfile_source, ipy):
-    """Git-specific installation logic for Versioneer.
-
-    For Git, this means creating/changing .gitattributes to mark _version.py
-    for export-subst keyword substitution.
-    """
-    GITS = ["git"]
-    if sys.platform == "win32":
-        GITS = ["git.cmd", "git.exe"]
-    files = [manifest_in, versionfile_source]
-    if ipy:
-        files.append(ipy)
-    try:
-        me = __file__
-        if me.endswith(".pyc") or me.endswith(".pyo"):
-            me = os.path.splitext(me)[0] + ".py"
-        versioneer_file = os.path.relpath(me)
-    except NameError:
-        versioneer_file = "versioneer.py"
-    files.append(versioneer_file)
-    present = False
-    try:
-        f = open(".gitattributes", "r")
-        for line in f.readlines():
-            if line.strip().startswith(versionfile_source):
-                if "export-subst" in line.strip().split()[1:]:
-                    present = True
-        f.close()
-    except EnvironmentError:
-        pass
-    if not present:
-        f = open(".gitattributes", "a+")
-        f.write("%s export-subst\n" % versionfile_source)
-        f.close()
-        files.append(".gitattributes")
-    run_command(GITS, ["add", "--"] + files)
-
-
-def versions_from_parentdir(parentdir_prefix, root, verbose):
-    """Try to determine the version from the parent directory name.
-
-    Source tarballs conventionally unpack into a directory that includes both
-    the project name and a version string. We will also support searching up
-    two directory levels for an appropriately named parent directory
-    """
-    rootdirs = []
-
-    for i in range(3):
-        dirname = os.path.basename(root)
-        if dirname.startswith(parentdir_prefix):
-            return {"version": dirname[len(parentdir_prefix):],
-                    "full-revisionid": None,
-                    "dirty": False, "error": None, "date": None}
-        else:
-            rootdirs.append(root)
-            root = os.path.dirname(root)  # up a level
-
-    if verbose:
-        print("Tried directories %s but none started with prefix %s" %
-              (str(rootdirs), parentdir_prefix))
-    raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
-
-
-SHORT_VERSION_PY = """
-# This file was generated by 'versioneer.py' (0.18) from
-# revision-control system data, or from the parent directory name of an
-# unpacked source archive. Distribution tarballs contain a pre-generated copy
-# of this file.
-
-import json
-
-version_json = '''
-%s
-'''  # END VERSION_JSON
-
-
-def get_versions():
-    return json.loads(version_json)
-"""
-
-
-def versions_from_file(filename):
-    """Try to determine the version from _version.py if present."""
-    try:
-        with open(filename) as f:
-            contents = f.read()
-    except EnvironmentError:
-        raise NotThisMethod("unable to read _version.py")
-    mo = re.search(r"version_json = '''\n(.*)'''  # END VERSION_JSON",
-                   contents, re.M | re.S)
-    if not mo:
-        mo = re.search(r"version_json = '''\r\n(.*)'''  # END VERSION_JSON",
-                       contents, re.M | re.S)
-    if not mo:
-        raise NotThisMethod("no version_json in _version.py")
-    return json.loads(mo.group(1))
-
-
-def write_to_version_file(filename, versions):
-    """Write the given version number to the given _version.py file."""
-    os.unlink(filename)
-    contents = json.dumps(versions, sort_keys=True,
-                          indent=1, separators=(",", ": "))
-    with open(filename, "w") as f:
-        f.write(SHORT_VERSION_PY % contents)
-
-    print("set %s to '%s'" % (filename, versions["version"]))
-
-
-def plus_or_dot(pieces):
-    """Return a + if we don't already have one, else return a ."""
-    if "+" in pieces.get("closest-tag", ""):
-        return "."
-    return "+"
-
-
-def render_pep440(pieces):
-    """Build up version string, with post-release "local version identifier".
-
-    Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you
-    get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty
-
-    Exceptions:
-    1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += plus_or_dot(pieces)
-            rendered += "%d.g%s" % (pieces["distance"], pieces["short"])
-            if pieces["dirty"]:
-                rendered += ".dirty"
-    else:
-        # exception #1
-        rendered = "0+untagged.%d.g%s" % (pieces["distance"],
-                                          pieces["short"])
-        if pieces["dirty"]:
-            rendered += ".dirty"
-    return rendered
-
-
-def render_pep440_pre(pieces):
-    """TAG[.post.devDISTANCE] -- No -dirty.
-
-    Exceptions:
-    1: no tags. 0.post.devDISTANCE
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"]:
-            rendered += ".post.dev%d" % pieces["distance"]
-    else:
-        # exception #1
-        rendered = "0.post.dev%d" % pieces["distance"]
-    return rendered
-
-
-def render_pep440_post(pieces):
-    """TAG[.postDISTANCE[.dev0]+gHEX] .
-
-    The ".dev0" means dirty. Note that .dev0 sorts backwards
-    (a dirty tree will appear "older" than the corresponding clean one),
-    but you shouldn't be releasing software with -dirty anyways.
-
-    Exceptions:
-    1: no tags. 0.postDISTANCE[.dev0]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += ".post%d" % pieces["distance"]
-            if pieces["dirty"]:
-                rendered += ".dev0"
-            rendered += plus_or_dot(pieces)
-            rendered += "g%s" % pieces["short"]
-    else:
-        # exception #1
-        rendered = "0.post%d" % pieces["distance"]
-        if pieces["dirty"]:
-            rendered += ".dev0"
-        rendered += "+g%s" % pieces["short"]
-    return rendered
-
-
-def render_pep440_old(pieces):
-    """TAG[.postDISTANCE[.dev0]] .
-
-    The ".dev0" means dirty.
-
-    Eexceptions:
-    1: no tags. 0.postDISTANCE[.dev0]
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"] or pieces["dirty"]:
-            rendered += ".post%d" % pieces["distance"]
-            if pieces["dirty"]:
-                rendered += ".dev0"
-    else:
-        # exception #1
-        rendered = "0.post%d" % pieces["distance"]
-        if pieces["dirty"]:
-            rendered += ".dev0"
-    return rendered
-
-
-def render_git_describe(pieces):
-    """TAG[-DISTANCE-gHEX][-dirty].
-
-    Like 'git describe --tags --dirty --always'.
-
-    Exceptions:
-    1: no tags. HEX[-dirty]  (note: no 'g' prefix)
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        if pieces["distance"]:
-            rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
-    else:
-        # exception #1
-        rendered = pieces["short"]
-    if pieces["dirty"]:
-        rendered += "-dirty"
-    return rendered
-
-
-def render_git_describe_long(pieces):
-    """TAG-DISTANCE-gHEX[-dirty].
-
-    Like 'git describe --tags --dirty --always -long'.
-    The distance/hash is unconditional.
-
-    Exceptions:
-    1: no tags. HEX[-dirty]  (note: no 'g' prefix)
-    """
-    if pieces["closest-tag"]:
-        rendered = pieces["closest-tag"]
-        rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
-    else:
-        # exception #1
-        rendered = pieces["short"]
-    if pieces["dirty"]:
-        rendered += "-dirty"
-    return rendered
-
-
-def render(pieces, style):
-    """Render the given version pieces into the requested style."""
-    if pieces["error"]:
-        return {"version": "unknown",
-                "full-revisionid": pieces.get("long"),
-                "dirty": None,
-                "error": pieces["error"],
-                "date": None}
-
-    if not style or style == "default":
-        style = "pep440"  # the default
-
-    if style == "pep440":
-        rendered = render_pep440(pieces)
-    elif style == "pep440-pre":
-        rendered = render_pep440_pre(pieces)
-    elif style == "pep440-post":
-        rendered = render_pep440_post(pieces)
-    elif style == "pep440-old":
-        rendered = render_pep440_old(pieces)
-    elif style == "git-describe":
-        rendered = render_git_describe(pieces)
-    elif style == "git-describe-long":
-        rendered = render_git_describe_long(pieces)
-    else:
-        raise ValueError("unknown style '%s'" % style)
-
-    return {"version": rendered, "full-revisionid": pieces["long"],
-            "dirty": pieces["dirty"], "error": None,
-            "date": pieces.get("date")}
-
-
-class VersioneerBadRootError(Exception):
-    """The project root directory is unknown or missing key files."""
-
-
-def get_versions(verbose=False):
-    """Get the project version from whatever source is available.
-
-    Returns dict with two keys: 'version' and 'full'.
-    """
-    if "versioneer" in sys.modules:
-        # see the discussion in cmdclass.py:get_cmdclass()
-        del sys.modules["versioneer"]
-
-    root = get_root()
-    cfg = get_config_from_root(root)
-
-    assert cfg.VCS is not None, "please set [versioneer]VCS= in setup.cfg"
-    handlers = HANDLERS.get(cfg.VCS)
-    assert handlers, "unrecognized VCS '%s'" % cfg.VCS
-    verbose = verbose or cfg.verbose
-    assert cfg.versionfile_source is not None, \
-        "please set versioneer.versionfile_source"
-    assert cfg.tag_prefix is not None, "please set versioneer.tag_prefix"
-
-    versionfile_abs = os.path.join(root, cfg.versionfile_source)
-
-    # extract version from first of: _version.py, VCS command (e.g. 'git
-    # describe'), parentdir. This is meant to work for developers using a
-    # source checkout, for users of a tarball created by 'setup.py sdist',
-    # and for users of a tarball/zipball created by 'git archive' or github's
-    # download-from-tag feature or the equivalent in other VCSes.
-
-    get_keywords_f = handlers.get("get_keywords")
-    from_keywords_f = handlers.get("keywords")
-    if get_keywords_f and from_keywords_f:
-        try:
-            keywords = get_keywords_f(versionfile_abs)
-            ver = from_keywords_f(keywords, cfg.tag_prefix, verbose)
-            if verbose:
-                print("got version from expanded keyword %s" % ver)
-            return ver
-        except NotThisMethod:
-            pass
-
-    try:
-        ver = versions_from_file(versionfile_abs)
-        if verbose:
-            print("got version from file %s %s" % (versionfile_abs, ver))
-        return ver
-    except NotThisMethod:
-        pass
-
-    from_vcs_f = handlers.get("pieces_from_vcs")
-    if from_vcs_f:
-        try:
-            pieces = from_vcs_f(cfg.tag_prefix, root, verbose)
-            ver = render(pieces, cfg.style)
-            if verbose:
-                print("got version from VCS %s" % ver)
-            return ver
-        except NotThisMethod:
-            pass
-
-    try:
-        if cfg.parentdir_prefix:
-            ver = versions_from_parentdir(cfg.parentdir_prefix, root, verbose)
-            if verbose:
-                print("got version from parentdir %s" % ver)
-            return ver
-    except NotThisMethod:
-        pass
-
-    if verbose:
-        print("unable to compute version")
-
-    return {"version": "0+unknown", "full-revisionid": None,
-            "dirty": None, "error": "unable to compute version",
-            "date": None}
-
-
-def get_version():
-    """Get the short version string for this project."""
-    return get_versions()["version"]
-
-
-def get_cmdclass():
-    """Get the custom setuptools/distutils subclasses used by Versioneer."""
-    if "versioneer" in sys.modules:
-        del sys.modules["versioneer"]
-        # this fixes the "python setup.py develop" case (also 'install' and
-        # 'easy_install .'), in which subdependencies of the main project are
-        # built (using setup.py bdist_egg) in the same python process. Assume
-        # a main project A and a dependency B, which use different versions
-        # of Versioneer. A's setup.py imports A's Versioneer, leaving it in
-        # sys.modules by the time B's setup.py is executed, causing B to run
-        # with the wrong versioneer. Setuptools wraps the sub-dep builds in a
-        # sandbox that restores sys.modules to it's pre-build state, so the
-        # parent is protected against the child's "import versioneer". By
-        # removing ourselves from sys.modules here, before the child build
-        # happens, we protect the child from the parent's versioneer too.
-        # Also see https://github.com/warner/python-versioneer/issues/52
-
-    cmds = {}
-
-    # we add "version" to both distutils and setuptools
-    from distutils.core import Command
-
-    class cmd_version(Command):
-        description = "report generated version string"
-        user_options = []
-        boolean_options = []
-
-        def initialize_options(self):
-            pass
-
-        def finalize_options(self):
-            pass
-
-        def run(self):
-            vers = get_versions(verbose=True)
-            print("Version: %s" % vers["version"])
-            print(" full-revisionid: %s" % vers.get("full-revisionid"))
-            print(" dirty: %s" % vers.get("dirty"))
-            print(" date: %s" % vers.get("date"))
-            if vers["error"]:
-                print(" error: %s" % vers["error"])
-    cmds["version"] = cmd_version
-
-    # we override "build_py" in both distutils and setuptools
-    #
-    # most invocation pathways end up running build_py:
-    #  distutils/build -> build_py
-    #  distutils/install -> distutils/build ->..
-    #  setuptools/bdist_wheel -> distutils/install ->..
-    #  setuptools/bdist_egg -> distutils/install_lib -> build_py
-    #  setuptools/install -> bdist_egg ->..
-    #  setuptools/develop -> ?
-    #  pip install:
-    #   copies source tree to a tempdir before running egg_info/etc
-    #   if .git isn't copied too, 'git describe' will fail
-    #   then does setup.py bdist_wheel, or sometimes setup.py install
-    #  setup.py egg_info -> ?
-
-    # we override different "build_py" commands for both environments
-    if "setuptools" in sys.modules:
-        from setuptools.command.build_py import build_py as _build_py
-    else:
-        from distutils.command.build_py import build_py as _build_py
-
-    class cmd_build_py(_build_py):
-        def run(self):
-            root = get_root()
-            cfg = get_config_from_root(root)
-            versions = get_versions()
-            _build_py.run(self)
-            # now locate _version.py in the new build/ directory and replace
-            # it with an updated value
-            if cfg.versionfile_build:
-                target_versionfile = os.path.join(self.build_lib,
-                                                  cfg.versionfile_build)
-                print("UPDATING %s" % target_versionfile)
-                write_to_version_file(target_versionfile, versions)
-    cmds["build_py"] = cmd_build_py
-
-    if "cx_Freeze" in sys.modules:  # cx_freeze enabled?
-        from cx_Freeze.dist import build_exe as _build_exe
-        # nczeczulin reports that py2exe won't like the pep440-style string
-        # as FILEVERSION, but it can be used for PRODUCTVERSION, e.g.
-        # setup(console=[{
-        #   "version": versioneer.get_version().split("+", 1)[0], # FILEVERSION
-        #   "product_version": versioneer.get_version(),
-        #   ...
-
-        class cmd_build_exe(_build_exe):
-            def run(self):
-                root = get_root()
-                cfg = get_config_from_root(root)
-                versions = get_versions()
-                target_versionfile = cfg.versionfile_source
-                print("UPDATING %s" % target_versionfile)
-                write_to_version_file(target_versionfile, versions)
-
-                _build_exe.run(self)
-                os.unlink(target_versionfile)
-                with open(cfg.versionfile_source, "w") as f:
-                    LONG = LONG_VERSION_PY[cfg.VCS]
-                    f.write(LONG %
-                            {"DOLLAR": "$",
-                             "STYLE": cfg.style,
-                             "TAG_PREFIX": cfg.tag_prefix,
-                             "PARENTDIR_PREFIX": cfg.parentdir_prefix,
-                             "VERSIONFILE_SOURCE": cfg.versionfile_source,
-                             })
-        cmds["build_exe"] = cmd_build_exe
-        del cmds["build_py"]
-
-    if 'py2exe' in sys.modules:  # py2exe enabled?
-        try:
-            from py2exe.distutils_buildexe import py2exe as _py2exe  # py3
-        except ImportError:
-            from py2exe.build_exe import py2exe as _py2exe  # py2
-
-        class cmd_py2exe(_py2exe):
-            def run(self):
-                root = get_root()
-                cfg = get_config_from_root(root)
-                versions = get_versions()
-                target_versionfile = cfg.versionfile_source
-                print("UPDATING %s" % target_versionfile)
-                write_to_version_file(target_versionfile, versions)
-
-                _py2exe.run(self)
-                os.unlink(target_versionfile)
-                with open(cfg.versionfile_source, "w") as f:
-                    LONG = LONG_VERSION_PY[cfg.VCS]
-                    f.write(LONG %
-                            {"DOLLAR": "$",
-                             "STYLE": cfg.style,
-                             "TAG_PREFIX": cfg.tag_prefix,
-                             "PARENTDIR_PREFIX": cfg.parentdir_prefix,
-                             "VERSIONFILE_SOURCE": cfg.versionfile_source,
-                             })
-        cmds["py2exe"] = cmd_py2exe
-
-    # we override different "sdist" commands for both environments
-    if "setuptools" in sys.modules:
-        from setuptools.command.sdist import sdist as _sdist
-    else:
-        from distutils.command.sdist import sdist as _sdist
-
-    class cmd_sdist(_sdist):
-        def run(self):
-            versions = get_versions()
-            self._versioneer_generated_versions = versions
-            # unless we update this, the command will keep using the old
-            # version
-            self.distribution.metadata.version = versions["version"]
-            return _sdist.run(self)
-
-        def make_release_tree(self, base_dir, files):
-            root = get_root()
-            cfg = get_config_from_root(root)
-            _sdist.make_release_tree(self, base_dir, files)
-            # now locate _version.py in the new base_dir directory
-            # (remembering that it may be a hardlink) and replace it with an
-            # updated value
-            target_versionfile = os.path.join(base_dir, cfg.versionfile_source)
-            print("UPDATING %s" % target_versionfile)
-            write_to_version_file(target_versionfile,
-                                  self._versioneer_generated_versions)
-    cmds["sdist"] = cmd_sdist
-
-    return cmds
-
-
-CONFIG_ERROR = """
-setup.cfg is missing the necessary Versioneer configuration. You need
-a section like:
-
- [versioneer]
- VCS = git
- style = pep440
- versionfile_source = src/myproject/_version.py
- versionfile_build = myproject/_version.py
- tag_prefix =
- parentdir_prefix = myproject-
-
-You will also need to edit your setup.py to use the results:
-
- import versioneer
- setup(version=versioneer.get_version(),
-       cmdclass=versioneer.get_cmdclass(), ...)
-
-Please read the docstring in ./versioneer.py for configuration instructions,
-edit setup.cfg, and re-run the installer or 'python versioneer.py setup'.
-"""
-
-SAMPLE_CONFIG = """
-# See the docstring in versioneer.py for instructions. Note that you must
-# re-run 'versioneer.py setup' after changing this section, and commit the
-# resulting files.
-
-[versioneer]
-#VCS = git
-#style = pep440
-#versionfile_source =
-#versionfile_build =
-#tag_prefix =
-#parentdir_prefix =
-
-"""
-
-INIT_PY_SNIPPET = """
-from ._version import get_versions
-__version__ = get_versions()['version']
-del get_versions
-"""
-
-
-def do_setup():
-    """Main VCS-independent setup function for installing Versioneer."""
-    root = get_root()
-    try:
-        cfg = get_config_from_root(root)
-    except (EnvironmentError, configparser.NoSectionError,
-            configparser.NoOptionError) as e:
-        if isinstance(e, (EnvironmentError, configparser.NoSectionError)):
-            print("Adding sample versioneer config to setup.cfg",
-                  file=sys.stderr)
-            with open(os.path.join(root, "setup.cfg"), "a") as f:
-                f.write(SAMPLE_CONFIG)
-        print(CONFIG_ERROR, file=sys.stderr)
-        return 1
-
-    print(" creating %s" % cfg.versionfile_source)
-    with open(cfg.versionfile_source, "w") as f:
-        LONG = LONG_VERSION_PY[cfg.VCS]
-        f.write(LONG % {"DOLLAR": "$",
-                        "STYLE": cfg.style,
-                        "TAG_PREFIX": cfg.tag_prefix,
-                        "PARENTDIR_PREFIX": cfg.parentdir_prefix,
-                        "VERSIONFILE_SOURCE": cfg.versionfile_source,
-                        })
-
-    ipy = os.path.join(os.path.dirname(cfg.versionfile_source),
-                       "__init__.py")
-    if os.path.exists(ipy):
-        try:
-            with open(ipy, "r") as f:
-                old = f.read()
-        except EnvironmentError:
-            old = ""
-        if INIT_PY_SNIPPET not in old:
-            print(" appending to %s" % ipy)
-            with open(ipy, "a") as f:
-                f.write(INIT_PY_SNIPPET)
-        else:
-            print(" %s unmodified" % ipy)
-    else:
-        print(" %s doesn't exist, ok" % ipy)
-        ipy = None
-
-    # Make sure both the top-level "versioneer.py" and versionfile_source
-    # (PKG/_version.py, used by runtime code) are in MANIFEST.in, so
-    # they'll be copied into source distributions. Pip won't be able to
-    # install the package without this.
-    manifest_in = os.path.join(root, "MANIFEST.in")
-    simple_includes = set()
-    try:
-        with open(manifest_in, "r") as f:
-            for line in f:
-                if line.startswith("include "):
-                    for include in line.split()[1:]:
-                        simple_includes.add(include)
-    except EnvironmentError:
-        pass
-    # That doesn't cover everything MANIFEST.in can do
-    # (http://docs.python.org/2/distutils/sourcedist.html#commands), so
-    # it might give some false negatives. Appending redundant 'include'
-    # lines is safe, though.
-    if "versioneer.py" not in simple_includes:
-        print(" appending 'versioneer.py' to MANIFEST.in")
-        with open(manifest_in, "a") as f:
-            f.write("include versioneer.py\n")
-    else:
-        print(" 'versioneer.py' already in MANIFEST.in")
-    if cfg.versionfile_source not in simple_includes:
-        print(" appending versionfile_source ('%s') to MANIFEST.in" %
-              cfg.versionfile_source)
-        with open(manifest_in, "a") as f:
-            f.write("include %s\n" % cfg.versionfile_source)
-    else:
-        print(" versionfile_source already in MANIFEST.in")
-
-    # Make VCS-specific changes. For git, this means creating/changing
-    # .gitattributes to mark _version.py for export-subst keyword
-    # substitution.
-    do_vcs_install(manifest_in, cfg.versionfile_source, ipy)
-    return 0
-
-
-def scan_setup_py():
-    """Validate the contents of setup.py against Versioneer's expectations."""
-    found = set()
-    setters = False
-    errors = 0
-    with open("setup.py", "r") as f:
-        for line in f.readlines():
-            if "import versioneer" in line:
-                found.add("import")
-            if "versioneer.get_cmdclass()" in line:
-                found.add("cmdclass")
-            if "versioneer.get_version()" in line:
-                found.add("get_version")
-            if "versioneer.VCS" in line:
-                setters = True
-            if "versioneer.versionfile_source" in line:
-                setters = True
-    if len(found) != 3:
-        print("")
-        print("Your setup.py appears to be missing some important items")
-        print("(but I might be wrong). Please make sure it has something")
-        print("roughly like the following:")
-        print("")
-        print(" import versioneer")
-        print(" setup( version=versioneer.get_version(),")
-        print("        cmdclass=versioneer.get_cmdclass(),  ...)")
-        print("")
-        errors += 1
-    if setters:
-        print("You should remove lines like 'versioneer.VCS = ' and")
-        print("'versioneer.versionfile_source = ' . This configuration")
-        print("now lives in setup.cfg, and should be removed from setup.py")
-        print("")
-        errors += 1
-    return errors
-
-
-if __name__ == "__main__":
-    cmd = sys.argv[1]
-    if cmd == "setup":
-        errors = do_setup()
-        errors += scan_setup_py()
-        if errors:
-            sys.exit(1)



View it on GitLab: https://salsa.debian.org/med-team/python-questplus/-/commit/d015787fb84d239ebb100a9aecadee79567b2a21

-- 
This project does not include diff previews in email notifications.
View it on GitLab: https://salsa.debian.org/med-team/python-questplus/-/commit/d015787fb84d239ebb100a9aecadee79567b2a21
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/debian-med-commit/attachments/20240716/fb2d719e/attachment-0001.htm>


More information about the debian-med-commit mailing list