[Git][debian-gis-team/flox][upstream] New upstream version 0.6.7

Antonio Valentino (@antonio.valentino) gitlab at salsa.debian.org
Wed Jan 18 07:46:05 GMT 2023



Antonio Valentino pushed to branch upstream at Debian GIS Project / flox


Commits:
314ead24 by Antonio Valentino at 2023-01-18T07:36:12+00:00
New upstream version 0.6.7
- - - - -


9 changed files:

- .github/workflows/ci-additional.yaml
- .github/workflows/ci.yaml
- .pre-commit-config.yaml
- README.md
- ci/environment.yml
- codecov.yml
- flox/aggregate_flox.py
- flox/xrdtypes.py
- tests/test_core.py


Changes:

=====================================
.github/workflows/ci-additional.yaml
=====================================
@@ -71,7 +71,15 @@ jobs:
           conda list
       - name: Run doctests
         run: |
-          python -m pytest --doctest-modules flox --ignore flox/tests
+          python -m pytest --doctest-modules flox --ignore flox/tests --cov=./ --cov-report=xml
+      - name: Upload code coverage to Codecov
+        uses: codecov/codecov-action at v3.1.1
+        with:
+          file: ./coverage.xml
+          flags: unittests
+          env_vars: RUNNER_OS
+          name: codecov-umbrella
+          fail_ci_if_error: false
 
   mypy:
     name: Mypy
@@ -97,12 +105,12 @@ jobs:
         uses: mamba-org/provision-with-micromamba at v14
         with:
           environment-file: ${{env.CONDA_ENV_FILE}}
-          environment-name: xarray-tests
+          environment-name: flox-tests
           extra-specs: |
             python=${{env.PYTHON_VERSION}}
           cache-env: true
           cache-env-key: "${{runner.os}}-${{runner.arch}}-py${{env.PYTHON_VERSION}}-${{env.TODAY}}-${{hashFiles(env.CONDA_ENV_FILE)}}"
-      - name: Install xarray
+      - name: Install flox
         run: |
           python -m pip install --no-deps -e .
       - name: Version info
@@ -115,4 +123,13 @@ jobs:
 
       - name: Run mypy
         run: |
-          python -m mypy --install-types --non-interactive
+          python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report
+
+      - name: Upload mypy coverage to Codecov
+        uses: codecov/codecov-action at v3.1.1
+        with:
+          file: mypy_report/cobertura.xml
+          flags: mypy
+          env_vars: PYTHON_VERSION
+          name: codecov-umbrella
+          fail_ci_if_error: false


=====================================
.github/workflows/ci.yaml
=====================================
@@ -89,7 +89,15 @@ jobs:
           python -m pip install --no-deps -e .
       - name: Run tests
         run: |
-          python -m pytest -n auto
+          python -m pytest -n auto --cov=./ --cov-report=xml
+      - name: Upload code coverage to Codecov
+        uses: codecov/codecov-action at v3.1.1
+        with:
+          file: ./coverage.xml
+          flags: unittests
+          env_vars: RUNNER_OS
+          name: codecov-umbrella
+          fail_ci_if_error: false
 
   upstream-dev:
     name: upstream-dev


=====================================
.pre-commit-config.yaml
=====================================
@@ -3,24 +3,25 @@ ci:
 
 repos:
     - repo: https://github.com/pre-commit/pre-commit-hooks
-      rev: v4.3.0
+      rev: v4.4.0
       hooks:
+        - id: check-yaml
         - id: trailing-whitespace
         - id: end-of-file-fixer
         - id: check-docstring-first
 
     - repo: https://github.com/psf/black
-      rev: 22.8.0
+      rev: 22.12.0
       hooks:
         - id: black
 
     - repo: https://github.com/PyCQA/flake8
-      rev: 5.0.4
+      rev: 6.0.0
       hooks:
         - id: flake8
 
     - repo: https://github.com/PyCQA/isort
-      rev: 5.10.1
+      rev: 5.11.4
       hooks:
         - id: isort
 


=====================================
README.md
=====================================
@@ -1,4 +1,4 @@
-[![GitHub Workflow CI Status](https://img.shields.io/github/workflow/status/xarray-contrib/flox/CI?logo=github&style=flat)](https://github.com/xarray-contrib/flox/actions)
+[![GitHub Workflow CI Status](https://img.shields.io/github/actions/workflow/status/xarray-contrib/flox/ci.yaml?branch=main&logo=github&style=flat)](https://github.com/xarray-contrib/flox/actions)
 [![pre-commit.ci status](https://results.pre-commit.ci/badge/github/xarray-contrib/flox/main.svg)](https://results.pre-commit.ci/latest/github/xarray-contrib/flox/main)
 [![image](https://img.shields.io/codecov/c/github/xarray-contrib/flox.svg?style=flat)](https://codecov.io/gh/xarray-contrib/flox)
 [![Documentation Status](https://readthedocs.org/projects/flox/badge/?version=latest)](https://flox.readthedocs.io/en/latest/?badge=latest)


=====================================
ci/environment.yml
=====================================
@@ -9,6 +9,7 @@ dependencies:
   - netcdf4
   - pandas
   - numpy>=1.20
+  - lxml  # for mypy coverage report
   - matplotlib
   - pip
   - pytest


=====================================
codecov.yml
=====================================
@@ -5,6 +5,7 @@ codecov:
 comment: false
 
 ignore:
+  - 'benchmarks/*.py'
   - 'tests/*.py'
   - 'setup.py'
 


=====================================
flox/aggregate_flox.py
=====================================
@@ -107,7 +107,8 @@ def mean(group_idx, array, *, axis=-1, size=None, fill_value=None, dtype=None):
     if fill_value is None:
         fill_value = 0
     out = sum(group_idx, array, axis=axis, size=size, dtype=dtype, fill_value=fill_value)
-    out /= nanlen(group_idx, array, size=size, axis=axis, fill_value=0)
+    with np.errstate(invalid="ignore", divide="ignore"):
+        out /= nanlen(group_idx, array, size=size, axis=axis, fill_value=0)
     return out
 
 
@@ -115,5 +116,6 @@ def nanmean(group_idx, array, *, axis=-1, size=None, fill_value=None, dtype=None
     if fill_value is None:
         fill_value = 0
     out = nansum(group_idx, array, size=size, axis=axis, dtype=dtype, fill_value=fill_value)
-    out /= nanlen(group_idx, array, size=size, axis=axis, fill_value=0)
+    with np.errstate(invalid="ignore", divide="ignore"):
+        out /= nanlen(group_idx, array, size=size, axis=axis, fill_value=0)
     return out


=====================================
flox/xrdtypes.py
=====================================
@@ -152,28 +152,3 @@ def get_neg_infinity(dtype, min_for_int=False):
 def is_datetime_like(dtype):
     """Check if a dtype is a subclass of the numpy datetime types"""
     return np.issubdtype(dtype, np.datetime64) or np.issubdtype(dtype, np.timedelta64)
-
-
-def result_type(*arrays_and_dtypes):
-    """Like np.result_type, but with type promotion rules matching pandas.
-
-    Examples of changed behavior:
-    number + string -> object (not string)
-    bytes + unicode -> object (not unicode)
-
-    Parameters
-    ----------
-    *arrays_and_dtypes : list of arrays and dtypes
-        The dtype is extracted from both numpy and dask arrays.
-
-    Returns
-    -------
-    numpy.dtype for the result.
-    """
-    types = {np.result_type(t).type for t in arrays_and_dtypes}
-
-    for left, right in PROMOTE_TO_OBJECT:
-        if any(issubclass(t, left) for t in types) and any(issubclass(t, right) for t in types):
-            return np.dtype(object)
-
-    return np.result_type(*arrays_and_dtypes)


=====================================
tests/test_core.py
=====================================
@@ -140,7 +140,7 @@ def test_groupby_reduce(
     elif func == "sum":
         expected_result = np.array(expected, dtype=dtype)
     elif func == "count":
-        expected_result = np.array(expected, dtype=np.int64)
+        expected_result = np.array(expected, dtype=np.intp)
 
     result, groups, = groupby_reduce(
         array,
@@ -284,7 +284,7 @@ def test_groupby_reduce_count():
     array = np.array([0, 0, np.nan, np.nan, np.nan, 1, 1])
     labels = np.array(["a", "b", "b", "b", "c", "c", "c"])
     result, _ = groupby_reduce(array, labels, func="count")
-    assert_equal(result, np.array([1, 1, 2], dtype=np.int64))
+    assert_equal(result, np.array([1, 1, 2], dtype=np.intp))
 
 
 def test_func_is_aggregation():
@@ -393,12 +393,12 @@ def test_groupby_agg_dask(func, shape, array_chunks, group_chunks, add_nan, dtyp
     kwargs["expected_groups"] = [0, 2, 1]
     with raise_if_dask_computes():
         actual, groups = groupby_reduce(array, by, engine=engine, **kwargs, sort=False)
-    assert_equal(groups, np.array([0, 2, 1], dtype=np.intp))
+    assert_equal(groups, np.array([0, 2, 1], dtype=np.int64))
     assert_equal(expected, actual[..., [0, 2, 1]])
 
     with raise_if_dask_computes():
         actual, groups = groupby_reduce(array, by, engine=engine, **kwargs, sort=True)
-    assert_equal(groups, np.array([0, 1, 2], np.intp))
+    assert_equal(groups, np.array([0, 1, 2], np.int64))
     assert_equal(expected, actual)
 
 
@@ -408,29 +408,29 @@ def test_numpy_reduce_axis_subset(engine):
     array = np.ones_like(by, dtype=np.int64)
     kwargs = dict(func="count", engine=engine, fill_value=0)
     result, _ = groupby_reduce(array, by, **kwargs, axis=1)
-    assert_equal(result, np.array([[2, 3], [2, 3]], dtype=np.int64))
+    assert_equal(result, np.array([[2, 3], [2, 3]], dtype=np.intp))
 
     by = np.broadcast_to(labels2d, (3, *labels2d.shape))
     array = np.ones_like(by)
     result, _ = groupby_reduce(array, by, **kwargs, axis=1)
-    subarr = np.array([[1, 1], [1, 1], [0, 2], [1, 1], [1, 1]], dtype=np.int64)
+    subarr = np.array([[1, 1], [1, 1], [0, 2], [1, 1], [1, 1]], dtype=np.intp)
     expected = np.tile(subarr, (3, 1, 1))
     assert_equal(result, expected)
 
     result, _ = groupby_reduce(array, by, **kwargs, axis=2)
-    subarr = np.array([[2, 3], [2, 3]], dtype=np.int64)
+    subarr = np.array([[2, 3], [2, 3]], dtype=np.intp)
     expected = np.tile(subarr, (3, 1, 1))
     assert_equal(result, expected)
 
     result, _ = groupby_reduce(array, by, **kwargs, axis=(1, 2))
-    expected = np.array([[4, 6], [4, 6], [4, 6]], dtype=np.int64)
+    expected = np.array([[4, 6], [4, 6], [4, 6]], dtype=np.intp)
     assert_equal(result, expected)
 
     result, _ = groupby_reduce(array, by, **kwargs, axis=(2, 1))
     assert_equal(result, expected)
 
     result, _ = groupby_reduce(array, by[0, ...], **kwargs, axis=(1, 2))
-    expected = np.array([[4, 6], [4, 6], [4, 6]], dtype=np.int64)
+    expected = np.array([[4, 6], [4, 6], [4, 6]], dtype=np.intp)
     assert_equal(result, expected)
 
 
@@ -447,11 +447,11 @@ def test_dask_reduce_axis_subset():
             axis=1,
             expected_groups=[0, 2],
         )
-    assert_equal(result, np.array([[2, 3], [2, 3]], dtype=np.int64))
+    assert_equal(result, np.array([[2, 3], [2, 3]], dtype=np.intp))
 
     by = np.broadcast_to(labels2d, (3, *labels2d.shape))
     array = np.ones_like(by)
-    subarr = np.array([[1, 1], [1, 1], [123, 2], [1, 1], [1, 1]], dtype=np.int64)
+    subarr = np.array([[1, 1], [1, 1], [123, 2], [1, 1], [1, 1]], dtype=np.intp)
     expected = np.tile(subarr, (3, 1, 1))
     with raise_if_dask_computes():
         result, _ = groupby_reduce(
@@ -464,7 +464,7 @@ def test_dask_reduce_axis_subset():
         )
     assert_equal(result, expected)
 
-    subarr = np.array([[2, 3], [2, 3]], dtype=np.int64)
+    subarr = np.array([[2, 3], [2, 3]], dtype=np.intp)
     expected = np.tile(subarr, (3, 1, 1))
     with raise_if_dask_computes():
         result, _ = groupby_reduce(
@@ -672,7 +672,7 @@ def test_groupby_bins(chunk_labels, chunks, engine, method) -> None:
             engine=engine,
             method=method,
         )
-    expected = np.array([3, 1, 0], dtype=np.int64)
+    expected = np.array([3, 1, 0], dtype=np.intp)
     for left, right in zip(groups, pd.IntervalIndex.from_arrays([1, 2, 4], [2, 4, 5]).to_numpy()):
         assert left == right
     assert_equal(actual, expected)
@@ -801,7 +801,7 @@ def test_cohorts_map_reduce_consistent_dtypes(method, dtype, labels_dtype):
 
     actual, actual_groups = groupby_reduce(array, labels, func="count", method=method)
     assert_equal(actual_groups, np.arange(6, dtype=labels.dtype))
-    assert_equal(actual, repeats.astype(np.int64))
+    assert_equal(actual, repeats.astype(np.intp))
 
     actual, actual_groups = groupby_reduce(array, labels, func="sum", method=method)
     assert_equal(actual_groups, np.arange(6, dtype=labels.dtype))
@@ -955,7 +955,7 @@ def test_group_by_datetime(engine, method):
 
 
 def test_factorize_values_outside_bins():
-
+    # pd.factorize returns intp
     vals = factorize_(
         (np.arange(10).reshape(5, 2), np.arange(10).reshape(5, 2)),
         axis=(0, 1),
@@ -967,7 +967,7 @@ def test_factorize_values_outside_bins():
         fastpath=True,
     )
     actual = vals[0]
-    expected = np.array([[-1, -1], [-1, 0], [6, 12], [18, 24], [-1, -1]], np.int64)
+    expected = np.array([[-1, -1], [-1, 0], [6, 12], [18, 24], [-1, -1]], np.intp)
     assert_equal(expected, actual)
 
 
@@ -991,7 +991,8 @@ def test_multiple_groupers_bins(chunk) -> None:
         ),
         func="count",
     )
-    expected = np.eye(5, 5, dtype=np.int64)
+    # output from `count` is intp
+    expected = np.eye(5, 5, dtype=np.intp)
     assert_equal(expected, actual)
 
 
@@ -1020,7 +1021,8 @@ def test_multiple_groupers(chunk, by1, by2, expected_groups) -> None:
     if chunk:
         by2 = dask.array.from_array(by2)
 
-    expected = np.ones((5, 2), dtype=np.int64)
+    # output from `count` is intp
+    expected = np.ones((5, 2), dtype=np.intp)
     actual, *_ = groupby_reduce(
         array, by1, by2, axis=(0, 1), func="count", expected_groups=expected_groups
     )
@@ -1059,6 +1061,7 @@ def test_validate_expected_groups_not_none_dask() -> None:
 
 
 def test_factorize_reindex_sorting_strings():
+    # pd.factorize seems to return intp so int32 on 32bit arch
     kwargs = dict(
         by=(np.array(["El-Nino", "La-Nina", "boo", "Neutral"]),),
         axis=-1,
@@ -1066,19 +1069,20 @@ def test_factorize_reindex_sorting_strings():
     )
 
     expected = factorize_(**kwargs, reindex=True, sort=True)[0]
-    assert_equal(expected, np.array([0, 1, 4, 2], dtype=np.int64))
+    assert_equal(expected, np.array([0, 1, 4, 2], dtype=np.intp))
 
     expected = factorize_(**kwargs, reindex=True, sort=False)[0]
-    assert_equal(expected, np.array([0, 3, 4, 1], dtype=np.int64))
+    assert_equal(expected, np.array([0, 3, 4, 1], dtype=np.intp))
 
     expected = factorize_(**kwargs, reindex=False, sort=False)[0]
-    assert_equal(expected, np.array([0, 1, 2, 3], dtype=np.int64))
+    assert_equal(expected, np.array([0, 1, 2, 3], dtype=np.intp))
 
     expected = factorize_(**kwargs, reindex=False, sort=True)[0]
-    assert_equal(expected, np.array([0, 1, 3, 2], dtype=np.int64))
+    assert_equal(expected, np.array([0, 1, 3, 2], dtype=np.intp))
 
 
 def test_factorize_reindex_sorting_ints():
+    # pd.factorize seems to return intp so int32 on 32bit arch
     kwargs = dict(
         by=(np.array([-10, 1, 10, 2, 3, 5]),),
         axis=-1,
@@ -1086,18 +1090,18 @@ def test_factorize_reindex_sorting_ints():
     )
 
     expected = factorize_(**kwargs, reindex=True, sort=True)[0]
-    assert_equal(expected, np.array([6, 1, 6, 2, 3, 5], dtype=np.int64))
+    assert_equal(expected, np.array([6, 1, 6, 2, 3, 5], dtype=np.intp))
 
     expected = factorize_(**kwargs, reindex=True, sort=False)[0]
-    assert_equal(expected, np.array([6, 1, 6, 2, 3, 5], dtype=np.int64))
+    assert_equal(expected, np.array([6, 1, 6, 2, 3, 5], dtype=np.intp))
 
     kwargs["expected_groups"] = (np.arange(5, -1, -1),)
 
     expected = factorize_(**kwargs, reindex=True, sort=True)[0]
-    assert_equal(expected, np.array([6, 1, 6, 2, 3, 5], dtype=np.int64))
+    assert_equal(expected, np.array([6, 1, 6, 2, 3, 5], dtype=np.intp))
 
     expected = factorize_(**kwargs, reindex=True, sort=False)[0]
-    assert_equal(expected, np.array([6, 4, 6, 3, 2, 0], dtype=np.int64))
+    assert_equal(expected, np.array([6, 4, 6, 3, 2, 0], dtype=np.intp))
 
 
 @requires_dask



View it on GitLab: https://salsa.debian.org/debian-gis-team/flox/-/commit/314ead24d41259d48e73b9163951f2d690a0ccb5

-- 
View it on GitLab: https://salsa.debian.org/debian-gis-team/flox/-/commit/314ead24d41259d48e73b9163951f2d690a0ccb5
You're receiving this email because of your account on salsa.debian.org.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-grass-devel/attachments/20230118/d05119c3/attachment-0001.htm>


More information about the Pkg-grass-devel mailing list