[Debian-med-packaging] Bug#992676: scipy breaks python-skbio autopkgtest: Unsupported dtype object

Paul Gevers elbrus at debian.org
Sun Aug 22 09:32:44 BST 2021


Source: scipy, python-skbio
Control: found -1 scipy/1.7.1-1
Control: found -1 python-skbio/0.5.6-4
Severity: serious
Tags: sid bookworm
X-Debbugs-CC: debian-ci at lists.debian.org
User: debian-ci at lists.debian.org
Usertags: breaks needs-update

Dear maintainer(s),

With a recent upload of scipy the autopkgtest of python-skbio fails in
testing when that autopkgtest is run with the binary packages of scipy
from unstable. It passes when run with only packages from testing. In
tabular form:

                       pass            fail
scipy                  from testing    1.7.1-1
python-skbio           from testing    0.5.6-4
all others             from testing    from testing

I copied some of the output at the bottom of this report.

Currently this regression is blocking the migration of scipy to testing
[1]. Due to the nature of this issue, I filed this bug report against
both packages. Can you please investigate the situation and reassign the
bug to the right package?

More information about this bug and the reason for filing it can be found on
https://wiki.debian.org/ContinuousIntegration/RegressionEmailInformation

Paul

[1] https://qa.debian.org/excuses.php?package=scipy

https://ci.debian.net/data/autopkgtest/testing/amd64/p/python-skbio/14750991/log.gz

=================================== FAILURES
===================================
____________________ testPERMDISP.test_centroids_eq_groups
_____________________

self = <skbio.stats.distance.tests.test_permdisp.testPERMDISP
testMethod=test_centroids_eq_groups>

    def test_centroids_eq_groups(self):
        exp = [[1.2886811963240687, 1.890538910062923, 1.490527658097728],
               [2.17349240061718, 2.3192679626679946, 2.028338553903792]]
        exp_stat, _ = f_oneway(*exp)

        dm = pcoa(self.eq_mat)
        dm = dm.samples

>       obs = _compute_groups(dm, 'centroid', self.grouping_eq)

skbio/stats/distance/tests/test_permdisp.py:121:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
skbio/stats/distance/_permdisp.py:251: in _compute_groups
    groups.append(cdist(df.values[:, :-1], [centroids.loc[label].values],
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

XA = array([[-1.7249342172942905, 0.4245950661770306, -0.4421150498378117,
        -0.8800986337268075, 0.0, 0.0],
       [...   [-1.813911896152713, 0.25012463135432966, 0.42415638246654996,
        0.9643046100830307, 0.0, 0.0]], dtype=object)
XB = array([[-1.19938106, -0.22670737, -0.50629061,  0.0977458 ,  0.
    ,
         0.        ]])
metric = 'euclidean', out = None, kwargs = {}, s = (3, 6), sB = (1, 6),
mA = 3
mB = 1, n = 6, mstr = 'euclidean'

    def cdist(XA, XB, metric='euclidean', *, out=None, **kwargs):
        """
        Compute distance between each pair of the two collections of inputs.

        See Notes for common calling conventions.

        Parameters
        ----------
        XA : array_like
            An :math:`m_A` by :math:`n` array of :math:`m_A`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        XB : array_like
            An :math:`m_B` by :math:`n` array of :math:`m_B`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        metric : str or callable, optional
            The distance metric to use. If a string, the distance
function can be
            'braycurtis', 'canberra', 'chebyshev', 'cityblock',
'correlation',
            'cosine', 'dice', 'euclidean', 'hamming', 'jaccard',
'jensenshannon',
            'kulsinski', 'mahalanobis', 'matching', 'minkowski',
'rogerstanimoto',
            'russellrao', 'seuclidean', 'sokalmichener', 'sokalsneath',
            'sqeuclidean', 'wminkowski', 'yule'.
        **kwargs : dict, optional
            Extra arguments to `metric`: refer to each metric
documentation for a
            list of all possible arguments.

            Some possible arguments:

            p : scalar
            The p-norm to apply for Minkowski, weighted and unweighted.
            Default: 2.

            w : array_like
            The weight vector for metrics that support weights (e.g.,
Minkowski).

            V : array_like
            The variance vector for standardized Euclidean.
            Default: var(vstack([XA, XB]), axis=0, ddof=1)

            VI : array_like
            The inverse of the covariance matrix for Mahalanobis.
            Default: inv(cov(vstack([XA, XB].T))).T

            out : ndarray
            The output array
            If not None, the distance matrix Y is stored in this array.

        Returns
        -------
        Y : ndarray
            A :math:`m_A` by :math:`m_B` distance matrix is returned.
            For each :math:`i` and :math:`j`, the metric
            ``dist(u=XA[i], v=XB[j])`` is computed and stored in the
            :math:`ij` th entry.

        Raises
        ------
        ValueError
            An exception is thrown if `XA` and `XB` do not have
            the same number of columns.

        Notes
        -----
        The following are common calling conventions:

        1. ``Y = cdist(XA, XB, 'euclidean')``

           Computes the distance between :math:`m` points using
           Euclidean distance (2-norm) as the distance metric between the
           points. The points are arranged as :math:`m`
           :math:`n`-dimensional row vectors in the matrix X.

        2. ``Y = cdist(XA, XB, 'minkowski', p=2.)``

           Computes the distances using the Minkowski distance
           :math:`||u-v||_p` (:math:`p`-norm) where :math:`p \\geq 1`.

        3. ``Y = cdist(XA, XB, 'cityblock')``

           Computes the city block or Manhattan distance between the
           points.

        4. ``Y = cdist(XA, XB, 'seuclidean', V=None)``

           Computes the standardized Euclidean distance. The standardized
           Euclidean distance between two n-vectors ``u`` and ``v`` is

           .. math::

              \\sqrt{\\sum {(u_i-v_i)^2 / V[x_i]}}.

           V is the variance vector; V[i] is the variance computed over all
           the i'th components of the points. If not passed, it is
           automatically computed.

        5. ``Y = cdist(XA, XB, 'sqeuclidean')``

           Computes the squared Euclidean distance :math:`||u-v||_2^2`
between
           the vectors.

        6. ``Y = cdist(XA, XB, 'cosine')``

           Computes the cosine distance between vectors u and v,

           .. math::

              1 - \\frac{u \\cdot v}
                       {{||u||}_2 {||v||}_2}

           where :math:`||*||_2` is the 2-norm of its argument ``*``, and
           :math:`u \\cdot v` is the dot product of :math:`u` and :math:`v`.

        7. ``Y = cdist(XA, XB, 'correlation')``

           Computes the correlation distance between vectors u and v.
This is

           .. math::

              1 - \\frac{(u - \\bar{u}) \\cdot (v - \\bar{v})}
                       {{||(u - \\bar{u})||}_2 {||(v - \\bar{v})||}_2}

           where :math:`\\bar{v}` is the mean of the elements of vector v,
           and :math:`x \\cdot y` is the dot product of :math:`x` and
:math:`y`.


        8. ``Y = cdist(XA, XB, 'hamming')``

           Computes the normalized Hamming distance, or the proportion of
           those vector elements between two n-vectors ``u`` and ``v``
           which disagree. To save memory, the matrix ``X`` can be of type
           boolean.

        9. ``Y = cdist(XA, XB, 'jaccard')``

           Computes the Jaccard distance between the points. Given two
           vectors, ``u`` and ``v``, the Jaccard distance is the
           proportion of those elements ``u[i]`` and ``v[i]`` that
           disagree where at least one of them is non-zero.

        10. ``Y = cdist(XA, XB, 'jensenshannon')``

            Computes the Jensen-Shannon distance between two probability
arrays.
            Given two probability vectors, :math:`p` and :math:`q`, the
            Jensen-Shannon distance is

            .. math::

               \\sqrt{\\frac{D(p \\parallel m) + D(q \\parallel m)}{2}}

            where :math:`m` is the pointwise mean of :math:`p` and :math:`q`
            and :math:`D` is the Kullback-Leibler divergence.

        11. ``Y = cdist(XA, XB, 'chebyshev')``

            Computes the Chebyshev distance between the points. The
            Chebyshev distance between two n-vectors ``u`` and ``v`` is the
            maximum norm-1 distance between their respective elements. More
            precisely, the distance is given by

            .. math::

               d(u,v) = \\max_i {|u_i-v_i|}.

        12. ``Y = cdist(XA, XB, 'canberra')``

            Computes the Canberra distance between the points. The
            Canberra distance between two points ``u`` and ``v`` is

            .. math::

              d(u,v) = \\sum_i \\frac{|u_i-v_i|}
                                   {|u_i|+|v_i|}.

        13. ``Y = cdist(XA, XB, 'braycurtis')``

            Computes the Bray-Curtis distance between the points. The
            Bray-Curtis distance between two points ``u`` and ``v`` is


            .. math::

                 d(u,v) = \\frac{\\sum_i (|u_i-v_i|)}
                               {\\sum_i (|u_i+v_i|)}

        14. ``Y = cdist(XA, XB, 'mahalanobis', VI=None)``

            Computes the Mahalanobis distance between the points. The
            Mahalanobis distance between two points ``u`` and ``v`` is
            :math:`\\sqrt{(u-v)(1/V)(u-v)^T}` where :math:`(1/V)` (the
``VI``
            variable) is the inverse covariance. If ``VI`` is not None,
            ``VI`` will be used as the inverse covariance matrix.

        15. ``Y = cdist(XA, XB, 'yule')``

            Computes the Yule distance between the boolean
            vectors. (see `yule` function documentation)

        16. ``Y = cdist(XA, XB, 'matching')``

            Synonym for 'hamming'.

        17. ``Y = cdist(XA, XB, 'dice')``

            Computes the Dice distance between the boolean vectors. (see
            `dice` function documentation)

        18. ``Y = cdist(XA, XB, 'kulsinski')``

            Computes the Kulsinski distance between the boolean
            vectors. (see `kulsinski` function documentation)

        19. ``Y = cdist(XA, XB, 'rogerstanimoto')``

            Computes the Rogers-Tanimoto distance between the boolean
            vectors. (see `rogerstanimoto` function documentation)

        20. ``Y = cdist(XA, XB, 'russellrao')``

            Computes the Russell-Rao distance between the boolean
            vectors. (see `russellrao` function documentation)

        21. ``Y = cdist(XA, XB, 'sokalmichener')``

            Computes the Sokal-Michener distance between the boolean
            vectors. (see `sokalmichener` function documentation)

        22. ``Y = cdist(XA, XB, 'sokalsneath')``

            Computes the Sokal-Sneath distance between the vectors. (see
            `sokalsneath` function documentation)


        23. ``Y = cdist(XA, XB, 'wminkowski', p=2., w=w)``

            Computes the weighted Minkowski distance between the
            vectors. (see `wminkowski` function documentation)

            'wminkowski' is deprecated and will be removed in SciPy 1.8.0.
            Use 'minkowski' instead.

        24. ``Y = cdist(XA, XB, f)``

            Computes the distance between all pairs of vectors in X
            using the user supplied 2-arity function f. For example,
            Euclidean distance between the vectors could be computed
            as follows::

              dm = cdist(XA, XB, lambda u, v: np.sqrt(((u-v)**2).sum()))

            Note that you should avoid passing a reference to one of
            the distance functions defined in this library. For example,::

              dm = cdist(XA, XB, sokalsneath)

            would calculate the pair-wise distances between the vectors in
            X using the Python function `sokalsneath`. This would result in
            sokalsneath being called :math:`{n \\choose 2}` times, which
            is inefficient. Instead, the optimized C version is more
            efficient, and we call it using the following syntax::

              dm = cdist(XA, XB, 'sokalsneath')

        Examples
        --------
        Find the Euclidean distances between four 2-D coordinates:

        >>> from scipy.spatial import distance
        >>> coords = [(35.0456, -85.2672),
        ...           (35.1174, -89.9711),
        ...           (35.9728, -83.9422),
        ...           (36.1667, -86.7833)]
        >>> distance.cdist(coords, coords, 'euclidean')
        array([[ 0.    ,  4.7044,  1.6172,  1.8856],
               [ 4.7044,  0.    ,  6.0893,  3.3561],
               [ 1.6172,  6.0893,  0.    ,  2.8477],
               [ 1.8856,  3.3561,  2.8477,  0.    ]])


        Find the Manhattan distance from a 3-D point to the corners of
the unit
        cube:

        >>> a = np.array([[0, 0, 0],
        ...               [0, 0, 1],
        ...               [0, 1, 0],
        ...               [0, 1, 1],
        ...               [1, 0, 0],
        ...               [1, 0, 1],
        ...               [1, 1, 0],
        ...               [1, 1, 1]])
        >>> b = np.array([[ 0.1,  0.2,  0.4]])
        >>> distance.cdist(a, b, 'cityblock')
        array([[ 0.7],
               [ 0.9],
               [ 1.3],
               [ 1.5],
               [ 1.5],
               [ 1.7],
               [ 2.1],
               [ 2.3]])

        """
        # You can also call this as:
        #     Y = cdist(XA, XB, 'test_abc')
        # where 'abc' is the metric being tested.  This computes the
distance
        # between all pairs of vectors in XA and XB using the distance
metric 'abc'
        # but with a more succinct, verifiable, but less efficient
implementation.

        XA = np.asarray(XA)
        XB = np.asarray(XB)

        s = XA.shape
        sB = XB.shape

        if len(s) != 2:
            raise ValueError('XA must be a 2-dimensional array.')
        if len(sB) != 2:
            raise ValueError('XB must be a 2-dimensional array.')
        if s[1] != sB[1]:
            raise ValueError('XA and XB must have the same number of
columns '
                             '(i.e. feature dimension.)')

        mA = s[0]
        mB = sB[0]
        n = s[1]

        if callable(metric):
            mstr = getattr(metric, '__name__', 'Unknown')
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                XA, XB, typ, kwargs = _validate_cdist_input(
                    XA, XB, mA, mB, n, metric_info, **kwargs)
            return _cdist_callable(XA, XB, metric=metric, out=out, **kwargs)
        elif isinstance(metric, str):
            mstr = metric.lower()
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                cdist_fn = metric_info.cdist_func
>               return cdist_fn(XA, XB, out=out, **kwargs)
E               ValueError: Unsupported dtype object

/usr/lib/python3/dist-packages/scipy/spatial/distance.py:2954: ValueError
___________________ testPERMDISP.test_centroids_mixedgroups
____________________

self = <skbio.stats.distance.tests.test_permdisp.testPERMDISP
testMethod=test_centroids_mixedgroups>

    def test_centroids_mixedgroups(self):
        exp = [[2.5847022428144935, 2.285624595858895,
                1.7022431146340287],
               [1.724817266046108, 1.724817266046108],
               [2.4333280644972795, 2.389000390879655,
                2.8547180589306036, 3.218568759338847]]
        dm = pcoa(self.uneq_mat)
        dm = dm.samples

        exp_stat, _ = f_oneway(*exp)

>       obs_mixed = _compute_groups(dm, 'centroid', self.grouping_un_mixed)

skbio/stats/distance/tests/test_permdisp.py:158:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
skbio/stats/distance/_permdisp.py:251: in _compute_groups
    groups.append(cdist(df.values[:, :-1], [centroids.loc[label].values],
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

XA = array([[-1.0559100095567946, 1.5297925821761993, 1.5885925999277593,
        -0.27423308995024576, -0.9980655246587443...8692,
-1.8614901635882695,
        0.018065319797415144, 0.18781865676804516, 0.0, 0.0, 0.0, 0.0]],
      dtype=object)
XB = array([[-0.96679999,  0.24357552, -0.42381734,  0.41217343,
-0.29276908,
         0.        ,  0.        ,  0.        ,  0.        ]])
metric = 'euclidean', out = None, kwargs = {}, s = (3, 9), sB = (1, 9),
mA = 3
mB = 1, n = 9, mstr = 'euclidean'

    def cdist(XA, XB, metric='euclidean', *, out=None, **kwargs):
        """
        Compute distance between each pair of the two collections of inputs.

        See Notes for common calling conventions.

        Parameters
        ----------
        XA : array_like
            An :math:`m_A` by :math:`n` array of :math:`m_A`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        XB : array_like
            An :math:`m_B` by :math:`n` array of :math:`m_B`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        metric : str or callable, optional
            The distance metric to use. If a string, the distance
function can be
            'braycurtis', 'canberra', 'chebyshev', 'cityblock',
'correlation',
            'cosine', 'dice', 'euclidean', 'hamming', 'jaccard',
'jensenshannon',
            'kulsinski', 'mahalanobis', 'matching', 'minkowski',
'rogerstanimoto',
            'russellrao', 'seuclidean', 'sokalmichener', 'sokalsneath',
            'sqeuclidean', 'wminkowski', 'yule'.
        **kwargs : dict, optional
            Extra arguments to `metric`: refer to each metric
documentation for a
            list of all possible arguments.

            Some possible arguments:

            p : scalar
            The p-norm to apply for Minkowski, weighted and unweighted.
            Default: 2.

            w : array_like
            The weight vector for metrics that support weights (e.g.,
Minkowski).

            V : array_like
            The variance vector for standardized Euclidean.
            Default: var(vstack([XA, XB]), axis=0, ddof=1)

            VI : array_like
            The inverse of the covariance matrix for Mahalanobis.
            Default: inv(cov(vstack([XA, XB].T))).T

            out : ndarray
            The output array
            If not None, the distance matrix Y is stored in this array.

        Returns
        -------
        Y : ndarray
            A :math:`m_A` by :math:`m_B` distance matrix is returned.
            For each :math:`i` and :math:`j`, the metric
            ``dist(u=XA[i], v=XB[j])`` is computed and stored in the
            :math:`ij` th entry.

        Raises
        ------
        ValueError
            An exception is thrown if `XA` and `XB` do not have
            the same number of columns.

        Notes
        -----
        The following are common calling conventions:

        1. ``Y = cdist(XA, XB, 'euclidean')``

           Computes the distance between :math:`m` points using
           Euclidean distance (2-norm) as the distance metric between the
           points. The points are arranged as :math:`m`
           :math:`n`-dimensional row vectors in the matrix X.

        2. ``Y = cdist(XA, XB, 'minkowski', p=2.)``

           Computes the distances using the Minkowski distance
           :math:`||u-v||_p` (:math:`p`-norm) where :math:`p \\geq 1`.

        3. ``Y = cdist(XA, XB, 'cityblock')``

           Computes the city block or Manhattan distance between the
           points.

        4. ``Y = cdist(XA, XB, 'seuclidean', V=None)``

           Computes the standardized Euclidean distance. The standardized
           Euclidean distance between two n-vectors ``u`` and ``v`` is

           .. math::

              \\sqrt{\\sum {(u_i-v_i)^2 / V[x_i]}}.

           V is the variance vector; V[i] is the variance computed over all
           the i'th components of the points. If not passed, it is
           automatically computed.

        5. ``Y = cdist(XA, XB, 'sqeuclidean')``

           Computes the squared Euclidean distance :math:`||u-v||_2^2`
between
           the vectors.

        6. ``Y = cdist(XA, XB, 'cosine')``

           Computes the cosine distance between vectors u and v,

           .. math::

              1 - \\frac{u \\cdot v}
                       {{||u||}_2 {||v||}_2}

           where :math:`||*||_2` is the 2-norm of its argument ``*``, and
           :math:`u \\cdot v` is the dot product of :math:`u` and :math:`v`.

        7. ``Y = cdist(XA, XB, 'correlation')``

           Computes the correlation distance between vectors u and v.
This is

           .. math::

              1 - \\frac{(u - \\bar{u}) \\cdot (v - \\bar{v})}
                       {{||(u - \\bar{u})||}_2 {||(v - \\bar{v})||}_2}

           where :math:`\\bar{v}` is the mean of the elements of vector v,
           and :math:`x \\cdot y` is the dot product of :math:`x` and
:math:`y`.


        8. ``Y = cdist(XA, XB, 'hamming')``

           Computes the normalized Hamming distance, or the proportion of
           those vector elements between two n-vectors ``u`` and ``v``
           which disagree. To save memory, the matrix ``X`` can be of type
           boolean.

        9. ``Y = cdist(XA, XB, 'jaccard')``

           Computes the Jaccard distance between the points. Given two
           vectors, ``u`` and ``v``, the Jaccard distance is the
           proportion of those elements ``u[i]`` and ``v[i]`` that
           disagree where at least one of them is non-zero.

        10. ``Y = cdist(XA, XB, 'jensenshannon')``

            Computes the Jensen-Shannon distance between two probability
arrays.
            Given two probability vectors, :math:`p` and :math:`q`, the
            Jensen-Shannon distance is

            .. math::

               \\sqrt{\\frac{D(p \\parallel m) + D(q \\parallel m)}{2}}

            where :math:`m` is the pointwise mean of :math:`p` and :math:`q`
            and :math:`D` is the Kullback-Leibler divergence.

        11. ``Y = cdist(XA, XB, 'chebyshev')``

            Computes the Chebyshev distance between the points. The
            Chebyshev distance between two n-vectors ``u`` and ``v`` is the
            maximum norm-1 distance between their respective elements. More
            precisely, the distance is given by

            .. math::

               d(u,v) = \\max_i {|u_i-v_i|}.

        12. ``Y = cdist(XA, XB, 'canberra')``

            Computes the Canberra distance between the points. The
            Canberra distance between two points ``u`` and ``v`` is

            .. math::

              d(u,v) = \\sum_i \\frac{|u_i-v_i|}
                                   {|u_i|+|v_i|}.

        13. ``Y = cdist(XA, XB, 'braycurtis')``

            Computes the Bray-Curtis distance between the points. The
            Bray-Curtis distance between two points ``u`` and ``v`` is


            .. math::

                 d(u,v) = \\frac{\\sum_i (|u_i-v_i|)}
                               {\\sum_i (|u_i+v_i|)}

        14. ``Y = cdist(XA, XB, 'mahalanobis', VI=None)``

            Computes the Mahalanobis distance between the points. The
            Mahalanobis distance between two points ``u`` and ``v`` is
            :math:`\\sqrt{(u-v)(1/V)(u-v)^T}` where :math:`(1/V)` (the
``VI``
            variable) is the inverse covariance. If ``VI`` is not None,
            ``VI`` will be used as the inverse covariance matrix.

        15. ``Y = cdist(XA, XB, 'yule')``

            Computes the Yule distance between the boolean
            vectors. (see `yule` function documentation)

        16. ``Y = cdist(XA, XB, 'matching')``

            Synonym for 'hamming'.

        17. ``Y = cdist(XA, XB, 'dice')``

            Computes the Dice distance between the boolean vectors. (see
            `dice` function documentation)

        18. ``Y = cdist(XA, XB, 'kulsinski')``

            Computes the Kulsinski distance between the boolean
            vectors. (see `kulsinski` function documentation)

        19. ``Y = cdist(XA, XB, 'rogerstanimoto')``

            Computes the Rogers-Tanimoto distance between the boolean
            vectors. (see `rogerstanimoto` function documentation)

        20. ``Y = cdist(XA, XB, 'russellrao')``

            Computes the Russell-Rao distance between the boolean
            vectors. (see `russellrao` function documentation)

        21. ``Y = cdist(XA, XB, 'sokalmichener')``

            Computes the Sokal-Michener distance between the boolean
            vectors. (see `sokalmichener` function documentation)

        22. ``Y = cdist(XA, XB, 'sokalsneath')``

            Computes the Sokal-Sneath distance between the vectors. (see
            `sokalsneath` function documentation)


        23. ``Y = cdist(XA, XB, 'wminkowski', p=2., w=w)``

            Computes the weighted Minkowski distance between the
            vectors. (see `wminkowski` function documentation)

            'wminkowski' is deprecated and will be removed in SciPy 1.8.0.
            Use 'minkowski' instead.

        24. ``Y = cdist(XA, XB, f)``

            Computes the distance between all pairs of vectors in X
            using the user supplied 2-arity function f. For example,
            Euclidean distance between the vectors could be computed
            as follows::

              dm = cdist(XA, XB, lambda u, v: np.sqrt(((u-v)**2).sum()))

            Note that you should avoid passing a reference to one of
            the distance functions defined in this library. For example,::

              dm = cdist(XA, XB, sokalsneath)

            would calculate the pair-wise distances between the vectors in
            X using the Python function `sokalsneath`. This would result in
            sokalsneath being called :math:`{n \\choose 2}` times, which
            is inefficient. Instead, the optimized C version is more
            efficient, and we call it using the following syntax::

              dm = cdist(XA, XB, 'sokalsneath')

        Examples
        --------
        Find the Euclidean distances between four 2-D coordinates:

        >>> from scipy.spatial import distance
        >>> coords = [(35.0456, -85.2672),
        ...           (35.1174, -89.9711),
        ...           (35.9728, -83.9422),
        ...           (36.1667, -86.7833)]
        >>> distance.cdist(coords, coords, 'euclidean')
        array([[ 0.    ,  4.7044,  1.6172,  1.8856],
               [ 4.7044,  0.    ,  6.0893,  3.3561],
               [ 1.6172,  6.0893,  0.    ,  2.8477],
               [ 1.8856,  3.3561,  2.8477,  0.    ]])


        Find the Manhattan distance from a 3-D point to the corners of
the unit
        cube:

        >>> a = np.array([[0, 0, 0],
        ...               [0, 0, 1],
        ...               [0, 1, 0],
        ...               [0, 1, 1],
        ...               [1, 0, 0],
        ...               [1, 0, 1],
        ...               [1, 1, 0],
        ...               [1, 1, 1]])
        >>> b = np.array([[ 0.1,  0.2,  0.4]])
        >>> distance.cdist(a, b, 'cityblock')
        array([[ 0.7],
               [ 0.9],
               [ 1.3],
               [ 1.5],
               [ 1.5],
               [ 1.7],
               [ 2.1],
               [ 2.3]])

        """
        # You can also call this as:
        #     Y = cdist(XA, XB, 'test_abc')
        # where 'abc' is the metric being tested.  This computes the
distance
        # between all pairs of vectors in XA and XB using the distance
metric 'abc'
        # but with a more succinct, verifiable, but less efficient
implementation.

        XA = np.asarray(XA)
        XB = np.asarray(XB)

        s = XA.shape
        sB = XB.shape

        if len(s) != 2:
            raise ValueError('XA must be a 2-dimensional array.')
        if len(sB) != 2:
            raise ValueError('XB must be a 2-dimensional array.')
        if s[1] != sB[1]:
            raise ValueError('XA and XB must have the same number of
columns '
                             '(i.e. feature dimension.)')

        mA = s[0]
        mB = sB[0]
        n = s[1]

        if callable(metric):
            mstr = getattr(metric, '__name__', 'Unknown')
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                XA, XB, typ, kwargs = _validate_cdist_input(
                    XA, XB, mA, mB, n, metric_info, **kwargs)
            return _cdist_callable(XA, XB, metric=metric, out=out, **kwargs)
        elif isinstance(metric, str):
            mstr = metric.lower()
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                cdist_fn = metric_info.cdist_func
>               return cdist_fn(XA, XB, out=out, **kwargs)
E               ValueError: Unsupported dtype object

/usr/lib/python3/dist-packages/scipy/spatial/distance.py:2954: ValueError
_______________________ testPERMDISP.test_centroids_null
_______________________

self = <skbio.stats.distance.tests.test_permdisp.testPERMDISP
testMethod=test_centroids_null>

    def test_centroids_null(self):
        dm = pcoa(self.null_mat)
        dm = dm.samples

>       obs_null = _compute_groups(dm, 'centroid', self.grouping_eq)

skbio/stats/distance/tests/test_permdisp.py:165:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
skbio/stats/distance/_permdisp.py:251: in _compute_groups
    groups.append(cdist(df.values[:, :-1], [centroids.loc[label].values],
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

XA = array([[0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
       [0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
       [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]], dtype=object)
XB = array([[0., 0., 0., 0., 0., 0.]]), metric = 'euclidean', out = None
kwargs = {}, s = (3, 6), sB = (1, 6), mA = 3, mB = 1, n = 6, mstr =
'euclidean'

    def cdist(XA, XB, metric='euclidean', *, out=None, **kwargs):
        """
        Compute distance between each pair of the two collections of inputs.

        See Notes for common calling conventions.

        Parameters
        ----------
        XA : array_like
            An :math:`m_A` by :math:`n` array of :math:`m_A`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        XB : array_like
            An :math:`m_B` by :math:`n` array of :math:`m_B`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        metric : str or callable, optional
            The distance metric to use. If a string, the distance
function can be
            'braycurtis', 'canberra', 'chebyshev', 'cityblock',
'correlation',
            'cosine', 'dice', 'euclidean', 'hamming', 'jaccard',
'jensenshannon',
            'kulsinski', 'mahalanobis', 'matching', 'minkowski',
'rogerstanimoto',
            'russellrao', 'seuclidean', 'sokalmichener', 'sokalsneath',
            'sqeuclidean', 'wminkowski', 'yule'.
        **kwargs : dict, optional
            Extra arguments to `metric`: refer to each metric
documentation for a
            list of all possible arguments.

            Some possible arguments:

            p : scalar
            The p-norm to apply for Minkowski, weighted and unweighted.
            Default: 2.

            w : array_like
            The weight vector for metrics that support weights (e.g.,
Minkowski).

            V : array_like
            The variance vector for standardized Euclidean.
            Default: var(vstack([XA, XB]), axis=0, ddof=1)

            VI : array_like
            The inverse of the covariance matrix for Mahalanobis.
            Default: inv(cov(vstack([XA, XB].T))).T

            out : ndarray
            The output array
            If not None, the distance matrix Y is stored in this array.

        Returns
        -------
        Y : ndarray
            A :math:`m_A` by :math:`m_B` distance matrix is returned.
            For each :math:`i` and :math:`j`, the metric
            ``dist(u=XA[i], v=XB[j])`` is computed and stored in the
            :math:`ij` th entry.

        Raises
        ------
        ValueError
            An exception is thrown if `XA` and `XB` do not have
            the same number of columns.

        Notes
        -----
        The following are common calling conventions:

        1. ``Y = cdist(XA, XB, 'euclidean')``

           Computes the distance between :math:`m` points using
           Euclidean distance (2-norm) as the distance metric between the
           points. The points are arranged as :math:`m`
           :math:`n`-dimensional row vectors in the matrix X.

        2. ``Y = cdist(XA, XB, 'minkowski', p=2.)``

           Computes the distances using the Minkowski distance
           :math:`||u-v||_p` (:math:`p`-norm) where :math:`p \\geq 1`.

        3. ``Y = cdist(XA, XB, 'cityblock')``

           Computes the city block or Manhattan distance between the
           points.

        4. ``Y = cdist(XA, XB, 'seuclidean', V=None)``

           Computes the standardized Euclidean distance. The standardized
           Euclidean distance between two n-vectors ``u`` and ``v`` is

           .. math::

              \\sqrt{\\sum {(u_i-v_i)^2 / V[x_i]}}.

           V is the variance vector; V[i] is the variance computed over all
           the i'th components of the points. If not passed, it is
           automatically computed.

        5. ``Y = cdist(XA, XB, 'sqeuclidean')``

           Computes the squared Euclidean distance :math:`||u-v||_2^2`
between
           the vectors.

        6. ``Y = cdist(XA, XB, 'cosine')``

           Computes the cosine distance between vectors u and v,

           .. math::

              1 - \\frac{u \\cdot v}
                       {{||u||}_2 {||v||}_2}

           where :math:`||*||_2` is the 2-norm of its argument ``*``, and
           :math:`u \\cdot v` is the dot product of :math:`u` and :math:`v`.

        7. ``Y = cdist(XA, XB, 'correlation')``

           Computes the correlation distance between vectors u and v.
This is

           .. math::

              1 - \\frac{(u - \\bar{u}) \\cdot (v - \\bar{v})}
                       {{||(u - \\bar{u})||}_2 {||(v - \\bar{v})||}_2}

           where :math:`\\bar{v}` is the mean of the elements of vector v,
           and :math:`x \\cdot y` is the dot product of :math:`x` and
:math:`y`.


        8. ``Y = cdist(XA, XB, 'hamming')``

           Computes the normalized Hamming distance, or the proportion of
           those vector elements between two n-vectors ``u`` and ``v``
           which disagree. To save memory, the matrix ``X`` can be of type
           boolean.

        9. ``Y = cdist(XA, XB, 'jaccard')``

           Computes the Jaccard distance between the points. Given two
           vectors, ``u`` and ``v``, the Jaccard distance is the
           proportion of those elements ``u[i]`` and ``v[i]`` that
           disagree where at least one of them is non-zero.

        10. ``Y = cdist(XA, XB, 'jensenshannon')``

            Computes the Jensen-Shannon distance between two probability
arrays.
            Given two probability vectors, :math:`p` and :math:`q`, the
            Jensen-Shannon distance is

            .. math::

               \\sqrt{\\frac{D(p \\parallel m) + D(q \\parallel m)}{2}}

            where :math:`m` is the pointwise mean of :math:`p` and :math:`q`
            and :math:`D` is the Kullback-Leibler divergence.

        11. ``Y = cdist(XA, XB, 'chebyshev')``

            Computes the Chebyshev distance between the points. The
            Chebyshev distance between two n-vectors ``u`` and ``v`` is the
            maximum norm-1 distance between their respective elements. More
            precisely, the distance is given by

            .. math::

               d(u,v) = \\max_i {|u_i-v_i|}.

        12. ``Y = cdist(XA, XB, 'canberra')``

            Computes the Canberra distance between the points. The
            Canberra distance between two points ``u`` and ``v`` is

            .. math::

              d(u,v) = \\sum_i \\frac{|u_i-v_i|}
                                   {|u_i|+|v_i|}.

        13. ``Y = cdist(XA, XB, 'braycurtis')``

            Computes the Bray-Curtis distance between the points. The
            Bray-Curtis distance between two points ``u`` and ``v`` is


            .. math::

                 d(u,v) = \\frac{\\sum_i (|u_i-v_i|)}
                               {\\sum_i (|u_i+v_i|)}

        14. ``Y = cdist(XA, XB, 'mahalanobis', VI=None)``

            Computes the Mahalanobis distance between the points. The
            Mahalanobis distance between two points ``u`` and ``v`` is
            :math:`\\sqrt{(u-v)(1/V)(u-v)^T}` where :math:`(1/V)` (the
``VI``
            variable) is the inverse covariance. If ``VI`` is not None,
            ``VI`` will be used as the inverse covariance matrix.

        15. ``Y = cdist(XA, XB, 'yule')``

            Computes the Yule distance between the boolean
            vectors. (see `yule` function documentation)

        16. ``Y = cdist(XA, XB, 'matching')``

            Synonym for 'hamming'.

        17. ``Y = cdist(XA, XB, 'dice')``

            Computes the Dice distance between the boolean vectors. (see
            `dice` function documentation)

        18. ``Y = cdist(XA, XB, 'kulsinski')``

            Computes the Kulsinski distance between the boolean
            vectors. (see `kulsinski` function documentation)

        19. ``Y = cdist(XA, XB, 'rogerstanimoto')``

            Computes the Rogers-Tanimoto distance between the boolean
            vectors. (see `rogerstanimoto` function documentation)

        20. ``Y = cdist(XA, XB, 'russellrao')``

            Computes the Russell-Rao distance between the boolean
            vectors. (see `russellrao` function documentation)

        21. ``Y = cdist(XA, XB, 'sokalmichener')``

            Computes the Sokal-Michener distance between the boolean
            vectors. (see `sokalmichener` function documentation)

        22. ``Y = cdist(XA, XB, 'sokalsneath')``

            Computes the Sokal-Sneath distance between the vectors. (see
            `sokalsneath` function documentation)


        23. ``Y = cdist(XA, XB, 'wminkowski', p=2., w=w)``

            Computes the weighted Minkowski distance between the
            vectors. (see `wminkowski` function documentation)

            'wminkowski' is deprecated and will be removed in SciPy 1.8.0.
            Use 'minkowski' instead.

        24. ``Y = cdist(XA, XB, f)``

            Computes the distance between all pairs of vectors in X
            using the user supplied 2-arity function f. For example,
            Euclidean distance between the vectors could be computed
            as follows::

              dm = cdist(XA, XB, lambda u, v: np.sqrt(((u-v)**2).sum()))

            Note that you should avoid passing a reference to one of
            the distance functions defined in this library. For example,::

              dm = cdist(XA, XB, sokalsneath)

            would calculate the pair-wise distances between the vectors in
            X using the Python function `sokalsneath`. This would result in
            sokalsneath being called :math:`{n \\choose 2}` times, which
            is inefficient. Instead, the optimized C version is more
            efficient, and we call it using the following syntax::

              dm = cdist(XA, XB, 'sokalsneath')

        Examples
        --------
        Find the Euclidean distances between four 2-D coordinates:

        >>> from scipy.spatial import distance
        >>> coords = [(35.0456, -85.2672),
        ...           (35.1174, -89.9711),
        ...           (35.9728, -83.9422),
        ...           (36.1667, -86.7833)]
        >>> distance.cdist(coords, coords, 'euclidean')
        array([[ 0.    ,  4.7044,  1.6172,  1.8856],
               [ 4.7044,  0.    ,  6.0893,  3.3561],
               [ 1.6172,  6.0893,  0.    ,  2.8477],
               [ 1.8856,  3.3561,  2.8477,  0.    ]])


        Find the Manhattan distance from a 3-D point to the corners of
the unit
        cube:

        >>> a = np.array([[0, 0, 0],
        ...               [0, 0, 1],
        ...               [0, 1, 0],
        ...               [0, 1, 1],
        ...               [1, 0, 0],
        ...               [1, 0, 1],
        ...               [1, 1, 0],
        ...               [1, 1, 1]])
        >>> b = np.array([[ 0.1,  0.2,  0.4]])
        >>> distance.cdist(a, b, 'cityblock')
        array([[ 0.7],
               [ 0.9],
               [ 1.3],
               [ 1.5],
               [ 1.5],
               [ 1.7],
               [ 2.1],
               [ 2.3]])

        """
        # You can also call this as:
        #     Y = cdist(XA, XB, 'test_abc')
        # where 'abc' is the metric being tested.  This computes the
distance
        # between all pairs of vectors in XA and XB using the distance
metric 'abc'
        # but with a more succinct, verifiable, but less efficient
implementation.

        XA = np.asarray(XA)
        XB = np.asarray(XB)

        s = XA.shape
        sB = XB.shape

        if len(s) != 2:
            raise ValueError('XA must be a 2-dimensional array.')
        if len(sB) != 2:
            raise ValueError('XB must be a 2-dimensional array.')
        if s[1] != sB[1]:
            raise ValueError('XA and XB must have the same number of
columns '
                             '(i.e. feature dimension.)')

        mA = s[0]
        mB = sB[0]
        n = s[1]

        if callable(metric):
            mstr = getattr(metric, '__name__', 'Unknown')
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                XA, XB, typ, kwargs = _validate_cdist_input(
                    XA, XB, mA, mB, n, metric_info, **kwargs)
            return _cdist_callable(XA, XB, metric=metric, out=out, **kwargs)
        elif isinstance(metric, str):
            mstr = metric.lower()
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                cdist_fn = metric_info.cdist_func
>               return cdist_fn(XA, XB, out=out, **kwargs)
E               ValueError: Unsupported dtype object

/usr/lib/python3/dist-packages/scipy/spatial/distance.py:2954: ValueError
___________________ testPERMDISP.test_centroids_uneq_groups
____________________

self = <skbio.stats.distance.tests.test_permdisp.testPERMDISP
testMethod=test_centroids_uneq_groups>

    def test_centroids_uneq_groups(self):
        """
        the expected result here was calculated by hand
        """
        exp = [[2.5847022428144935, 2.285624595858895,
                1.7022431146340287],
               [1.724817266046108, 1.724817266046108],
               [2.4333280644972795, 2.389000390879655,
                2.8547180589306036, 3.218568759338847]]
        exp_stat, _ = f_oneway(*exp)

        dm = pcoa(self.uneq_mat)
        dm = dm.samples

>       obs = _compute_groups(dm, 'centroid', self.grouping_uneq)

skbio/stats/distance/tests/test_permdisp.py:141:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
skbio/stats/distance/_permdisp.py:251: in _compute_groups
    groups.append(cdist(df.values[:, :-1], [centroids.loc[label].values],
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

XA = array([[-1.0559100095567946, 1.5297925821761993, 1.5885925999277593,
        -0.27423308995024576, -0.9980655246587443...8692,
-1.8614901635882695,
        0.018065319797415144, 0.18781865676804516, 0.0, 0.0, 0.0, 0.0]],
      dtype=object)
XB = array([[-0.96679999,  0.24357552, -0.42381734,  0.41217343,
-0.29276908,
         0.        ,  0.        ,  0.        ,  0.        ]])
metric = 'euclidean', out = None, kwargs = {}, s = (3, 9), sB = (1, 9),
mA = 3
mB = 1, n = 9, mstr = 'euclidean'

    def cdist(XA, XB, metric='euclidean', *, out=None, **kwargs):
        """
        Compute distance between each pair of the two collections of inputs.

        See Notes for common calling conventions.

        Parameters
        ----------
        XA : array_like
            An :math:`m_A` by :math:`n` array of :math:`m_A`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        XB : array_like
            An :math:`m_B` by :math:`n` array of :math:`m_B`
            original observations in an :math:`n`-dimensional space.
            Inputs are converted to float type.
        metric : str or callable, optional
            The distance metric to use. If a string, the distance
function can be
            'braycurtis', 'canberra', 'chebyshev', 'cityblock',
'correlation',
            'cosine', 'dice', 'euclidean', 'hamming', 'jaccard',
'jensenshannon',
            'kulsinski', 'mahalanobis', 'matching', 'minkowski',
'rogerstanimoto',
            'russellrao', 'seuclidean', 'sokalmichener', 'sokalsneath',
            'sqeuclidean', 'wminkowski', 'yule'.
        **kwargs : dict, optional
            Extra arguments to `metric`: refer to each metric
documentation for a
            list of all possible arguments.

            Some possible arguments:

            p : scalar
            The p-norm to apply for Minkowski, weighted and unweighted.
            Default: 2.

            w : array_like
            The weight vector for metrics that support weights (e.g.,
Minkowski).

            V : array_like
            The variance vector for standardized Euclidean.
            Default: var(vstack([XA, XB]), axis=0, ddof=1)

            VI : array_like
            The inverse of the covariance matrix for Mahalanobis.
            Default: inv(cov(vstack([XA, XB].T))).T

            out : ndarray
            The output array
            If not None, the distance matrix Y is stored in this array.

        Returns
        -------
        Y : ndarray
            A :math:`m_A` by :math:`m_B` distance matrix is returned.
            For each :math:`i` and :math:`j`, the metric
            ``dist(u=XA[i], v=XB[j])`` is computed and stored in the
            :math:`ij` th entry.

        Raises
        ------
        ValueError
            An exception is thrown if `XA` and `XB` do not have
            the same number of columns.

        Notes
        -----
        The following are common calling conventions:

        1. ``Y = cdist(XA, XB, 'euclidean')``

           Computes the distance between :math:`m` points using
           Euclidean distance (2-norm) as the distance metric between the
           points. The points are arranged as :math:`m`
           :math:`n`-dimensional row vectors in the matrix X.

        2. ``Y = cdist(XA, XB, 'minkowski', p=2.)``

           Computes the distances using the Minkowski distance
           :math:`||u-v||_p` (:math:`p`-norm) where :math:`p \\geq 1`.

        3. ``Y = cdist(XA, XB, 'cityblock')``

           Computes the city block or Manhattan distance between the
           points.

        4. ``Y = cdist(XA, XB, 'seuclidean', V=None)``

           Computes the standardized Euclidean distance. The standardized
           Euclidean distance between two n-vectors ``u`` and ``v`` is

           .. math::

              \\sqrt{\\sum {(u_i-v_i)^2 / V[x_i]}}.

           V is the variance vector; V[i] is the variance computed over all
           the i'th components of the points. If not passed, it is
           automatically computed.

        5. ``Y = cdist(XA, XB, 'sqeuclidean')``

           Computes the squared Euclidean distance :math:`||u-v||_2^2`
between
           the vectors.

        6. ``Y = cdist(XA, XB, 'cosine')``

           Computes the cosine distance between vectors u and v,

           .. math::

              1 - \\frac{u \\cdot v}
                       {{||u||}_2 {||v||}_2}

           where :math:`||*||_2` is the 2-norm of its argument ``*``, and
           :math:`u \\cdot v` is the dot product of :math:`u` and :math:`v`.

        7. ``Y = cdist(XA, XB, 'correlation')``

           Computes the correlation distance between vectors u and v.
This is

           .. math::

              1 - \\frac{(u - \\bar{u}) \\cdot (v - \\bar{v})}
                       {{||(u - \\bar{u})||}_2 {||(v - \\bar{v})||}_2}

           where :math:`\\bar{v}` is the mean of the elements of vector v,
           and :math:`x \\cdot y` is the dot product of :math:`x` and
:math:`y`.


        8. ``Y = cdist(XA, XB, 'hamming')``

           Computes the normalized Hamming distance, or the proportion of
           those vector elements between two n-vectors ``u`` and ``v``
           which disagree. To save memory, the matrix ``X`` can be of type
           boolean.

        9. ``Y = cdist(XA, XB, 'jaccard')``

           Computes the Jaccard distance between the points. Given two
           vectors, ``u`` and ``v``, the Jaccard distance is the
           proportion of those elements ``u[i]`` and ``v[i]`` that
           disagree where at least one of them is non-zero.

        10. ``Y = cdist(XA, XB, 'jensenshannon')``

            Computes the Jensen-Shannon distance between two probability
arrays.
            Given two probability vectors, :math:`p` and :math:`q`, the
            Jensen-Shannon distance is

            .. math::

               \\sqrt{\\frac{D(p \\parallel m) + D(q \\parallel m)}{2}}

            where :math:`m` is the pointwise mean of :math:`p` and :math:`q`
            and :math:`D` is the Kullback-Leibler divergence.

        11. ``Y = cdist(XA, XB, 'chebyshev')``

            Computes the Chebyshev distance between the points. The
            Chebyshev distance between two n-vectors ``u`` and ``v`` is the
            maximum norm-1 distance between their respective elements. More
            precisely, the distance is given by

            .. math::

               d(u,v) = \\max_i {|u_i-v_i|}.

        12. ``Y = cdist(XA, XB, 'canberra')``

            Computes the Canberra distance between the points. The
            Canberra distance between two points ``u`` and ``v`` is

            .. math::

              d(u,v) = \\sum_i \\frac{|u_i-v_i|}
                                   {|u_i|+|v_i|}.

        13. ``Y = cdist(XA, XB, 'braycurtis')``

            Computes the Bray-Curtis distance between the points. The
            Bray-Curtis distance between two points ``u`` and ``v`` is


            .. math::

                 d(u,v) = \\frac{\\sum_i (|u_i-v_i|)}
                               {\\sum_i (|u_i+v_i|)}

        14. ``Y = cdist(XA, XB, 'mahalanobis', VI=None)``

            Computes the Mahalanobis distance between the points. The
            Mahalanobis distance between two points ``u`` and ``v`` is
            :math:`\\sqrt{(u-v)(1/V)(u-v)^T}` where :math:`(1/V)` (the
``VI``
            variable) is the inverse covariance. If ``VI`` is not None,
            ``VI`` will be used as the inverse covariance matrix.

        15. ``Y = cdist(XA, XB, 'yule')``

            Computes the Yule distance between the boolean
            vectors. (see `yule` function documentation)

        16. ``Y = cdist(XA, XB, 'matching')``

            Synonym for 'hamming'.

        17. ``Y = cdist(XA, XB, 'dice')``

            Computes the Dice distance between the boolean vectors. (see
            `dice` function documentation)

        18. ``Y = cdist(XA, XB, 'kulsinski')``

            Computes the Kulsinski distance between the boolean
            vectors. (see `kulsinski` function documentation)

        19. ``Y = cdist(XA, XB, 'rogerstanimoto')``

            Computes the Rogers-Tanimoto distance between the boolean
            vectors. (see `rogerstanimoto` function documentation)

        20. ``Y = cdist(XA, XB, 'russellrao')``

            Computes the Russell-Rao distance between the boolean
            vectors. (see `russellrao` function documentation)

        21. ``Y = cdist(XA, XB, 'sokalmichener')``

            Computes the Sokal-Michener distance between the boolean
            vectors. (see `sokalmichener` function documentation)

        22. ``Y = cdist(XA, XB, 'sokalsneath')``

            Computes the Sokal-Sneath distance between the vectors. (see
            `sokalsneath` function documentation)


        23. ``Y = cdist(XA, XB, 'wminkowski', p=2., w=w)``

            Computes the weighted Minkowski distance between the
            vectors. (see `wminkowski` function documentation)

            'wminkowski' is deprecated and will be removed in SciPy 1.8.0.
            Use 'minkowski' instead.

        24. ``Y = cdist(XA, XB, f)``

            Computes the distance between all pairs of vectors in X
            using the user supplied 2-arity function f. For example,
            Euclidean distance between the vectors could be computed
            as follows::

              dm = cdist(XA, XB, lambda u, v: np.sqrt(((u-v)**2).sum()))

            Note that you should avoid passing a reference to one of
            the distance functions defined in this library. For example,::

              dm = cdist(XA, XB, sokalsneath)

            would calculate the pair-wise distances between the vectors in
            X using the Python function `sokalsneath`. This would result in
            sokalsneath being called :math:`{n \\choose 2}` times, which
            is inefficient. Instead, the optimized C version is more
            efficient, and we call it using the following syntax::

              dm = cdist(XA, XB, 'sokalsneath')

        Examples
        --------
        Find the Euclidean distances between four 2-D coordinates:

        >>> from scipy.spatial import distance
        >>> coords = [(35.0456, -85.2672),
        ...           (35.1174, -89.9711),
        ...           (35.9728, -83.9422),
        ...           (36.1667, -86.7833)]
        >>> distance.cdist(coords, coords, 'euclidean')
        array([[ 0.    ,  4.7044,  1.6172,  1.8856],
               [ 4.7044,  0.    ,  6.0893,  3.3561],
               [ 1.6172,  6.0893,  0.    ,  2.8477],
               [ 1.8856,  3.3561,  2.8477,  0.    ]])


        Find the Manhattan distance from a 3-D point to the corners of
the unit
        cube:

        >>> a = np.array([[0, 0, 0],
        ...               [0, 0, 1],
        ...               [0, 1, 0],
        ...               [0, 1, 1],
        ...               [1, 0, 0],
        ...               [1, 0, 1],
        ...               [1, 1, 0],
        ...               [1, 1, 1]])
        >>> b = np.array([[ 0.1,  0.2,  0.4]])
        >>> distance.cdist(a, b, 'cityblock')
        array([[ 0.7],
               [ 0.9],
               [ 1.3],
               [ 1.5],
               [ 1.5],
               [ 1.7],
               [ 2.1],
               [ 2.3]])

        """
        # You can also call this as:
        #     Y = cdist(XA, XB, 'test_abc')
        # where 'abc' is the metric being tested.  This computes the
distance
        # between all pairs of vectors in XA and XB using the distance
metric 'abc'
        # but with a more succinct, verifiable, but less efficient
implementation.

        XA = np.asarray(XA)
        XB = np.asarray(XB)

        s = XA.shape
        sB = XB.shape

        if len(s) != 2:
            raise ValueError('XA must be a 2-dimensional array.')
        if len(sB) != 2:
            raise ValueError('XB must be a 2-dimensional array.')
        if s[1] != sB[1]:
            raise ValueError('XA and XB must have the same number of
columns '
                             '(i.e. feature dimension.)')

        mA = s[0]
        mB = sB[0]
        n = s[1]

        if callable(metric):
            mstr = getattr(metric, '__name__', 'Unknown')
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                XA, XB, typ, kwargs = _validate_cdist_input(
                    XA, XB, mA, mB, n, metric_info, **kwargs)
            return _cdist_callable(XA, XB, metric=metric, out=out, **kwargs)
        elif isinstance(metric, str):
            mstr = metric.lower()
            metric_info = _METRIC_ALIAS.get(mstr, None)
            if metric_info is not None:
                cdist_fn = metric_info.cdist_func
>               return cdist_fn(XA, XB, out=out, **kwargs)
E               ValueError: Unsupported dtype object

/usr/lib/python3/dist-packages/scipy/spatial/distance.py:2954: ValueError

-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature
Type: application/pgp-signature
Size: 495 bytes
Desc: OpenPGP digital signature
URL: <http://alioth-lists.debian.net/pipermail/debian-med-packaging/attachments/20210822/2e3cabb7/attachment-0001.sig>


More information about the Debian-med-packaging mailing list