Bug#1101852: joblib: FTBFS: failing tests

Santiago Vila sanvila at debian.org
Tue Apr 1 18:12:42 BST 2025


Package: src:joblib
Version: 1.4.2-3
Severity: serious
Tags: ftbfs trixie sid

Dear maintainer:

During a rebuild of all packages in unstable, your package failed to build:

--------------------------------------------------------------------------------
[...]
 debian/rules clean
dh clean --buildsystem=pybuild
   dh_auto_clean -O--buildsystem=pybuild
   dh_autoreconf_clean -O--buildsystem=pybuild
   dh_clean -O--buildsystem=pybuild
 debian/rules binary
dh binary --buildsystem=pybuild
   dh_update_autotools_config -O--buildsystem=pybuild
   dh_autoreconf -O--buildsystem=pybuild
   dh_auto_configure -O--buildsystem=pybuild
   dh_auto_build -O--buildsystem=pybuild
I: pybuild plugin_pyproject:129: Building wheel for python3.13 with "build" module
I: pybuild base:311: python3.13 -m build --skip-dependency-check --no-isolation --wheel --outdir /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib  
* Building wheel...

[... snipped ...]

[Worker 72411] Mean for slice 24 is 0.499804
[Worker 72411] Mean for slice 25 is 0.499862
[Worker 72411] Mean for slice 26 is 0.500288
[Worker 72411] Mean for slice 27 is 0.500574
[Worker 72411] Mean for slice 36 is 0.499939
[Worker 72411] Mean for slice 37 is 0.499588
[Worker 72411] Mean for slice 38 is 0.499484
[Worker 72411] Mean for slice 39 is 0.499655
[Worker 72411] Mean for slice 40 is 0.499515
[Worker 72411] Mean for slice 41 is 0.499870
[Worker 72411] Mean for slice 42 is 0.499825
[Worker 72411] Mean for slice 43 is 0.500291
[Worker 72411] Mean for slice 52 is 0.500135
[Worker 72411] Mean for slice 53 is 0.499961
[Worker 72411] Mean for slice 54 is 0.500484
[Worker 72411] Mean for slice 55 is 0.500101
[Worker 72411] Mean for slice 56 is 0.499932
[Worker 72411] Mean for slice 57 is 0.499655
[Worker 72411] Mean for slice 58 is 0.499578
[Worker 72411] Mean for slice 59 is 0.499419
[Worker 72411] Mean for slice 76 is 0.500085
[Worker 72411] Mean for slice 77 is 0.500091
[Worker 72411] Mean for slice 78 is 0.499649
[Worker 72411] Mean for slice 79 is 0.499709
[Worker 72411] Mean for slice 80 is 0.499443
[Worker 72411] Mean for slice 81 is 0.499264
[Worker 72411] Mean for slice 82 is 0.499219
[Worker 72411] Mean for slice 83 is 0.499209
[Worker 72411] Mean for slice 84 is 0.499090
[Worker 72411] Mean for slice 85 is 0.499414
[Worker 72411] Mean for slice 86 is 0.499436
[Worker 72411] Mean for slice 87 is 0.499538
[Worker 72411] Mean for slice 88 is 0.499982
[Worker 72411] Mean for slice 89 is 0.499890
[Worker 72411] Mean for slice 90 is 0.499794
[Worker 72411] Mean for slice 91 is 0.500261
[Worker 72411] Mean for slice 93 is 0.499729
[Worker 72410] Mean for slice 1 is 0.499822
[Worker 72410] Mean for slice 2 is 0.500192
[Worker 72410] Mean for slice 4 is 0.499882
[Worker 72410] Mean for slice 5 is 0.500227
[Worker 72410] Mean for slice 8 is 0.500768
[Worker 72410] Mean for slice 9 is 0.500827
[Worker 72410] Mean for slice 12 is 0.500622
[Worker 72410] Mean for slice 13 is 0.500309
[Worker 72410] Mean for slice 14 is 0.500301
[Worker 72410] Mean for slice 15 is 0.500188
[Worker 72410] Mean for slice 20 is 0.499801
[Worker 72410] Mean for slice 21 is 0.499498
[Worker 72410] Mean for slice 22 is 0.499277
[Worker 72410] Mean for slice 23 is 0.499704
[Worker 72410] Mean for slice 28 is 0.500372
[Worker 72410] Mean for slice 29 is 0.500678
[Worker 72410] Mean for slice 30 is 0.500472
[Worker 72410] Mean for slice 31 is 0.500405
[Worker 72410] Mean for slice 32 is 0.500430
[Worker 72410] Mean for slice 33 is 0.500550
[Worker 72410] Mean for slice 34 is 0.500390
[Worker 72410] Mean for slice 35 is 0.500548
[Worker 72410] Mean for slice 44 is 0.500116
[Worker 72410] Mean for slice 45 is 0.500156
[Worker 72410] Mean for slice 46 is 0.500287
[Worker 72410] Mean for slice 47 is 0.500436
[Worker 72410] Mean for slice 48 is 0.500211
[Worker 72410] Mean for slice 49 is 0.499870
[Worker 72410] Mean for slice 50 is 0.499956
[Worker 72410] Mean for slice 51 is 0.500031
[Worker 72410] Mean for slice 60 is 0.499742
[Worker 72410] Mean for slice 61 is 0.499482
[Worker 72410] Mean for slice 62 is 0.499713
[Worker 72410] Mean for slice 63 is 0.499787
[Worker 72410] Mean for slice 64 is 0.499745
[Worker 72410] Mean for slice 65 is 0.499882
[Worker 72410] Mean for slice 66 is 0.500111
[Worker 72410] Mean for slice 67 is 0.500058
[Worker 72410] Mean for slice 68 is 0.500018
[Worker 72410] Mean for slice 69 is 0.500003
[Worker 72410] Mean for slice 70 is 0.499966
[Worker 72410] Mean for slice 71 is 0.500159
[Worker 72410] Mean for slice 72 is 0.500172
[Worker 72410] Mean for slice 73 is 0.500485
[Worker 72410] Mean for slice 74 is 0.500487
[Worker 72410] Mean for slice 75 is 0.500367
[Worker 72410] Mean for slice 92 is 0.500122
[Worker 72410] Mean for slice 94 is 0.499784
Exception ignored in: <function ResourceTracker.__del__ at 0x7f44966825c0>
Traceback (most recent call last):
  File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 82, in __del__
  File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 91, in _stop
  File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 116, in _stop_locked
ChildProcessError: [Errno 10] No child processes
Exception ignored in: <function ResourceTracker.__del__ at 0x7fdbab0825c0>
Traceback (most recent call last):
  File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 82, in __del__
  File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 91, in _stop
  File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 116, in _stop_locked
ChildProcessError: [Errno 10] No child processes
make[2]: Leaving directory '/<<PKGBUILDDIR>>/doc'
make[1]: Leaving directory '/<<PKGBUILDDIR>>'
   dh_auto_test -O--buildsystem=pybuild
I: pybuild pybuild:308: cp -v debian/conftest.py /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build
'debian/conftest.py' -> '/<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/conftest.py'
I: pybuild base:311: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build; python3.13 -m pytest -k "not test_nested_loop_error_in_grandchild_resource_tracker_silent and not test_resource_tracker_silent_when_reference_cycles and not test_parallel_with_interactively_defined_functions_default_backend and not test_joblib_pickle_across_python_versions" --confcutdir=.
============================= test session starts ==============================
platform linux -- Python 3.13.2, pytest-8.3.5, pluggy-1.5.0
rootdir: /<<PKGBUILDDIR>>
configfile: pyproject.toml
plugins: typeguard-4.4.2
collected 1487 items / 4 deselected / 2 skipped / 1483 selected

joblib/__init__.py .                                                     [  0%]
joblib/_utils.py .                                                       [  0%]
joblib/parallel.py ...                                                   [  0%]
joblib/test/data/create_numpy_pickle.py .                                [  0%]
joblib/test/test_backports.py .....                                      [  0%]
joblib/test/test_cloudpickle_wrapper.py .                                [  0%]
joblib/test/test_config.py ......................                        [  2%]
joblib/test/test_disk.py .........                                       [  2%]
joblib/test/test_func_inspect.py ......................................  [  5%]
joblib/test/test_hashing.py ............................................ [  8%]
........................................................................ [ 13%]
........................................................................ [ 18%]
........................................................................ [ 22%]
........................................................................ [ 27%]
........................................................................ [ 32%]
........................................................................ [ 37%]
.....................................                                    [ 40%]
joblib/test/test_init.py .                                               [ 40%]
joblib/test/test_logger.py .                                             [ 40%]
joblib/test/test_memmapping.py .s.........F.....F.F....X................ [ 42%]
........                                                                 [ 43%]
joblib/test/test_memory.py .xx.......................................... [ 46%]
.....................                                                    [ 47%]
joblib/test/test_memory_async.py sssss                                   [ 48%]
joblib/test/test_missing_multiprocessing.py .                            [ 48%]
joblib/test/test_module.py ....                                          [ 48%]
joblib/test/test_numpy_pickle.py ....................................... [ 51%]
..........s............................................................. [ 56%]
..............s..                                                        [ 57%]
joblib/test/test_numpy_pickle_compat.py .                                [ 57%]
joblib/test/test_numpy_pickle_utils.py ..                                [ 57%]
joblib/test/test_parallel.py ........................................... [ 60%]
........................................................................ [ 65%]
........................................................................ [ 70%]
...........................XXX......................s................... [ 74%]
...................................................................s.ss. [ 79%]
s...s.ss.s...s.ss.s..............ssss................................... [ 84%]
.............................ssssssss................................... [ 89%]
........................................................................ [ 94%]
.................ss....ss.....ssssssssssssssssssss...............        [ 98%]
joblib/test/test_store_backends.py .....                                 [ 99%]
joblib/test/test_testing.py .....                                        [ 99%]
joblib/test/test_utils.py .........                                      [100%]

=================================== FAILURES ===================================
__________ test_permission_error_windows_memmap_sent_to_parent[loky] ___________

backend = 'loky'

     at with_numpy
     at with_multiprocessing
     at parametrize("backend", ["multiprocessing", "loky"])
    def test_permission_error_windows_memmap_sent_to_parent(backend):
        # Second non-regression test for:
        # https://github.com/joblib/joblib/issues/806
        # previously, child process would not convert temporary memmaps to numpy
        # arrays when sending the data back to the parent process. This would lead
        # to permission errors on windows when deleting joblib's temporary folder,
        # as the memmaped files handles would still opened in the parent process.
        cmd = '''if 1:
            import os
            import time
    
            import numpy as np
    
            from joblib import Parallel, delayed
            from testutils import return_slice_of_data
    
            data = np.ones(int(2e6))
    
            if __name__ == '__main__':
                # warm-up call to launch the workers and start the resource_tracker
                _ = Parallel(n_jobs=2, verbose=5, backend='{b}')(
                    delayed(id)(i) for i in range(20))
    
                time.sleep(0.5)
    
                slice_of_data = Parallel(n_jobs=2, verbose=5, backend='{b}')(
                    delayed(return_slice_of_data)(data, 0, 20) for _ in range(10))
        '''.format(b=backend)
    
        for _ in range(3):
            env = os.environ.copy()
            env['PYTHONPATH'] = os.path.dirname(__file__)
            p = subprocess.Popen([sys.executable, '-c', cmd],
                                 stderr=subprocess.PIPE,
                                 stdout=subprocess.PIPE, env=env)
            p.wait()
            out, err = p.communicate()
            assert p.returncode == 0, err
            assert out == b''
            if sys.version_info[:3] not in [(3, 8, 0), (3, 8, 1)]:
                # In early versions of Python 3.8, a reference leak
                # https://github.com/cloudpipe/cloudpickle/issues/327, holds
                # references to pickled objects, generating race condition during
                # cleanup finalizers of joblib and noisy resource_tracker outputs.
>               assert b'resource_tracker' not in err
E               assert b'resource_tracker' not in b'[Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers.\n[Parallel(n_jobs=2)]: Done  12 out of  2...13/multiprocessing/resource_tracker.py", line 116, in _stop_locked\nChildProcessError: [Errno 10] No child processes\n'

joblib/test/test_memmapping.py:441: AssertionError
_______ test_multithreaded_parallel_termination_resource_tracker_silent ________

     at with_numpy
     at with_multiprocessing
    def test_multithreaded_parallel_termination_resource_tracker_silent():
        # test that concurrent termination attempts of a same executor does not
        # emit any spurious error from the resource_tracker. We test various
        # situations making 0, 1 or both parallel call sending a task that will
        # make the worker (and thus the whole Parallel call) error out.
        cmd = '''if 1:
            import os
            import numpy as np
            from joblib import Parallel, delayed
            from joblib.externals.loky.backend import resource_tracker
            from concurrent.futures import ThreadPoolExecutor, wait
    
            resource_tracker.VERBOSE = 0
    
            array = np.arange(int(1e2))
    
            temp_dirs_thread_1 = set()
            temp_dirs_thread_2 = set()
    
    
            def raise_error(array):
                raise ValueError
    
    
            def parallel_get_filename(array, temp_dirs):
                with Parallel(backend="loky", n_jobs=2, max_nbytes=10) as p:
                    for i in range(10):
                        [filename] = p(
                            delayed(getattr)(array, "filename") for _ in range(1)
                        )
                        temp_dirs.add(os.path.dirname(filename))
    
    
            def parallel_raise(array, temp_dirs):
                with Parallel(backend="loky", n_jobs=2, max_nbytes=10) as p:
                    for i in range(10):
                        [filename] = p(
                            delayed(raise_error)(array) for _ in range(1)
                        )
                        temp_dirs.add(os.path.dirname(filename))
    
    
            executor = ThreadPoolExecutor(max_workers=2)
    
            # both function calls will use the same loky executor, but with a
            # different Parallel object.
            future_1 = executor.submit({f1}, array, temp_dirs_thread_1)
            future_2 = executor.submit({f2}, array, temp_dirs_thread_2)
    
            # Wait for both threads to terminate their backend
            wait([future_1, future_2])
    
            future_1.result()
            future_2.result()
        '''
        functions_and_returncodes = [
            ("parallel_get_filename", "parallel_get_filename", 0),
            ("parallel_get_filename", "parallel_raise", 1),
            ("parallel_raise", "parallel_raise", 1)
        ]
    
        for f1, f2, returncode in functions_and_returncodes:
            p = subprocess.Popen([sys.executable, '-c', cmd.format(f1=f1, f2=f2)],
                                 stderr=subprocess.PIPE, stdout=subprocess.PIPE)
            p.wait()
            out, err = p.communicate()
            assert p.returncode == returncode, out.decode()
>           assert b"resource_tracker" not in err, err.decode()
E           AssertionError: Exception ignored in: <function ResourceTracker.__del__ at 0x7ff0a639a5c0>
E             Traceback (most recent call last):
E               File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 82, in __del__
E             Exception ignored in: <function ResourceTracker.__del__ at 0x7f737c48a5c0>
E             Traceback (most recent call last):
E               File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 82, in __del__
E               File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 91, in _stop
E               File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 116, in _stop_locked
E               File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 91, in _stop
E             ChildProcessError: [Errno 10] No child processes
E               File "/usr/lib/python3.13/multiprocessing/resource_tracker.py", line 116, in _stop_locked
E             ChildProcessError: [Errno 10] No child processes
E             
E           assert b'resource_tracker' not in b'Exception ignored in: <function ResourceTracker.__del__ at 0x7ff0a639a5c0>\nTraceback (most recent call last):\n  Fi...13/multiprocessing/resource_tracker.py", line 116, in _stop_locked\nChildProcessError: [Errno 10] No child processes\n'

joblib/test/test_memmapping.py:585: AssertionError
________________ test_many_parallel_calls_on_same_object[loky] _________________

backend = 'loky'

     at with_numpy
     at with_multiprocessing
     at parametrize("backend", ["multiprocessing", "loky"])
    def test_many_parallel_calls_on_same_object(backend):
        # After #966 got merged, consecutive Parallel objects were sharing temp
        # folder, which would lead to race conditions happening during the
        # temporary resources management with the resource_tracker. This is a
        # non-regression test that makes sure that consecutive Parallel operations
        # on the same object do not error out.
        cmd = '''if 1:
            import os
            import time
    
            import numpy as np
    
            from joblib import Parallel, delayed
            from testutils import return_slice_of_data
    
            data = np.ones(100)
    
            if __name__ == '__main__':
                for i in range(5):
                    slice_of_data = Parallel(
                        n_jobs=2, max_nbytes=1, backend='{b}')(
                            delayed(return_slice_of_data)(data, 0, 20)
                            for _ in range(10)
                        )
        '''.format(b=backend)
        env = os.environ.copy()
        env['PYTHONPATH'] = os.path.dirname(__file__)
        p = subprocess.Popen(
            [sys.executable, '-c', cmd],
            stderr=subprocess.PIPE,
            stdout=subprocess.PIPE,
            env=env,
        )
        p.wait()
        out, err = p.communicate()
        assert p.returncode == 0, err
        assert out == b''
        if sys.version_info[:3] not in [(3, 8, 0), (3, 8, 1)]:
            # In early versions of Python 3.8, a reference leak
            # https://github.com/cloudpipe/cloudpickle/issues/327, holds
            # references to pickled objects, generating race condition during
            # cleanup finalizers of joblib and noisy resource_tracker outputs.
>           assert b'resource_tracker' not in err
E           assert b'resource_tracker' not in b'Exception ignored in: <function ResourceTracker.__del__ at 0x7f20df882660>\nTraceback (most recent call last):\n  Fi...13/multiprocessing/resource_tracker.py", line 116, in _stop_locked\nChildProcessError: [Errno 10] No child processes\n'

joblib/test/test_memmapping.py:633: AssertionError
=============================== warnings summary ===============================
joblib/testing.py:22
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/testing.py:22: PytestUnknownMarkWarning: Unknown pytest.mark.timeout - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    timeout = pytest.mark.timeout

joblib/test/test_parallel.py:1806
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py:1806: PytestUnknownMarkWarning: Unknown pytest.mark.no_cover - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.no_cover

joblib/executor.py:105
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/executor.py:105: PytestCollectionWarning: cannot collect test class '_TestingMemmappingExecutor' because it has a __init__ constructor (from: .pybuild/cpython3_3.13_joblib/build/joblib/test/test_memmapping.py)
    class _TestingMemmappingExecutor(MemmappingExecutor):

joblib/test/test_memory_async.py:27
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py:27: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.asyncio

joblib/test/test_memory_async.py:68
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py:68: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.asyncio

joblib/test/test_memory_async.py:86
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py:86: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.asyncio

joblib/test/test_memory_async.py:125
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py:125: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.asyncio

joblib/test/test_memory_async.py:152
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py:152: PytestUnknownMarkWarning: Unknown pytest.mark.asyncio - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/how-to/mark.html
    @pytest.mark.asyncio

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_func_inspect.py::test_filter_args_2
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_func_inspect.py:131: UserWarning: Cannot inspect object functools.partial(<function f at 0x7f2a7dbd37e0>, 1), ignore list will not work.
    assert filter_args(ff, ['y'], (1, )) == {'*': [1], '**': {}}

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_hashing.py: 2 warnings
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memmapping.py: 27 warnings
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py: 143 warnings
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_store_backends.py: 2 warnings
  /usr/lib/python3.13/multiprocessing/popen_fork.py:67: DeprecationWarning: This process (pid=72639) is multi-threaded, use of fork() may lead to deadlocks in the child.
    self.pid = os.fork()

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memmapping.py: 20 warnings
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py: 152 warnings
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/externals/loky/backend/fork_exec.py:38: DeprecationWarning: This process (pid=72639) is multi-threaded, use of fork() may lead to deadlocks in the child.
    pid = os.fork()

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory.py::test_memory_integration
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory.py:104: UserWarning: Compressed results cannot be memmapped
    memory = Memory(location=tmpdir.strpath, verbose=10,

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory.py::test_memory_integration
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/memory.py:128: UserWarning: Compressed items cannot be memmapped in a filesystem store. Option will be ignored.
    obj.configure(location, verbose=verbose,

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory.py::test_memory_integration
  /usr/lib/python3.13/contextlib.py:141: UserWarning: mmap_mode "r" is not compatible with compressed file /tmp/pytest-of-buildd/pytest-0/test_memory_integration0/joblib/joblib/test/test_memory/test_memory_integration/<locals>/f/b69f9d78d7bc537482721c40ce38db0a/output.pkl. "r" flag will be ignored.
    return next(self.gen)

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py::test_memory_integration_async
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py::test_no_memory_async
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py::test_memory_numpy_check_mmap_mode_async
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py::test_call_and_shelve_async
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_memory_async.py::test_memorized_func_call_async
  /usr/lib/python3/dist-packages/_pytest/python.py:148: PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
  You need to install a suitable plugin for your async framework, for example:
    - anyio
    - pytest-asyncio
    - pytest-tornasync
    - pytest-trio
    - pytest-twisted
    warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_numpy_pickle.py::test_joblib_compression_formats[lz4-1]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_numpy_pickle.py::test_joblib_compression_formats[lz4-3]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_numpy_pickle.py::test_joblib_compression_formats[lz4-6]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_numpy_pickle.py::test_lz4_compression
  /usr/lib/python3/dist-packages/_pytest/unraisableexception.py:85: PytestUnraisableExceptionWarning: Exception ignored in: <_io.BufferedReader>
  
  Traceback (most recent call last):
    File "/usr/lib/python3/dist-packages/lz4/frame/__init__.py", line 753, in flush
      self._fp.flush()
      ~~~~~~~~~~~~~~^^
  ValueError: I/O operation on closed file.
  
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_nested_loop[threading-multiprocessing]
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/parallel.py:1359: UserWarning: Multiprocessing-backed parallel loops cannot be nested below threads, setting n_jobs=1
    n_jobs = self._backend.configure(n_jobs=self.n_jobs, parallel=self,

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_nested_loop[threading-loky]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_nested_loop[threading-back_compat_backend]
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/parallel.py:1359: UserWarning: Loky-backed parallel loops cannot be nested below threads, setting n_jobs=1
    n_jobs = self._backend.configure(n_jobs=self.n_jobs, parallel=self,

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_parallel_unordered_generator_returns_fastest_first[threading-2]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_parallel_unordered_generator_returns_fastest_first[threading-4]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_parallel_unordered_generator_returns_fastest_first[loky-2]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_parallel_unordered_generator_returns_fastest_first[loky-4]
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/parallel.py:1817: UserWarning: 1 tasks which were still being processed by the workers have been cancelled. You could benefit from adjusting the input task iterator to limit unnecessary computation time.
    warnings.warn(msg)

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_deadlock_with_generator[2-generator-loky]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_deadlock_with_generator[2-generator_unordered-loky]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_deadlock_with_generator[-1-generator-loky]
.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py::test_deadlock_with_generator[-1-generator_unordered-loky]
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/parallel.py:1817: UserWarning: 6 tasks which were still being processed by the workers have been cancelled. You could benefit from adjusting the input task iterator to limit unnecessary computation time.
    warnings.warn(msg)

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_parallel.py: 24 warnings
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/parallel.py:1817: UserWarning: 4 tasks which were still being processed by the workers have been cancelled. You could benefit from adjusting the input task iterator to limit unnecessary computation time.
    warnings.warn(msg)

.pybuild/cpython3_3.13_joblib/build/joblib/test/test_testing.py::test_check_subprocess_call_timeout
  /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build/joblib/testing.py:58: UserWarning: Timeout running ['/usr/bin/python3.13', '-c', 'import time\nimport sys\nprint("before sleep on stdout")\nsys.stdout.flush()\nsys.stderr.write("before sleep on stderr")\nsys.stderr.flush()\ntime.sleep(10)\nprint("process should have be killed before")\nsys.stdout.flush()']
    warnings.warn(f"Timeout running {cmd}")

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== short test summary info ============================
FAILED joblib/test/test_memmapping.py::test_permission_error_windows_memmap_sent_to_parent[loky] - assert b'resource_tracker' not in b'[Parallel(n_jobs=2)]: Using backend Lok...
FAILED joblib/test/test_memmapping.py::test_multithreaded_parallel_termination_resource_tracker_silent - AssertionError: Exception ignored in: <function ResourceTracker.__del__ at ...
FAILED joblib/test/test_memmapping.py::test_many_parallel_calls_on_same_object[loky] - assert b'resource_tracker' not in b'Exception ignored in: <function Resourc...
= 3 failed, 1417 passed, 59 skipped, 4 deselected, 2 xfailed, 4 xpassed, 403 warnings in 63.29s (0:01:03) =
E: pybuild pybuild:389: test: plugin pyproject failed with: exit code=1: cd /<<PKGBUILDDIR>>/.pybuild/cpython3_3.13_joblib/build; python3.13 -m pytest -k "not test_nested_loop_error_in_grandchild_resource_tracker_silent and not test_resource_tracker_silent_when_reference_cycles and not test_parallel_with_interactively_defined_functions_default_backend and not test_joblib_pickle_across_python_versions" --confcutdir=.
dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.13 returned exit code 13
make: *** [debian/rules:27: binary] Error 25
dpkg-buildpackage: error: debian/rules binary subprocess returned exit status 2
--------------------------------------------------------------------------------

The above is just how the build ends and not necessarily the most relevant part.
If required, the full build log is available here:

https://people.debian.org/~sanvila/build-logs/202504/

About the archive rebuild: The build was made on virtual machines from AWS,
using sbuild and a reduced chroot with only build-essential packages.

If you could not reproduce the bug please contact me privately, as I
am willing to provide ssh access to a virtual machine where the bug is
fully reproducible.

If this is really a bug in one of the build-depends, please use
reassign and add an affects on src:joblib, so that this is still
visible in the BTS web page for this package.

Thanks.



More information about the debian-science-maintainers mailing list