Bug#1071722: adios4dolfinx: FTBFS: failing tests
Santiago Vila
sanvila at debian.org
Sat May 25 17:31:10 BST 2024
El 25/5/24 a las 16:42, Drew Parsons escribió:
> Source: adios4dolfinx
> Followup-For: Bug #1071722
> Control: tags -1 ftbfs
>
> adios4dolfinx is building cleanly in reproducibility builds.
> Perhaps the problem was a temporary glitch on your test system?
No, this is unlikely to be a temporary glitch:
Status: successful adios4dolfinx_0.7.3-1_amd64-20240215T153414.310Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240504T105910.601Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240504T105911.782Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240504T105926.289Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240504T105927.312Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240508T222341.343Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240508T230025.295Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240520T162738.912Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240524T093906.909Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240524T093908.800Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240524T093910.225Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240524T095911.253Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240524T095911.177Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240524T100100.391Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240524T100158.350Z
Status: failed adios4dolfinx_0.8.1.post0-1_amd64-20240525T133854.167Z
My system has 2 CPUs, apparently MPI counts them as "one engine" and fails
because the code has things like this:
ipp.Cluster(engines="mpi", n=2)
This bypasses whatever BUILD_OPTIONS=parallel=n setting the
user might set.
BTW: This bug looks similar to #1057556, which the maintainer
misdiagnosed as "fails with a single cpu" (not true!).
In such bug I proposed to run mpi with true and skip
the tests if it fails.
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1057556#34
Maybe a similar idea would work here as well.
Thanks.
More information about the debian-science-maintainers
mailing list