Bug#968118: dolfin: 64-bit tests fail
Drew Parsons
dparsons at debian.org
Sun Aug 9 12:16:51 BST 2020
Source: dolfin
Version: 2019.2.0~git20200218.027d9cc-12
Severity: normal
Control: forwarded -1 https://bitbucket.org/fenics-project/dolfin/issues/1113/
dolfin64 tests in debian/test (autopkgtest) currently fail.
debci provides failure logs, e.g.
https://ci.debian.net/data/autopkgtest/unstable/amd64/d/dolfin/6522059/log.gz
There seem to be a couple of classes of failure.
1) “Out of memory” e.g. in C++ unittests
Run 64-bit C++ unit tests (serial)
Test project /tmp/autopkgtest-lxc.k8ge4uns/downtmp/build.QiR/src/dolfin-unittests
Start 1: unittests
1/1 Test #1: unittests ........................***Failed 1.50 sec
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
unittests is a Catch v1.9.6 host application.
Run with -? for options
-------------------------------------------------------------------------------
Initialise PETSc
-------------------------------------------------------------------------------
/tmp/autopkgtest-lxc.k8ge4uns/downtmp/build.QiR/src/test/unit/cpp/common/SubSystemsManager.cpp:42
...............................................................................
/tmp/autopkgtest-lxc.k8ge4uns/downtmp/build.QiR/src/test/unit/cpp/common/SubSystemsManager.cpp:44: FAILED:
CHECK_NOTHROW( init_petsc() )
due to unexpected exception with message:
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** fenics-support at googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to successfully call PETSc function 'VecSetSizes'.
*** Reason: PETSc error code is: 55 (Out of memory).
*** Where: This error was encountered inside /build/dolfin-HRGZgk/dolfin-
2019.2.0~git20200218.027d9cc/dolfin/la/PETScVector.cpp.
*** Process: 0
***
*** DOLFIN version: 2019.2.0.dev0
*** Git changeset: unknown
*** -------------------------------------------------------------------------
2) Unable to call KSPSolve in some demos (others pass), e.g.
Start 13: demo_multimesh-stokes_mpi
7/48 Test #13: demo_multimesh-stokes_mpi .............. Passed 0.40 sec
Start 15: demo_sym-dirichlet-bc_mpi
8/48 Test #15: demo_sym-dirichlet-bc_mpi ..............***Failed 1.75 sec
Process 0: <Table of size 2 x 1>
Process 1: <Table of size 2 x 1>
Process 0: *** Warning: Using PETSc native LU solver. Consider specifying a more efficient LU solver (e.g.umfpack) if available.
Process 2: <Table of size 2 x 1>
Process 1: *** Warning: Using PETSc native LU solver. Consider specifying a more efficient LU solver (e.g.umfpack) if available.
Process 2: *** Warning: Using PETSc native LU solver. Consider specifying a more efficient LU solver (e.g.umfpack) if available.
terminate called after throwing an instance of 'std::runtime_error'
terminate called after throwing an instance of 'std::runtime_error'
terminate called after throwing an instance of 'std::runtime_error'
what(): what():
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** fenics-support at googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to successfully call PETSc function 'KSPSolve'.
*** Reason: PETSc error code is: 92 (See https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers).
*** Where: This error was encountered inside /build/dolfin-HRGZgk/dolfin-2019.2.0~git20200218.027d9cc/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 2
***
*** DOLFIN version: 2019.2.0.dev0
*** Git changeset: unknown
*** -------------------------------------------------------------------------
[ci-215-65927b02:02396] *** Process received signal ***
[ci-215-65927b02:02396] Signal: Aborted (6)
[ci-215-65927b02:02396] Signal code: (-6)
Given the reference to “possible LU and Cholesky solvers” in the
KSPSolve error message, I suspect the latter class might simply mean
that 64-bit PETSc doesn’t have the linear solvers that dolfin would
otherwise invoke. superlu for instance is not available in the 64 bit
build. Would this explain the error?
3) MPI_INIT problems, e.g.
18/49 Test #38: demo_curl-curl_serial .....................***Failed 0.02 sec
*** The MPI_Comm_rank() function was called before MPI_INIT was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[ci-215-65927b02:02093] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed!
Reported upstream at
https://bitbucket.org/fenics-project/dolfin/issues/1113/
For now, dolfin64 tests are disabled in debian/tests
(from 2019.2.0~git20200218.027d9cc-13)
The 64-bit build should be considered "experimental" for the time
being (hence test failures should not hold up migration to testing)
-- System Information:
Debian Release: bullseye/sid
APT prefers unstable
APT policy: (500, 'unstable'), (1, 'experimental')
Architecture: amd64 (x86_64)
Foreign Architectures: i386
Kernel: Linux 5.7.0-2-amd64 (SMP w/8 CPU threads)
Locale: LANG=en_AU.UTF-8, LC_CTYPE=en_AU.UTF-8 (charmap=UTF-8), LANGUAGE=en_AU:en
Shell: /bin/sh linked to /usr/bin/dash
Init: systemd (via /run/systemd/system)
LSM: AppArmor: enabled
More information about the debian-science-maintainers
mailing list