[Debichem-devel] Bug#981831: dbcsr FTBFS: dbcsr_perf:inputs/test_square_sparse_rma.perf (Failed)
Adrian Bunk
bunk at debian.org
Thu Feb 4 11:45:54 GMT 2021
Source: dbcsr
Version: 2.1.0-1
Severity: serious
Tags: ftbfs
https://buildd.debian.org/status/package.php?p=dbcsr
...
8/18 Test #10: dbcsr_perf:inputs/test_square_sparse_rma.perf .........***Failed 0.91 sec
DBCSR| CPU Multiplication driver BLAS
DBCSR| Multrec recursion limit 512
DBCSR| Multiplication stack size 1000
DBCSR| Maximum elements for images UNLIMITED
DBCSR| Multiplicative factor virtual images 1
DBCSR| Use multiplication densification T
DBCSR| Multiplication size stacks 3
DBCSR| Use memory pool for CPU allocation F
DBCSR| Number of 3D layers SINGLE
DBCSR| Use MPI memory allocation F
DBCSR| Use RMA algorithm T
DBCSR| Use Communication thread T
DBCSR| Communication thread load 100
DBCSR| MPI: My node id 0
DBCSR| MPI: Number of nodes 2
DBCSR| OMP: Current number of threads 2
DBCSR| OMP: Max number of threads 2
DBCSR| Split modifier for TAS multiplication algorithm 1.0E+00
numthreads 2
numnodes 2
matrix_sizes 1000 1000 1000
sparsities 0.90000000000000002 0.90000000000000002 0.90000000000000002
trans NN
symmetries NNN
type 3
alpha_in 1.0000000000000000 0.0000000000000000
beta_in 1.0000000000000000 0.0000000000000000
limits 1 1000 1 1000 1 1000
retain_sparsity F
nrep 10
bs_m 1 5
bs_n 1 5
bs_k 1 5
*******************************************************************************
* ___ *
* / \ *
* [ABORT] *
* \___/ MPI error 53 in mpi_win_create @ mp_win_create_dv : MPI_ERR_WIN: *
* | invalid window *
* O/| *
* /| | *
* / \ dbcsr_mpiwrap.F:853 *
*******************************************************************************
===== Routine Calling Stack =====
7 mp_win_create_dv
6 win_setup
5 multiply_3D
4 dbcsr_multiply_generic
3 perf_multiply
2 dbcsr_perf_multiply_low
1 dbcsr_performance_driver
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
...
94% tests passed, 1 tests failed out of 18
Total Test time (real) = 606.45 sec
The following tests FAILED:
10 - dbcsr_perf:inputs/test_square_sparse_rma.perf (Failed)
Errors while running CTest
make[1]: *** [Makefile:129: test] Error 8
More information about the Debichem-devel
mailing list