Bug#909407: pybind11 breaks dolfin autopkgtest
Paul Gevers
elbrus at debian.org
Sun Sep 23 07:55:49 BST 2018
Source: pybind11, dolfin
Control: found -1 pybind11/2.2.4-1
Control: found -1 dolfin/2018.1.0.post1-10
X-Debbugs-CC: debian-ci at lists.debian.org
User: debian-ci at lists.debian.org
Usertags: breaks needs-update
Dear maintainers,
With a recent upload of pybind11 the autopkgtest of dolfin fails in
testing when that autopkgtest is run with the binary packages of
pybind11 from unstable. It passes when run with only packages from
testing. In tabular form:
pass fail
pybind11 from testing 2.2.4-1
dolfin from testing 2018.1.0.post1-10
all others from testing from testing
I copied some of the output at the bottom of this report.
Looking at the changelog of dolfin 2018.1.0.post1-11, I fear that a
versioned depends or a versioned breaks is missing somewhere. Note that
the Debian migration software considers those to determine if packages
need to be tested together from unstable. If dolfin can't determine the
upper version of pybind11 beforehand, a versioned breaks in pybind11
helps the migration software to use the proper version of dolfin during
dolfin's autopkgtesting. If dolfin 2018.1.0.post1-11 can migrate without
the pybind11 2.2.4-1, you could decide to ignore this bug as once dolfin
migrates, the test that is retried daily will be run with that version.
Currently this regression is contributing to the delay of the migration
of pybind11 to testing [1]. Due to the nature of this issue, I filed
this bug report against both packages. Can you please investigate the
situation and reassign the bug to the right package? If needed, please
change the bug's severity.
More information about this bug and the reason for filing it can be found on
https://wiki.debian.org/ContinuousIntegration/RegressionEmailInformation
Paul
[1] https://qa.debian.org/excuses.php?package=pybind11
https://ci.debian.net/data/autopkgtest/testing/amd64/d/dolfin/1036561/log.gz
=================================== FAILURES
===================================
[31m[1m________________________ test_compile_extension_module
_________________________[0m
[1m @skip_if_not_PETSc[0m
[1m def test_compile_extension_module():[0m
[1m [0m
[1m # This test should do basically the same as the docstring of
the[0m
[1m # compile_extension_module function in compilemodule.py.
Remember[0m
[1m # to update the docstring if the test is modified![0m
[1m [0m from testing
[1m from numpy import arange, exp[0m
[1m code = """[0m=================================== FAILURES
===================================
[31m[1m________________________ test_compile_extension_module
_________________________[0m
[36m [100%][0m[1m @skip_if_not_PETSc[0m
[1m def test_compile_extension_module():[0m
[1m [0m
[1m # This test should do basically the same as the docstring of
the[0m
[1m # compile_extension_module function in compilemodule.py.
Remember[0m
[1m # to update the docstring if the test is modified![0m
[1m [0m
[1m from numpy import arange, exp[0m
[1m code = """[0m
[1m #include <pybind11/pybind11.h>[0m
[1m [0m
[1m #include <petscvec.h>[0m
[1m #include <dolfin/la/PETScVector.h>[0m
[1m [0m
[1m void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)[0m
=================================== FAILURES
===================================
[31m[1m________________________ test_compile_extension_module
_________________________[0m
[1m @skip_if_not_PETSc[0m
[1m def test_compile_extension_module():[0m
[1m [0m
[1m # This test should do basically the same as the docstring of
the[0m
[1m # compile_extension_module function in compilemodule.py.
Remember[0m
[1m # to update the docstring if the test is modified![0m
[1m [0m
[1m from numpy import arange, exp[0m
[1m code = """[0m
[1m #include <pybind11/pybind11.h>[0m
[1m [0m
[1m #include <petscvec.h>[0m
[1m #include <dolfin/la/PETScVector.h>[0m
[1m [0m
[1m void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)[0m
[1m {[0m
[1m Vec x = vec->vec();[0m
[1m assert(x);[0m
[1m VecExp(x);[0m
[1m }[0m
[1m [0m
[1m PYBIND11_MODULE(SIGNATURE, m)[0m
[1m {[0m
[1m m.def("PETSc_exp", &PETSc_exp);[0m
[1m }[0m
[1m """[0m
[1m [0m
[1m ext_module = compile_cpp_code(code)[0m
[1m [0m
[1m vec = PETScVector(MPI.comm_world, 10)[0m
[1m np_vec = vec.get_local()[0m
[1m np_vec[:] = arange(len(np_vec))[0m
[1m vec.set_local(np_vec)[0m
[1m> ext_module.PETSc_exp(vec)[0m
[1m {[0m[1m[31mE TypeError: PETSc_exp(): incompatible
function arguments. The following argument types are supported:[0m
[1m[31mE 1. (arg0: dolfin::PETScVector) -> None[0m
[1m[31mE [0m
[1m[31mE Invoked with: <dolfin.cpp.la.PETScVector object at
0x7fe71f9b0468>[0m
[1m[31mpython/test/unit/jit/test_jit.py[0m:221: TypeError
[1m Vec x = vec->vec();[0m
[1m assert(x);[0m
[1m VecExp(x);[0m
[1m }[0m
[1m [0m
[1m PYBIND11_MODULE(SIGNATURE, m)[0m
[1m {[0m
[1m m.def("PETSc_exp", &PETSc_exp);[0m
[1m }[0m
[1m """[0m[31m[1m__________________________
test_creation_and_marking ___________________________[0m
[1m def test_creation_and_marking():[0m
[1m [0m
[1m class Left(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] < DOLFIN_EPS[0m
[1m #include <pybind11/pybind11.h>[0m
[1m [0m
[1m #include <petscvec.h>[0m
[1m #include <dolfin/la/PETScVector.h>[0m
[1m [0m
[1m void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)[0m
[1m {[0m
[1m Vec x = vec->vec();[0m
[1m assert(x);[0m
[1m VecExp(x);[0m
[1m }[0m
[1m [0m
[1m PYBIND11_MODULE(SIGNATURE, m)[0m
[1m {[0m
[1m m.def("PETSc_exp", &PETSc_exp);[0m
[1m }[0m
[1m """[0m
[1m [0m
[1m ext_module = compile_cpp_code(code)[0m
[1m [0m
[1m vec = PETScVector(MPI.comm_world, 10)[0m
[1m np_vec = vec.get_local()[0m
[1m np_vec[:] = arange(len(np_vec))[0m
[1m vec.set_local(np_vec)[0m
[1m> ext_module.PETSc_exp(vec)[0m
[1m[31mE TypeError: PETSc_exp(): incompatible function
arguments. The following argument types are supported:[0m
[1m[31mE 1. (arg0: dolfin::PETScVector) -> None[0m
[1m[31mE [0m
[1m[31mE Invoked with: <dolfin.cpp.la.PETScVector object at
0x7f488a5b7468>[0m
[1m[31mpython/test/unit/jit/test_jit.py[0m:221: TypeError
[1m [0m
[1m ext_module = compile_cpp_code(code)[0m
[1m [0m
[1m vec = PETScVector(MPI.comm_world, 10)[0m
[1m np_vec = vec.get_local()[0m
[1m np_vec[:] = arange(len(np_vec))[0m
[1m vec.set_local(np_vec)[0m
[1m> ext_module.PETSc_exp(vec)[0m
[1m[31mE TypeError: PETSc_exp(): incompatible function
arguments. The following argument types are supported:[0m
[1m[31mE 1. (arg0: dolfin::PETScVector) ->
None[0m[31m[1m__________________________ test_creation_and_marking
___________________________[0m
[1m def test_creation_and_marking():[0m
[1m [0m
[1m class Left(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] < DOLFIN_EPS[0m
[1m [0m
[1m class LeftOnBoundary(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] < DOLFIN_EPS and on_boundary[0m
[1m [0m
[1m class Right(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] > 1.0 - DOLFIN_EPS[0m
[1m [0m
[1m class RightOnBoundary(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] > 1.0 - DOLFIN_EPS and on_boundary[0m
[1m [0m
[1m cpp_code = """[0m
[1m #include<pybind11/pybind11.h>[0m
[1m #include<pybind11/eigen.h>[0m
[1m namespace py = pybind11;[0m
[1m [0m
[1m #include<Eigen/Dense>[0m
[1m #include<dolfin/mesh/SubDomain.h>[0m
[1m [0m
[1m class Left : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m class LeftOnBoundary(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] < DOLFIN_EPS and on_boundary[0m
[1m [0m
[1m class Right(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] > 1.0 - DOLFIN_EPS[0m
[1m [0m
[1m class RightOnBoundary(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] > 1.0 - DOLFIN_EPS and on_boundary[0m
[1m [0m
[1m cpp_code = """[0m
[1m #include<pybind11/pybind11.h>[0m
[1m #include<pybind11/eigen.h>[0m
[1m namespace py = pybind11;[0m
[1m [0m
[1m #include<Eigen/Dense>[0m
[1m #include<dolfin/mesh/SubDomain.h>[0m
[1m [0m
[1m class Left : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] < DOLFIN_EPS;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class LeftOnBoundary : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] < DOLFIN_EPS and on_boundary;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class Right : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] > 1.0 - DOLFIN_EPS;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class RightOnBoundary : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] > 1.0 - DOLFIN_EPS and on_boundary;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m PYBIND11_MODULE(SIGNATURE, m) {[0m
[1m py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());[0m
[1m py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());[0m
[1m py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());[0m
[1m py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());[0m
[1m }[0m
[1m """[0m
[1m [0m
[1m> compiled_domain_module = compile_cpp_code(cpp_code)[0m
[1m[31mpython/test/unit/mesh/test_sub_domain.py[0m:127:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
[1m[31m/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py[0m:87:
in compile_cpp_code
[1m generate=jit_generate)[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
[1m[31mE [0margs = ('\n #include<pybind11/pybind11.h>\n
#include<pybind11/eigen.h>\n namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})
kwargs = {'generate': <function jit_generate at 0x7fe726463488>}
mpi_comm = <mpi4py.MPI.Intracomm object at 0x7fe71ecd8090>, status = 0
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] < DOLFIN_EPS;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class LeftOnBoundary : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] < DOLFIN_EPS and on_boundary;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class Right : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] > 1.0 - DOLFIN_EPS;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class RightOnBoundary : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] > 1.0 - DOLFIN_EPS and on_boundary;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m PYBIND11_MODULE(SIGNATURE, m) {[0m
[1m py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());[0m
[1m py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());[0m
[1m py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());[0m
[1m py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());[0m
[1m }[0m
[1m """[0m
[1m [0m
[1m> compiled_domain_module = compile_cpp_code(cpp_code)[0m
[1m[31mpython/test/unit/mesh/test_sub_domain.py[0m:127:
[1m[31mE Invoked with: <dolfin.cpp.la.PETScVector object at
0x7fab32897468>[0m_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31m/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py[0m
:87: in compile_cpp_code
[1m generate=jit_generate)[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
[1m[31mpython/test/unit/jit/test_jit.py[0m
:221: TypeErrorargs = ('\n #include<pybind11/pybind11.h>\n
#include<pybind11/eigen.h>\n namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})
kwargs = {'generate': <function jit_generate at 0x7f489106f488>}
mpi_comm = <mpi4py.MPI.Intracomm object at 0x7f48898e0e50>, status = 1
root = True
error_msg = 'generic_type: type "Left" referenced unknown base type
"dolfin::SubDomain"'
global_status = 1.0
[1m @wraps(local_jit)[0m
[1m def mpi_jit(*args, **kwargs):[0m
[1m [0m
[1m # FIXME: should require mpi_comm to be explicit[0m
[1m # and not default to comm_world?[0m
[1m mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)[0m
[1m [0m
[1m # Just call JIT compiler when running in serial[0m
[1m if MPI.size(mpi_comm) == 1:[0m
[1m return local_jit(*args, **kwargs)[0m
[1m [0m
[1m # Default status (0 == ok, 1 == fail)[0m
[1m status = 0[0m
[1m [0m
[1m # Compile first on process 0[0m
[1m root = MPI.rank(mpi_comm) == 0[0m
[1m if root:[0m
[1m try:[0m
[1m output = local_jit(*args, **kwargs)[0m
[1m except Exception as e:[0m
root = False, error_msg = 'Compilation failed on root node.'
global_status = 1.0
[1m @wraps(local_jit)[0m
[1m def mpi_jit(*args, **kwargs):[0m
[1m [0m
[1m # FIXME: should require mpi_comm to be explicit[0m
[1m # and not default to comm_world?[0m
[1m mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)[0m
[1m [0m
[1m # Just call JIT compiler when running in serial[0m
[1m if MPI.size(mpi_comm) == 1:[0m
[1m return local_jit(*args, **kwargs)[0m
[1m [0m
[1m # Default status (0 == ok, 1 == fail)[0m
[1m status = 0[0m
[1m [0m
[1m # Compile first on process 0[0m
[1m root = MPI.rank(mpi_comm) == 0[0m
[1m if root:[0m
[1m try:[0m
[1m output = local_jit(*args, **kwargs)[0m
[1m except Exception as e:[0m
[1m status = 1[0m
[1m error_msg = str(e)[0m
[1m [0m
[1m # TODO: This would have lower overhead if using the
dijitso.jit[0m
[1m # features to inject a waiting callback instead of waiting
out here.[0m
[1m # That approach allows all processes to first look in the
cache,[0m
[1m # introducing a barrier only on cache miss.[0m
[1m # There's also a sketch in dijitso of how to make only one[0m
[1m # process per physical cache directory do the compilation.[0m
[1m [0m
[1m # Wait for the compiling process to finish and get status[0m
[1m # TODO: Would be better to broadcast the status from root
but this works.[0m
[1m global_status = MPI.max(mpi_comm, status)[0m
[1m [0m
[1m if global_status == 0:[0m
[1m # Success, call jit on all other processes[0m
[1m # (this should just read the cache)[0m
[1m if not root:[0m
[1m output = local_jit(*args, **kwargs)[0m
[1m else:[0m
[1m # Fail simultaneously on all processes,[0m
[1m # to allow catching the error without deadlock[0m
[1m status = 1[0m[1m if not root:[0m
[1m error_msg = "Compilation failed on root node."[0m
[1m> raise RuntimeError(error_msg)[0m
[1m[31mE RuntimeError: Compilation failed on root node.[0m
[1m[31m/usr/lib/python3/dist-packages/dolfin/jit/jit.py[0m:82:
RuntimeError
[31m[1m__________________________ test_creation_and_marking
___________________________[0m[1m error_msg = str(e)[0m
[1m [0m
[33m=============================== warnings summary
===============================[0m
test/unit/jit/test_jit.py::test_nasty_jit_caching_bug
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
*** ===================================================== ***
*** FFC: quadrature representation is deprecated! It will ***
*** likely be removed in 2018.2.0 release. Use uflacs ***
*** representation instead. ***
*** ===================================================== ***
issue_deprecation_warning()
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
*** ===================================================== ***
*** FFC: quadrature representation is deprecated! It will ***
*** likely be removed in 2018.2.0 release. Use uflacs ***
*** representation instead. ***
*** ===================================================== ***
issue_deprecation_warning()
[1m # TODO: This would have lower overhead if using the
dijitso.jit[0m
-- Docs: http://doc.pytest.org/en/latest/warnings.html
[31m[1m= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
672.01 seconds ==[0m
[1m # features to inject a waiting callback instead of waiting
out here.[0m
[1m # That approach allows all processes to first look in the
cache,[0m
[1m # introducing a barrier only on cache miss.[0m
[1m # There's also a sketch in dijitso of how to make only one[0m
[1m # process per physical cache directory do the compilation.[0m
[1m [0m
[1m # Wait for the compiling process to finish and get status[0m
[1m # TODO: Would be better to broadcast the status from root
but this works.[0m
[1m global_status = MPI.max(mpi_comm, status)[0m
[1m [0m
[1m if global_status == 0:[0m
[1m # Success, call jit on all other processes[0m
[1m # (this should just read the cache)[0m[1m def
test_creation_and_marking():[0m
[1m if not root:[0m[1m [0m
[1m class Left(SubDomain):[0m[1m output =
local_jit(*args, **kwargs)[0m
[1m def inside(self, x, on_boundary):[0m[1m else:[0m
[1m # Fail simultaneously on all processes,[0m[1m
return x[0] < DOLFIN_EPS[0m
[1m # to allow catching the error without deadlock[0m[1m
[0m
[1m class LeftOnBoundary(SubDomain):[0m
[1m if not root:[0m
[1m def inside(self, x, on_boundary):[0m[1m
error_msg = "Compilation failed on root node."[0m
[1m return x[0] < DOLFIN_EPS and on_boundary[0m
[1m> raise RuntimeError(error_msg)[0m
[1m[31mE RuntimeError: generic_type: type "Left" referenced
unknown base type "dolfin::SubDomain"[0m[1m [0m
[1m class Right(SubDomain):[0m
[1m[31m/usr/lib/python3/dist-packages/dolfin/jit/jit.py[0m[1m
def inside(self, x, on_boundary):[0m
:82: RuntimeError[1m return x[0] > 1.0 - DOLFIN_EPS[0m
[1m [0m
[1m class RightOnBoundary(SubDomain):[0m
[1m def inside(self, x, on_boundary):[0m
[1m return x[0] > 1.0 - DOLFIN_EPS and on_boundary[0m
[1m [0m
[1m cpp_code = """[0m
[1m #include<pybind11/pybind11.h>[0m
[1m #include<pybind11/eigen.h>[0m
[1m namespace py = pybind11;[0m
[1m [0m
[1m #include<Eigen/Dense>[0m
[1m #include<dolfin/mesh/SubDomain.h>[0m
[1m [0m
[1m class Left : public dolfin::SubDomain[0m
[1m {[0m[33m=============================== warnings
summary ===============================[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] < DOLFIN_EPS;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class LeftOnBoundary : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
test/unit/jit/test_jit.py::test_nasty_jit_caching_bug[1m {[0m
[1m return x[0] < DOLFIN_EPS and on_boundary;[0m
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
*** ===================================================== ***
*** FFC: quadrature representation is deprecated! It will ***
*** likely be removed in 2018.2.0 release. Use uflacs ***
*** representation instead. ***
*** ===================================================== ***
issue_deprecation_warning()
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
*** ===================================================== ***
*** FFC: quadrature representation is deprecated! It will ***
*** likely be removed in 2018.2.0 release. Use uflacs ***
*** representation instead. ***
*** ===================================================== ***
issue_deprecation_warning()[1m }[0m
[1m };[0m
-- Docs: http://doc.pytest.org/en/latest/warnings.html
[1m [0m
[31m[1m= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
671.92 seconds ==[0m[1m class Right : public
dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] > 1.0 - DOLFIN_EPS;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m class RightOnBoundary : public dolfin::SubDomain[0m
[1m {[0m
[1m public:[0m
[1m [0m
[1m virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const[0m
[1m {[0m
[1m return x[0] > 1.0 - DOLFIN_EPS and on_boundary;[0m
[1m }[0m
[1m };[0m
[1m [0m
[1m PYBIND11_MODULE(SIGNATURE, m) {[0m
[1m py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());[0m
[1m py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());[0m
[1m py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());[0m
[1m py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());[0m
[1m }[0m
[1m """[0m
[1m [0m
[1m> compiled_domain_module = compile_cpp_code(cpp_code)[0m
[1m[31mpython/test/unit/mesh/test_sub_domain.py[0m:127:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
[1m[31m/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py[0m:87:
in compile_cpp_code
[1m generate=jit_generate)[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
args = ('\n #include<pybind11/pybind11.h>\n
#include<pybind11/eigen.h>\n namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})
kwargs = {'generate': <function jit_generate at 0x7fab3934b488>}
mpi_comm = <mpi4py.MPI.Intracomm object at 0x7fab31c00230>, status = 0
root = False, error_msg = 'Compilation failed on root node.'
global_status = 1.0
[1m @wraps(local_jit)[0m
[1m def mpi_jit(*args, **kwargs):[0m
[1m [0m
[1m # FIXME: should require mpi_comm to be explicit[0m
[1m # and not default to comm_world?[0m
[1m mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)[0m
[1m [0m
[1m # Just call JIT compiler when running in serial[0m
[1m if MPI.size(mpi_comm) == 1:[0m
[1m return local_jit(*args, **kwargs)[0m
[1m [0m
[1m # Default status (0 == ok, 1 == fail)[0m
[1m status = 0[0m
[1m [0m
[1m # Compile first on process 0[0m
[1m root = MPI.rank(mpi_comm) == 0[0m
[1m if root:[0m
[1m try:[0m
[1m output = local_jit(*args, **kwargs)[0m
[1m except Exception as e:[0m
[1m status = 1[0m
[1m error_msg = str(e)[0m
[1m [0m
[1m # TODO: This would have lower overhead if using the
dijitso.jit[0m
[1m # features to inject a waiting callback instead of waiting
out here.[0m
[1m # That approach allows all processes to first look in the
cache,[0m
[1m # introducing a barrier only on cache miss.[0m
[1m # There's also a sketch in dijitso of how to make only one[0m
[1m # process per physical cache directory do the compilation.[0m
[1m [0m
[1m # Wait for the compiling process to finish and get status[0m
[1m # TODO: Would be better to broadcast the status from root
but this works.[0m
[1m global_status = MPI.max(mpi_comm, status)[0m
[1m [0m
[1m if global_status == 0:[0m
[1m # Success, call jit on all other processes[0m
[1m # (this should just read the cache)[0m
[1m if not root:[0m
[1m output = local_jit(*args, **kwargs)[0m
[1m else:[0m
[1m # Fail simultaneously on all processes,[0m
[1m # to allow catching the error without deadlock[0m
[1m if not root:[0m
[1m error_msg = "Compilation failed on root node."[0m
[1m> raise RuntimeError(error_msg)[0m
[1m[31mE RuntimeError: Compilation failed on root node.[0m
[1m[31m/usr/lib/python3/dist-packages/dolfin/jit/jit.py[0m:82:
RuntimeError
[33m=============================== warnings summary
===============================[0m
test/unit/jit/test_jit.py::test_nasty_jit_caching_bug
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
*** ===================================================== ***
*** FFC: quadrature representation is deprecated! It will ***
*** likely be removed in 2018.2.0 release. Use uflacs ***
*** representation instead. ***
*** ===================================================== ***
issue_deprecation_warning()
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
*** ===================================================== ***
*** FFC: quadrature representation is deprecated! It will ***
*** likely be removed in 2018.2.0 release. Use uflacs ***
*** representation instead. ***
*** ===================================================== ***
issue_deprecation_warning()
-- Docs: http://doc.pytest.org/en/latest/warnings.html
[31m[1m= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
671.96 seconds ==[0m
-------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
-------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status,
thus causing
the job to be terminated. The first process to do so was:
Process name: [[60892,1],1]
Exit code: 1
--------------------------------------------------------------------------
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 488 bytes
Desc: OpenPGP digital signature
URL: <http://alioth-lists.debian.net/pipermail/debian-science-maintainers/attachments/20180923/969e57e8/attachment-0001.sig>
More information about the debian-science-maintainers
mailing list