Bug#909407: pybind11 breaks dolfin autopkgtest

Paul Gevers elbrus at debian.org
Sun Sep 23 07:55:49 BST 2018


Source: pybind11, dolfin
Control: found -1 pybind11/2.2.4-1
Control: found -1 dolfin/2018.1.0.post1-10
X-Debbugs-CC: debian-ci at lists.debian.org
User: debian-ci at lists.debian.org
Usertags: breaks needs-update

Dear maintainers,

With a recent upload of pybind11 the autopkgtest of dolfin fails in
testing when that autopkgtest is run with the binary packages of
pybind11 from unstable. It passes when run with only packages from
testing. In tabular form:
                       pass            fail
pybind11               from testing    2.2.4-1
dolfin                 from testing    2018.1.0.post1-10
all others             from testing    from testing

I copied some of the output at the bottom of this report.

Looking at the changelog of dolfin 2018.1.0.post1-11, I fear that a
versioned depends or a versioned breaks is missing somewhere. Note that
the Debian migration software considers those to determine if packages
need to be tested together from unstable. If dolfin can't determine the
upper version of pybind11 beforehand, a versioned breaks in pybind11
helps the migration software to use the proper version of dolfin during
dolfin's autopkgtesting. If dolfin 2018.1.0.post1-11 can migrate without
the pybind11 2.2.4-1, you could decide to ignore this bug as once dolfin
migrates, the test that is retried daily will be run with that version.

Currently this regression is contributing to the delay of the migration
of pybind11 to testing [1]. Due to the nature of this issue, I filed
this bug report against both packages. Can you please investigate the
situation and reassign the bug to the right package? If needed, please
change the bug's severity.

More information about this bug and the reason for filing it can be found on
https://wiki.debian.org/ContinuousIntegration/RegressionEmailInformation

Paul

[1] https://qa.debian.org/excuses.php?package=pybind11

https://ci.debian.net/data/autopkgtest/testing/amd64/d/dolfin/1036561/log.gz

=================================== FAILURES
===================================

________________________ test_compile_extension_module
_________________________

    @skip_if_not_PETSc
    def test_compile_extension_module():
    
        # This test should do basically the same as the docstring of
the
        # compile_extension_module function in compilemodule.py.
Remember
        # to update the docstring if the test is modified!
                  from testing
        from numpy import arange, exp
        code = """=================================== FAILURES
===================================
________________________ test_compile_extension_module
_________________________

        [100%]    @skip_if_not_PETSc
    def test_compile_extension_module():
    
        # This test should do basically the same as the docstring of
the
        # compile_extension_module function in compilemodule.py.
Remember
        # to update the docstring if the test is modified!
    
        from numpy import arange, exp
        code = """
          #include <pybind11/pybind11.h>
    
          #include <petscvec.h>
          #include <dolfin/la/PETScVector.h>
    
          void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)


=================================== FAILURES
===================================
________________________ test_compile_extension_module
_________________________

    @skip_if_not_PETSc
    def test_compile_extension_module():
    
        # This test should do basically the same as the docstring of
the
        # compile_extension_module function in compilemodule.py.
Remember
        # to update the docstring if the test is modified!
    
        from numpy import arange, exp
        code = """
          #include <pybind11/pybind11.h>
    
          #include <petscvec.h>
          #include <dolfin/la/PETScVector.h>
    
          void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)
          {
            Vec x = vec->vec();
            assert(x);
            VecExp(x);
          }
    
        PYBIND11_MODULE(SIGNATURE, m)
        {
          m.def("PETSc_exp", &PETSc_exp);
        }
        """
    
        ext_module = compile_cpp_code(code)
    
        vec = PETScVector(MPI.comm_world, 10)
        np_vec = vec.get_local()
        np_vec[:] = arange(len(np_vec))
        vec.set_local(np_vec)
>       ext_module.PETSc_exp(vec)
          {E       TypeError: PETSc_exp(): incompatible
function arguments. The following argument types are supported:
E           1. (arg0: dolfin::PETScVector) -> None
E       
E       Invoked with: <dolfin.cpp.la.PETScVector object at
0x7fe71f9b0468>

python/test/unit/jit/test_jit.py:221: TypeError

            Vec x = vec->vec();
            assert(x);
            VecExp(x);
          }
    
        PYBIND11_MODULE(SIGNATURE, m)
        {
          m.def("PETSc_exp", &PETSc_exp);
        }
        """__________________________
test_creation_and_marking ___________________________

    def test_creation_and_marking():
    
        class Left(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS
          #include <pybind11/pybind11.h>
    
          #include <petscvec.h>
          #include <dolfin/la/PETScVector.h>
    
          void PETSc_exp(std::shared_ptr<dolfin::PETScVector> vec)
          {
            Vec x = vec->vec();
            assert(x);
            VecExp(x);
          }
    
        PYBIND11_MODULE(SIGNATURE, m)
        {
          m.def("PETSc_exp", &PETSc_exp);
        }
        """
    
        ext_module = compile_cpp_code(code)
    
        vec = PETScVector(MPI.comm_world, 10)
        np_vec = vec.get_local()
        np_vec[:] = arange(len(np_vec))
        vec.set_local(np_vec)
>       ext_module.PETSc_exp(vec)

E       TypeError: PETSc_exp(): incompatible function
arguments. The following argument types are supported:
E           1. (arg0: dolfin::PETScVector) -> None
E       
E       Invoked with: <dolfin.cpp.la.PETScVector object at
0x7f488a5b7468>

python/test/unit/jit/test_jit.py:221: TypeError
    
        ext_module = compile_cpp_code(code)
    
        vec = PETScVector(MPI.comm_world, 10)
        np_vec = vec.get_local()
        np_vec[:] = arange(len(np_vec))
        vec.set_local(np_vec)
>       ext_module.PETSc_exp(vec)
E       TypeError: PETSc_exp(): incompatible function
arguments. The following argument types are supported:
E           1. (arg0: dolfin::PETScVector) ->
None__________________________ test_creation_and_marking
___________________________

    def test_creation_and_marking():
    
        class Left(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS
    
        class LeftOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS and on_boundary
    
        class Right(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS
    
        class RightOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary
    
        cpp_code = """
            #include<pybind11/pybind11.h>
            #include<pybind11/eigen.h>
            namespace py = pybind11;
    
            #include<Eigen/Dense>
            #include<dolfin/mesh/SubDomain.h>
    
            class Left : public dolfin::SubDomain
            {
            public:
    
        class LeftOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] < DOLFIN_EPS and on_boundary
    
        class Right(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS
    
        class RightOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary
    
        cpp_code = """
            #include<pybind11/pybind11.h>
            #include<pybind11/eigen.h>
            namespace py = pybind11;
    
            #include<Eigen/Dense>
            #include<dolfin/mesh/SubDomain.h>
    
            class Left : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS;
              }
            };
    
            class LeftOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const

              {
                return x[0] < DOLFIN_EPS and on_boundary;
              }
            };
    
            class Right : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS;
              }
            };
    
            class RightOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary;
              }
            };
    
        PYBIND11_MODULE(SIGNATURE, m) {
           py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());
           py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());
           py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());
           py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());
        }
        """
    
>       compiled_domain_module = compile_cpp_code(cpp_code)

python/test/unit/mesh/test_sub_domain.py:127:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py:87:
in compile_cpp_code
    generate=jit_generate)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

E       args = ('\n        #include<pybind11/pybind11.h>\n
      #include<pybind11/eigen.h>\n        namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})

kwargs = {'generate': <function jit_generate at 0x7fe726463488>}
mpi_comm = <mpi4py.MPI.Intracomm object at 0x7fe71ecd8090>, status = 0

    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS;
              }
            };
    
            class LeftOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS and on_boundary;
              }
            };
    
            class Right : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS;
              }
            };
    
            class RightOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary;
              }
            };
    
        PYBIND11_MODULE(SIGNATURE, m) {
           py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());
           py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());
           py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());
           py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());
        }
        """
    
>       compiled_domain_module = compile_cpp_code(cpp_code)

python/test/unit/mesh/test_sub_domain.py:127:
E       Invoked with: <dolfin.cpp.la.PETScVector object at
0x7fab32897468>_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _
/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py
:87: in compile_cpp_code
    generate=jit_generate)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
python/test/unit/jit/test_jit.py

:221: TypeErrorargs = ('\n        #include<pybind11/pybind11.h>\n
#include<pybind11/eigen.h>\n        namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})
kwargs = {'generate': <function jit_generate at 0x7f489106f488>}

mpi_comm = <mpi4py.MPI.Intracomm object at 0x7f48898e0e50>, status = 1
root = True
error_msg = 'generic_type: type "Left" referenced unknown base type
"dolfin::SubDomain"'
global_status = 1.0

    @wraps(local_jit)
    def mpi_jit(*args, **kwargs):
    
        # FIXME: should require mpi_comm to be explicit
        # and not default to comm_world?
        mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)
    
        # Just call JIT compiler when running in serial
        if MPI.size(mpi_comm) == 1:
            return local_jit(*args, **kwargs)
    
        # Default status (0 == ok, 1 == fail)
        status = 0
    
        # Compile first on process 0
        root = MPI.rank(mpi_comm) == 0
        if root:
            try:
                output = local_jit(*args, **kwargs)
            except Exception as e:
root = False, error_msg = 'Compilation failed on root node.'
global_status = 1.0

    @wraps(local_jit)
    def mpi_jit(*args, **kwargs):
    
        # FIXME: should require mpi_comm to be explicit
        # and not default to comm_world?
        mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)
    
        # Just call JIT compiler when running in serial
        if MPI.size(mpi_comm) == 1:
            return local_jit(*args, **kwargs)
    
        # Default status (0 == ok, 1 == fail)
        status = 0
    
        # Compile first on process 0
        root = MPI.rank(mpi_comm) == 0
        if root:
            try:
                output = local_jit(*args, **kwargs)
            except Exception as e:
                status = 1
                error_msg = str(e)
    
        # TODO: This would have lower overhead if using the
dijitso.jit
        # features to inject a waiting callback instead of waiting
out here.
        # That approach allows all processes to first look in the
cache,
        # introducing a barrier only on cache miss.
        # There's also a sketch in dijitso of how to make only one
        # process per physical cache directory do the compilation.
    
        # Wait for the compiling process to finish and get status
        # TODO: Would be better to broadcast the status from root
but this works.
        global_status = MPI.max(mpi_comm, status)
    
        if global_status == 0:
            # Success, call jit on all other processes
            # (this should just read the cache)
            if not root:
                output = local_jit(*args, **kwargs)
        else:
            # Fail simultaneously on all processes,
            # to allow catching the error without deadlock
                status = 1            if not root:

                error_msg = "Compilation failed on root node."
>           raise RuntimeError(error_msg)
E           RuntimeError: Compilation failed on root node.

/usr/lib/python3/dist-packages/dolfin/jit/jit.py:82:
RuntimeError
__________________________ test_creation_and_marking
___________________________                error_msg = str(e)
    
=============================== warnings summary
===============================
test/unit/jit/test_jit.py::test_nasty_jit_caching_bug
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()

        # TODO: This would have lower overhead if using the
dijitso.jit
-- Docs: http://doc.pytest.org/en/latest/warnings.html
= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
672.01 seconds ==
        # features to inject a waiting callback instead of waiting
out here.
        # That approach allows all processes to first look in the
cache,
        # introducing a barrier only on cache miss.
        # There's also a sketch in dijitso of how to make only one
        # process per physical cache directory do the compilation.
    
        # Wait for the compiling process to finish and get status
        # TODO: Would be better to broadcast the status from root
but this works.
        global_status = MPI.max(mpi_comm, status)
    
        if global_status == 0:
            # Success, call jit on all other processes


            # (this should just read the cache)    def
test_creation_and_marking():

            if not root:    

        class Left(SubDomain):                output =
local_jit(*args, **kwargs)

            def inside(self, x, on_boundary):        else:

            # Fail simultaneously on all processes,
      return x[0] < DOLFIN_EPS

            # to allow catching the error without deadlock
 
        class LeftOnBoundary(SubDomain):
            if not root:

            def inside(self, x, on_boundary):
error_msg = "Compilation failed on root node."
                return x[0] < DOLFIN_EPS and on_boundary
>           raise RuntimeError(error_msg)

E           RuntimeError: generic_type: type "Left" referenced
unknown base type "dolfin::SubDomain"    
        class Right(SubDomain):


/usr/lib/python3/dist-packages/dolfin/jit/jit.py
    def inside(self, x, on_boundary):
:82: RuntimeError                return x[0] > 1.0 - DOLFIN_EPS

    
        class RightOnBoundary(SubDomain):
            def inside(self, x, on_boundary):
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary
    
        cpp_code = """
            #include<pybind11/pybind11.h>
            #include<pybind11/eigen.h>
            namespace py = pybind11;
    
            #include<Eigen/Dense>
            #include<dolfin/mesh/SubDomain.h>
    
            class Left : public dolfin::SubDomain
            {=============================== warnings
summary ===============================
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] < DOLFIN_EPS;
              }
            };
    
            class LeftOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const

test/unit/jit/test_jit.py::test_nasty_jit_caching_bug              {

                return x[0] < DOLFIN_EPS and on_boundary;
/usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()

  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()              }

            };
-- Docs: http://doc.pytest.org/en/latest/warnings.html
    

= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
671.92 seconds ==            class Right : public
dolfin::SubDomain
            {
            public:

    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS;
              }
            };
    
            class RightOnBoundary : public dolfin::SubDomain
            {
            public:
    
              virtual bool inside(Eigen::Ref<const Eigen::VectorXd>
x, bool on_boundary) const
              {
                return x[0] > 1.0 - DOLFIN_EPS and on_boundary;
              }
            };
    
        PYBIND11_MODULE(SIGNATURE, m) {
           py::class_<Left, std::shared_ptr<Left>,
dolfin::SubDomain>(m, "Left").def(py::init<>());
           py::class_<Right, std::shared_ptr<Right>,
dolfin::SubDomain>(m, "Right").def(py::init<>());
           py::class_<LeftOnBoundary,
std::shared_ptr<LeftOnBoundary>, dolfin::SubDomain>(m,
"LeftOnBoundary").def(py::init<>());
           py::class_<RightOnBoundary,
std::shared_ptr<RightOnBoundary>, dolfin::SubDomain>(m,
"RightOnBoundary").def(py::init<>());
        }
        """
    
>       compiled_domain_module = compile_cpp_code(cpp_code)

python/test/unit/mesh/test_sub_domain.py:127:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _
/usr/lib/python3/dist-packages/dolfin/jit/pybind11jit.py:87:
in compile_cpp_code
    generate=jit_generate)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _

args = ('\n        #include<pybind11/pybind11.h>\n
#include<pybind11/eigen.h>\n        namespace py = pybind11;\n\n
...ache': {'cache_dir': None, 'comm_dir': 'comm', 'enable_build_log':
True, 'fail_dir_root': None, ...}, 'generator': {}})
kwargs = {'generate': <function jit_generate at 0x7fab3934b488>}
mpi_comm = <mpi4py.MPI.Intracomm object at 0x7fab31c00230>, status = 0
root = False, error_msg = 'Compilation failed on root node.'
global_status = 1.0

    @wraps(local_jit)
    def mpi_jit(*args, **kwargs):
    
        # FIXME: should require mpi_comm to be explicit
        # and not default to comm_world?
        mpi_comm = kwargs.pop("mpi_comm", MPI.comm_world)
    
        # Just call JIT compiler when running in serial
        if MPI.size(mpi_comm) == 1:
            return local_jit(*args, **kwargs)
    
        # Default status (0 == ok, 1 == fail)
        status = 0
    
        # Compile first on process 0
        root = MPI.rank(mpi_comm) == 0
        if root:
            try:
                output = local_jit(*args, **kwargs)
            except Exception as e:
                status = 1
                error_msg = str(e)
    
        # TODO: This would have lower overhead if using the
dijitso.jit
        # features to inject a waiting callback instead of waiting
out here.
        # That approach allows all processes to first look in the
cache,
        # introducing a barrier only on cache miss.
        # There's also a sketch in dijitso of how to make only one
        # process per physical cache directory do the compilation.
    
        # Wait for the compiling process to finish and get status
        # TODO: Would be better to broadcast the status from root
but this works.
        global_status = MPI.max(mpi_comm, status)
    
        if global_status == 0:
            # Success, call jit on all other processes
            # (this should just read the cache)
            if not root:
                output = local_jit(*args, **kwargs)
        else:
            # Fail simultaneously on all processes,
            # to allow catching the error without deadlock
            if not root:
                error_msg = "Compilation failed on root node."
>           raise RuntimeError(error_msg)
E           RuntimeError: Compilation failed on root node.

/usr/lib/python3/dist-packages/dolfin/jit/jit.py:82:
RuntimeError
=============================== warnings summary
===============================
test/unit/jit/test_jit.py::test_nasty_jit_caching_bug
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()
  /usr/lib/python3/dist-packages/ffc/jitcompiler.py:234:
QuadratureRepresentationDeprecationWarning:
  *** ===================================================== ***
  *** FFC: quadrature representation is deprecated! It will ***
  *** likely be removed in 2018.2.0 release. Use uflacs     ***
  *** representation instead.                               ***
  *** ===================================================== ***
    issue_deprecation_warning()

-- Docs: http://doc.pytest.org/en/latest/warnings.html
= 2 failed, 796 passed, 437 skipped, 35 xfailed, 2 warnings in
671.96 seconds ==
-------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
-------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status,
thus causing
the job to be terminated. The first process to do so was:

  Process name: [[60892,1],1]
  Exit code:    1
--------------------------------------------------------------------------

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 488 bytes
Desc: OpenPGP digital signature
URL: <http://alioth-lists.debian.net/pipermail/debian-science-maintainers/attachments/20180923/969e57e8/attachment-0001.sig>


More information about the debian-science-maintainers mailing list