updated fenics -> problem with assemble_system

Asked by Max Jensen

Hello,

I recently have updated Fenics and have since then problems with the 'assemble_system' command. The difficulties arises for instance with the script:

====
from dolfin import *
import numpy

#parameters ["linear_algebra_backend"] = "uBLAS"
#parameters ["linear_algebra_backend"] = "Epetra"
parameters ["linear_algebra_backend"] = "PETSc"

mesh = UnitCube(6, 6, 6)
U = FunctionSpace(mesh, 'CG', 1)
g = Expression('0.0')

class DirichletBoundaryU(SubDomain):
    def inside(self, x, on_boundary):
        return on_boundary
Du = DirichletBoundaryU()
BCu = DirichletBC(U, g, Du)

u = TrialFunction(U)
v = TestFunction(U)

Au = inner(grad(u), grad(v))*dx + u*v*dx
Lu = g*v*ds

#Mu = assemble(Au)
#Bu = assemble(Lu)
#BCu.apply(Mu, Bu)
Mu, Bu = assemble_system(Au, Lu, BCu)
===

Running the program with PETSc backend

=== BEGIN PETSC ===
Applying boundary conditions to linear system.
Assembling linear system and applying boundary conditions...
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54 CDT 2010
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Unknown Name on a linux-gnu named ubuntu-VirtualBox by ubuntu Mon Sep 12 16:14:36 2011
[0]PETSC ERROR: Libraries linked from /build/buildd/petsc-3.1.dfsg/linux-gnu-c-opt/lib
[0]PETSC ERROR: Configure run at Mon Mar 7 18:34:33 2011
[0]PETSC ERROR: Configure options --with-shared --with-debugging=0 --useThreads 0 --with-clanguage=C++ --with-c-support --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-blacs=1 --with-blacs-include=/usr/include --with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-openmpi.so]" --with-scalapack=1 --with-scalapack-include=/usr/include --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1 --with-mumps-include=/usr/include --with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/libsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpord.so]" --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]" --with-spooles=1 --with-spooles-include=/usr/include/spooles --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1 --with-hypre-dir=/usr --with-scotch=1 --with-scotch-include=/usr/include/scotch --with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1 --with-hdf5-dir=/usr
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
=== END PETSC ===

Similary Epetra and uBlas cause difficulties. Their output is basically the same.

=== BEGIN EPETRA ===
pplying boundary conditions to linear system.
Assembling linear system and applying boundary conditions...
[ubuntu-VirtualBox:02380] *** Process received signal ***
[ubuntu-VirtualBox:02380] Signal: Segmentation fault (11)
[ubuntu-VirtualBox:02380] Signal code: Address not mapped (1)
[ubuntu-VirtualBox:02380] Failing at address: (nil)
[ubuntu-VirtualBox:02380] [ 0] [0xac140c]
[ubuntu-VirtualBox:02380] [ 1] /usr/lib/libdolfin.so.0(_ZN6dolfin15SystemAssembler8assembleERNS_13GenericMatrixERNS_13GenericVectorERKNS_4FormES7_RKSt6vectorIPKNS_11DirichletBCESaISB_EEPKNS_12MeshFunctionIjEESJ_SJ_PKS3_bb+0xe36) [0x36e3336]
[ubuntu-VirtualBox:02380] [ 2] /usr/lib/libdolfin.so.0(_ZN6dolfin15assemble_systemERNS_13GenericMatrixERNS_13GenericVectorERKNS_4FormES6_RKSt6vectorIPKNS_11DirichletBCESaISA_EEPKNS_12MeshFunctionIjEESI_SI_PKS2_bb+0x65) [0x36ea6d5]
[ubuntu-VirtualBox:02380] [ 3] /usr/lib/python2.7/dist-packages/dolfin/_cpp.so(+0x173ffb) [0xfa3ffb]
[ubuntu-VirtualBox:02380] [ 4] /usr/lib/python2.7/dist-packages/dolfin/_cpp.so(+0x175cf7) [0xfa5cf7]
[ubuntu-VirtualBox:02380] [ 5] python(PyEval_EvalFrameEx+0x4332) [0x80de822]
[ubuntu-VirtualBox:02380] [ 6] python(PyEval_EvalCodeEx+0x127) [0x80e11e7]
[ubuntu-VirtualBox:02380] [ 7] python(PyEval_EvalFrameEx+0x15e5) [0x80dbad5]
[ubuntu-VirtualBox:02380] [ 8] python(PyEval_EvalCodeEx+0x127) [0x80e11e7]
[ubuntu-VirtualBox:02380] [ 9] python(PyEval_EvalFrameEx+0x73a) [0x80dac2a]
[ubuntu-VirtualBox:02380] [10] python(PyEval_EvalCodeEx+0x127) [0x80e11e7]
[ubuntu-VirtualBox:02380] [11] python(PyEval_EvalCode+0x57) [0x812c477]
[ubuntu-VirtualBox:02380] [12] python() [0x813c010]
[ubuntu-VirtualBox:02380] [13] python(PyRun_FileExFlags+0x84) [0x80700b3]
[ubuntu-VirtualBox:02380] [14] python(PyRun_SimpleFileExFlags+0x2cb) [0x8070af9]
[ubuntu-VirtualBox:02380] [15] python(Py_Main+0xbc6) [0x805c069]
[ubuntu-VirtualBox:02380] [16] python(main+0x1b) [0x805b25b]
[ubuntu-VirtualBox:02380] [17] /lib/i386-linux-gnu/libc.so.6(__libc_start_main+0xe7) [0x44be37]
[ubuntu-VirtualBox:02380] [18] python() [0x81074ad]
[ubuntu-VirtualBox:02380] *** End of error message ***
===END EPETRA ===

The above code runs smoothly with the last line removed and the three above that uncommented.

Can anyone help with this problem?

I would like to maintain the symmetric structure of the variational form -- this is part of a bigger nonlinear system for which the symmetry allows to compute solutions a lot faster & reliable.

Best,
Max

Question information

Language:
English Edit question
Status:
Solved
For:
DOLFIN Edit question
Assignee:
No assignee Edit question
Solved by:
Max Jensen
Solved:
Last query:
Last reply:
Revision history for this message
Johan Hake (johan-hake) said :
#1

On Monday September 12 2011 07:41:17 Max Jensen wrote:
> New question #170899 on DOLFIN:
> https://answers.launchpad.net/dolfin/+question/170899
>
> Hello,
>
> I recently have updated Fenics and have since then problems with the
> 'assemble_system' command. The difficulties arises for instance with the
> script:
>
> ====
> from dolfin import *
> import numpy
>
> #parameters ["linear_algebra_backend"] = "uBLAS"
> #parameters ["linear_algebra_backend"] = "Epetra"
> parameters ["linear_algebra_backend"] = "PETSc"
>
> mesh = UnitCube(6, 6, 6)
> U = FunctionSpace(mesh, 'CG', 1)
> g = Expression('0.0')
>
> class DirichletBoundaryU(SubDomain):
> def inside(self, x, on_boundary):
> return on_boundary
> Du = DirichletBoundaryU()
> BCu = DirichletBC(U, g, Du)
>
> u = TrialFunction(U)
> v = TestFunction(U)
>
> Au = inner(grad(u), grad(v))*dx + u*v*dx
> Lu = g*v*ds
>
> #Mu = assemble(Au)
> #Bu = assemble(Lu)
> #BCu.apply(Mu, Bu)
> Mu, Bu = assemble_system(Au, Lu, BCu)
> ===
>
> Running the program with PETSc backend
>
> === BEGIN PETSC ===
> Applying boundary conditions to linear system.
> Assembling linear system and applying boundary conditions...
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range [0]PETSC ERROR: Try option
> -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#S
> ignal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors [0]PETSC ERROR: configure using
> --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more
> information on the crash.
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------ [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54
> CDT 2010 [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Unknown Name on a linux-gnu named ubuntu-VirtualBox by
> ubuntu Mon Sep 12 16:14:36 2011 [0]PETSC ERROR: Libraries linked from
> /build/buildd/petsc-3.1.dfsg/linux-gnu-c-opt/lib [0]PETSC ERROR: Configure
> run at Mon Mar 7 18:34:33 2011
> [0]PETSC ERROR: Configure options --with-shared --with-debugging=0
> --useThreads 0 --with-clanguage=C++ --with-c-support
> --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi
> --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack
> --with-blacs=1 --with-blacs-include=/usr/include
> --with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-ope
> nmpi.so]" --with-scalapack=1 --with-scalapack-include=/usr/include
> --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1
> --with-mumps-include=/usr/include
> --with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/li
> bsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpo
> rd.so]" --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse
> --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]"
> --with-spooles=1 --with-spooles-include=/usr/include/spooles
> --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1
> --with-hypre-dir=/usr --with-scotch=1
> --with-scotch-include=/usr/include/scotch
> --with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1 --with-hdf5-dir=/usr
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown directory
> unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> === END PETSC ===
>
> Similary Epetra and uBlas cause difficulties. Their output is basically the
> same.
>
> === BEGIN EPETRA ===
> pplying boundary conditions to linear system.
> Assembling linear system and applying boundary conditions...
> [ubuntu-VirtualBox:02380] *** Process received signal ***
> [ubuntu-VirtualBox:02380] Signal: Segmentation fault (11)
> [ubuntu-VirtualBox:02380] Signal code: Address not mapped (1)
> [ubuntu-VirtualBox:02380] Failing at address: (nil)
> [ubuntu-VirtualBox:02380] [ 0] [0xac140c]
> [ubuntu-VirtualBox:02380] [ 1]
> /usr/lib/libdolfin.so.0(_ZN6dolfin15SystemAssembler8assembleERNS_13Generic
> MatrixERNS_13GenericVectorERKNS_4FormES7_RKSt6vectorIPKNS_11DirichletBCESaI
> SB_EEPKNS_12MeshFunctionIjEESJ_SJ_PKS3_bb+0xe36) [0x36e3336]
> [ubuntu-VirtualBox:02380] [ 2]
> /usr/lib/libdolfin.so.0(_ZN6dolfin15assemble_systemERNS_13GenericMatrixERN
> S_13GenericVectorERKNS_4FormES6_RKSt6vectorIPKNS_11DirichletBCESaISA_EEPKNS
> _12MeshFunctionIjEESI_SI_PKS2_bb+0x65) [0x36ea6d5]
> [ubuntu-VirtualBox:02380] [ 3]
> /usr/lib/python2.7/dist-packages/dolfin/_cpp.so(+0x173ffb) [0xfa3ffb]
> [ubuntu-VirtualBox:02380] [ 4]
> /usr/lib/python2.7/dist-packages/dolfin/_cpp.so(+0x175cf7) [0xfa5cf7]
> [ubuntu-VirtualBox:02380] [ 5] python(PyEval_EvalFrameEx+0x4332)
> [0x80de822] [ubuntu-VirtualBox:02380] [ 6] python(PyEval_EvalCodeEx+0x127)
> [0x80e11e7] [ubuntu-VirtualBox:02380] [ 7]
> python(PyEval_EvalFrameEx+0x15e5) [0x80dbad5] [ubuntu-VirtualBox:02380] [
> 8] python(PyEval_EvalCodeEx+0x127) [0x80e11e7] [ubuntu-VirtualBox:02380] [
> 9] python(PyEval_EvalFrameEx+0x73a) [0x80dac2a] [ubuntu-VirtualBox:02380]
> [10] python(PyEval_EvalCodeEx+0x127) [0x80e11e7] [ubuntu-VirtualBox:02380]
> [11] python(PyEval_EvalCode+0x57) [0x812c477] [ubuntu-VirtualBox:02380]
> [12] python() [0x813c010]
> [ubuntu-VirtualBox:02380] [13] python(PyRun_FileExFlags+0x84) [0x80700b3]
> [ubuntu-VirtualBox:02380] [14] python(PyRun_SimpleFileExFlags+0x2cb)
> [0x8070af9] [ubuntu-VirtualBox:02380] [15] python(Py_Main+0xbc6)
> [0x805c069]
> [ubuntu-VirtualBox:02380] [16] python(main+0x1b) [0x805b25b]
> [ubuntu-VirtualBox:02380] [17]
> /lib/i386-linux-gnu/libc.so.6(__libc_start_main+0xe7) [0x44be37]
> [ubuntu-VirtualBox:02380] [18] python() [0x81074ad]
> [ubuntu-VirtualBox:02380] *** End of error message ***
> ===END EPETRA ===
>
> The above code runs smoothly with the last line removed and the three above
> that uncommented.
>
> Can anyone help with this problem?
>
> I would like to maintain the symmetric structure of the variational form --
> this is part of a bigger nonlinear system for which the symmetry allows to
> compute solutions a lot faster & reliable.
>
> Best,
> Max

Revision history for this message
Johan Hake (johan-hake) said :
#2

Looks like there is a logic bug in assemble_system. If you change Lu to a cell
integral it all works.

Can you register a bug for this? Something like:

   assemble_system fails for pure facet integrals

Johan

On Monday September 12 2011 07:41:17 Max Jensen wrote:
> New question #170899 on DOLFIN:
> https://answers.launchpad.net/dolfin/+question/170899
>
> Hello,
>
> I recently have updated Fenics and have since then problems with the
> 'assemble_system' command. The difficulties arises for instance with the
> script:
>
> ====
> from dolfin import *
> import numpy
>
> #parameters ["linear_algebra_backend"] = "uBLAS"
> #parameters ["linear_algebra_backend"] = "Epetra"
> parameters ["linear_algebra_backend"] = "PETSc"
>
> mesh = UnitCube(6, 6, 6)
> U = FunctionSpace(mesh, 'CG', 1)
> g = Expression('0.0')
>
> class DirichletBoundaryU(SubDomain):
> def inside(self, x, on_boundary):
> return on_boundary
> Du = DirichletBoundaryU()
> BCu = DirichletBC(U, g, Du)
>
> u = TrialFunction(U)
> v = TestFunction(U)
>
> Au = inner(grad(u), grad(v))*dx + u*v*dx
> Lu = g*v*ds
>
> #Mu = assemble(Au)
> #Bu = assemble(Lu)
> #BCu.apply(Mu, Bu)
> Mu, Bu = assemble_system(Au, Lu, BCu)
> ===
>
> Running the program with PETSc backend
>
> === BEGIN PETSC ===
> Applying boundary conditions to linear system.
> Assembling linear system and applying boundary conditions...
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range [0]PETSC ERROR: Try option
> -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#S
> ignal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
> OS X to find memory corruption errors [0]PETSC ERROR: configure using
> --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more
> information on the crash.
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------ [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54
> CDT 2010 [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: Unknown Name on a linux-gnu named ubuntu-VirtualBox by
> ubuntu Mon Sep 12 16:14:36 2011 [0]PETSC ERROR: Libraries linked from
> /build/buildd/petsc-3.1.dfsg/linux-gnu-c-opt/lib [0]PETSC ERROR: Configure
> run at Mon Mar 7 18:34:33 2011
> [0]PETSC ERROR: Configure options --with-shared --with-debugging=0
> --useThreads 0 --with-clanguage=C++ --with-c-support
> --with-fortran-interfaces=1 --with-mpi-dir=/usr/lib/openmpi
> --with-mpi-shared=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack
> --with-blacs=1 --with-blacs-include=/usr/include
> --with-blacs-lib="[/usr/lib/libblacsCinit-openmpi.so,/usr/lib/libblacs-ope
> nmpi.so]" --with-scalapack=1 --with-scalapack-include=/usr/include
> --with-scalapack-lib=/usr/lib/libscalapack-openmpi.so --with-mumps=1
> --with-mumps-include=/usr/include
> --with-mumps-lib="[/usr/lib/libdmumps.so,/usr/lib/libzmumps.so,/usr/lib/li
> bsmumps.so,/usr/lib/libcmumps.so,/usr/lib/libmumps_common.so,/usr/lib/libpo
> rd.so]" --with-umfpack=1 --with-umfpack-include=/usr/include/suitesparse
> --with-umfpack-lib="[/usr/lib/libumfpack.so,/usr/lib/libamd.so]"
> --with-spooles=1 --with-spooles-include=/usr/include/spooles
> --with-spooles-lib=/usr/lib/libspooles.so --with-hypre=1
> --with-hypre-dir=/usr --with-scotch=1
> --with-scotch-include=/usr/include/scotch
> --with-scotch-lib=/usr/lib/libscotch.so --with-hdf5=1 --with-hdf5-dir=/usr
> [0]PETSC ERROR:
> ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown directory
> unknown file
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 59.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> === END PETSC ===
>
> Similary Epetra and uBlas cause difficulties. Their output is basically the
> same.
>
> === BEGIN EPETRA ===
> pplying boundary conditions to linear system.
> Assembling linear system and applying boundary conditions...
> [ubuntu-VirtualBox:02380] *** Process received signal ***
> [ubuntu-VirtualBox:02380] Signal: Segmentation fault (11)
> [ubuntu-VirtualBox:02380] Signal code: Address not mapped (1)
> [ubuntu-VirtualBox:02380] Failing at address: (nil)
> [ubuntu-VirtualBox:02380] [ 0] [0xac140c]
> [ubuntu-VirtualBox:02380] [ 1]
> /usr/lib/libdolfin.so.0(_ZN6dolfin15SystemAssembler8assembleERNS_13Generic
> MatrixERNS_13GenericVectorERKNS_4FormES7_RKSt6vectorIPKNS_11DirichletBCESaI
> SB_EEPKNS_12MeshFunctionIjEESJ_SJ_PKS3_bb+0xe36) [0x36e3336]
> [ubuntu-VirtualBox:02380] [ 2]
> /usr/lib/libdolfin.so.0(_ZN6dolfin15assemble_systemERNS_13GenericMatrixERN
> S_13GenericVectorERKNS_4FormES6_RKSt6vectorIPKNS_11DirichletBCESaISA_EEPKNS
> _12MeshFunctionIjEESI_SI_PKS2_bb+0x65) [0x36ea6d5]
> [ubuntu-VirtualBox:02380] [ 3]
> /usr/lib/python2.7/dist-packages/dolfin/_cpp.so(+0x173ffb) [0xfa3ffb]
> [ubuntu-VirtualBox:02380] [ 4]
> /usr/lib/python2.7/dist-packages/dolfin/_cpp.so(+0x175cf7) [0xfa5cf7]
> [ubuntu-VirtualBox:02380] [ 5] python(PyEval_EvalFrameEx+0x4332)
> [0x80de822] [ubuntu-VirtualBox:02380] [ 6] python(PyEval_EvalCodeEx+0x127)
> [0x80e11e7] [ubuntu-VirtualBox:02380] [ 7]
> python(PyEval_EvalFrameEx+0x15e5) [0x80dbad5] [ubuntu-VirtualBox:02380] [
> 8] python(PyEval_EvalCodeEx+0x127) [0x80e11e7] [ubuntu-VirtualBox:02380] [
> 9] python(PyEval_EvalFrameEx+0x73a) [0x80dac2a] [ubuntu-VirtualBox:02380]
> [10] python(PyEval_EvalCodeEx+0x127) [0x80e11e7] [ubuntu-VirtualBox:02380]
> [11] python(PyEval_EvalCode+0x57) [0x812c477] [ubuntu-VirtualBox:02380]
> [12] python() [0x813c010]
> [ubuntu-VirtualBox:02380] [13] python(PyRun_FileExFlags+0x84) [0x80700b3]
> [ubuntu-VirtualBox:02380] [14] python(PyRun_SimpleFileExFlags+0x2cb)
> [0x8070af9] [ubuntu-VirtualBox:02380] [15] python(Py_Main+0xbc6)
> [0x805c069]
> [ubuntu-VirtualBox:02380] [16] python(main+0x1b) [0x805b25b]
> [ubuntu-VirtualBox:02380] [17]
> /lib/i386-linux-gnu/libc.so.6(__libc_start_main+0xe7) [0x44be37]
> [ubuntu-VirtualBox:02380] [18] python() [0x81074ad]
> [ubuntu-VirtualBox:02380] *** End of error message ***
> ===END EPETRA ===
>
> The above code runs smoothly with the last line removed and the three above
> that uncommented.
>
> Can anyone help with this problem?
>
> I would like to maintain the symmetric structure of the variational form --
> this is part of a bigger nonlinear system for which the symmetry allows to
> compute solutions a lot faster & reliable.
>
> Best,
> Max

Revision history for this message
Max Jensen (m-p-j-jensen) said :
#3

Many thanks for picking the question up that quickly. The bug report is

assemble_system fails (for pure facet integrals)
DOLFIN Bugs Bug #848042