DOLFIN 0.9.8. Running elastodynamics demo. Error.

Asked by Frantisek Fridrich

Hello.
Pleas, could anybody help me?

I installed DOLFIN in Scientific Linux 5, gcc-4.4.4.
DOLFIN 0.9.7 works OK but DOLFIN 0.9.8 gives errors when running elastodynamics demo.

Initializing DOLFIN version 0.9.8.
terminate called after throwing an instance of 'std::runtime_error'
  what(): *** Error: Unable to determine if point is inside subdomain, function inside() not implemented by user.
[juniper:12297] *** Process received signal ***
[juniper:12297] Signal: Aborted (6)
[juniper:12297] Signal code: (-6)
[juniper:12297] [ 0] /lib64/libpthread.so.0 [0x337a80de80]
[juniper:12297] [ 1] /lib64/libc.so.6(gsignal+0x35) [0x3379c30155]
[juniper:12297] [ 2] /lib64/libc.so.6(abort+0x110) [0x3379c31bf0]
[juniper:12297] [ 3] /home/rose/local-gcc-4.4.4/lib64/libstdc++.so.6 [0x2af30a61846d]
[juniper:12297] [ 4] /home/rose/local-gcc-4.4.4/lib64/libstdc++.so.6 [0x2af30a6168e6]
[juniper:12297] [ 5] /home/rose/local-gcc-4.4.4/lib64/libstdc++.so.6 [0x2af30a616913]
[juniper:12297] [ 6] /home/rose/local-gcc-4.4.4/lib64/libstdc++.so.6 [0x2af30a6169fe]
[juniper:12297] [ 7] /home/rose/OpenMPI/openmpi-1.4.2/fenics/utils/dolfin-0.9.8/lib/libdolfin.so.0(_ZNK6dolfin6Logger5errorESs+0xd8) [0x2af305490ff8]
[juniper:12297] [ 8] /home/rose/OpenMPI/openmpi-1.4.2/fenics/utils/dolfin-0.9.8/lib/libdolfin.so.0(_ZN6dolfin5errorESsz+0x141) [0x2af3054999f1]
[juniper:12297] [ 9] /home/rose/OpenMPI/openmpi-1.4.2/fenics/utils/dolfin-0.9.8/lib/libdolfin.so.0(_ZNK6dolfin9SubDomain6insideERKNS_5ArrayIdEEb+0x28) [0x2af3054b6f88]
[juniper:12297] [10] /home/rose/OpenMPI/openmpi-1.4.2/fenics/utils/dolfin-0.9.8/lib/libdolfin.so.0(_ZNK6dolfin9SubDomain4markERNS_12MeshFunctionIjEEj+0x838) [0x2af3054b7838]
[juniper:12297] [11] ./demo(main+0x4db) [0x418bfb]
[juniper:12297] [12] /lib64/libc.so.6(__libc_start_main+0xf4) [0x3379c1d8b4]
[juniper:12297] [13] ./demo(_ZN6dolfin13GenericMatrix11ident_zerosEv+0x51) [0x418029]
[juniper:12297] *** End of error message ***

I also got errors wher running other demos.
Thanky you for respons.
Frantisek

Question information

Language:
English Edit question
Status:
Solved
For:
DOLFIN Edit question
Assignee:
No assignee Edit question
Solved by:
Frantisek Fridrich
Solved:
Last query:
Last reply:
Revision history for this message
Garth Wells (garth-wells) said :
#1

It works for me. Are you running the demo from the 0.9.8 release against the 0.9.8 library?

Revision history for this message
Frantisek Fridrich (frafridr) said :
#2

Hello Garth.

Thank you for response.
I run the following demos. They work OK.
  elasticity, cpp
  stokes, stabilized, cpp
  stokes, TaylorHood, cpp
  lift-drag, cpp
  poisson, cpp
  hyperelasticity, cpp

The following demos rises an error.
  elastodynamics, python
  elasticity, python
  stokes, stabilized, python
  stokes, TaylorHood, python
  lift-drag, python
  poisson, python
  hyperelasticity, python

Calling FFC just-in-time (JIT) compiler, this may take some time.
Calling FFC just-in-time (JIT) compiler, this may take some time.
Lift: -14.742218
Drag: 57.550887
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR: INSTEAD the line number of the start of the function
[0]PETSC ERROR: is given.
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 3, Fri Jun 4 15:34:52 CDT 2010
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Unknown Name on a linux-gnu named juniper by fridrich Sat Jul 10 16:06:21 2010
[0]PETSC ERROR: Libraries linked from /home/rose/OpenMPI/openmpi-1.4.2/utils/petsc-3.1-p3/lib
[0]PETSC ERROR: Configure run at Thu Jul 8 19:49:50 2010
[0]PETSC ERROR: Configure options --prefix=/home/rose/OpenMPI/openmpi-1.4.2/utils/petsc-3.1-p3 --with-external-packages-dir=/home/rose/OpenMPI/openmpi-1.4.2/utilsRepo/petsc-3.1-p3/petscUtilsRepo --PETSC_ARCH=linux-gnu --PETSC_DIR=/home/rose/OpenMPI/openmpi-1.4.2/utilsRepo/petsc-3.1-p3/petsc-3.1-p3 --with-clanguage=c++ --with-c-support=yes --with-shared=yes --with-large-file-io=yes --CFLAGS= -Wall -march=opteron -m64 -O2 -fno-reorder-blocks -fno-reorder-functions -pipe -fPIC --CXXFLAGS= -Wall -march=opteron -m64 -O2 -fno-reorder-blocks -fno-reorder-functions -pipe -fPIC --FFLAGS= -Wall -march=opteron -m64 -O2 -fno-reorder-blocks -fno-reorder-functions -pipe -fPIC --with-ar=ar --AR_FLAGS=cr --with-ranlib=ranlib --COPTFLAGS= -O2 -fno-reorder-blocks -fno-reorder-functions --CXXOPTFLAGS= -O2 -fno-reorder-blocks -fno-reorder-functions --FOPTFLAGS= -O2 -fno-reorder-blocks -fno-reorder-functions --with-mpi-dir=/home/rose/OpenMPI/openmpi-1.4.2/install --with-mpi-shared=yes --with-spooles=yes --download-spooles=yes --with-blas-lapack-dir=/home/rose/OpenMPI/openmpi-1.4.2/utils/atlas-3.9.24/lib --with-blacs=yes --download-blacs=yes --with-parmetis=yes --with-parmetis-dir=/home/rose/OpenMPI/openmpi-1.4.2/utils/ParMetis-3.1.1 --with-scalapack=yes --download-scalapack=yes --with-mumps=yes --download-mumps=yes --with-hypre=yes --with-hypre-dir=/home/rose/OpenMPI/openmpi-1.4.2/utils/hypre-2.6.0b --with-umfpack=yes --download-umfpack=yes
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

The demo elastodynamics, cpp does not work in my instalation.

Best regards.
Frantisek

Revision history for this message
Frantisek Fridrich (frafridr) said :
#3

Hello.

I modified elastodynamics, cpp
class RightBoundary and class LeftBoundary
from
  bool inside(const double* x, bool on_boundary) const
to
  bool inside(const Array<double>& x, bool on_boundary) const

It seem it works. But not in parallel.

Best regards.
Frantisek

Revision history for this message
Frantisek Fridrich (frafridr) said :
#4

I think that question could be closed.