updated fenics -> problem with assemble_system
Hello,
I recently have updated Fenics and have since then problems with the 'assemble_system' command. The difficulties arises for instance with the script:
====
from dolfin import *
import numpy
#parameters ["linear_
#parameters ["linear_
parameters ["linear_
mesh = UnitCube(6, 6, 6)
U = FunctionSpace(mesh, 'CG', 1)
g = Expression('0.0')
class DirichletBounda
def inside(self, x, on_boundary):
return on_boundary
Du = DirichletBounda
BCu = DirichletBC(U, g, Du)
u = TrialFunction(U)
v = TestFunction(U)
Au = inner(grad(u), grad(v))*dx + u*v*dx
Lu = g*v*ds
#Mu = assemble(Au)
#Bu = assemble(Lu)
#BCu.apply(Mu, Bu)
Mu, Bu = assemble_system(Au, Lu, BCu)
===
Running the program with PETSc backend
=== BEGIN PETSC ===
Applying boundary conditions to linear system.
Assembling linear system and applying boundary conditions...
[0]PETSC ERROR: -------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_
[0]PETSC ERROR: or see http://
[0]PETSC ERROR: configure using --with-
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: -------
[0]PETSC ERROR: Signal received!
[0]PETSC ERROR: -------
[0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54 CDT 2010
[0]PETSC ERROR: See docs/changes/
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: -------
[0]PETSC ERROR: Unknown Name on a linux-gnu named ubuntu-VirtualBox by ubuntu Mon Sep 12 16:14:36 2011
[0]PETSC ERROR: Libraries linked from /build/
[0]PETSC ERROR: Configure run at Mon Mar 7 18:34:33 2011
[0]PETSC ERROR: Configure options --with-shared --with-debugging=0 --useThreads 0 --with-
[0]PETSC ERROR: -------
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
-------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
-------
=== END PETSC ===
Similary Epetra and uBlas cause difficulties. Their output is basically the same.
=== BEGIN EPETRA ===
pplying boundary conditions to linear system.
Assembling linear system and applying boundary conditions...
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
[ubuntu-
===END EPETRA ===
The above code runs smoothly with the last line removed and the three above that uncommented.
Can anyone help with this problem?
I would like to maintain the symmetric structure of the variational form -- this is part of a bigger nonlinear system for which the symmetry allows to compute solutions a lot faster & reliable.
Best,
Max
Question information
- Language:
- English Edit question
- Status:
- Solved
- For:
- DOLFIN Edit question
- Assignee:
- No assignee Edit question
- Solved by:
- Max Jensen
- Solved:
- Last query:
- Last reply: