Defining Dirichlet BCs in parallel

Asked by Douglas Brinkerhoff

I'm attempting to solve a set of three equations in parallel. The first is a horizontal approximation of the Stokes' equations, the second is conservation of mass in order to get vertical velocity, and the third is a heat equation. The first uses only natural boundary conditions, the second can use Dirichlet or Neumann, depending on whether integration by parts is used, and the third requires a Dirichlet condition on one of its boundaries (the imposition of a specific temperature).

All three work well in serial, but upon parallelization I run into some issues. For the heat equation, there are erroneous very high values along linear features in the solution, which seem to propogate out from the boundary. I assume that the location is at a divisions between mesh partitions. The conservation of mass equations exhibits the same issues when the Dirichlet BCs are used, but not when posed with essential boundary conditions. The first set, which also use only natural BCs, also work fine. My code is too long to analyze here, so I won't post it, but it seems as though the issue stems from the use of a MeshFunction to define the Dirichlet BCs. I read a similar post from a year ago which indicated that there were issues with using mesh functions to define a boundary in parallel.

https://answers.launchpad.net/dolfin/+question/146875

However, I don't get any errors, as in the above post.

Does anyone have any insight into whether this is still unsupported? And if so, are there any workarounds I might try?

Question information

Language:
English Edit question
Status:
Answered
For:
DOLFIN Edit question
Assignee:
No assignee Edit question
Last query:
Last reply:
Revision history for this message
Douglas Brinkerhoff (douglas-brinkerhoff) said :
#1

Some additional information. Rolling back from the development version to the stable version changes this behavior. While still incorrect, rather than the BCs being applied in an unpredictable way (and apparently non-deterministically, as the results change with each model run), with the stable build, the Dirichlet BC is simply not applied to sections of the boundary that are situated along the mesh partitition.

Revision history for this message
Anders Logg (logg) said :
#2

On Thu, Mar 22, 2012 at 10:30:53PM -0000, Douglas Brinkerhoff wrote:
> New question #191441 on DOLFIN:
> https://answers.launchpad.net/dolfin/+question/191441
>
> I'm attempting to solve a set of three equations in parallel. The first is a horizontal approximation of the Stokes' equations, the second is conservation of mass in order to get vertical velocity, and the third is a heat equation. The first uses only natural boundary conditions, the second can use Dirichlet or Neumann, depending on whether integration by parts is used, and the third requires a Dirichlet condition on one of its boundaries (the imposition of a specific temperature).
>
> All three work well in serial, but upon parallelization I run into some issues. For the heat equation, there are erroneous very high values along linear features in the solution, which seem to propogate out from the boundary. I assume that the location is at a divisions between mesh partitions. The conservation of mass equations exhibits the same issues when the Dirichlet BCs are used, but not when posed with essential boundary conditions. The first set, which also use only natural BCs, also work fine. My code is too long to analyze here, so I won't post it, but it seems as though the issue stems from the use of a MeshFunction to define the Dirichlet BCs. I read a similar post from a year ago which indicated that there were issues with using mesh functions to define a boundary in parallel.
>
> https://answers.launchpad.net/dolfin/+question/146875
>
> However, I don't get any errors, as in the above post.
>
> Does anyone have any insight into whether this is still unsupported?
> And if so, are there any workarounds I might try?

Specification of boundary data using plain MeshFunctions (where data
is stored based on facet indices) is not robust in parallel. Boundary
data should instead be specified using the MeshDomain class.

I'm surprised you don't get a warning when distributing the
MeshFunction.

--
Anders

Revision history for this message
Douglas Brinkerhoff (douglas-brinkerhoff) said :
#3

Is there any documentation for this MeshDomain class? I can't seem to find it in the python interface.

Revision history for this message
Douglas Brinkerhoff (douglas-brinkerhoff) said :
#4

The problem only seems to occur when specifying the value of the boundary condition with a function, and does not occur when setting it with a constant. To observe this behavior, try running the following script in parallel:
----------------------------

from dolfin import *

mesh = UnitCube(10,10,10)

Q = FunctionSpace(mesh,"CG",1)

u_0 = Function(Q)

class left(SubDomain):
    def inside(self,x,on_boundary):
        return near(x[0],0) and on_boundary

class other(SubDomain):
    def inside(self,x,on_boundary):
        return (not near(x[0],0)) and on_boundary

Left = left()
Other = other()

Left.mark(mesh,mesh.topology().dim()-1,1)
Other.mark(mesh,mesh.topology().dim()-1,2)

phi = TestFunction(Q)
u = TrialFunction(Q)

f_l = Function(Q)
f_l.vector()[:] = 0.0

f_o = Function(Q)
f_o.vector()[:] = 1.0

R = dot(grad(phi),grad(u))*dx

a = lhs(R)
L = rhs(R)

bc_0 = DirichletBC(Q,f_l,1)
bc_1 = DirichletBC(Q,f_o,2)

solve(a==L,u_0,[bc_0,bc_1])

u_file = File("u.pvd")
u_file << u_0

----------------------------------
The break along the mesh partition is obvious. Anders, is marking the mesh directly (as is done above) equivalent to using the MeshDomains class, or is there another way of doing this?

Note that if the lines

bc_0 = DirichletBC(Q,f_l,1)
bc_1 = DirichletBC(Q,f_o,2)

are changed to

bc_0 = DirichletBC(Q,0.0,1)
bc_1 = DirichletBC(Q,1.0,2)

the solve works properly. Is there a way to define boundaries such that Dirichlet BCs will interact properly with functions in parallel?

Revision history for this message
Joachim Haga (jobh) said :
#5

I think the problem in this example is not the DirichletBC, but the vector
initialisation (f_l[:] = 0.0 etc). I think it currently will only set local
degrees of freedom, meaning that ghost dofs are never initialised. You
should be able to fix this example by calling update() on the functions
after they are initialised, whether this will also fix your "real" problem
I do not know.

Can you help with this problem?

Provide an answer of your own, or ask Douglas Brinkerhoff for more information if necessary.

To post a message you must log in.