Access a vector of a subfunction

Asked by Jan Blechta

Hi,
I'm trying to access a vector of a subfunction to be used by normalize().

V = VectorFunctionSpace(mesh, "CG", 2)
Q = FunctionSpace(mesh, "CG", 1)
R = FunctionSpace(mesh, "CG", 1)
W = MixedFunctionSpace([V,Q,R])

w = Function(W)

normalize(w[2].vector(), 'average')
AttributeError: 'Indexed' object has no attribute 'vector'

normalize(w.vector()[W.sub(1).dofmap().dofs()], 'average')
TypeError: expected an int, slice, list or numpy array of integers
#W.sub(1).dofmap().dofs() is swigpy pointer which can't be dereferenced from python. Any work-around?

(u, p, theta) = w.split()
normalize(p.vector(), 'average')
*** Error: Unable to access vector of degrees of freedom fro function.
*** Reason: Cannot access a non-const vector from a subfunction.
*** Where: This error was encountered inside Function.cpp.

I've found few another ways. One could for example deepcopy p and then interpolate back to w. But it's an extra interpolate instead of simple algebraic operation and it could reduce accuraccy of normalization.

Or it has been suggested here that one could get indeces of w.vector() belonging to p-part by V.dim(), Q.dim() or by num_vertices(). But it will not work in parallel and also num_vertices work-around depends on chosen FunctionSpaces.

Is there any clean way to do it? Thanks
Jan Blechta

Question information

Language:
English Edit question
Status:
Answered
For:
DOLFIN Edit question
Assignee:
No assignee Edit question
Last query:
Last reply:
Revision history for this message
Mikael Mortensen (mikael-mortensen) said :
#1

Hi Jan,

I used to have the same problem and my solution can be found in the extended_normalize class that is part of CBC.PDESys. I have copied it here for your convenience. It works in parallel as well.

from dolfin import *

class extended_normalize:
    """Normalize part or whole of vector.

    V = Functionspace we normalize in

    u = Function where part is normalized

    part = The index of the part of the mixed function space
           that we want to normalize.

    For example. When solving for velocity and pressure coupled in the
    Navier-Stokes equations we sometimes (when there is only Neuman BCs
    on pressure) need to normalize the pressure.

    Example of use:
    mesh = UnitSquare(1, 1)
    V = VectorFunctionSpace(mesh, 'CG', 2)
    Q = FunctionSpace(mesh, 'CG', 1)
    VQ = V * Q
    up = Function(VQ)
    normalize_func = extended_normalize(VQ, 2)
    up.vector()[:] = 2.
    print 'before ', up.vector().array().astype('I')
    normalize_func(up.vector())
    print 'after ', up.vector().array().astype('I')

    results in:
        before [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2]
        after [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 0 0 0 0]
    """
    def __init__(self, V, part='entire vector'):
        self.part = part
        if isinstance(part, int):
            self.u = Function(V)
            v = TestFunction(V)
            self.c = assemble(Constant(1., cell=V.cell())*dx, mesh=V.mesh())
            self.pp = ['0']*self.u.value_size()
            self.pp[part] = '1'
            self.u0 = interpolate(Expression(self.pp, element=V.ufl_element()), V)
            self.x0 = self.u0.vector()
            self.C1 = assemble(v[self.part]*dx)

    def __call__(self, v):
        if isinstance(self.part, int):
            # assemble into c1 the part of the vector that we want to normalize
            c1 = self.C1.inner(v)
            if abs(c1) > 1.e-8:
                # Perform normalization
                self.x0[:] = self.x0[:]*(c1/self.c)
                v.axpy(-1., self.x0)
                self.x0[:] = self.x0[:]*(self.c/c1)
        else:
            # normalize entire vector
            dummy = normalize(v)

mesh = UnitSquare(2, 2)
V = VectorFunctionSpace(mesh, "CG", 2)
Q = FunctionSpace(mesh, "CG", 1)
R = FunctionSpace(mesh, "CG", 1)
W = MixedFunctionSpace([V,Q,R])

w = Function(W)
w.vector()[:] = 2.
print 'before ', w.vector().array().astype('I')
normalize = extended_normalize(W, 3) ## Normalize part 3, i.e., R
normalize(w.vector()) ## Perform normalization
print 'after ', w.vector().array().astype('I')

Hope it does the trick for you as well,

Best regards

Mikael

Den Aug 7, 2012 kl. 3:55 PM skrev Jan Blechta:

> New question #205221 on DOLFIN:
> https://answers.launchpad.net/dolfin/+question/205221
>
> Hi,
> I'm trying to access a vector of a subfunction to be used by normalize().
>
>
> V = VectorFunctionSpace(mesh, "CG", 2)
> Q = FunctionSpace(mesh, "CG", 1)
> R = FunctionSpace(mesh, "CG", 1)
> W = MixedFunctionSpace([V,Q,R])
>
> w = Function(W)
>
> normalize(w[2].vector(), 'average')
> AttributeError: 'Indexed' object has no attribute 'vector'
>
> normalize(w.vector()[W.sub(1).dofmap().dofs()], 'average')
> TypeError: expected an int, slice, list or numpy array of integers
> #W.sub(1).dofmap().dofs() is swigpy pointer which can't be dereferenced from python. Any work-around?
>
> (u, p, theta) = w.split()
> normalize(p.vector(), 'average')
> *** Error: Unable to access vector of degrees of freedom fro function.
> *** Reason: Cannot access a non-const vector from a subfunction.
> *** Where: This error was encountered inside Function.cpp.
>
>
> I've found few another ways. One could for example deepcopy p and then interpolate back to w. But it's an extra interpolate instead of simple algebraic operation and it could reduce accuraccy of normalization.
>
> Or it has been suggested here that one could get indeces of w.vector() belonging to p-part by V.dim(), Q.dim() or by num_vertices(). But it will not work in parallel and also num_vertices work-around depends on chosen FunctionSpaces.
>
> Is there any clean way to do it? Thanks
> Jan Blechta
>
> --
> You received this question notification because you are a member of
> DOLFIN Team, which is an answer contact for DOLFIN.

Revision history for this message
Jan Blechta (blechta) said :
#2

Ok, thanks, this seems being a solution. I'll try it in few days..

But I think the problem is more general - there is no secure way to simply access and manipulate dofs of a subfunction, isn't it?

Revision history for this message
Mikael Mortensen (mikael-mortensen) said :
#3

Den Aug 12, 2012 kl. 1:35 PM skrev Jan Blechta:

> Question #205221 on DOLFIN changed:
> https://answers.launchpad.net/dolfin/+question/205221
>
> Status: Answered => Open
>
> Jan Blechta is still having a problem:
> Ok, thanks, this seems being a solution. I'll try it in few days..
>
> But I think the problem is more general - there is no secure way to
> simply access and manipulate dofs of a subfunction, isn't it?

Not sure it's the most efficient solution, but you can extract sub dofmaps by doing something like:

from dolfin import *
from numpy import array
parameters["linear_algebra_backend"] = "Epetra"

# Create mesh and finite element
mesh = UnitSquare(2, 2)
VV = VectorFunctionSpace(mesh, "CG", 1)
V = FunctionSpace(mesh, "CG", 2)
Q = FunctionSpace(mesh, "CG", 1)
VQ = V * Q

# Create a function to normalize in
u = Function(VQ)
u.vector()[:] = 2.

# Get the subdofmap
sub_space = VQ.sub(1)
dofmap = sub_space.dofmap()

# Extract a list of dofs
dofs = dofmap.collapse(mesh)[1].values()

# Remove off process dofs
off_dofs = dofmap.off_process_owner()
for i in off_dofs.keys():
    dofs.remove(i)

# Sum values on local process
total_sum = u.vector()[dofs]
total_sum = total_sum.sum()
# The above segfults for PETSc. The code below works in that case
#total_sum = 0
#for i in dofs:
# total_sum += u.vector()[i]

# Subtract from the original vector
v = Vector(u.vector())
v.zero()
v[dofs] = -MPI.sum(total_sum) / Q.dim()
print MPI.process_number(), "Before:", u.vector().array().astype("I")
u.vector().add_local(v.array())
print MPI.process_number()," After:", u.vector().array().astype("I")

For me this results in:
[mikael@ubuntu:~]$ mpirun -np 2 python test_dofs.py
Process 0: Number of global vertices: 9
Process 0: Number of global cells: 8
Process 1: Partitioned mesh, edge cut is 2.
Process 0: Partitioned mesh, edge cut is 2.
0 Before: [2 2 2 2 2 2 2 2 2 2 2 2 2 2]
1 Before: [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2]
1 After: [2 0 0 0 2 0 2 2 2 2 2 2 2 2 2 2 2 2 0 0]
0 After: [2 2 2 2 2 0 2 0 2 2 0 2 2 2]
[mikael@ubuntu:~]$ mpirun -np 1 python test_dofs.py
0 Before: [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2]
0 After: [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 0 0 0 0 0 0 0 0 0]

Best regards

Mikael

>
> --
> You received this question notification because you are a member of
> DOLFIN Team, which is an answer contact for DOLFIN.

Revision history for this message
Johan Hake (johan-hake) said :
#4

On 08/12/2012 10:30 PM, Mikael Mortensen wrote:
> Question #205221 on DOLFIN changed:
> https://answers.launchpad.net/dolfin/+question/205221
>
> Status: Open => Answered
>
> Mikael Mortensen proposed the following answer:
>
> Den Aug 12, 2012 kl. 1:35 PM skrev Jan Blechta:
>
>> Question #205221 on DOLFIN changed:
>> https://answers.launchpad.net/dolfin/+question/205221
>>
>> Status: Answered => Open
>>
>> Jan Blechta is still having a problem:
>> Ok, thanks, this seems being a solution. I'll try it in few days..
>>
>> But I think the problem is more general - there is no secure way to
>> simply access and manipulate dofs of a subfunction, isn't it?
>
> Not sure it's the most efficient solution, but you can extract sub
> dofmaps by doing something like:
>
> from dolfin import *
> from numpy import array
> parameters["linear_algebra_backend"] = "Epetra"
>
> # Create mesh and finite element
> mesh = UnitSquare(2, 2)
> VV = VectorFunctionSpace(mesh, "CG", 1)
> V = FunctionSpace(mesh, "CG", 2)
> Q = FunctionSpace(mesh, "CG", 1)
> VQ = V * Q
>
> # Create a function to normalize in
> u = Function(VQ)
> u.vector()[:] = 2.
>
> # Get the subdofmap
> sub_space = VQ.sub(1)
> dofmap = sub_space.dofmap()
>
> # Extract a list of dofs
> dofs = dofmap.collapse(mesh)[1].values()
>
> # Remove off process dofs
> off_dofs = dofmap.off_process_owner()
> for i in off_dofs.keys():
> dofs.remove(i)

For larger data sets it might be faster to use sets:

  dofs = set(dofmap.collapse(mesh)[1].values())
  dofs.difference_update(off_dofs.keys())
  dofs = list(dofs)

Johan

> # Sum values on local process
> total_sum = u.vector()[dofs]
> total_sum = total_sum.sum()
> # The above segfults for PETSc. The code below works in that case
> #total_sum = 0
> #for i in dofs:
> # total_sum += u.vector()[i]
>
> # Subtract from the original vector
> v = Vector(u.vector())
> v.zero()
> v[dofs] = -MPI.sum(total_sum) / Q.dim()
> print MPI.process_number(), "Before:", u.vector().array().astype("I")
> u.vector().add_local(v.array())
> print MPI.process_number()," After:", u.vector().array().astype("I")
>
>
> For me this results in:
> [mikael@ubuntu:~]$ mpirun -np 2 python test_dofs.py
> Process 0: Number of global vertices: 9
> Process 0: Number of global cells: 8
> Process 1: Partitioned mesh, edge cut is 2.
> Process 0: Partitioned mesh, edge cut is 2.
> 0 Before: [2 2 2 2 2 2 2 2 2 2 2 2 2 2]
> 1 Before: [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2]
> 1 After: [2 0 0 0 2 0 2 2 2 2 2 2 2 2 2 2 2 2 0 0]
> 0 After: [2 2 2 2 2 0 2 0 2 2 0 2 2 2]
> [mikael@ubuntu:~]$ mpirun -np 1 python test_dofs.py
> 0 Before: [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2]
> 0 After: [2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 0 0 0 0 0 0 0 0 0]
>
>
> Best regards
>
> Mikael
>
>>
>> --
>> You received this question notification because you are a member of
>> DOLFIN Team, which is an answer contact for DOLFIN.
>

Can you help with this problem?

Provide an answer of your own, or ask Jan Blechta for more information if necessary.

To post a message you must log in.