Large Matrices Error

Asked by Oluwaseun Sharomi on 2013-02-15

When I try to write a stiffness matrix for large mesh to file, I get the following error

"
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/osharomi/Work/FEniCS/lib/python2.7/site-packages/dolfin/cpp.py", line 2980, in array
    A = zeros((m_range[1] - m_range[0], self.size(1)))
MemoryError
"
I get the error specifically at A=A.array(). My guess is that numpy cannot create an array that large because of memory requirement.

Is there any easy way to go around this error for large matrices. My code is below.

from dolfin import *
import numpy
import scipy.io
# Create mesh and define function space
mesh=BoxMesh(0, 0, 0, 2, 0.7, 0.3, 400, 140, 60);
V = FunctionSpace(mesh, 'CG', 1)

u = TrialFunction(V)
v = TestFunction(V)

a1 = u*v*dx
a2 = -inner(nabla_grad(v),nabla_grad(u))*dx

A = assemble(a1)
B = assemble(a2)

A = A.array()
B = B.array()

scipy.io.savemat("abc.mat", {"A": A, "B": B})

Question information

Language:
English Edit question
Status:
Solved
For:
DOLFIN Edit question
Assignee:
No assignee Edit question
Solved by:
Oluwaseun Sharomi
Solved:
2013-02-15
Last query:
2013-02-15
Last reply:

I fixed it using a different backend.

Mandrake (felix-linder) said : #2

I've got the same problem, can you tell me which backend works?

I used this

parameters["linear_algebra_backend"] = "uBLAS"
parameters.reorder_dofs_serial = False

then I created the function

def dump_matrix(filename, AA):
    row_ptr, col_ind, val = AA.data()
    Acsr = csr_matrix((val, col_ind, row_ptr))
    Acsr.eliminate_zeros()
    mmio.mmwrite(filename, Acsr )

so that the matrices can be saved into a Matrix Market file with the following

a1 = u*v*dx
A = assemble(a1)
dump_matrix("A",A)

needed imports

from scipy.io import mmio
from scipy.sparse import csr_matrix

I used this
parameters["linear_algebra_backend"] = "uBLAS"
parameters.reorder_dofs_serial = False
then I created the function
def dump_matrix(filename, AA):
    row_ptr, col_ind, val = AA.data()
    Acsr = csr_matrix((val, col_ind, row_ptr))
    Acsr.eliminate_zeros()
    mmio.mmwrite(filename, Acsr )
so that the matrices can be saved into a Matrix Market file with the following
a1 = u*v*dx
A = assemble(a1)
dump_matrix("A",A)
needed imports
from scipy.io import mmio
from scipy.sparse import csr_matrix

-------------------------------------------------
Oluwaseun Sharomi, PhD
PIMS Post-doctoral Research Fellow,
Department of Computer Science
University of Saskatchewan
Saskatoon, SK, S7N 5C9,
Phone: (306) 966-1498
Fax:     (306) 966-4884

Email: <email address hidden>

________________________________
 From: Mandrake <email address hidden>
To: <email address hidden>
Sent: Wednesday, July 31, 2013 12:16:11 PM
Subject: Re: [Question #221970]: Large Matrices Error

Your question #221970 on DOLFIN changed:
https://answers.launchpad.net/dolfin/+question/221970

Mandrake posted a new comment:
I've got the same problem, can you tell me which backend works?

--
You received this question notification because you asked the question.

Mandrake (felix-linder) said : #5

Thank you, this gave me some useful hints, but finally i wrote my own dump-routine as uBLAS doesn't support distributed vectors.

All glory to the Almighty God. I am pleased that it helped.

Sent from Windows Mail

From: Mandrake
Sent: ‎Saturday‎, ‎August‎ ‎03‎, ‎2013 ‎9‎:‎21‎ ‎AM
To: <email address hidden>

Your question #221970 on DOLFIN changed:
https://answers.launchpad.net/dolfin/+question/221970

Mandrake posted a new comment:
Thank you, this gave me some useful hints, but finally i wrote my own
dump-routine as uBLAS doesn't support distributed vectors.

--
You received this question notification because you asked the question.