HDF5 python test failing in parallell on my laptop

Asked by Martin Sandve Alnæs

I've installed fenics with the quantal-hpc platform, and built and run the dolfin unt tests, getting errors shown below when running

cd test/unit/io/python
mpirun -np 3 python HDF5.py

It works fine in serial. All other unit tests seem to pass.
The test in question creates a Vector(305), is that a proper test in parallell?
Is there something wrong with my setup?

Errors like these occur on MPI process 0, 1, and 2:

ble-src/hdf5-1.8.10/src/H5D.c line 316 in H5Dopen2(): not a location
    major: Invalid arguments to routine
    minor: Inappropriate type
  #001: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5Gloc.c line 253 in H5G_loc(): invalid object ID
    major: Invalid arguments to routine
    minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
  #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 242 in H5Acreate2(): not a location
    major: Invalid arguments to routine
    minor: Inappropriate type
  #001: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5Gloc.c line 253 in H5G_loc(): invalid object ID
    major: Invalid arguments to routine
    minor: Bad value
HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
  #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 927 in H5Awrite(): not an attribute
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
  #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 2286 in H5Aclose(): not an attribute
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
  #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5D.c line 391 in H5Dclose(): not a dataset
    major: Invalid arguments to routine
    minor: Inappropriate type
HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
  #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5F.c line 2037 in H5Fclose(): not a file ID
    major: Invalid arguments to routine
    minor: Inappropriate type
.
======================================================================
ERROR: test_save_and_read_vector (__main__.HDF5_Vector)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "HDF5.py", line 47, in test_save_and_read_vector
    vector_file.read(y, "my_vector")
Exception: std::bad_alloc

Question information

Language:
English Edit question
Status:
Answered
For:
DOLFIN Edit question
Assignee:
No assignee Edit question
Last query:
Last reply:
Revision history for this message
Johan Hake (johan-hake) said :
#1

I have the exact same issue.

Johan

On 12/12/2012 10:50 AM, Martin Sandve Alnæs wrote:
> New question #216582 on DOLFIN:
> https://answers.launchpad.net/dolfin/+question/216582
>
> I've installed fenics with the quantal-hpc platform, and built and run the dolfin unt tests, getting errors shown below when running
>
> cd test/unit/io/python
> mpirun -np 3 python HDF5.py
>
> It works fine in serial. All other unit tests seem to pass.
> The test in question creates a Vector(305), is that a proper test in parallell?
> Is there something wrong with my setup?
>
> Errors like these occur on MPI process 0, 1, and 2:
>
> ble-src/hdf5-1.8.10/src/H5D.c line 316 in H5Dopen2(): not a location
> major: Invalid arguments to routine
> minor: Inappropriate type
> #001: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5Gloc.c line 253 in H5G_loc(): invalid object ID
> major: Invalid arguments to routine
> minor: Bad value
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 242 in H5Acreate2(): not a location
> major: Invalid arguments to routine
> minor: Inappropriate type
> #001: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5Gloc.c line 253 in H5G_loc(): invalid object ID
> major: Invalid arguments to routine
> minor: Bad value
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 927 in H5Awrite(): not an attribute
> major: Invalid arguments to routine
> minor: Inappropriate type
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 2286 in H5Aclose(): not an attribute
> major: Invalid arguments to routine
> minor: Inappropriate type
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5D.c line 391 in H5Dclose(): not a dataset
> major: Invalid arguments to routine
> minor: Inappropriate type
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5F.c line 2037 in H5Fclose(): not a file ID
> major: Invalid arguments to routine
> minor: Inappropriate type
> .
> ======================================================================
> ERROR: test_save_and_read_vector (__main__.HDF5_Vector)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "HDF5.py", line 47, in test_save_and_read_vector
> vector_file.read(y, "my_vector")
> Exception: std::bad_alloc
>
>

Revision history for this message
Garth Wells (garth-wells) said :
#2

Tests run fine for me.

Revision history for this message
Johan Hake (johan-hake) said :
#3

How did you compile the hdf5 library?

Obviously you did not use dorsal or the hpc platform file ;)

Maybe there are something we could do with the compile statements?

Johan

On 12/13/2012 09:35 AM, Garth Wells wrote:
> Question #216582 on DOLFIN changed:
> https://answers.launchpad.net/dolfin/+question/216582
>
> Garth Wells posted a new comment:
> Tests run fine for me.
>

Revision history for this message
Anders Logg (logg) said :
#4

I'm getting the exact same error.

Here's how HDF5 is built with Dorsal:

  cmake -D CMAKE_INSTALL_PREFIX:PATH=${INSTALL_PATH} \
            -D BUILD_SHARED_LIBS:BOOL=ON \
            -D HDF5_ENABLE_PARALLEL:BOOL=ON \
            ..

  # Fix problem with error: /usr/bin/ld: unrecognized option '--export-dynamic;'
  sed -i 's/dynamic;/dynamic/' src/CMakeFiles/H5detect.dir/link.txt
  sed -i 's/dynamic;/dynamic/' src/CMakeFiles/H5make_libsettings.dir/link.txt

  # Build and install
  make ${MAKEOPTS} -j ${PROCS}
  make install

Garth, is there a particular reason you are not building with Dorsal?
It would help the rest of us if you did. If only you know the magic
flags needed to build HDF5 it will break for everyone else.

--
Anders

On Wed, Dec 12, 2012 at 09:50:58AM -0000, Martin Sandve Alnæs wrote:
> New question #216582 on DOLFIN:
> https://answers.launchpad.net/dolfin/+question/216582
>
> I've installed fenics with the quantal-hpc platform, and built and run the dolfin unt tests, getting errors shown below when running
>
> cd test/unit/io/python
> mpirun -np 3 python HDF5.py
>
> It works fine in serial. All other unit tests seem to pass.
> The test in question creates a Vector(305), is that a proper test in parallell?
> Is there something wrong with my setup?
>
> Errors like these occur on MPI process 0, 1, and 2:
>
> ble-src/hdf5-1.8.10/src/H5D.c line 316 in H5Dopen2(): not a location
> major: Invalid arguments to routine
> minor: Inappropriate type
> #001: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5Gloc.c line 253 in H5G_loc(): invalid object ID
> major: Invalid arguments to routine
> minor: Bad value
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 242 in H5Acreate2(): not a location
> major: Invalid arguments to routine
> minor: Inappropriate type
> #001: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5Gloc.c line 253 in H5G_loc(): invalid object ID
> major: Invalid arguments to routine
> minor: Bad value
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 927 in H5Awrite(): not an attribute
> major: Invalid arguments to routine
> minor: Inappropriate type
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5A.c line 2286 in H5Aclose(): not an attribute
> major: Invalid arguments to routine
> minor: Inappropriate type
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5D.c line 391 in H5Dclose(): not a dataset
> major: Invalid arguments to routine
> minor: Inappropriate type
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/martinal/opt/fenics/dorsal-unstable-src/hdf5-1.8.10/src/H5F.c line 2037 in H5Fclose(): not a file ID
> major: Invalid arguments to routine
> minor: Inappropriate type
> .
> ======================================================================
> ERROR: test_save_and_read_vector (__main__.HDF5_Vector)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "HDF5.py", line 47, in test_save_and_read_vector
> vector_file.read(y, "my_vector")
> Exception: std::bad_alloc
>
>

Revision history for this message
Simone Pezzuto (junki-gnu) said :
#5

I'm also facing the same issue on Kubuntu, with dorsal hpc-specific buildchain. On the other hand, on archlinux, the issue is not present (hdf5-openmpi package compiled by myself).

I think that I've tracked down the problem: it is related to openmpi build, not hdf5.

More specifically, the "--enable-smp-locks" needs to be appended to CONFOPTS of openmpi.package.
There is no need to recompile the entire toolchain.

Let me know if this works also for you.

 Simone

P.S. (to Anders) The trunk version of DOLFIN doesn't compile at the moment, because
you recently removed Poisson{1,2,3}D.h from ale directory. Also SWIG fails, because
docstrings are missing.

Revision history for this message
Johan Hake (johan-hake) said :
#6

On 03/22/2013 02:35 PM, Simone Pezzuto wrote:
> Question #216582 on DOLFIN changed:
> https://answers.launchpad.net/dolfin/+question/216582
>
> Simone Pezzuto proposed the following answer:
> I'm also facing the same issue on Kubuntu, with dorsal hpc-specific
> buildchain. On the other hand, on archlinux, the issue is not present
> (hdf5-openmpi package compiled by myself).
>
> I think that I've tracked down the problem: it is related to openmpi
> build, not hdf5.

So I have the same setup as you (Kubuntu and quantal-hpc platform). I
tried updating the openmpi package file with your suggestion and
recompiled openmpi, hdf5 and dolfin, but I still get the same error
while running the HDF5 test in parralell.

Process 0: Number of global vertices: 441
Process 0: Number of global cells: 800
HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 1:
  #000: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1500 in
H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1271 in
H5F_open(): unable to open file: time = Fri Mar 22 15:26:54 2013
, name = 'mesh.h5', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: /home/hake/local/src/hdf5-1.8.10/src/H5FD.c line 987 in
H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
H5FD_mpio_open(): MPI_ERR_OTHER: known error not in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String
HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
  #000: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1500 in
H5Fcreate(): unable to create file
    major: File accessability
    minor: Unable to open file
  #001: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1271 in
H5F_open(): unable to open file: time = Fri Mar 22 15:26:54 2013
, name = 'mesh.h5', tent_flags = 13
    major: File accessability
    minor: Unable to open file
  #002: /home/hake/local/src/hdf5-1.8.10/src/H5FD.c line 987 in
H5FD_open(): open failed
    major: Virtual File Layer
    minor: Unable to initialize object
  #003: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
H5FD_mpio_open(): MPI_File_open failed
    major: Internal error (too specific to document in detail)
    minor: Some MPI function failed
  #004: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
H5FD_mpio_open(): MPI_ERR_OTHER: known error not in list
    major: Internal error (too specific to document in detail)
    minor: MPI Error String

[snip]

> More specifically, the "--enable-smp-locks" needs to be appended to CONFOPTS of openmpi.package.
> There is no need to recompile the entire toolchain.
>
> Let me know if this works also for you.
>
> Simone
>
> P.S. (to Anders) The trunk version of DOLFIN doesn't compile at the moment, because
> you recently removed Poisson{1,2,3}D.h from ale directory. Also SWIG fails, because
> docstrings are missing.

Have a look at the fenics email list at launchpad. We are saying goodbye
to bzr and launchpad and hello to git and bitbucket ;) The error you see
is caused by repo being in a state of flux.

Johan

Revision history for this message
Simone Pezzuto (junki-gnu) said :
#7

Try with my .package files (see attachments), actually I've changed more
than that ... so probably
what I've suggested is not enough.

Recompile also dolfin, just to be sure.

2013/3/22 Johan Hake <email address hidden>

> Question #216582 on DOLFIN changed:
> https://answers.launchpad.net/dolfin/+question/216582
>
> Johan Hake proposed the following answer:
> On 03/22/2013 02:35 PM, Simone Pezzuto wrote:
> > Question #216582 on DOLFIN changed:
> > https://answers.launchpad.net/dolfin/+question/216582
> >
> > Simone Pezzuto proposed the following answer:
> > I'm also facing the same issue on Kubuntu, with dorsal hpc-specific
> > buildchain. On the other hand, on archlinux, the issue is not present
> > (hdf5-openmpi package compiled by myself).
> >
> > I think that I've tracked down the problem: it is related to openmpi
> > build, not hdf5.
>
> So I have the same setup as you (Kubuntu and quantal-hpc platform). I
> tried updating the openmpi package file with your suggestion and
> recompiled openmpi, hdf5 and dolfin, but I still get the same error
> while running the HDF5 test in parralell.
>
> Process 0: Number of global vertices: 441
> Process 0: Number of global cells: 800
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 1:
> #000: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1500 in
> H5Fcreate(): unable to create file
> major: File accessability
> minor: Unable to open file
> #001: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1271 in
> H5F_open(): unable to open file: time = Fri Mar 22 15:26:54 2013
> , name = 'mesh.h5', tent_flags = 13
> major: File accessability
> minor: Unable to open file
> #002: /home/hake/local/src/hdf5-1.8.10/src/H5FD.c line 987 in
> H5FD_open(): open failed
> major: Virtual File Layer
> minor: Unable to initialize object
> #003: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
> H5FD_mpio_open(): MPI_File_open failed
> major: Internal error (too specific to document in detail)
> minor: Some MPI function failed
> #004: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
> H5FD_mpio_open(): MPI_ERR_OTHER: known error not in list
> major: Internal error (too specific to document in detail)
> minor: MPI Error String
> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
> #000: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1500 in
> H5Fcreate(): unable to create file
> major: File accessability
> minor: Unable to open file
> #001: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1271 in
> H5F_open(): unable to open file: time = Fri Mar 22 15:26:54 2013
> , name = 'mesh.h5', tent_flags = 13
> major: File accessability
> minor: Unable to open file
> #002: /home/hake/local/src/hdf5-1.8.10/src/H5FD.c line 987 in
> H5FD_open(): open failed
> major: Virtual File Layer
> minor: Unable to initialize object
> #003: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
> H5FD_mpio_open(): MPI_File_open failed
> major: Internal error (too specific to document in detail)
> minor: Some MPI function failed
> #004: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
> H5FD_mpio_open(): MPI_ERR_OTHER: known error not in list
> major: Internal error (too specific to document in detail)
> minor: MPI Error String
>
> [snip]
>
> > More specifically, the "--enable-smp-locks" needs to be appended to
> CONFOPTS of openmpi.package.
> > There is no need to recompile the entire toolchain.
> >
> > Let me know if this works also for you.
> >
> > Simone
> >
> > P.S. (to Anders) The trunk version of DOLFIN doesn't compile at the
> moment, because
> > you recently removed Poisson{1,2,3}D.h from ale directory. Also SWIG
> fails, because
> > docstrings are missing.
>
> Have a look at the fenics email list at launchpad. We are saying goodbye
> to bzr and launchpad and hello to git and bitbucket ;) The error you see
> is caused by repo being in a state of flux.
>
> Johan
>
> --
> You received this question notification because you are a direct
> subscriber of the question.
>

Revision history for this message
Johan Hake (johan-hake) said :
#8

No attachment allowed on email responses to questions and answers on
launchpad.

Johan

On 03/22/2013 03:45 PM, Simone Pezzuto wrote:
> Question #216582 on DOLFIN changed:
> https://answers.launchpad.net/dolfin/+question/216582
>
> Simone Pezzuto proposed the following answer:
> Try with my .package files (see attachments), actually I've changed more
> than that ... so probably
> what I've suggested is not enough.
>
> Recompile also dolfin, just to be sure.
>
> 2013/3/22 Johan Hake <email address hidden>
>
>> Question #216582 on DOLFIN changed:
>> https://answers.launchpad.net/dolfin/+question/216582
>>
>> Johan Hake proposed the following answer:
>> On 03/22/2013 02:35 PM, Simone Pezzuto wrote:
>>> Question #216582 on DOLFIN changed:
>>> https://answers.launchpad.net/dolfin/+question/216582
>>>
>>> Simone Pezzuto proposed the following answer:
>>> I'm also facing the same issue on Kubuntu, with dorsal hpc-specific
>>> buildchain. On the other hand, on archlinux, the issue is not present
>>> (hdf5-openmpi package compiled by myself).
>>>
>>> I think that I've tracked down the problem: it is related to openmpi
>>> build, not hdf5.
>>
>> So I have the same setup as you (Kubuntu and quantal-hpc platform). I
>> tried updating the openmpi package file with your suggestion and
>> recompiled openmpi, hdf5 and dolfin, but I still get the same error
>> while running the HDF5 test in parralell.
>>
>> Process 0: Number of global vertices: 441
>> Process 0: Number of global cells: 800
>> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 1:
>> #000: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1500 in
>> H5Fcreate(): unable to create file
>> major: File accessability
>> minor: Unable to open file
>> #001: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1271 in
>> H5F_open(): unable to open file: time = Fri Mar 22 15:26:54 2013
>> , name = 'mesh.h5', tent_flags = 13
>> major: File accessability
>> minor: Unable to open file
>> #002: /home/hake/local/src/hdf5-1.8.10/src/H5FD.c line 987 in
>> H5FD_open(): open failed
>> major: Virtual File Layer
>> minor: Unable to initialize object
>> #003: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
>> H5FD_mpio_open(): MPI_File_open failed
>> major: Internal error (too specific to document in detail)
>> minor: Some MPI function failed
>> #004: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
>> H5FD_mpio_open(): MPI_ERR_OTHER: known error not in list
>> major: Internal error (too specific to document in detail)
>> minor: MPI Error String
>> HDF5-DIAG: Error detected in HDF5 (1.8.10) MPI-process 2:
>> #000: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1500 in
>> H5Fcreate(): unable to create file
>> major: File accessability
>> minor: Unable to open file
>> #001: /home/hake/local/src/hdf5-1.8.10/src/H5F.c line 1271 in
>> H5F_open(): unable to open file: time = Fri Mar 22 15:26:54 2013
>> , name = 'mesh.h5', tent_flags = 13
>> major: File accessability
>> minor: Unable to open file
>> #002: /home/hake/local/src/hdf5-1.8.10/src/H5FD.c line 987 in
>> H5FD_open(): open failed
>> major: Virtual File Layer
>> minor: Unable to initialize object
>> #003: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
>> H5FD_mpio_open(): MPI_File_open failed
>> major: Internal error (too specific to document in detail)
>> minor: Some MPI function failed
>> #004: /home/hake/local/src/hdf5-1.8.10/src/H5FDmpio.c line 1052 in
>> H5FD_mpio_open(): MPI_ERR_OTHER: known error not in list
>> major: Internal error (too specific to document in detail)
>> minor: MPI Error String
>>
>> [snip]
>>
>>> More specifically, the "--enable-smp-locks" needs to be appended to
>> CONFOPTS of openmpi.package.
>>> There is no need to recompile the entire toolchain.
>>>
>>> Let me know if this works also for you.
>>>
>>> Simone
>>>
>>> P.S. (to Anders) The trunk version of DOLFIN doesn't compile at the
>> moment, because
>>> you recently removed Poisson{1,2,3}D.h from ale directory. Also SWIG
>> fails, because
>>> docstrings are missing.
>>
>> Have a look at the fenics email list at launchpad. We are saying goodbye
>> to bzr and launchpad and hello to git and bitbucket ;) The error you see
>> is caused by repo being in a state of flux.
>>
>> Johan
>>
>> --
>> You received this question notification because you are a direct
>> subscriber of the question.
>>
>

Can you help with this problem?

Provide an answer of your own, or ask Martin Sandve Alnæs for more information if necessary.

To post a message you must log in.