ESyS_Particle on a 8-core SMP - #1

Asked by Michele Griffa

Hello everybody

I just want to share with all the other users and with the developers my experience with the installation and first run of ESyS_Particle on a small Symmetric Multi-Processor workstation (8 core machine). It may help other users and it may be of interest for the developers to see ESyS_Particle behaviour on a different architecture/machine.

I installed ESyS_Particle on a HP xw8600 workstation, which is a two-socket SMP with 1 Intel Xeon quad-core CPU/socket.

The OS is Ubuntu 9.0.4

OpenMPI and Python 2.6 installed. All the other required software installed (Boost libraries, POVray, VTK, CppUnit, epydoc, docutils.

I followed the instructions for installation under Ubuntu and all was fine.

I ran all the simulations of the "bingle" set.
They seem to run but there are some issues.

1) As mentioned in the post "" (, when I run the "" simulation the control does not return to the shell when the program has finished.

2) When I run whatever "bingle" simulation these messages appear:

"CSubLatticeControler::initMPI()" I'm not sure this is something just relating to my system or to ESyS_Particle

then it follows

"/usr/lib/python2.6/dist-packages/esys/lsm/util/ DeprecationWarning: The popen2 module is deprecated. Use the subprocess module.
import popen2"

This message seems clearly related to an updated version of ESyS_Particle, isn't it ? In any case, this should not hurt.

So far so good, apart the deprecation message and the absence of return of the control to the shell (already addressed in other posts by Thomas and Dion).

Hope this help in improving ESyS_Particle.



Question information

English Edit question
ESyS-Particle Edit question
No assignee Edit question
Solved by:
Dion Weatherley
Last query:
Last reply:
Revision history for this message
Dion Weatherley (d-weatherley) said :

Hi Michele,

Thanks for sharing your experiences with installing ESyS-Particle. This is certainly very helpful to improve the code in future versions.

Regarding the issues you raised:

1) the issue with not returning to the shell appears to related to which MPI library ESyS-Particle is compiled with. I have yet to ascertain the cause of this admittedly annoying problem. Strangely, not all simulations suffer the same problem. Thanks for highlighting this on the forum though.

2) The DeprecationWarning is relatively new and appears to be due to changes in Python version 2.6. I'll flag this for a bugfix in the next stable release, which is tentatively scheduled for June 2009. With this kind of warning I think it's prudent to wait a while before making the fix as it will probably make ESyS-Particle incompatible with earlier versions of Python. Thankfully it is only a warning at this stage...

I'm glad you have succeeded with installing ESyS-Particle and hope you enjoy the code!



Revision history for this message
Michele Griffa (michele-griffa) said :

Hello everybody again.
There is one additional problem I did not mention in my previous post.
It's related more to the latest Ubuntu release (9.0.4) then to ESyS_Particle.
On a multi-core workstation/laptop, if you install OpenMPI under Ubuntu 9.0.4 using simply the packages that come along with it you get this problem:

your password is asked for by the OS.

This behaviour has been already notified in OpenMPI User's Mailing List:

It seems related to some kind of incompatibility between OpenMPI 1.3.1 and Ubuntu.
It's been fixed in the latest OpenMPI release (1.3.2), as mentioned in some answer to the initial thread in OpenMPi mailing list.

Ubuntu 9.0.4 seems to come with OpenMPI 1.3.1

So, if you plan to use ESyS_Particle under Ubuntu 9.0.4, DO NOT INSTALL the provided OpenMPI packages.
Download OpenMPI version 1.3.2 zipped archive and build it by yourself.

Put it eventually in a directory as /opt/openmpi-1.3.2, such that you can always install the OpenMPI version naturally coming with Ubuntu. If so, do not forget to setup the environment variables for finding the right locations of your OpenMPI shared libraries and executables




I followed this instructions. Then, I ran the "bingle" series of ESyS_Particle test simulations.
They seem to run properly, except some annoying messages.
For example, each one produces this message:

/usr/local/bin/mpipython: Symbol `ompi_mpi_comm_world' has different size in shared object, consider re-linking

Question for the developers: is this something serious ?

When you run you get this additional message:

/usr/lib/python2.6/dist-packages/esys/lsm/vis/povray/ RuntimeWarning: tempnam is a potential security risk to your program
  return os.tempnam()

but the output files (snapshots at different time steps) are OK, so there seems not to be any problem with

At least, in this way, you avoid the problem of inserting your password every time you start to run your ESyS_Particle simulation.

Again, all of this is system-specific:

multi-core machine (SMP) with the latest version of Ubuntu (9.04.) and OpenMPI as MPI stack.


Revision history for this message
Joel Fenwick (j-fenwick1) said :

Hello Michele,
    We've just installed particle on an ubuntu 9.04 machine and had an apparently similar.
In our case the solution was a bit simpler. It turned out to be a DNS problem: localhost
w.x.y.z myhost.mydomain myhost

At the top of /etc/hosts worked for us (where w.x.y.z is your IP address and myhost, mydomain are adjusted to fit).
Your milage may vary.


Revision history for this message
Best Dion Weatherley (d-weatherley) said :

Hi Michele,

Thanks for the update and thanks to Joel for the update on working around the password problem. Just to clarify, we found that by editing the /etc/hosts file according to Joel's instructions, the problem with openMPI v.1.3.1 could be subverted i.e. installing openMPI v.1.3.2 from source is not necessary. This hasn't been extensively tested so if this does/doesn't work for anyone, please report it here!

Regarding your other questions Michele, I'm a little concerned about the openMPI warning ("Symbol `ompi_mpi_comm_world' has different size in shared object"). I suggest, just to be safe, you change directory to your ESyS-Particle source code directory then do the following:

1/ "sudo make uninstall",
2/ "make distclean",
3/ execute your "./configure ..." command again
4/ "make -j 8", then
5/ "sudo make install" again.

I suspect you currently have an installation that links partially against the old openMPI library and partially against the new one. Let me know if the warning goes away when you do this.

The "RuntimeWarning: tempnam is a potential security risk to your program" is a *feature* of ESyS-Particle now rather than a bug (feel free to ignore the warning). Once again, this warning stems from changes to Python rather than ESyS-Particle. You will get this warning every time you use the povray inline rendering facilities. The new stable release of ESyS-Particle scheduled for later this year will probably drop the inline rendering in favour of a tool to post-process checkpoint files to render POVray snapshots. After some discussion, the developers are favouring post-processing for future releases. This decision is largely because saving checkpoint files regularly during your simulations offers many more options for "unforeseen" data analysis after-the-fact than deciding up-front only to save snapshots. We are also discussing a new library for checkpoint file I/O that will (hopefully!) facilitate rapid development of new post-analysis tools.

All things considered Michele, it sounds like you've now managed to successfully install ESyS-Particle on your workstation. Congratulations! Thank you most sincerely for sharing your experiences as this feedback is invaluable for planning the new ESyS-Particle stable release later this year.



Revision history for this message
Michele Griffa (michele-griffa) said :

Thanks a lot Joel and Dion

I made the correction within the /etc/hosts file, re-installed OpenMPI 1.3.1 libraries (just a change in the environment variables actually) and re-built ESyS_Particle from scratch with that version of OpenMPI.

It works fine. The password is not asked any more.

The message

"Symbol `ompi_mpi_comm_world' has different size in shared object"

does not appear any more.

I'll let you all know if I meet problems in running simulations requiring more than one process.

Revision history for this message
Michele Griffa (michele-griffa) said :

Thanks Dion Weatherley, that solved my question.

Revision history for this message
Dion Weatherley (d-weatherley) said :

As a follow-up, the problem with not terminating can be resolved by adding "" to the end of the script. In a new version of the Tutorial due for release soon, this problem will be addressed.