parallel computation

Asked by Shen Weigang

Hi all,

I am doing simulation on a desktop computer (8 cores ; Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz 3.70GHz). There are about 1000000 particles in my model. I'd like to use parallel computation to speed up my simulation. However, it seems that the parallel computation does not speed up my simulation significantly.
So, I am wondering whether the parallel computation still works, who can give me some suggestion?

Thanks!

Question information

Language:
English Edit question
Status:
Solved
For:
ESyS-Particle Edit question
Assignee:
No assignee Edit question
Solved by:
Dion Weatherley
Solved:
Last query:
Last reply:
Revision history for this message
Best Dion Weatherley (d-weatherley) said :
#1

Hi Shen Weigang,

Yes, parallel computation definitely still works for ESyS-Particle however you are unlikely to observe a significant speedup for a 1M particle simulation using 8 cores.

ESyS-Particle demonstrates so-called "weak scalability" rather than "strong scalability". Benchmarking of ESyS-Particle on up to 32,000 cores demonstrates that the best speedup is obtained if the number of particles per worker (or core) is around 10000. Consequently, I would be looking to use approximately 100 cores for a 1M particle simulation. This will ensure that the simulation will take approximately the same amount of time as a 10000 particle simulation using a single worker process.

I would estimate that running your 1M particle simulation on 8 cores would be roughly equivalent to running a 125,000 particle simulation on a single core. This will be quite slow for most DEM problems; likely requiring a week or more of execution time depending on the number of timesteps to be simulated.

My suggestion would therefore be to either port your simulation to a supercomputer where more cores are available, or reduce your problem size to approximately 100,000 particles (around 12,500 particles per worker for 8 cores).

I hope this helps. Have fun!

Cheers,

Dion

Revision history for this message
Shen Weigang (wgshen) said :
#2

Thanks Dion Weatherley, that solved my question.