How i can run MADGRAPH on a GPU??
I'm currently running MADGRAPH on my CPU and have managed to obtain some results for a simple process:
generate e+ e- > Z > e+ e-
However, I'm interested in speeding up the event generation process using the power of a GPU. I'm wondering how I can configure MADGRAPH to run on a GPU and leverage its massive processing capability to improve performance. I've been searching for tutorials or resources that explain how to do this, but so far, I haven't found any detailed information on the topic.
In addition to that, I'm curious if it's also possible to run MADGRAPH on an FPGA. I'm intrigued by the possibilities of parallelization and optimization that an FPGA could offer for these types of data-intensive scientific calculations. My idea would be to export or translate the code to HLS and then execute it on an FPGA.
I appreciate any guidance or resources you can provide on these topics.
Regards,
Hector
Question information
- Language:
- English Edit question
- Status:
- Solved
- Assignee:
- No assignee Edit question
- Solved by:
- Olivier Mattelaer
- Solved:
- Last query:
- Last reply: