Reweighted samples for aQGC model do not agree with generated

Asked by Yannis Maznas

Dear experts,

I'm using the aQGC models lying here: http://feynrules.irmp.ucl.ac.be/wiki/AnomalousGaugeCoupling
and after generating a sample of 30k events for a FT0 coupling value I tried reweighting it to a lower value of that coupling.
With the same run and param cards (except the coupling value and for 10k events) I generated a second sample corresponding to the smaller coupling value in order to compare the two but they do not seem to agree neither in terms of cross section nor in kinematics.

generated xs = 5.8884E-04 ± 1.17552E-06
reweighted xs = 5.9154E-04 ± 2.36962E-06

Do you have any idea what might be the problem?

Best regards,
Yannis

=============================================
process card
------------------------------------------------------------------------------------
set group_subprocesses Auto
set ignore_six_quark_processes False
set loop_color_flows False
set gauge unitary
set complex_mass_scheme False
set max_npoint_for_channel 0
import model /cvmfs/atlas.cern.ch/repo/sw/Generators/madgraph/models/latest/SM_LT0
define vl = ve vm vt
define vl~ = ve~ vm~ vt~
define p = g u c d b s u~ c~ d~ s~ b~
define j = g u c d b s u~ c~ d~ s~ b~
define l+ = e+ mu+
define l- = e- mu-
define vl = vt
define vl~ = vt~
generate p p > j j z z QCD=0 QED=4 NP=1, (z > l+ l-)
output VBS_ZZ_aQGC_FT0
------------------------------------------------------------------------------------

=============================================
reweight card
------------------------------------------------------------------------------------
change output 2.0

launch --rwgt_name=FT0_-0.3
  set anoinputs 12 -3.00e-13
  set anoinputs 13 00.00e-13
  set anoinputs 14 00.00e-13
  set anoinputs 15 00.00e-13
  set anoinputs 16 00.00e-13
  set anoinputs 17 00.00e-13
  set anoinputs 18 00.00e-13
  set anoinputs 19 00.00e-13
  set anoinputs 20 00.00e-13
  set anoinputs 21 00.00e-13

Question information

Language:
English Edit question
Status:
Answered
For:
MadGraph5_aMC@NLO Edit question
Assignee:
No assignee Edit question
Last query:
Last reply:

This question was reopened

Revision history for this message
Yannis Maznas (imaznas) said :
#1

You can find some plots from madanalysis here:
https://cernbox.cern.ch/index.php/s/rkPfQN3ZjQDEBUk

Revision history for this message
Yannis Maznas (imaznas) said :
#2

Hello again,

Any updates on this?

On the same logic as before, here is a plot following that gives the xs vs coupling value of reweighted and generated event samples. All samples contain 20k events.
https://prnt.sc/kax98w
Apparently the difference is more than 2sigma.

Have you seen this behavior before?

Thanks,
Yannis

Revision history for this message
Olivier Mattelaer (olivier-mattelaer) said :
#3

Looks like my answer did not went trough.

Did you try to set the helicity by helicity re-weighting to off? (obviously you can only do that if you are not going to use the helicity information at any stage later)

Cheers,

Olivier

Revision history for this message
Yannis Maznas (imaznas) said :
#4

Hi Olivier,

Yes I tried that but with no effects. Actually it slightly made it worse.
The "mother" sample consists of several 1k sub-samples that I merge in order to get the big one.

During merging I get the message "WARNING Cross sections do not agree with a 5% precision!" for several of the samples (roughly 1/3 of them).
Is it possible that this affects somehow the cross section of the mother sample so that to influence also the cross sections of the reweighted ones?

Cheers,
Yannis

Revision history for this message
Olivier Mattelaer (olivier-mattelaer) said :
#5

> During merging I get the message "WARNING Cross sections do not agree with a 5% precision!" for several of the samples (roughly 1/3 of them).
> Is it possible that this affects somehow the cross section of the mother sample so that to influence also the cross sections of the reweighted ones?

Yes this is worrisome. Please generate larger sub-sample to have a good accuracy.

Cheers,

Olivier

> On 29 Jul 2018, at 18:12, Yannis Maznas <email address hidden> wrote:
>
> Question #670931 on MadGraph5_aMC@NLO changed:
> https://answers.launchpad.net/mg5amcnlo/+question/670931
>
> Yannis Maznas posted a new comment:
> Hi Olivier,
>
> Yes I tried that but with no effects. Actually it slightly made it worse.
> The "mother" sample consists of several 1k sub-samples that I merge in order to get the big one.
>
> During merging I get the message "WARNING Cross sections do not agree with a 5% precision!" for several of the samples (roughly 1/3 of them).
> Is it possible that this affects somehow the cross section of the mother sample so that to influence also the cross sections of the reweighted ones?
>
> Cheers,
> Yannis
>
> --
> You received this question notification because you are an answer
> contact for MadGraph5_aMC@NLO.

Revision history for this message
Yannis Maznas (imaznas) said :
#6

Actually it's impossible to generate larger samples since madgraph kills most of the events when I ask more than 1k.
I think there are tickets opened about it but anyway you gave me a clue what may have gone wrong in my case.

Thanks a lot!

Cheers,
Yannis

Revision history for this message
Yannis Maznas (imaznas) said :
#7

Thanks Olivier Mattelaer, that solved my question.

Revision history for this message
Yannis Maznas (imaznas) said :
#8

It appears that the the warning message previously mentioned is not related to the files, or at least not only to them.
I have a pack of ~40 1k lhe files and during merging this message appears only on specific ~15 files.
After excluding these files from merging and when trying to merge anew the remaining ones this message re-appears on some of the rest, previously "healthy", lhe files.

Is there a way to mitigate this behavior by keeping the 1k sub-sample size as a given? I cannot really understand where exactly lies the problem.
Should I expect the situation to meliorate if I generate more 1k samples to merge?

Cheers,
Yannis

Revision history for this message
Olivier Mattelaer (olivier-mattelaer) said :
#9

Hi,

If the cross-section is that much unstable, they are a juge issue for this process.
In that case, the merging of the events needs to be done in a weighted way and not in a unweighted way.

Are you sure that you do not have singularity (even integrable one)?
In any case, I would not trust too much your dedicated sample in this case.

Cheers,

Olivier

> On 30 Jul 2018, at 13:02, Yannis Maznas <email address hidden> wrote:
>
> Question #670931 on MadGraph5_aMC@NLO changed:
> https://answers.launchpad.net/mg5amcnlo/+question/670931
>
> Yannis Maznas posted a new comment:
> It appears that the the warning message previously mentioned is not related to the files, or at least not only to them.
> I have a pack of ~40 1k lhe files and during merging this message appears only on specific ~15 files.
> After excluding these files from merging and when trying to merge anew the remaining ones this message re-appears on some of the rest, previously "healthy", lhe files.
>
> Is there a way to mitigate this behavior by keeping the 1k sub-sample size as a given? I cannot really understand where exactly lies the problem.
> Should I expect the situation to meliorate if I generate more 1k samples to merge?
>
> Cheers,
> Yannis
>
> --
> You received this question notification because you are an answer
> contact for MadGraph5_aMC@NLO.

Revision history for this message
Yannis Maznas (imaznas) said :
#10

Hello,

How can I merge the lhe files in a weighted way? Who defines the weight in this case?

About the singularity, I'm not really sure how to check that.
Along with the process card above, I'm attaching the run and parameter cards in case you can find something funny in them or you can come up with any suggestion.

https://www.dropbox.com/s/7zbixgya2qqakx6/run_card.dat.txt?dl=0
https://www.dropbox.com/s/nhaim9kvjn9f7vm/param_card.dat.txt?dl=0

Cheers,
Yannis

Revision history for this message
Yannis Maznas (imaznas) said :
#11

Hello again,

Thinking about it, it raised a few more questions.

If I have a single lhe file (not merged), is there a way to check the 5% precision on it? Does it make sense or should it be part of a larger sample i.e. be a sub-sample?

Assuming that I ask for a 30k generation sample (not merged) and most of the events get killed. e.g. managed to produce ~5k events instead. Can I trust the cross section estimation in this case or the fact it kills events affects the estimation as well?

Cheers,
Yannis

Revision history for this message
Olivier Mattelaer (olivier-mattelaer) said :
#12

Hi,

If I have a single lhe file (not merged), is there a way to check the 5% precision on it? Does it make sense or should it be part of a larger sample i.e. be a sub-sample?

Depend which type of error you are looking for. In the case of the process that you describe and the type of error that you point.
I do not think that you have a 5% precision. Now I think that your 5% error can not be guessed from the lhe file in your case.

Assuming that I ask for a 30k generation sample (not merged) and most of the events get killed. e.g. managed to produce ~5k events instead. Can I trust the cross section estimation in this case or the fact it kills events affects the estimation as well?

This should trigger a lot of warning. You should then test to permute the order of the particle in the final state.
And compare to other code/theoretical result.

Cheers,

Olivier

Can you help with this problem?

Provide an answer of your own, or ask Yannis Maznas for more information if necessary.

To post a message you must log in.