no error in LO cross section when using bias

Asked by Benjamin Grinstein

I get slightly different cross sections when running with bias vs without. The process is p p > ta- vt~ (I don't think the specific process matters) and I get
with bias: 3578 ± 0
without bias: 3616 ± 6.9
While the difference is small, it is much larger than the errors. I am particularly concerned about the zero uncertainty in the run with bias. For completeness the bias is with
/ptl_bias = bias_module ! Bias type of bias, [None, ptj_bias, -custom_folder-]
 {'ptl_bias_target_ptl': 3000.0, 'ptl_bias_enhancement_power': 4.0} = bias_parameters
where the ptl_bias module is just the ptj_bias module with is_a_l for is_a_j

Incidentally the bias works: the histograms of pT of ta- (and that of MET) are very smooth at large pT (and MET) for the biased case, not so for the unbiased one (with a 10000 run).

The question is then: what happened to the error in the biased run? (I understand that the difference is less than the correction that would come from NLO, but still....)
Thanks
Ben

Question information

Language:
English Edit question
Status:
Answered
For:
MadGraph5_aMC@NLO Edit question
Assignee:
No assignee Edit question
Last query:
Last reply:
Revision history for this message
Olivier Mattelaer (olivier-mattelaer) said :
#1

HI,

When you use bias, we do not estimate the "real" cross-section anymore but the integral of another function (the bias function).
We then use "re-weighting" to get the estimator of the original cross-section (that we print on screen).

We also have the statistical uncertainty for the function that we did integrate (that include the bias), but
we do not have an efficient/reliable way to estimate the statistical uncertainty associated to the original integral.
This is the reason why we set it to zero. What is clear is that the bias degrades the precision of the total cross-section so I would say that the shift of cross-section that you observe is not surprising.

Cheers,

Olivier

> On 20 Jul 2023, at 04:10, Benjamin Grinstein <email address hidden> wrote:
>
> New question #707335 on MadGraph5_aMC@NLO:
> https://answers.launchpad.net/mg5amcnlo/+question/707335
>
> I get slightly different cross sections when running with bias vs without. The process is p p > ta- vt~ (I don't think the specific process matters) and I get
> with bias: 3578 ± 0
> without bias: 3616 ± 6.9
> While the difference is small, it is much larger than the errors. I am particularly concerned about the zero uncertainty in the run with bias. For completeness the bias is with
> /ptl_bias = bias_module ! Bias type of bias, [None, ptj_bias, -custom_folder-]
> {'ptl_bias_target_ptl': 3000.0, 'ptl_bias_enhancement_power': 4.0} = bias_parameters
> where the ptl_bias module is just the ptj_bias module with is_a_l for is_a_j
>
> Incidentally the bias works: the histograms of pT of ta- (and that of MET) are very smooth at large pT (and MET) for the biased case, not so for the unbiased one (with a 10000 run).
>
> The question is then: what happened to the error in the biased run? (I understand that the difference is less than the correction that would come from NLO, but still....)
> Thanks
> Ben
>
> --
> You received this question notification because you are an answer
> contact for MadGraph5_aMC@NLO.

Can you help with this problem?

Provide an answer of your own, or ask Benjamin Grinstein for more information if necessary.

To post a message you must log in.