MadWeight: Number of data points smaller than number of parameters
Dear all,
I am trying to run the semileptonic decay of ttbar, but it is a very long process and I had to reduce the number of events to 100 in order to get some result, but the problem is that I am getting an error message that says "Number of data points smaller than number of parameters". Note: I get this in the step -9 of MW.
The cards that I am using are something like this:
pp>(t~>(w->l- vl~)b~)(t>(w+>jj)b) @1 # First Process
QCD=99 # Max QCD couplings
QED=99 # Max QED couplings
end_coup # End the couplings input
_______
_______
# TAG VALUE UTILITY
name fermi # name for the run
cluster 0 # 0:single machine, 1: condor, 2: SGE
queue 'madgraph==True' # queue condition (usage depend of the cluster)
nb_exp_events 10 # number of experimental events to consider
write_log F # avoid writting of a lot of log files
normalize T # normalizes weight (1/sigma prefactor)
ME_int_points 10000 # number of points in MadEvent integration
MW_int_points 10000 # number of points in MadWeight integration
use_cut F # use the cut defined in run_card.dat
bw_cut F # use the BW cut
#******
## define the different param_card's ##
#******
Block MW_parameter
# TAG VALUE UTILITY
mode 1 # type of input
#
# # first parameter #
11 mass # Block of the parameter to change
12 6 # id of the parameter to change
13 180 # here you can enter the different values:
#
# # second parameter #
#
# use same syntax for parameters 3,4,...
#******
## Permutations ##
#******
Block MW_perm
# TAG VALUE UTILITY
permutation T # make permutation
bjet_is_jet T # consider permutation between b-jets and light jets
Could you please indicate me what I can do?.
Thanks in advance,
JP
Question information
- Language:
- English Edit question
- Status:
- Solved
- Assignee:
- No assignee Edit question
- Solved by:
- Juan Pablo Gomez
- Solved:
- Last query:
- Last reply: