TransWikia.com

Normalisation factors for Monte Carlo backgrounds

Physics Asked by uitty400 on March 31, 2021

I am currently starting to get involved in particle physics. I often hear of normalization factors for Monte Carlo backgrounds denoted by $mu$. So for example if someone talks about his normalization factor for his MC $tbar{t}$ background $mu_{tbar{t}} = 2.22 pm{1.12}$ what does he mean by it? Is it necessary to “scale” up the MC to the data so they match the same number of events? Where does the uncertainty come from and what does it mean? And what if $mu_{tbar{t}}$ would be smaller than 1?

One Answer

In experimental analyses like searches or cross section measurement one has to know the size of background contamination. There are basically two ways to do this:

  1. direct measurement of amount of background from data
  2. estimating background using theoretical predictions

There are pros and cons for both methods, e.g. data-driven methods are independent of theoretical assumptions but at certain point become limited by statistical and/or systematic uncertainty. A typical example here are perturbative QCD predictions that are limited by the uncertainty from higher order corrections.

One can combine two approaches and use the shape of a distribution from MC predictions, but constrain the absolute normalisation using data. The scale factor is then the ratio of theoretically predicted cross section to the one measured in data. If the ratio is greater than unity, the number of MC events has to be 'scaled up'.

Answered by dlont on March 31, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP