Tell me the truth!

As you know, on HiggsHunters we have been giving you not just real data images from the ATLAS detector. We have also been showing simulated images of what ATLAS detector events would look like if the Standard Model were extended with various new particles and interactions that would give displaced vertices. So what are these simulations of exactly, and why are they important for the science at HiggsHunters?

The simulations are of Z plus Higgs events, where the Z decays to a pair of muons (see last post!) and the Higgs boson decays to a pair of new, long-lived, neutral particles (called “a” bosons). Such a model is predicted by various theories which are possible extensions of the SM. These “a” bosons can then themselves decay, after some lifetime, back to SM particles. The parameters of this model are:

  • the mass of the “a” boson
  • the average lifetime of the “a” boson
  • what SM particles the “a” boson decays to

Since this is unknown new physics, these parameters are not known! But we can make some educated and motivated guesses based on theories of SM extensions and exclude values that would have allowed the “a” boson to be seen at previous experiments. For our simulations on HiggsHunters, we chose the following values:

  • mass of either 8, 20, or 50 GeV. Recall the Higgs boson mass is 125 GeV, so we need the “a” boson mass to be less than half the Higgs boson mass if the Higgs boson is to decay to a pair of “a” bosons.
  • average lifetime (times the speed of light) of 1, 10, or 100 mm. We’d like to have a chance of seeing it decay displaced, and within the detector. Note that this does not mean the displaced vertex will always be 1, 10, or 100 mm away from the collision point, anymore than dogs who live an average of 11 years will all die on their 11th birthday! The probability that a particle with average lifetime, L, decays after a length of time, t, is proportional to e^(-t/L). Dogs have a more complicated distribution!)
  • “a” boson decays to either a pair of bottom quarks (for an “a” mass of 20 or 50 GeV) or tau leptons (for an “a” mass of 8 GeV). This is based on the expectation that Higgs-like bosons usually decay to pairs of heavy SM particles more often, since they couple to mass.

This gives us 3×2+3×1=9 possible parameter combinations to simulate. We present a roughly equal number of all 9 on HiggsHunters. Below is an image of an event simulated with the parameters: “a” mass = 20 GeV, “a” decay to a pair of bottom quarks, and “a” lifetime = 100 mm. Since this is a simulation, we know where the decays took place – they are drawn on the image as pink dots.

truth_atlantisXY_110202953_004966

Here are a bunch more of these (XY, XYzoom, RZzoom), for your viewing pleasure! (These have “a” mass = 20 GeV, “a” decays to bottom quarks, and “a” lifetime = 10 mm.)

But when we give them to you on the site, you don’t get to see the pink dots! Wouldn’t that make life easier for you?! Well, yes, but the point is that by testing you all with these simulations we can measure your ability (efficiency) to have found such displaced vertices if they were to exist in the real data. This is very important for us to draw scientific conclusions from the results of the HiggsHunters project! (We also can’t show the pink dots to you right after you classify it, since each image gets classified by ~20 people. We can’t reveal the truth until after the image is done being classified by everyone! We could make the truth images available on talk at some point in the far future.)

If you all manage to find evidence for some new particle giving displaced decays, we need to know how often this process is occurring in the LHC pp collisions. (Well, we would also need to celebrate with some Champagne first!) How often it happens is given by how often you see it happening divided by your efficiency to see it happen. It is this efficiency which we can measure using the simulations, for each set of model parameters.

If instead you manage to go through all the data and find no new particles, we have still learned something very interesting! (No Champagne needed though, unfortunately.) We will have learned that these new particles are not being created at a rate that is large enough so that you would have seen them. This rate (for each particular set of model parameters) is very interesting to theorists who work on possible extensions to the SM and is given by about 3 divided by your efficiency to see it happen. So here again, these efficiencies measured with the simulations are essential. (In case you’re curious, the “3” in the formula above comes from the fact that to have at least a 95% chance of having at least 1 event occur, you need to expect about 3 events to occur on average. You can always just get a little unlucky and the events don’t occur even if on average you would have expected them to! For more info, read about the Poisson distribution. I also love this Poisson calculator – try typing in x=1 and avg. rate of success=3, you should see P(X>=1)=0.95!)

That’s it! Be proud that your abilities have been calibrated, and you are now ready to tackle the larger sets of real data we plan to upload to the site soon!

About drandyhaas

Assistant Professor NYU, Physics

17 responses to “Tell me the truth!”

  1. brownfox1 says :

    Andy

    I think that there’s a typo: “to be at least mH/2” should be “to be no more than mH/2”

    Some questions

    When are you going to get your first results of the simulation? [Either in time or in number of simulations marked] And what are you planning to make public? Will there be a sensitivity score or scores? I would think that the sensitivity would be lower for particles with shorter lifetimes which would have a vertex close to the centre and would therefore be harder for us to differentiate, so I am guessing that you’ll need a sensitivity for each scenario, so nine in total. Are you going to try to combine these into a general formula for all energies / lifetimes?

    I can think of two scenarios where you might get ‘false positives’ – either non-vertices which are marked in error or vertices which are correctly marked but are due to known particles. You could probably use the simulations to try to address the first type by finding the right balance between false positives and false negatives for known decays to get the cleanest results. Are you doing this? And how are you addressing the issue of vertices from known particles?

    Many thanks for your patience
    Steve

    Like

    • drandyhaas says :

      Thanks, I’ve fixed the typo.
      We will be doing the calibration soon (next month?) once all simulated events have been classified enough times. I imagine we would share the results of the calibration in a future blog post. There certainly will be a separate sensitivity per each set of parameters. You’re right that very short lifetimes should have smaller sensitivity, but also the sensitivity should drop at large lifetimes once the decays are no longer well-measured in the tracker. There is likely a sweet-spot for moderate lifetimes. We’ll definitely interpolate / extrapolate to estimate the efficiency as a continuous function of the parameters, but the range of parameters will be limited by those of the simulated events.
      As for false positives, yes we are also going to use the simulations to check how often you identify vertices where there is no true decay. We also have simulated Z decays to muons pairs, with no accompanying Higgs boson – these simulated events will be purely used to study such false positives. But we will likely not be able to distinguish different reasons for false positives such as known SM particles, interactions with detector material (future blog post?), fake track / random combinations, etc. And it’s not so important to. All we need to know is the rate of false positives, which will be checked by the rate of data events with only one (large) displaced vertex found.
      Thanks for the great questions!

      Like

  2. randi says :

    I’m just curious about your comment:
    “3×2+3×1=9 possible parameter combinations to simulate.”

    If I understand correctly, there are 3 choices for the mass of the “a” boson and 3 choices for the average lifetime of the “a” boson, but what SM particles the “a” boson decays to is determined by the mass.
    I would have thought that there would be 3×3=9 possible parameter combinations to simulate. The same answer, but a different equation.

    Statistics is not my strong point, but I am curious where the “3×2+3×1” comes from.

    Thanks,
    Randi

    Like

    • drandyhaas says :

      Right, 3×2+3×1 = 3×3. 🙂 I had just broken it down differently.
      I was trying to show more explicitly that there are 3 lifetime choices times (2 choices for the “a” mass if it is decaying to bottom quarks and 1 choice for the “a” mass if it is decaying to tau leptons). So perhaps I should have written 3(2+1), which is also 9. Math works! 🙂
      Sorry for not being more clear initially.

      Like

  3. foggyglasses says :

    Can you post a Feynman diagram for what you describe? Is it p + p –> H + Z or p + p^- –>H+Z ? Or is there an intermediate virtual Z ?

    thanks

    fg

    Like

    • drandyhaas says :

      Yes, there’s a virtual Z in the middle, which “radiates” a Higgs.
      zh diagram
      Note that the q and qbar come from inside the protons…

      Like

      • foggyglasses says :

        Andy: thanks!

        So we’re actually Higgstrahlung Hunters? This is going to make an awesome tee shirt.

        um….don’t protons consist of two ups and a down? Don’t you need an antiproton for that antiquark?

        I’m just sayin’….

        fg

        Like

  4. brownfox1 says :

    Actually, the proton is a bit more complex than that. If you were to be able to take a snapshot of the quarks at a moment in time you might find two ups (and a down) or you might find five ups and three anti-ups, or four and two, etc.
    So when two protons collide there is a chance that an up could interact with an anti-up quark.

    Like

    • foggyglasses says :

      To Brownfox1: thanks! So you’re relying on vacuum polarization to provide anti-quarks. That’s a perturbative effect in electroweak theory. How good is the math for the strong interaction?

      I guess the LHC designers had a lot of faith in that math, since they went to the trouble of building colliding beams of protons and protons, not protons and antiprotons.

      That’s quite a €10B bet. I’m impressed.

      fg

      Like

  5. foggyglasses says :

    To brownfox1: good comments. They imply also that the building of the proton-antiproton colliders assumed that some antiprotons could be created and stored long enough to build up a big bunch for injecting into the collider. Which brings me to what may actually be a point: how, if at all, does this project depend upon knowing the frequency of quark-antiquark collisions in total? And relative to what?

    Please allow me to add that one can read all the hep books out there, but nothing is as enlightening as gossip like this. Many thanks to all involved.

    fg

    Like

    • drandyhaas says :

      Good questions!
      Actually the anti quark distributions of the proton are quite well known from many previous collider experiments and from the LHC itself.
      The Z boson is most directly produced by q qbar annihilation at the LHC. The rate is predicted to about 3% accuracy, and is measured to about 2% accuracy by ATLAS. They agree within these uncertainties, and also the differential distributions as a function of the angle from the beam, and as a function of the transverse momentum of the Z, also agree!

      Like

      • foggyglasses says :

        Andy: I’m familiar with the SM at the level of one reading of Quigg 2nd ed. Would you please suggest some reading on your latest reply, especially on the ‘antiquark distribution of the proton’? It would be very much appreciated. I do understand how get the amplitude for weak annihilation of a q qbar pair into the Z.

        Best

        fg

        Like

  6. drandyhaas says :

    Fg,
    have a look at:
    Quarks and Leptons: An Introductory Course in Modern Particle Physics https://www.amazon.com/dp/0471887412/ref=cm_sw_r_awd_T8sLub1JVZZ2F

    Like

    • foggyglasses says :

      Thanks! I’ve borrowed a copy of Q&L. It’s dated (1980: no top yet) but does not seem wrong, and fits nicely between Griffiths and Quigg. Anyway the upshot is I gather that the anti-up luminosity is very roughly about the same as the proton luminosity. I also looked Dittmaier et al but could not find separate cross sections for Higgstrahlung and gluon fusion. Fascinating stuff, anyway.

      fg

      Like

Leave a reply to foggyglasses Cancel reply