newsweekshowcase.com

physicists observe entangled quarks for the first time

Nature: https://www.nature.com/articles/d41586-024-02973-7

Observation of entanglement of the top quarks with the ATLAS detector at the LHC, CMS, and ArXiv

Physicists working on the ATLAS detector analysed one million pairs of the heaviest anti-top quarks and their antimatter cousins in order to see how entangled they are. They describe in detail the evidence that was found when they announced they were investigating the issue last year. Physicists working on the LHC’s other main detector, CMS, also confirmed the entanglement observation in a report posted to the preprint server arXiv in June2.

Owing to their short lifetime, top quarks cannot be detected directly in experiments. The top quarks decay into a bottom quark and a W boson, while the bottom quarks decay into a pair of lighter quarks or a charged lepton. In this measurement, only W bosons decaying into leptons are considered because charged leptons, especially electrons and muons, are readily detected with high precision at collider experiments. To a good approximation, the degree to which the leptons carry the spin information of their parent top quarks is 100% because of the maximally parity-violating nature of the electro-weak charged current. The normalized differential cross section of the process may be written as a result of the direction of the rotation of the parent top quark or antitop quark.

$$\frac{1}{\sigma }\frac{{\rm{d}}\sigma }{{\rm{d}}{\varOmega }{+}{\rm{d}}{\varOmega }{-}}=\frac{1+{{\bf{B}}}^{+}\cdot {\widehat{{\bf{q}}}}{+}-{{\bf{B}}}^{-}\cdot {\widehat{{\bf{q}}}}{-}-{\widehat{{\bf{q}}}}{+}\cdot C\cdot {\widehat{{\bf{q}}}}{-}}{{(4{\rm{\pi }})}^{2}},$$

$$\rho =\frac{1}{4}\left[{I}{4}+\sum {i}\left({B}{i}^{+}{\sigma }^{i}\otimes {I}{2}+{B}{i}^{-}{I}{2}\otimes {\sigma }^{i}\right)+\sum {i,j}{C}{ij}{\sigma }^{i}\otimes {\sigma }^{j}\right].$$

Top-Quark Decays at Next-to-Leading Order in HERWIG and POWHEG + PYTHIA

in the validation regions of (380 < {m}{t\bar{t}} < 500\,{\rm{GeV}}) and ({m}{t\bar{t}} > 500\,{\rm{GeV}}), respectively. The expected values are those predicted by POWHEG + PYTHIA. The calibration curve for the signal region and a summary of the results in all regions are presented in Fig. 2.

Monte Carlo event simulations are used to model the (t\bar{t}) signal and the expected standard model background processes. The production of (tbart) events was modeled using a heavy-quark. 42,43,44,45) generator at next-to-leading order (NLO) precision in QCD and the events were interfaced to either PYTHIA 8.230 (ref. 46) or HERWIG 7.2.1 (refs. 47,48) to model the parton shower and hadronization. The decays of the top quarks, including their spin correlations, were modelled at leading-order (LO) precision in QCD. An additional sample of (tbart) events that were generated using POWHEG BOX RES was used. The generator is interfaced to PYONHA. The setup and tuning of these generators are detailed in the section called Monte Carlo simulation. The difference is that the former uses a pT ordered shower, whereas the latter uses an hadronization effect shower. One important thought is the fact that full information on the spin density matrix can’t be passed on to the parton shower programs.

Systematic uncertainties in the D detector calibration curve: The impact of systemic uncertainties on the determination of the detector level and the detector-level background

The table shows the different sources of uncertainty and their impact on the result. The size of each systematic uncertainty depends on the value of D and is given in Table 1 for the standard model prediction, calculated with POWHEG + PYTHIA. The systematic uncertainties considered in the analysis are described in detail in the section ‘Systematic uncertainties’.

For all of the detector-related uncertainties, the particle-level quantity is not affected and only detector-level values change. For signal modelling uncertainties, the effects at the particle level propagate to the detector level, resulting in shifts in both. Uncertainties in modelling the background processes affect how much background is subtracted from the expected or observed data and can, therefore, cause changes in the calibration curve. These uncertainties are treated as fully correlated between the signal and background (that is, if a source of systematic uncertainty is expected to affect both the signal and background processes, this is estimated simultaneously and not separately).

The pairs of (D_rm detectorprime ), and ( D_rmparticleprime) are plotted together. 2a. A straight line interpolates between the points. With this calibration curve, any value for Ddetector can be calibrated to the particle level.

The standard model background processes that contribute to the analysis are the production of a single top quark with a W boson (tW), pair production of top quarks with an additional boson (t\bar{t}+X) (X = H, W, Z) and the production of dileptonic events from either one or two massive gauge bosons (W and Z bosons). The generators for the hard-scatter processes and the showering are listed in the section ‘Monte Carlo simulation’. The same procedure can be used to identify and reconstruct detector-level objects.

The background contribution of events with reconstructed objects that are misidentified as leptons, referred to as the ‘fake-lepton’ background, is estimated using a combination of Monte Carlo prediction and correction based on data. This data-driven correction is obtained from a control region dominated by fake leptons. It is defined by using the same selection criteria as above, except that the two leptons must have the same-sign electric charges. The scale factor applied to the fake-lepton events in the signal region takes the difference between observed events and predicted events into account.

All relevant components of the detector are considered for events taken during stable-beam conditions. To be selected, events must have exactly one electron and one muon with opposite-sign electric charges. A minimum of two jets is required, and at least one of them must be identified to be from a b-hadron.

The ATLAS experiment37,38,39 at the LHC is a multipurpose particle detector with a forward–backward symmetric cylindrical geometry and a solid-angle coverage of almost 4π. It is used to record particles produced at the collider with the help of energy and particle positions. The coordinate system is defined in the section ‘Object identification in the ATLAS detector’. There is a muon spectrometer, calorimeters, and a magnetic field surrounded by an inner- tracking detector. The muon spectrometer surrounds the calorimeters and is based on three large superconducting air-core toroidal magnets with eight coils each providing a field integral of between 2.0 T m and 6.0 T m across the detector. An extensive software suite40 is used in data simulation, the reconstruction and analysis of real and simulated data, detector operations, and the trigger and data acquisition systems of the experiment. The complete dataset of pp collision events with a centre-of-mass energy of √s = 13 TeV collected with the ATLAS experiment during 2015–2018 is used, corresponding to an integrated luminosity of 140 fb−1. The data sample recorded using single-electron or single- Muontrigger41 was analyzed.

In both of the validation regions, with no entanglement signal, the measurements are found to agree with the predictions from different Monte Carlo setups within the uncertainties. Consistency check is a way to confirm the method used for the measurement.

In the signal region, the POWHEG + PYTHIA and POWHEG+ HERWIG generators yield different predictions. The size of the observed difference is consistent with changing the method of shower ordering and is discussed in detail in the section ‘Parton shower and hadronization effects’.

Entanglement of top quark pairs at the LHC and hadron colliders – a perspective from James Howarth

Scientists have had no doubt that top-quark pairs can be entangled. The standard model of particle physics — the current best theory of fundamental particles and forces through which they interact — is built atop quantum mechanics, which describes entanglement. But the latest measurement is nonetheless valuable, researchers say.

But the fact that entanglement hasn’t been rigorously explored at high energies is justification enough for Afik and the phenomenon’s other aficionados. He says that people have realized that they can now use hadron colliders for these tests.

The top quarks have a short lifetime. “You could never do this with lighter quarks,” says James Howarth, an experimental physicist at the University of Glasgow, UK, who was part of the ATLAS analysis along with Afik and Muñoz de Nova. The particles form hadrons of both protons and neutrons after a mere 24 seconds, as they dislike being separated. But a top quark decays quickly enough that it doesn’t have time to ‘hadronize’ and lose its spin information through mixing, Howarth says. Instead, all of that information “gets transferred to its decay particles”, he adds. This meant that the researchers could measure the properties of decay products to work backwards and infer the properties, including spin, of the parent top quarks.

There were top and anti-top quarks created in the aftermath of a particle collision. They decay into bigger particles.

Source: Quantum feat: physicists observe entangled quarks for first time

What can a particle physicist at the institute of theoretical physics say about entanglement at high energies?

“You don’t really expect to break quantum mechanics, right?”, says Juan Aguilar-Saavedra, a theoretical physicist at the Institute of Theoretical Physics in Madrid. You don’t need an expected result to be able to measure things that are important.

Giulia Negro, a particle physicist at Purdue University in West Lafayette, Indiana, who worked on theCMS analysis, said that it was the first time they could study entanglement at the highest possible energies.

Exit mobile version