HSE Researchers Receive a Grant to Search for New Physics
A team of researchers from the HSE Laboratory of Methods for Big Data Analysis (LAMBDA) has won a contest held by the Presidential Research Funding Programme. Researchers with the laboratory are developing a system of algorithms that will help physicists look for new particles in the Large Hadron Collider.
Below, Senior Researcher at LAMBDA Denis Derkach discusses how the system helped HSE scientists who are part of the LHCb collaboration find a new particle, the double-charm baryon. He also describes some of the exciting work that lies ahead.
What Are New Particles
One of the most successful theories in modern physics is the Standard Model of particle physics, which can be used to describe the majority of data received during experiments. Aside from particles, the model encompasses three fundamental forces in the universe: electromagnetic, strong, and weak interactions. Electromagnetic interaction keeps electrons inside of an atom and atoms inside of a molecule. The photon is the carrier particle of the electromagnetic interaction. Strong interaction keeps protons and neutrons inside the atom’s nucleus and quarks inside of protons, neutrons, and other particles. The carriers of strong interaction are gluons, from the word glue. Weak interaction leads to certain decay, such as a neutron into a proton, electron, and electron antineutrino. W and Z bosons are the carrier particles of weak interaction.
Despite its many benefits, the Standard Model is not a final model for the world around us. One reason is that the model does not describe gravitational interaction. In order to get the whole picture, it is necessary to search for variances and inconsistencies in the experiments that are conducted on high and ultrahigh energies. This is one of the objectives of scientists working at the Large Hadron Collider – finding so-called ‘new physics.’
How to Search for New Particles
New physics can now be searched for using two methods – a direct and indirect one. The first largely uses universal experiments such as ATLAS and CMS, with the main objective being searching for particles that do not fall under the Standard Model. Finding such particles would prove that the existing model needs to be expanded upon.
The indirect method, which is applied in the LHCb experiment, is used to measure the characteristics of different particles, such as lifespan and the likelihood of decay, and the size of these particles is particularly sensitive to additional phenomena that occur on the background of core CM processes. If the result differs significantly from CM predictions, then this might point to a location where new physics can be looked for.
‘Experiments at the LHC present broad opportunities to use each of these methods to search for new physics,’ Mr Derkach explains. ‘As participants of the LHCb collaboration, we are going to develop a system of algorithms that are above all calculated for this experiment. We are focused on the indirect search method, which depends heavily on how precisely and effectively we are able to determine exactly which type of particle left which type of track.’
Existing theories predicted the particle that has been detected, and scientists even knew its approximate mass, but it took many years to be able to register the particle&hellip
According to Derkach, it is important to build a system of algorithms that will allow for different types of data – data that comes from all parts of the detector at the same time – to be processed. Examples include low-level ‘pixels,’ high-level impulse characteristics of particles, and the multiplicity of the decay vertices. An important advantage to the programme is the increased speed of work since the volume of processed data is expected to be massive. These same algorithms can also be used to discover anomalies in the detector’s functioning. Thanks to the fact that information is collected from different parts of the detector, it will be possible to assess how correctly a certain subsystem is working. This will allow for the quality of the data that the detector collects to be evaluated even more effectively.
‘Particle physics now uses advanced methods in data science. The main problem, however, is that each experiment is unique, and the algorithms that we’ve developed are very much adapted for specific experiments. They are exclusive. In the future, we are of course hoping to get closer to developing a more or less unified system,’ Derkach adds.
How the Double-charm Baryon Was Found Twice
Existing theories predicted the particle that has been detected, and scientists even knew its approximate mass, but it took many years to be able to register the particle.
‘When protons collide at the LHC, a number of particles are born and there’s a lot of noise that complicates the analysis of specific particles. Only modern machine-learning methods have allowed for the new double-charm baryon to be discovered twice,’ Mr Derkach comments.
The data used for the discovery went through a quality certification system developed by HSE staff. This complex system allows for a web interface to be used to track what is happening. The system that is built into the experiment’s software receives data from a current. Then it analyses the data for quality. If there’s a flaw, for example, the detector has illuminated, then a specific current needs to be shut off. This algorithm makes the researcher’s work much easier, and it allows him or her to trust the data being collected. It is partially thanks to this system that the charm baryon was found.
Future Plans: Replace Humans with Machines
In particle physics scientists always encounter large volumes of data. Up until the ubiquitous introduction of computers, researchers were the ones processing the event photographs (particle collisions) obtained by first-generation detectors. Since then, the simplest processing of photographs from a detector has been given to a computer that carries out billions of operations each second, but still completes essentially the same steps that a researcher does with a ruler and transporter. Scientists are now able to focus on higher-level tasks. The objective of HSE staff is to help algorithms reach a new level of accuracy and free up the researcher’s valuable time so that a physical picture of the world can be built. In this sense, developers are gradually replacing the routine actions of scientists with computer intelligence.
Searching for rare decays of well-known particles is one of the main focuses of the LHCb experiment. The experience amassed at LAMBDA will allow for such physics research to be conducted at the highest level. If decays were found with good statistical significance, the measurements would allow scientists to identify limitations on the expansion of the Standard Model, limitations that would otherwise be unavailable. Additionally, as a result of the project, LHCb will obtain an innovative particle identification system that is based on the deep analysis of primary subdetector data. This identification system has been tried and tested in new analyses and could become a fundamental component of the LHCb software stack after an upgrade.