Probing the nature of the universe with machine learning at the ATLAS experiment
Steven Schramm - University of Geneva
Lundi 02/12/2019, 11:00-12:00
Bat 141, salle André Berthelot, CEA Paris-Saclay

The Large Hadron Collider is the most powerful particle accelerator ever built, allowing us to study the conditions that existed at the beginning of the universe to unprecedented precision.  This accelerator has already discovered the highly anticipated Higgs Boson, providing an explanation for the origin of mass, and has turned its sights on the search for new physics and measurements of the fundamental properties of the universe.

In order to search for new physics, or measure properties of rare particles, it is necessary to sift through an enormous dataset. Processing the up to 40 million collisions per second delivered by the Large Hadron Collider, the ATLAS experiment currently records approximately ten petabytes of new data each year.  On top of this large dataset, the experiment creates advanced simulations of numerous known and predicted physical processes, resulting in even larger simulated datasets.

Analyzing all of this data is a massive task, and is a natural place to exploit the latest advancements in machine learning.  The usage of such techniques within ATLAS is rapidly growing, and machine learning has found a home in many corners of the experiment.  We will examine some of the ways in which machine learning techniques are already driving the sensitivity to rare physical processes, and discuss some of the more recent developments which are expected to become critical in the years ahead.


Contact : Fabrice BALLI

 

Retour en haut