Artificial Intelligence

Deep learning and beyond

The High-Luminosity LHC (HL-LHC), set to come online in 2029, will require roughly ten times the computing capacity we have today at CERN. Data-storage needs will also outstrip what it is possible to achieve with a constant investment budget by several factors. Even taking into account the expected evolution of technology, there will be a substantial shortage of IT resources. Thus, CERN openlab is exploring new and innovative solutions to help physicists bridge this resource gap, which may otherwise impact on the HL-LHC experimental programme.

Members of CERN’s research community expend significant efforts to understand how they can get the most value out of the data produced by the LHC experiments. They seek to maximise the potential for discovery and employ new techniques to help ensure that nothing is missed. At the same time, it is important to optimise resource usage (tape, disk, and CPU), both in the online and offline environments.

Modern machine-learning technologies — in particular, deep-learning solutions — offer a promising research path to achieving these goals. Deep-learning techniques offer the LHC experiments the potential to improve performance in each of the following areas: particle detection, identification of interesting events, modelling detector response in simulations, monitoring experimental apparatus during data taking, and managing computing resources.