CERN openlab began its sixth three-year phase at the start of 2018. Information about each of our projects — organised into four R&D topics — can be found below.
In preparation for our new phase, CERN openlab recently published a white paper on future ICT challenges in scientific research. This identified 16 key challenge areas divided into four overarching R&D topics.
R&D Topic 1: Data-Centre technologies and infrastructures
Designing and operating distributed data infrastructures and computing centres poses challenges in areas such as networking, architecture, storage, databases, and cloud. These challenges are amplified and added to when operating at the extremely large scales required by major scientific endeavours. CERN is evaluating different models for increasing computing and data-storage capacity, in order to accommodate the growing needs of the LHC experiments over the next decade. All models present different technological challenges. In addition to increasing the capacity of the systems used for traditional types of data processing and storage, explorations are being carried out into a number of alternative architectures and specialised capabilities. These will add heterogeneity and flexibility to the data centres, and should enable advances in resource optimisation.
R&D Topic 2: Computing performance and software
Modernising code plays a vital role in preparing for future upgrades to the LHC and the experiments. It is essential that software performance is continually increased by making use of modern coding techniques and tools, such as software-optimising compilers, etc. It is also important to ensure that software fully exploits the features offered by modern hardware architecture, such as many-core GPU platforms, acceleration coprocessors, and innovative hybrid combinations of CPUs and FPGAs. At the same time, it is of paramount importance that physics performance is not compromised in drives to ensure maximum efficiency.
R&D Topic 3: Machine Learning and Data Analytics
Members of CERN’s research community expend significant efforts to understand how they can get the most value out of the data produced by the LHC experiments. They seek to maximise the potential for discovery and employ new techniques to help ensure that nothing is missed. At the same time, it is important to optimise resource usage (tape, disk, and CPU), both in the online and offline environments. Modern machine-learning technologies — in particular, deep-learning solutions applied to raw data — offer a promising research path to achieving these goals. Deep-learning techniques offer the LHC experiments the potential to improve performance in each of the following areas: particle detection, identification of interesting events, modelling detector response in simulations, monitoring experimental apparatus during data taking, and managing computing resources.
R&D Topic 4: Applications in other disciplines
By working with communities beyond high-energy physics, we are able to ensure maximum relevancy for CERN openlab’s work, as well as learning and sharing both tools and best practices across scientific fields. Today, more and more research fields are driven by large quantities of data, and thus experience ICT challenges comparable to those at CERN. CERN openlab’s mission rests on three pillars: technological investigation, education, and dissemination. Collaborating with research communities and laboratories outside the high-energy physics community brings together all these aspects. Challenges related to the life sciences, medicine, astrophysics, and urban/environmental planning are all covered in this section, as well as scientific platforms designed to foster open collaboration.
|Infrastructure monitoring and automation of resource deployment|
|Extreme Flow Optimizer|
|Oracle WebLogic on Kubernetes|
|Oracle Management Cloud|
|Code modernisation: fast simulation|
|High-performance cloud caching technologies|
|Quantum computing for high-energy physics|
|Smart platforms for science|