The High-Luminosity LHC (HL-LHC) will produce a tenfold or more increase in the amount of data taken, exceeding what is expected to be achievable with a constant investment budget by several factors in both storage and computing capacity. Even corrected for the evolution of technology, this will result in a substantial shortage of computing resources.

Keeping with its mandate, CERN openlab is exploring new and innovative solutions to help physicists bridge this resource gap, which may otherwise impact on the HL-LHC experimental programme.

Following a successful workshop on quantum computing held at CERN in 2018, CERN openlab has started a number of projects in quantum computing that are at different stages of realisation.

 

Quantum graph neural networks

Project goal

The goal of this project is to explore the feasibility of using quantum algorithms to help track the particles produced by collisions in the LHC more efficiently. This is particularly important as the rate of collisions is set to increase dramatically in the coming years.

R&D topic
Quantum technologies
Project coordinator(s)
Sofia Vallecorsa
Team members
CERN openlab: Federico Carminati, Fabio Fracas│METU: Cenk Tüysüz, Bilge Demirkoz │ gluoNNet: Daniel Dobos, Kristiane Novotny │ CALTECH: Jean-Roch Vlimant │ University of Oxford: Karolos Potamianos

Collaborators

Project background

The Large Hadron Collider (LHC) at CERN is producing collisions at unprecedented collider energy. The hundreds of particles created during the collisions are recorded by large detectors composed of several sub-detectors. At the centre of these detectors there is usually a tracker detector, precisely recording the signal of the passage of charged particles through thin layers of active material. The trajectories of particles are bent by a magnetic field, to allow the measurement of the particle momentum. There is an expected tenfold increase in the number of tracks produced per bunch crossing after the high-luminosity upgrade of LHC. Classical algorithms to perform the reconstruction of the trajectory of charged particles are making use of Kalman filter formalism and even though quite accurate, scale worse than quadratically with the number of tracks. Several ways are explored to mitigate the increase in the computing needs, such as new detector layout, deep learning and code parallelisation. Quantum computing has been shown to provide speed-ups for certain problems and different R&D initiatives are exploring how quantum tracking algorithms could leverage such capabilities. We are developing a quantum-based track-finding algorithm aimed at reducing the combinatorial background during the initial seeding stage for the Kalman filters. We are using the publicly available data set designed for the recent Kaggle ‘TrackML’ challenge for this work.

Recent progress

We have established a consortium of parties interested in addressing this challenge. Members of the following organisations are now working together on this project: the Middle East Technical University (METU) in Ankara, Turkey; the University of Oxford, England; the California Institute of Technology (Caltech) in Pasadena, US; and gluoNNet, a humanitarian-focused big data analysis non-profit association based in Geneva, Switzerland. Quantum graph neural networks (QGNNs) can be implemented to represent quantum processes that have a graph structure. In the summer of 2019, we began work to develop a prototype QGNN algorithm for the tracking the particles produced by collision events.

Next steps

Initial results are very encouraging. We have implemented a simplified GNN architecture and we were able to train it to convergence achieving results that are very similar to the ones obtained by the classical counterpart. Work is still ongoing in order to better understand the performance and optimise the training process.

 

 

Publications

    C. Tüysüz, F. Carminati, B. Demirköz, D. Dobos, F. Fracas, K. Novotny, K. Potamianos, S. Vallecorsa, J.-R. Vlimant, A Quantum Graph Neural Network Approach to Particle Track Reconstruction. arXiv e-prints, p. arXiv:2007.06868 [cs.DC], 2020. cern.ch/go/6H88

Presentations

    F. Carminati, Particle Track Reconstruction with Quantum Algorithms (7 November). Presented at Conference on Computing in High Energy & Nuclear Physics, Adelaide, 2019. cern.ch/go/7Ddm
    D. Dobos, HEP Graph Machine Learning for Industrial & Humanitarian Applications (26 November). Presented at Conference on HEPTECH AIME19 AI & ML, Budapest, 2019. cern.ch/go/9Rvl
    C. Tüysüz, A Quantum Graph Neural Network Approach to Particle Track Reconstruction (22 January). Presented at the CERN openlab Technical Workshop, CERN, 2020. cern.ch/go/6TMH
    C. Tüysüz, A Quantum Graph Neural Network Approach to Particle Track Reconstruction (20 April). Presented at the 6th International Workshop “Connecting the dots”, Princeton University, 2020. cern.ch/go/9Tdt
    C. Tüysüz, Quantum Graph Neural Networks for Track Reconstruction in Particle Physics and Beyond (22 October). Presented at 4th Inter-experiment Machine Learning Workshop, 2020. cern.ch/go/6QQT
    C. Tüysüz, Performance of Particle Tracking using a Quantum Graph Neural Network (8 October). Presented at BAŞARIM 2020 Konferansı, 2020. cern.ch/go/kn9v
    K. S. Novotny, Quantum Track Reconstruction Algorithms for non-HEP applications (29 July). Presented at the 40th International Conference on High Energy Physics, Prague, 2020. cern.ch/go/8HFG
    K. S. Novotny, Exploring (Quantum) Track Reconstruction Algorithms for non-HEP applications (21 April). Presented at the 6th International Workshop “Connecting the dots”, Princeton University, 2020. cern.ch/go/bMM

Quantum support vector machines for Higgs boson classification

Project goal

This project is investigating the use of quantum support vector machines (QSVMs) for the classification of particle collision events that produce a certain type of decay for the Higgs boson. Specifically, such machines are being used to identify instances where a Higgs boson fluctuates for a very short time into a top quark and a top anti-quark, before decaying into two photons. Understanding this process — known by physicists as ttH production — is challenging, as it is rare: only 1% of Higgs bosons are produced in association with two top quarks and, in addition, the Higgs and the top quarks decay into other particles in many complex ways, or modes.

R&D topic
Quantum technologies
Project coordinator(s)
Sau-Lan Wu (Wisconsin University), Ivano Tavernelli (IBM Zurich), Sofia Vallecorsa (CERN openlab)
Team members
University of Wisconsin: Chen Zhou, Shaujun San, Wen Guan │ IBM Zurich: Panagiotis Barkoutsos, Jennifer Glick│ CERN openlab: Federico Carminati

Collaborators

Project background

QSVMs are among the most promising machine-learning algorithms for quantum computers. Initial quantum implementations have already shown performances comparable to their classical counterparts. QSVMs are considered suitable algorithms for early adoption on noisy, near-term quantum-computing devices. Several initiatives are studying and optimising input-data representation and training strategies.

We are testing IBM’s QSVM algorithm within the ATLAS experiment. Today, identifying ttH-production events relies on classical support vector machines, as well as another machine-learning technique known as ‘boosted decision trees’. Classically, these methods are used to improve event selection and background rejection by analysing 47 high-level characteristic features.

Recent progress

We are working to compare the QSVM to the classical approach in terms of classification accuracy. We are also working to ascertain the level of resources needed for training the model (time to convergence and training data-set size) and studying how different types of noise affect the final performance. In order to do this, we are making use of the IBM’s quantum simulator, with support from their expert team. Preliminary results, obtained using the quantum simulator, show that the QSVM can achieve performance that is comparable to its classical counterpart in terms of accuracy, while being much faster. We are now simulating noise in different ways, in order to understand performance on real hardware.

Next steps

Testing the algorithm on real hardware is one of the primary challenges. At the same time, we continue to work on the optimisation of the QSVM accuracy and we are studying the robustness of the algorithm against noise.

 

 


Presentations

    W. Guan, Application on LHC High Energy Physic data analysis with IBM Quantum Computing (March). Presented at 19th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT), Saas-Fee, 2019. cern.ch/go/6DnG

Quantum optimisation for grid computing

Project goal

The goal of this project is to develop quantum algorithms to help optimise how data is distributed for storage in the Worldwide LHC Computing Grid (WLCG), which consists of 170 computing centres, spread across 42 countries. Initial work focuses on the specific case of the ALICE experiment. We are trying to determine the optimal storage, movement, and access patterns for the data produced by this experiment in quasi-real-time. This would improve resource allocation and usage, thus leading to increased efficiency in the broader data-handling workflow.

R&D topic
Quantum technologies
Project coordinator(s)
Sofia Vallecorsa
Team members
Politehnica Univerity of Bucharest: Mircea-Marian Popa, Mihai Carabas, Popescu George Pantelimon │ Institut Polytechnique de Grenoble: Jaques Demongeot │ CERN openlab: Federico Carminati, Fabio Fracas │ CERN, ALICE: Costin Grigoras, Latchezar Betev

Collaborators

Project background

The WLCG has been essential to the success of the LHC’s scientific programme. It is used to store and analyse the data produced by the LHC experiments. Optimal usage of the grid’s resources is a major challenge: with the foreseen increase in the data produced by the LHC experiments, workflow optimisation — particularly for the data-placement strategy — becomes extremely important.

Simulating this complex and highly non-linear environment is very difficult; the complexity of the task goes beyond the capability of the computing hardware available today. Quantum computing could offer the possibility to address this. Our project, a collaboration with Google, the Polytechnic Institute of Grenoble and the Polytechnic University of Bucharest, will develop quantum algorithms to optimise the storage distribution.

Recent progress

In May, this project was awarded one-year funding under the European Union’s ATTRACT initiative. ATTRACT provides initial funding to 170 disruptive projects, each aiming to develop sensing and imaging technologies that will enable breakthrough innovations. This project, which has the full title of ‘Quantum optimisation of Worldwide LHC Computing Grid data placement’, is one of 19 projects funded in which CERN is involved. One of the major challenges faced by this project is the difficulty of defining a suitable description of the data set extracted from MonALISA, the monitoring and scheduling tool used by the ALICE experiment for grid operations. We have now defined the problem in terms of reinforcement learning, one of the three paradigms of machine learning (alongside supervised and unsupervised learning). We have also started implementing the key components of the reinforcement learning framework (in terms of environment and agent networks) that is to be used.

Next steps

Currently the deep neural network describing the environment and the agent behaviours are being implemented as classical networks. Our first goal is to prove that this strategy can reproduce the expected MonALISA behaviour. At a later stage we will implement these components as quantum circuits using CIRQ and work on the optimisation of the training process. Quantum computing and grid/cloud computing are game-changing technologies with the potential to have a very large impact on the future. Our results will provide an excellent initial prototype: the work could be then extended and integrated with other existing initiatives at the scale of the whole Worldwide LHC Computing Grid. Eventually benefits in terms efficient network usages, reduced computing time, optimised storage and therefore costs would be significant.

 

 


Presentations

    F. Carminati, Quantum Optimization of Worldwide LHC Computing Grid data placement (7 November). Presented at Conference on Computing in High Energy & Nuclear Physics (CHEP), Adelaide, 2019. cern.ch/go/nP9M

Quantum machine-learning for SuperSymmetry searches

Project goal

The goal of this project is to develop quantum machine-learning algorithms for the analysis of LHC collision data. The particular example chosen is the classification of SuperSymmetry signals from Standard-Model background.

R&D topic
Quantum technologies
Project coordinator(s)
Koji Terashi (University of Tokyo)
Team members
University of Tokyo: Michiru Kaneda, Tomoe Kishimoto, Masahiko Saito, Ryu Sawada, Junici Tanaka │ CERN openlab: Federico Carminati, Sofia Vallecorsa, Fabio Fracas

Collaborators

Project background

The analysis of LHC data for the detection of effects beyond the Standard Model requires increasing levels of precision. Various machine-learning techniques are now part of the standard analysis toolbox for high-energy physics. Deep-learning algorithms are increasingly demonstrating their usefulness in various areas of analysis, thanks to their ability to explore a much larger dimensional space.

This seems to be an almost ideal area of application for quantum computing, which is offering a potentially enormous parameter space and a correspondingly large level of computational parallelism. Moreover, the quasi-optimal Gibbs sampling features of quantum computers may enhance the training of such deep-learning networks.

Recent progress

During this year the Tokyo group has studied the performance of different quantum variational models for the classification of SuperSymmetry signals from the Standard-Model background, where the signal is: h ➞ ? ±? ∓ ➞ WW (➞lνlν) + ?01?01, and the background comes from: WW (➞lνlν). In particular, the efforts have focused on two approaches: quantum circuit learning (QCL) and quantum variational classifiers (QVC).

The SuperSymmetry data set in the UCI machine-learning repository has been used for this study. A quantum circuit learning with a set of seven variables and a depth of three has been implemented. The results of 5000 iterations with COBYLA (constrained optimisation by linear approximation) minimisation have been compared with classical deep neural networks and and boosted decision trees. Promising results have been achieved. An initial implementation of a QVC, with a depth of two and a set of three input variables, is also being studied.

Next steps

The current results are encouraging. The next steps of this work will be to test the results on real IBM Quantum Experience hardware. Initially, three variables will be used.

 

 

Possible projects in quantum computing

Project goal

We present here a number of interesting projects in quantum computing that we have discussed with different partners, but for which there is still no formal collaboration. The objective is to attract interest from the user community and create a critical mass that could allow us to establish collaborations with users and vendors.

R&D topic
Quantum technologies
Project coordinator(s)
Sofia Vallecorsa
Team members
CERN openlab: Federico Carminati, Fabio Fracas │ EPFL: Su Yeon Chang

Collaborators

Project background

OpenQKD: Testbed for quantum key distribution

OpenQKD is a European Horizon 2020 project to test quantum key distribution (QKD) technology and to prepare a pan-European QKD deployment to protect European citizens and economies against the potential security threat posed by a quantum computer. The three-year project started in September 2019. With a budget of €18 million, 38 European partners are developing fibre-based and free-space QKD and deploying over 30 use cases at 16 sites across Europe. Among these, several use cases are planned in Geneva. One of them is the so-called Quantum Vault, which aims to protect digital assets against failures and attacks. As a proof of concept, the Quantum Vault is being realised in six Genevan data centres and telecom nodes. CERN openlab and the Poznan Supercomputing and Networking Center in Poland are supporing the Quantum Vault by hosting one node at the CERN Data Centre. It is planned to actively involve CERN openlab by running a proper use case taking advantage of the Quantum Vault infrastructure.

 

Quantum generative adversarial networks

Generative adversarial networks (GANs) are among the most interesting models in classical machine learning. GANs are an example of generative models, i.e. models that learn a hidden distribution from the training data set, and can sample new synthetic data. At CERN openlab, we have been investigating their use as an alternative to Monte Carlo simulation, obtaining remarkable results. Much faster than standard Monte Carlo algorithms, GANs can generate realistic synthetic data, while retaining a high level of accuracy (see our fast simulation project). Quantum GANs could have more representational power than classical GANs, making them better able to learn more complex distributions from smaller training data sets. We are now training a quantum GAN to generate images of a few pixels and we are investigating two possible approaches: a hybrid schema with a quantum generator learning the target PDF, using either a classical network or a vibrational quantum circuit as a discriminator (variational quantum generator), as well as a full quantum adversarial implementation (quGAN).

 

Track seeding optimisation

The Kalman filter is widely used in high-energy physics for track fitting of particle trajectories. It runs after an initial pattern-recognition step where detector ‘hits’ are clustered into subsets belonging to the same particle. Currently, several pattern-recognition approaches exist. Quantum computing can be used to reduce the initial combinatorial search using the ‘quantum approximate optimisation algorithm’ developed by researchers at the Massachusetts Institute of Technology (MIT) in Cambridge, US. We are now studying the application of what is known as the ‘variational-quantum-eigensolver algorithm’ and implementing it on Intel’s quantum simulator.

 

Quantum homomorphic encryption

The latest advances in machine learning and data analytics offer great potential for gaining new insights from medical data. However, data privacy is of paramount concern. Anonymisation via the removal of personal information is not an option, since medical records carry information that may allow the identification of the owner much more easily and securely than the name or the birth date. One possibility being studied is to encrypt sensitive information in such a way that makes data analytics possible without decryption. This is called homomorphic encryption. It is important to find an encryption strategy that is secure, while also ensuring it is possible to apply a large family of analytic algorithms to the data. While such encryption algorithms do exist, they require high-quality random numbers and they tend to be very demanding in terms of computing resources. Thus, this is a promising field of investigation for the utilisation of quantum computing.

The aims of the project are multiple: to transfer anonymised medical records protected by Quantum Keys, to develop a quantum homomorphic encryption (QHE) algorithm to apply on it and to analyse the data with QHE-friendly analysis tools (techniques based on machine learning or deep learning). The main project consists of four different parts, each realised in collaboration with different partners, both European and Korean: ID Quantique, Innosuisse (the Swiss Innovation Agency), Korea Institute of Science and Technology Information (KISTI), and the Seoul National University Bundang Hospital (SNUBH).

 

Quantum random number generator

We have recently established a collaboration with Cambridge Quantum Computing to test the performance of a new quantum random-number generators and study its integration within simulation software used in high-energy physics.

 

RandomPower: evaluating the impact of a low-cost, robust ‘true random power generator’

Researchers at the University of Insubria in Italy have invented a true random power generator (TRNG). This is based on the local analysis of the time series of endogenous self-amplified pulses in a specific silicon device. The principle has been validated with lab equipment and a low-cost, small-form-factor board has been developed and commissioned with the support of an ATTRACT project. The board can deliver a stream of unpredictable bits at frequencies currently up to 1Mbps for a single generator, with the possibility to be scaled up. Randomness has been qualified through a test suite from the US National Institute of Standards and Technology, as well as beyond this. Together with the CERN openlab, the University of Insubria intends to evaluate the impact of the TRNG availability in a set of use cases, as follows:

  • Modifying the Linux OS, replacing the embedded random number generation with the random power stream, in order to facilitate its adoption.
  • Comparing the outcome of the application of generative adversarial networks using training sets guided by pseudo random number generators (PRNG) or the random power TRNG.
  • Identifying classes of Monte Carlo simulations in high-energy physics where the use of PRNG could be particularly critical.

Moreover, the availability of a low-cost platform for high-quality random numbers may open up new possibilities in the use of homomorphic encryption, relevant for privacy-preserving data analysis; this will be thoroughly evaluated.

Recent progress

-

Next steps

In 2020, we will continue to assess the merits of each of these lines of investigation. We will also continue our discussions with a range of companies and research institutes to identify areas for mutually beneficial collaboration. Where appropriate, we will work to formalise the investigations into full stand-alone projects.

 

 

More information:

Der große Traum vom Superhirn (German) in Süddeutsche Zeitung

The past, present and future of computing in high-energy physics in physicsworld

Wisconsin Quantum Institute Awarded Grant to Advance Quantum Computing Machine Learning in HPCwire

CERN, IBM Collaborate on Quantum Computing in IBM Research Blog

Inside the High-Stakes Race to Make Quantum Computers Work in Wired

Exploring quantum computing for high-energy physics in CERN courier

Quantum thinking required in CERN courier