Industrial control and monitoring
This project aims to make control systems used for the LHC both more efficient and ‘smarter’. To achieve this, we are working to enhance the functionality of WinCC OA (a SCADA tool used widely at CERN) and to apply data-analytics techniques to the recorded monitoring data, in order to detect anomalies and systematic issues that may impact upon system operation and maintenance.
The HL-LHC project aims to increase the integrated luminosity — and hence the rate of particle collisions — by a factor of ten beyond the LHC’s design value. Monitoring and control systems will therefore become increasingly complex, with unprecedented data throughputs. Consequently, it is vital to further improve the performance of these systems, and to make use of data-analytics algorithms to detect anomalies and to anticipate future behaviour. This project aims to achieve these objectives through three closely related areas of work:
- Developing a modular and future-proof archiving system (NextGen Archiver) that supports different SQL and NOSQL technologies to enable data analytics. It is important that this can be scaled up to meet our requirements beyond 2020.
- Developing a data-analytics platform that combines the benefits of cloud and edge computing.
- Developing a reporting system to feed the results of such data analysis directly into the operators’ control consoles.
By applying data-analytics techniques in this manner, our goal is to improve operation and diagnostics, with preventive maintenance leading to greater efficiency and reduced costs.
We developed a first prototype for the front end of the NextGen Archiver, and successfully carried out a series of functional and performance tests. We are also currently developing and testing back-end modules for Apache Kudu (a column-oriented data storage system) and Oracle databases.
A number of machine-learning algorithms were developed to detect faulty measurements in the cryogenics systems, as well as in other cooling and ventilation systems. We also implemented the control model’s estimation for what is known as ‘the electron-cloud effect’ (a phenomenon that occurs in particle accelerators and reduces the quality of the particle beam) as a job distributed over multiple computing nodes. We used Apache Spark, a cluster-computing framework, to achieve this. We were thus able to reduce the computational time required by a factor of 100.
In 2017, two of Siemens’s cloud-based solutions for big-data analytics, ELVis and WatchCAT, were integrated to create a single platform combining cloud and edge computing (using ‘Internet of things’ (IoT) devices). A domain-specific language (DSL) was implemented to develop user scripts that interact with the data.
Finally, a prototype reporting system based on Apache Impala (a massively parallel processing SQL query engine) was developed so that data-analytics results stored in the CERN Hadoop cluster could be injected into WinCC OA applications running on the operators’ control consoles.
Even if still in an early stage, the deployment of some of these technologies at CERN has already enhanced control systems and has led to processes being optimised significantly. We have also been able to use these technologies to extend the operational life of some devices.
In the first quarter of 2018, we will carry out a large-scale testing campaign to help optimise the performance of the NextGen Archiver modules. These are due to be officially released toward the end of 2018.
Regarding the data-analytics framework, the tighter integration between ELVis and WatchCAT will have a number of benefits. Through a single user interface, it will be possible to define complex event-processing rules, configure the WatchCAT infrastructure (i.e. push the rules to the analytics processes running on the IoT devices), and to monitor the execution of the analyses.
A new version of the DSL will also be implemented to integrate event and signal processing, as well as to extend its semantics to allow for more powerful and flexible rules. On top of this, we will develop new algorithms to cover other CERN use-cases, thus extending our current portfolio.
- F. Tilaro, M. Gonzalez, B. Bradu, M. Roshchin, An expert knowledge based methodology for online detection of signal oscillations (26 June), Presented at International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA 2017), Annecy, 2017.cern.ch/go/r7MS
- U. Puri, Simplified frontend for data generation and testing purposes in WinCC OA NextGen Archiver project (11 August), Presented at CERN openlab summer students’ lightning talks, Geneva, 2017. cern.ch/go/7DDN
- L. M. Sainio, Web reporting framework for control data analysis (11 August), Presented at CERN openlab summer students’ lightning talks, Geneva, 2017. cern.ch/go/fWz7
- B. Schofield, F. Varela Rodriguez, F. M. Tilaro, J. Guzik, P. Golonka, P. J. Seweryn, Siemens Industrial Control and Monitoring (21 September), Presented at CERN openlab Open Day, Geneva, 2017. cern.ch/go/8Lfv
- B. Bradu, E. Blanco, F. Tilaro, R. Marti, Automatic PID Performance Monitoring Applied to LHC Cryogenics (8 October), Presented at International Conference on Accelerator and Large Experimental Control Systems (ICALEPCS 2017), Barcelona, 2017. cern.ch/go/9cZx
- P. Golonka, M. Gonzalez, J. Guzik, R. Kulaga, Future Archiver for CERN SCADA Systems (8 October), Presented at International Conference on Accelerator and Large Experimental Control Systems (ICALEPCS 2017), Barcelona, 2017.cern.ch/go/jG8C
- P. J. Seweryn, M. Gonzalez-Berges, J. B. Schofield, F. M. Tilaro, Data Analytics Reporting Tool for CERN SCADA Systems (8 October), Presented at International Conference on Accelerator and Large Experimental Control Systems (ICALEPCS 2017), Barcelona, 2017. cern.ch/go/76pR
- F. Tilaro, B. Bradu, F. Varela, M. Roshchin, Model Learning Algorithms for Faulty Sensors Detection in CERN Control Systems (8 October), Presented at International Conference on Accelerator and Large Experimental Control Systems (ICALEPCS 2017), Barcelona, 2017. cern.ch/go/8QVF