Control systems are crucial to run CERN facilities. These include accelerators (such as the Large Hadron Collider), experiments (such as ATLAS, CMS, ALICE and LHCb) and technical infrastructure (such as the electrical network and the cooling and ventilation systems). The long-term collaboration between CERN and Siemens has played an important role in the successful running of all these facilities. Within CERN openlab, the Automation and Controls Competence Centre (ACCC) team is looking at the next generation of control systems for large installations. In 2013, the ACCC team focused specially on data analytics, without forgetting other ongoing projects (such as data archiving, large-scale deployment, and industrial security.
The Large Hadron Collider (LHC) has one of the largest and most complex industrial control systems ever built, and the volume of data generated is growing year after year. Analysing this data is vital for better understanding the entire control system, and for developing data-driven strategies that can improve its efficiency, functionality and predictability. In order to achieve this, innovative software solutions need to be developed to overcome the challenges surrounding the capturing, storing and processing of huge amounts of data within a tolerable period of time.
The main goal of the CERN openlab collaboration between Siemens and CERN is to design and implement a software framework, which can be used as a common solution to match the different data analytics requirements of the various control subsystems (cryogenics, gas, vacuum, machine protection, etc.). CERN groups could potentially use this software framework to perform custom analysis based on the knowledge of the system experts. The research activities include both online analysis (analysis of current values directly retrieved from the control system under analysis) and offline analysis (analysis of the historical data stored into several repositories). An internal software tool provided by Siemens Corporate Technology has been used to perform rootcause analysis. This has enabled the ACCC team to identify the causes of distinct control faults and issues through the analysis of large volumes of data (mainly control alarms and machine logs) that otherwise could not be analysed. An emphasis was put on the gas systems of the LHC experiments to understand the failure modes of each of their components.
In addition, another Siemens data analytics framework has successfully been integrated with the current CERN control systems; this was achieved through OPC based intercommunication (see OPC Foundation) with the SCADA system WinCC Open Architecture (OA). Finally, several research activities have been started to investigate and develop new data-mining methodologies and predictive models that can be used to extract valuable information from the huge amount of control data stored in the CERN repositories. Despite this project still being in a relatively early stage, we are confident that this fruitful and exciting collaboration between Siemens and CERN will lead towards new data analytics solutions aimed at improving the efficiency and predictability of control systems. Work on data analytics will continue throughout 2014 with the aim to offer a common framework to facilitate the tasks of many CERN engineers.
Relational database archiver
Significant progress was made in 2013 in the activities related to the Oracle database logging system for WinCC OA — both in the current production version (3.11 SP1), as well as for the future version of SCADA developed by Siemens. For the next generation SCADA system, an important milestone was achieved: the new Oracle plugin developed at CERN was integrated into a full vertical-slice test setup and recorded data into a database. This was actually the fi rst time the new version of SCADA was run at CERN. In addition, some initial performance measurements of the logging subsystem (with the Oracle plugin) were completed by looking at the maximum data-write rate and then comparing this to the performance of the 3.11 SP1 version, as shown on the ‘Archiving performance comparison’ graph. The results were gathered in a report that was sent to Siemens for further analysis.
For the current version, there was a large improvement in performance. This was necessary to sustain the unprecedented data rates of the upgraded LHC magnet protection system (Quench Protection System, QPS). This improvement is crucial, as the upcoming LHC run will double the beam energy compared to the previous run. The data retrieved from the sensors needs to be stored to a database at a constant rate of 150,000 values per second and then queried to perform online analysis. The overall data volume is about 13 billion rows generated daily. Numerous optimisations have enabled reductions in the space required for a single data row and the disk-throughput has been halved. This was combined with careful tuning of the data readout process that enabled the required performance level to be achieved. The modifications prototyped and tested at CERN will now be implemented in the mainstream version of WinCC OA to benefit to all Siemens customers.
The Siemens WinCC OA SCADA package is especially well-suited to building very large and highly distributed control systems, like the ones of the LHC experiments and those of many accelerator services. The unprecedented size of these control systems necessitates the use of tools to efficiently manage their evolution over the lifetime of these applications (over 20 years). In the framework of CERN openlab, CERN is collaborating with Siemens ETM on the development of the Central Deployment Tool (CDT) for a new version of the SCADA system. This should ease the initial setup of new controls applications and provide a powerful way to push upgrades to software components onto multiple sets of computers in a centralised fashion. The CDT will make use of the so-called WinCC OA ASCII Manager to import/export the run-time database of a project from/to ASCII files. However, the internal changes to WinCC OA envisaged following the adoption of the next generation of the SCADA platform require the development of a completely new ASCII manager. The new ASCII manager will need to handle the two significantly different data models exposed at different layers of the new SCADA (CHROM and OA) and the support media for the new ASCII Manager will be XML files.
In this initial phase of the project, the work has focused on defi ning the requirements and functionality of the new ASCII manager. A comparison between the two data models implemented in the next generation SCADA has also been performed. This study has identifi ed signifi cant differences that have a major impact on existing applications and frameworks developed at CERN. Further discussion is therefore needed. The work performed so far has also covered the defi nition of the XML schema used for import/export, as well as the integration of various XML parsers into the WinCC OA development workspace and their benchmarking. This point is of special relevance as the possible use of SCADA on mobile devices calls for the minimisation of the CPU and memory footprints of the processes. The work is now moving towards an architectural design and the validation of the model proposed through prototypes.
Within the CERN openlab collaboration, CERN and Siemens have combined their efforts to design and implement a cybersecurity model for evaluating the robustness of the protocols implementations defi ned in the IEC-61850 standards. These protocols are widely deployed in advanced electrical power systems, called ’smart grids’, which use digitised information and communication technology to drive industrial process operations on the basis of consumers’ needs. The replacement of proprietary industrial control networks with open and standardised TCP/IP-based Ethernet networks enables an easier and more cost-effective integration of all industrial control system levels. However, this also exposes the entire infrastructure to internal and external cyber-attacks, which could destabilise the grid in unpredictable ways. This requires a proper design which takes into account not only the typical functional aspects, but also the cybersecurity requirements. The developed testing methodology, based on fuzzing and grammars techniques, has already proven to be effective at detecting communication weaknesses and at improving the overall robustness of those systems.
Previous activities for the Automation & Controls Competence Centre covering 2012 are available here. Further content can be found archived on the previous phases' website here.