Project Goal
The Interdisciplinary Digital Twin (interTwin) project is an ambitious initiative aimed at revolutionizing digital twin technology. At its core, interTwin seeks to co-design and implement a prototype of an open-source Digital Twin Engine (DTE), built upon open standards, facilitating seamless integration with application-specific Digital Twins (DTs). This innovative platform, rooted in a co-designed interoperability framework and the conceptual model of a DT for research, known as the DTE blueprint architecture, aims to simplify and accelerate the development of complex applicationspecific DTs. By extending the technical capabilities of the European Open Science Cloud with integrated modelling and simulation tools, interTwin not only fosters trust and reproducibility in science but also showcases the potential of data fusion with advanced modelling and prediction technologies. With a focus on ensuring quality, reliability, and verifiability of DT outputs, while simultaneously simplifying application development through AI workflow management and reinforcement of open science practices, interTwin stands as a pioneering endeavor at the forefront of interdisciplinary innovation.
Background
InterTwin develops and implements an open-source DTE that offers generic and customized software components for modeling and simulation, promoting interdisciplinary collaboration. The DTE blueprint architecture, guided by open standards, aims to create a common approach applicable across scientific disciplines. Use cases span high-energy physics, radio astronomy, climate research, and environmental monitoring. The project leverages expertise from European research infrastructures, fostering the validation of technology across facilities and enhancing accessibility. InterTwin aligns with initiatives like Destination Earth, EOSC, EuroGEO, and EU data spaces for continuous development and collaboration.
Progress in 2024
The itwinai toolkit saw a stable release this year, supporting AI workflows on cloud and HPC with integration of distributed ML training frameworks, ML tracking systems, and modular workflows. Extensive documentation and tutorials were developed, covering Distributed Machine Learning, ML workflows, and scaling benchmarks. We also completed infrastructure integration with interLink and OSCAR for remote ML inference on HPC and interactive model development on JupyterLab for Vega. Significant progress was made in integrating CERN and other use cases into the digital twin engine, covering both physics and climate research domains.
In 2024, significant progress was made on CERN’s use case tasks (T4.2/T7.7) within the InterTwin project. Efforts focused on refining and updating the generative model (3DGAN) analysis and validation frameworks. Key activities included updating the codebase to ensure compatibility with ML framework requirements, particularly regarding tensor operations and data types. Model validation scripts were developed and refined, with integrated training experiments conducted utilizing the itwinai framework on JSC resources. Contributions included status inputs for project deliverables (Deliverables 4.4 and 7.6), presentations at the 4th Technical Meeting, summer student lectures, and the 2nd EC project review.
Several integration activities of the fast detector simulation use case with the DTE modules have been completed, as the DTE so far enables Particle Detector DT by providing specific services and platforms that streamline data and AI workflow management. Data was added and are accessible in the project’s data lake, the AI based model workflow is enabled by itwinai framework. Also, a paradigm of model inference was enabled by OSCAR and interLink frameworks.
The use case was disseminated through a public webinar, a poster presentation at the EuCAIF conference, and a collaborative paper with the University of Trento for the SC24 conference. For T4.2, most major activities were completed, with ongoing exploration of advanced validation techniques for model output. Both T4.2 and T7.7 contributed extensively to deliverables and public engagement, highlighting CERN’s pivotal role in contributing to fast simulation technologies within the project.
Next Steps
Next, we will consolidate itwinai for containerized workflows on cloud and HPC to complete the integration with the workflow manager of the Digital Twin Engine, and fully implement the hyper-parameter optimization module. Additional integrations are planned for radio astronomy, lattice quantum chromodynamics, EURAC environmental data, and climate modeling.
The next steps for CERN’s use case into the InterTwin project involve several targeted actions to build on the progress achieved in 2024. A primary focus will be finalizing the performance validation framework and possible integration with DTE core modules that further enable our DT’s capabilities. Additionally, the team plans to integrate a model from the CaloChallenge under the detector simulation use case into the itwinai framework, enabling more robust and scalable simulations. This integration will be accompanied by continued refinement of model validation scripts to validate and benchmark model performance. Further dissemination activities are also anticipated, including contributions to technical meetings and collaborative publications.
Project Coordinator: EGI Foundation
Technical Team: Matteo Bunino, Anna Elisa Lappe, Xavier Espinal, Enrique Garcia, Maria Girone, Jarl Saether, Kalliopi Tsolaki, Sofia Vallecorsa
Collaboration Liaisons: Donatello Elia, Sandro Fiore, Vera Maiboroda, David Rousseau
In partnership with: EGI Foundation, CNRS
This in an EC Funded Project, discover more on intertwin.eu