Issue |
Europhysics News
Volume 56, Number 1, 2025
AI for Physics
|
|
---|---|---|
Page(s) | 15 - 19 | |
Section | Features | |
DOI | https://doi.org/10.1051/epn/2025106 | |
Published online | 24 March 2025 |
Artificial intelligence for advancing particle accelerators
1
CNRS
2
GANIL - Univ. Paris-Saclay
3
CEA
4
DESY
5
GSI Helmholtzzentrum für Schwerionenforschung GmbH, Planckstr. 1, 64291 Darmstadt (Germany)
6
Univ. of Oxford
7
European Spallation Source
8
Univ. of Malta
9
Univ. of Liverpool
Artificial intelligence is increasingly shaping the way we think, design, build and operate present and future particle accelerators for science discovery and society.
© European Physical Society, EDP Sciences, 2025
Particle accelerators are vital for advancing knowledge across diverse fields. With over 50,000 devices worldwide, they enable applications such as pharmaceutical production, disease treatment, food preservation, environmental monitoring, material strengthening, and fundamental physics research.
Despite their versatility, accelerators face performance and reliability challenges affecting devices in use, under construction, or in design. To tackle these issues, the accelerator physics community is increasingly leveraging machine learning (ML), a branch of AI that has revolutionized many scientific fields. The transformative impact of ML was highlighted by the 2024 Nobel Prize in Physics awarded to Hopfield and Hinton.
However, ML applications in accelerators remain limited, with the field still in its infancy and underfunded. This article provides an overview of this emerging area and examines its challenges, progress, and opportunities. The following article first discuss the technical challenges faced by accelerators, then explore current AI applications and potential opportunities and, finally, propose solutions to address current obstacles in applying AI to accelerators. For a more in-depth review, readers are encouraged to refer to [1].
Current opportunities and goals in Particle Accelerators
Particle accelerators face common difficulties and opportunities in their operation and modelling.
Operational reliability is a key concern due to the complexity of interconnected components such as magnets, accelerating cavities, power amplifiers and diagnostic tools. This complexity increases the risk of malfunctions, leading to reduced performance and downtime.
The early detection of anomalies is crucial for predictive maintenance and minimization of failures. Due to the diverse requirements of systems and subsystems, a holistic understanding of anomalies is essential to optimize efficiency and maximize beam availability.
Even without major anomalies, effective operation demands precise control of interdependent systems. Traditional methods relying on human expertise and specialized algorithms face challenges like system drifts, external vibrations, and limited beam measurements, which degrade performance and require frequent retuning. Nonlinear, nonstationary behaviors and component uncertainties further complicate optimization.
While detailed models offer initial settings and virtual diagnostics, their limited accuracy and speed pose challenges. High-demand scientific and industrial facilities face scheduling pressures, which hinder further tuning. For medical and industrial accelerators, operational autonomy is crucial to ensure safety and reliability while reducing dependence on constant human oversight.
Modelling also plays a critical role in optimizing operating accelerator performance and designing future ones. However, rising performance demands drive simulation complexity and threaten to limit efficiency. Higher beam intensities require computationally intensive simulations to account for complex interactions, while increased beam phase-space density necessitates more accurate modelling. Comprehensive end-to-end simulations are essential for system-wide optimization but have become harder to develop due to their complexity. Additionally, multi-objective optimization demands fast and precise methods to balance competing goals like beam intensity and emittance control.
AI developments in Particle Accelerators
In the field of particle accelerators, AI-related developments began as early as the 1990s [2]. However, the techniques explored at the time were hindered by limited computing power, which restricted funding and development opportunities. Starting in 2016, project-based funding, mostly in the U.S., enabled the resumption of real AI development in accelerators. In Europe, the absence of dedicated funding pushed a network of 33 institutes and accelerator infrastructure facilities to collaborate in a year-long consultation as part of preparation for a future European project. Although this effort did not result in development funds, it identified key challenges and opportunities for AI in accelerators and related experiments, thereby establishing clearly defined priorities. A plethora of “use-cases” was identified, each of which addressed specific challenges. These use cases were matched with corresponding AI techniques that could potentially address the identified issues. This work led to a categorization that clearly highlights the core priorities of the community (see Figure 2).
Optimisation
Optimisation problems dominate current needs, both in running existing accelerators and designing future ones. In operation, this requirement is particularly critical for circular accelerators. Advanced optimisation methods, such as Bayesian techniques, have been successfully applied to several accelerators, including LCLS [3], SwissFEL [4], and LEIR at CERN [5]. The community behind these advancements has released specialized tools such as Badger/XOpt1 and the Machine Learning Platform (MLP) at CERN which integrate directly with command-and-control systems at accelerator facilities. Other promising optimisation methods, such as reinforcement learning, which are expected to be better suited for machines with evolving parameters, have been successfully applied at the CERN [6] and DESY [7] facilities. Model Predictive Control has also been explored and demonstrated effectively in ALS [8] and in MAX IV [9].
Anomaly Detection and Diagnostics
For linear accelerators, the primary focus is operational reliability. Classification techniques, particularly those which target anomaly detection, have been identified as highly promising. Recurrent Neural Network-based classifiers, well suited for time-series data, have shown effectiveness in diagnostic readings, radio-frequency stations and superconducting cavities [10].
Reliable operation, whether through optimized tuning or anomaly detection, also depends on efficient diagnostics. AI/ML not only enables cleaner signals for existing diagnostics (11) but also facilitates the creation of complete state observers based, for example, on surrogate models. Virtual diagnostics enable the addition of extra diagnostic capabilities, such as where non-destructive beam emittance estimates are otherwise impossible (12).
Precise and Efficient Physics Simulations
Modeling efforts are now focused on future infrastructures requiring precise multi-physics simulations in two key research areas: 1. Future high-energy and high-intensity colliders using classical accelerating technologies; 2. Innovative future accelerator concepts and technologies. This is illustrated by the increase of the maximum beam energy over time and its expected evolution during the next decades for different acceleration techniques (see Figure 12).
![]() |
Figure 1 Livingston plot representing the evolution particle accelerators energy over time. |
For classical accelerators, several initiatives align with strategic visions in particle physics, such as the latest U.S. prioritization process (“P5”), China’s CEPC project, and the upcoming update to the European Strategy for Particle Physics. Key projects include colliders such as FCC-ee/hh3, ILC4, CLIC5, CepC-SppC6 and C37. These projects aim to operate at high energy and intensity, where the stability depends on accurate and computationally demanding simulations.
Emerging acceleration technologies, such as Laser-driven Plasma Acceleration (LWFA or LPA), beam-driven Plasma Wake-Field Acceleration (PWFA), Structure-based Wake-Field Accelerators (SWFA), Dielectric Laser Acceleration (DLA), laser ion acceleration, and novel plasma devices (e.g., laser-ionized plasma columns), are redefining accelerator technology. AI techniques have enabled the achievement of highly promising initial results in optimizing and improving the beam quality of laser-plasma accelerators [13].
AI-driven techniques provide promising solutions for faster and more efficient modelling. These include active learning as well as neural-network-based surrogate models that replicate multi-physics phenomena. For example, the Cheetah simulation tool [7] integrates AI to address more complex collective dynamics such as space charge. Methods inspired from computer vision (ex. PointNet) and Physics-informed neural networks (PINN) show promise for modelling other collective effects relevant for next-generation energy and intensity frontier accelerators, potentially reducing the need for computationally expensive particle-in-cell simulations. Facilities, such as HiRES and LCLS-II, have already demonstrated the benefits of some of these methods [1, 7].
Prospects of AI in accelerator physics and technologies
Integrating AI into particle accelerators offers unprecedented opportunities but also raises critical concerns. Ethical implementation requires controlled data use, protection of confidential information, and system accessibility—particularly for medical accelerators and radionuclide production. Key issues include explainability, traceability, transparency, and ensuring reliability, robustness, and resilience for systems operating under human control. Addressing these concerns is vital to maintaining ethical AI practices.
Another significant hurdle in AI adoption lies in the fragmented and domain-specific nature of accelerator data and control systems. High-quality, standardized datasets are crucial for training effective AI models, yet this remains a challenge for data produced in routine operation. The establishment of a unified, accessible, and AI-enabled data platform can transform this landscape. An Open Data and Feature store portal, aligned with Findable, Accessible, Interoperable, and Reusable (F.A.I.R.) principles would enable data and AI model sharing and processing in an intuitive manner. A federated data model, which keeps data distributed yet accessible, offers an effective way to balance accessibility and security while fostering collaboration and transparency.
However, achieving this vision demands robust international cooperation, sustained funding, and a shared long-term commitment. This platform would benefit greatly from integration into the EOSC (European Open Science Cloud) data landscape and EuroHPC supercomputing European ecosystem. Even if the latter aligns with the 2024 European Rolling Plan for ICT (Information and Communication Technologies) standardization and EU data standardization efforts, the field of particle accelerators still requires a well-funded and community-driven implementation.
While a centralized AI-enabled data platform is important, its portability and decentralized implementation, as close as possible to the data, are equally important. The latter is also true for command-and-control software frameworks and their associated hardware, which may not be compatible with modern AI algorithms today. Large accelerator infrastructures such as at CERN, ESS, GSI, FAIR and DESY will certainly push future standards, but mid-term projects such as the International Fusion Materials Irradiation Facility (IFMIF), which will need to conceive AI-enabled control systems, will equally contribute to this effort. These future systems will have to incorporate “human in the loop”. In fact, accelerators rely on skilled operators to diagnose faults and manage rare, unforeseen events that no AI algorithm can fully anticipate. While AI automates and optimizes many tasks, humans must be equipped with new skills to collaborate effectively with these systems. Training programs which combine physics and AI techniques, supported by tools, such as twin-model-based virtual accelerators or virtual assistants based on a domain-specific foundational model, can prepare operators to thrive in this new era.
Equally important is the training of the next generation of scientists and engineers. Cross-disciplinary education, blending data science, machine learning (ML), and accelerator physics, is fundamental for future advancements. Graduate courses, summer schools, data challenges and workshops will play a pivotal role in equipping young researchers with expertise to harness AI’s potential in accelerator systems.
Conclusion
While particle accelerators are pivotal in advancing science and technology, their increasing complexity presents significant challenges in terms of reliability, optimisation, modelling, and design. To address these issues, artificial intelligence (AI) has emerged as a transformative solution with demonstrated success in areas such as anomaly detection, operational optimisation, and advanced simulation. AI has also opened new pathways for addressing the design and operation requirements of next-generation accelerators, including energy and intensity frontier machines, and novel acceleration concepts. However, beneath these advances lie challenges that hinder the widespread practical application of AI in accelerator infrastructures. The key obstacles include data scarcity, quality, and F.A.I.R.-ness. Additionally, limited funding opportunities, social barriers, and the lack of clear local, national, and European development roadmaps further constrain the progress. In this context, the creation of federated and decentralized systems, combined with robust funding and community-driven efforts, will be crucial to realize the full potential of AI in particle accelerators. n
About the Authors
Adnan Ghribi, an experimental physicist/engineer at CNRS, leads AI development for GANIL’s Accelerator Division and is a Visiting Scientist with CERN’s Accelerator and Beam Physics group. He co-founded the French network for machine learning in particle accelerators and plays a key role in establishing its European counterpart.
Kevin Cassou is a senior research engineer at CNRS/ IJClab and the national coordinator of the PALLAS project. His work integrates artificial intelligence, data analysis, and simulation for advancing laser plasma accelerator technology.
Barbara Dalena is an experimental nuclear, particle and accelerator physicist at CEA/IRFU/DACM . Her work focuses on the design and performance optimization of future accelerators.
Annika Eichler, a senior data and control scientist, is a professor at Hamburg University and leads the Intelligent Process Control Group at DESY. Her research focuses on distributed, data-driven, and robust control, as well as the optimization and reliability of particle accelerators.
Hayg Guler is a senior accelerator physicist at CNRS/ IJCLab. His work focuses on machine learning and computational methods applications to particle accelerators beam dynamics with an emphasis on surrogate models. Andrew Kishor Mistry is the Coordinator for Research Data Management at GSI and FAIR, Darmstadt Germany. He holds a PhD in Nuclear Physics from the University of Liverpool.
Adrian Oeftiger is an Associate Professor of Physics at the John Adams Institute, Oxford University, and a Research Fellow at Linacre College, Oxford. He specializes in collective beam dynamics for high-energy hadron accelerators and modeling with high-performance computing techniques.
Thomas Shea serves as the Beam Diagnostics Section leader at the European Spallation Source. His has over 30 years experience in accelerator technology design and management of particle accelerator facilities with an emphasis on a data acquisition and control.
Gianluca Valentino is an Associate Professor at the University of Malta and a Visiting Scientist with CERN’s Accelerator and Beam Physics group. His research includes machine learning and AI in particle accelerators, Earth Observation, and aerospace.
Carsten Welsch is a leading researcher at the University of Liverpool, specializing in beam diagnostics, accelerator optimization, and advanced R&D for novel accelerators and applications. He leads the QUASAR Group and has coordinated multiple European projects focused on accelerator research and training the next generation of researchers.
References
- Edelen et al., Annual Review of Nuclear and Particle Science. 74, 557 (2024). [CrossRef] [Google Scholar]
- E. Bozoki et al., AIP Conf Proc. 315, 103 (1994). [CrossRef] [Google Scholar]
- J. Duris et al., Phys Rev Lett. 124 (2020). [Google Scholar]
- J. Kirschner et al., Physical Review Accelerators and Beams 25 (2022). [Google Scholar]
- P. Arpaia et al., Nucl Instrum Methods Phys Res A. 985 (2021). [Google Scholar]
- V. Kain et al., Physical Review Accelerators and Beams 23 (2020). [Google Scholar]
- J. Kaiser et al., Physical Review Accelerators and Beams 27 (2024). [Google Scholar]
- S. C. Leemann et al., Phys Rev Lett. 123 (2019). [Google Scholar]
- Takahashi et al., Proceedings of ICALEPCS2023, 116 (2023). [Google Scholar]
- Chris Tennant et al., Phys. Rev. Accel. Beams (2020). [Google Scholar]
- X. Ren et al., Physical Review Accelerators and Beams 23 (2020). [Google Scholar]
- Hanuka et al., Sci Rep. 11 (2021). [Google Scholar]
- Döpp et al., High Power Laser Science and Engineering 11 (2023). [Google Scholar]
All Figures
![]() |
Figure 1 Livingston plot representing the evolution particle accelerators energy over time. |
In the text |
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.