CD Technical Meeting (ML8): Using Conformal Prediction for Uncertainty Quantification of a Machine-Learning Based Fast Particle Pressure Emulator for Neutral Beam Injection in Tokamaks
→
Europe/London
Description
Machine Learning, Uncertainty Quantification and Data Science
Machine learning (ML) and related techniques have become a popular solution for implementing surrogate models and emulators to approximate difficult to compute features of physical systems [1]. In tokamaks with neutral beam injection-based heating, calculating the addition to the bulk thermal pressure by the injected fast particles is necessary for an accurate equilibrium reconstruction. The NUBEAM module of TRANSP, (developed by PPPL) is a mature Monte-Carlo code commonly used for evaluating these fast particle pressures. However, using such simulations is also a significant computational bottleneck in the equilibrium re-construction process.
To address this bottleneck, we are developing an ML-based emulator to work in place of TRANSP, that would allow evaluation of fast particle pressures on timescales suitable for rapid, iterative equilibrium reconstruction. In order for it to be practically useful, we need a robust quantification of its emulation error for any future predictions that it makes. Conformal prediction has emerged as a robust method for obtaining mathematically rigorous confidence interval bounds on generic ML model predictions [2-4]. In this talk, we will discuss the methodology behind the various conformal prediction schemes and why they are useful. We will then demonstrate how conformal prediction was used to obtain error bounds on our fast particle pressure emulator and how this could fit into the equilibrium re-construction pipeline in the future.
[1] R. M. Churchill, M. D. Boyer, and S. C. Cowley, Accelerating fusion energy with AI, in Artificial Intelligence for Science: A Deep Learning Revolution (World Scientific, 2023) pp. 271–284.
[2] A. Gammerman, V. Vovk, and V. Vapnik. 1998. Learning by transduction. In Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence (UAI'98). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 148–155.
[3] Anastasios N. Angelopoulos and Stephen Bates (2023), "Conformal Prediction: A Gentle Introduction", Foundations and Trends® in Machine Learning: Vol. 16: No. 4, pp 494-591.
[4] Vignesh Gopakumar, Ander Gray, Lorenzo Zanisi, Timothy Nunn, Daniel Giles,Matt J. Kusner, Stanislas Pamela, and Marc Peter Deisenroth. Calibrated physics-informed uncertainty quantification, (2025), arXiv:2502.04406.