C2D3 ECR Conference 2024

2024-11-23

- Climate Modelling
- Weather Forecasting
- Molecular Simulation
- Fluid Dynamics
- Engineering Simulation
- …

Derecho by NCAR

Dawn by Joe Bishop, with permission

We typically think of Deep Learning as an end-to-end process;

a black box with an input and an output^{1}. Typically conducted in Python.

Who’s that Pokémon?

\[\begin{bmatrix}\vdots\\a_{23}\\a_{24}\\a_{25}\\a_{26}\\a_{27}\\\vdots\\\end{bmatrix}=\begin{bmatrix}\vdots\\0\\0\\1\\0\\0\\\vdots\\\end{bmatrix}\] It’s Pikachu!

Neural Net by 3Blue1Brown under *fair dealing*.

Pikachu © *The Pokemon Company*, used under *fair dealing*.

*fair dealing*.

Pikachu © *The Pokemon Company*, used under *fair dealing*.

Many large scientific models are written in Fortran (or C, or C++).

Much machine learning is conducted in Python.

Mathematical Bridge by cmglee used under CC BY-SA 3.0

PyTorch, the PyTorch logo and any related marks are trademarks of The Linux Foundation.”

TensorFlow, the TensorFlow logo and any related marks are trademarks of Google Inc.

- Fortran bindings for PyTorch via C++ libtorch library
- Load and run models saved from PyTorch
- feature and future support, and reproducible
- Make use of the Torch backends for GPU offload

- Open source project on GitHub
- Easy to download and build (CMake)
- Examples suite for new users
- Full API documentation online at

cambridge-iccs.github.io/FTorch

Find it on :

```
use ftorch
implicit none
real, dimension(5), target :: in_data, out_data ! Fortran data structures
type(torch_tensor), dimension(1) :: input_tensors, output_tensors ! Set up Torch data structures
type(torch_model) :: torch_net
integer, dimension(1) :: tensor_layout = [1]
in_data = ... ! Prepare data in Fortran
! Create Torch input/output tensors from the Fortran arrays
call torch_tensor_from_array(input_tensors(1), in_data, tensor_layout, torch_kCPU)
call torch_tensor_from_array(output_tensors(1), out_data, tensor_layout, torch_kCPU)
call torch_model_load(torch_net, 'path/to/saved/model.pt') ! Load ML model
call torch_model_forward(torch_net, input_tensors, output_tensors) ! Infer
call further_code(out_data) ! Use output data in Fortran immediately
! Cleanup
call torch_delete(model)
call torch_delete(in_tensors)
call torch_delete(out_tensor)
```

*Uncertainty Quantification of a Machine Learning Subgrid-Scale Parameterization for Atmospheric Gravity Waves*(Mansfield and Sheshadri 2024)- “Identical” offline networks have very different behaviours when deployed online.

*Interpretable multiscale Machine Learning-Based Parameterizations of Convection for ICON*(Heuer et al. 2023)- Online stability improved when non-causal relations (assessed usng SHAP values) eliminated from net

- Model Bias Correction - Will Chapman WIP
- Learning model bias from data-assimilation to perform real-time corrections

- Plus other users in Civil Engineering, Biophysics, CFD, and more.

Jack Atkinson

RSE Unicorn by Cristin Merritt used with permission

Heuer, Helge, Mierk Schwabe, Pierre Gentine, Marco A Giorgetta, and Veronika Eyring. 2023. “Interpretable Multiscale Machine Learning-Based Parameterizations of Convection for ICON.” *arXiv Preprint arXiv:2311.03251*.

Mansfield, Laura A, and Aditi Sheshadri. 2024. “Uncertainty Quantification of a Machine Learning Subgrid-Scale Parameterization for Atmospheric Gravity Waves.” *Authorea Preprints*.