HAMLET: Graph Transformer Neural Operator for Partial Differential Equations

Andrey Bryutkin*1
Jiahao Huang*2
Zhongying Deng3
Guang Yang2
Carola-Bibiane Schönlieb3
Angelica I. Aviles-Rivero3

1MIT          2 Imperial College London      3 University of Cambridge
affiliations *Equal contribution

[Paper]
This is a website made for HAMLET: Graph Transformer Neural Operator for Partial Differential Equations.

Abstract

We present a novel graph transformer framework, HAMLET, designed to address the challenges in solving partial differential equations (PDEs) using neural networks. The framework uses graph transformers with modular input encoders to directly incorporate differential equation information into the solution process. This modularity enhances parameter correspondence control, making HAMLET adaptable to PDEs of arbitrary geometries and varied input formats. Notably, HAMLET scales effectively with increasing data complexity and noise, showcasing its robustness. HAMLET is not just tailored to a single type of physical simulation, but can be applied across various domains. Moreover, it boosts model resilience and performance, especially in scenarios with limited data. We demonstrate, through extensive experiments, that our framework is capable of outperforming current techniques for PDEs.


Network


Experiments









Paper and Supplementary Material


Andrey Bryutkin*, Jiahao Huang*, Zhongying Deng, Guang Yang, Carola-Bibiane Schönlieb, Angelica I Aviles-Rivero.
HAMLET: Graph Transformer Neural Operator for Partial Differential Equations
(hosted on ArXiv)


[Bibtex]


Acknowledgements

AB was supported in part by the Akamai Presidential Fellowship and the Hans-Messer Foundation. JH and GY was supported in part by the ERC IMI (101005122), the H2020 (952172), the MRC (MC/ PC/21013), the Royal Society (IEC NSFC211235), the NVIDIA Academic Hardware Grant Program, the SABER project supported by Boehringer Ingelheim Ltd, Wellcome Leap Dynamic Resilience, and the UKRI Future Leaders Fellowship (MR/V023799/1). CBS acknowledges support from the Philip Leverhulme Prize, the Royal Society Wolfson Fellowship, the EPSRC advanced career fellowship EP/V029428/1, EPSRC grants EP/S026045/1 and EP/T003553/1, EP/N014588/1, EP/T017961/1, the Wellcome Innovator Awards 215733/Z/19/Z and 221633/Z/20/Z, CCMI and the Alan Turing Institute. AAR gratefully acknowledges funding from the Cambridge Centre for Data-Driven Discovery and Accelerate Programme for Scientific Discovery, made possible by a donation from Schmidt Futures, ESPRC Digital Core Capability Award, and CMIH and CCIMI, University of Cambridge.