Lizenz: Creative Commons Namensnennung 4.0 International PDF - Veröffentlichte Version (1MB) | |
Lizenz: Creative Commons Namensnennung 4.0 International XML - Veröffentlichte Version (182kB) |
- URN zum Zitieren dieses Dokuments:
- urn:nbn:de:bvb:355-epub-554432
- DOI zum Zitieren dieses Dokuments:
- 10.5283/epub.55443
Zusammenfassung
We demonstrate that a state-of-the-art multigrid preconditioner can be learned efficiently by gauge-equivariant neural networks. We show that the models require minimal retraining on different gauge configurations of the same gauge ensemble and to a large extent remain efficient under modest modifications of ensemble parameters. We also demonstrate that important paradigms such as communication avoidance are straightforward to implement in this framework.