Carl-Zeiss Stiftung - Model-Based AI
Summary: The research project Model-based AI (CZS-MBAI) was granted by the Carl-Zeiss Foundation to a larger group of PI to advance machine learning in the broader context of physical models and deep learning for imaging and cancer treatment. Within this large collaboration, our group is concerned with the research question of “hardware-aware deep learning for green IT”.
The whole project is very comprehensive and combines mathematical modeling with a rapidly increasing range of state-of-the-art applications, at the level where these applications trigger and contribute highly relevant AI-research. It is centered around model-based AI and how it will allow us to combine domain-specific scientific results with deep learning to solve challenging problems at the level of AI-related breakthroughs. Overall, four main research questions are considered: (1) how model-based deep learning allows for trustworthy and resilient AI; (2) how simulation-based deep learning can be used for a scalable annotation; (3) how hardware awareness of the model-based AI meets the needs of green IT; (4) and how reliable deep learning allows for innovative treatment of oligometastatic cancer.
In more detail, within this project we are focussing on two major research directions:
- Model compression for training
- Model-based scheduling in heterogeneous environments
Current people
- Holger Fröning (co-PI)
- Daniel Barley (PhD student)
- Bálint Soproni (master student)
- Christian Simonides (master student)
- Constantin Nicolai (master student)
Collaborators
- various
Contact
Dissemination
2024
- Less Memory Means smaller GPUs: Backpropagation with Compressed ActivationsCoRR, abs/2409.11902, 2024
@article{barley2024, author = {Barley, Daniel and Fr{{\"o}}ning, Holger}, title = {Less Memory Means smaller GPUs: Backpropagation with Compressed Activations}, year = {2024}, volume = {abs/2409.11902}, journal = {CoRR}, url = {https://arxiv.org/abs/2409.11902}, }
2023
- Compressing the Backward Pass of Large-Scale Neural Architectures by Structured Activation PruningCoRR, abs/2311.16883, 2023
@article{DBLP:journals/corr/abs-2311-16883, author = {Barley, Daniel and Fr{\"{o}}ning, Holger}, title = {Compressing the Backward Pass of Large-Scale Neural Architectures by Structured Activation Pruning}, journal = {CoRR}, volume = {abs/2311.16883}, year = {2023}, url = {https://arxiv.org/abs/2311.16883}, doi = {10.48550/ARXIV.2311.16883}, eprinttype = {arXiv}, eprint = {2311.16883}, timestamp = {Mon, 04 Dec 2023 00:00:00 +0100}, }