Neural Bayesian Machines
Summary: The research project Neural Bayesian Machines (NBM) is concerned with the acceleration of Bayesian Neural Networks (BNNs) using specialized hardware architectures.
BNNs are a promising option to address the overconfidence found in traditional deep neural architectures, which often ignores being out-of-distribution or makes predictions on noisy, corrupted or otherwise malformed data in a confident manner. Essentially, BNNs offer a mathematically sound way to reason about epistemic and aleatoric uncertainty. However, realizing BNNs on conventional hardware is extremely resource hungry, which is why they are often not used in practise for scalable problems.
The idea of Neural Bayesian Machines (NBM) is to find new approaches to reduce the massive computational cost of BNNs. In particular, two research directions are pursued:
- Extending deep neural architectures, including MLPs, CNNs, Transformers and others, but uncertainty estimates, therefore maintaining compatibility with the prevailing GPU architecture.
- Addressing the tremendous costs of BNNs based on methods such as stochastic variational inference (SVI) or Markov-Chain Monte Carlo (MCMC) methods by dedicated hardware architectures. In this regard, of particular interest are noisy hardware architectures, for which we gear to exploit the inherent noise as a source of stochasticity, therefore transforming a notable disadvantage of analog forms of computing into an advantage.
Current people
- Holger Fröning (co-PI)
- Bernhard Klein (PhD student)
- Hendrik Borras (PhD student)
- Congcong Xu (master student)
- Prakriti Jain (master student)
- Xiao Wang (master student)
Collaborators
- Franz Pernkopf (co-PI)
- Sophie Steger (PhD student)
Contact
Dissemination
2024
- Walking Noise: On Layer-Specific Robustness of Neural Architectures against Noisy Computations and Associated Characteristic Learning DynamicsEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 2024
@inproceedings{borras2024, title = {Walking Noise: On Layer-Specific Robustness of Neural Architectures against Noisy Computations and Associated Characteristic Learning Dynamics}, author = {Borras, Hendrik and Klein, Bernhard and Fr{\"{o}}ning, Holger}, booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases}, year = {2024}, series = {ECML-PKDD}, url = {https://doi.org/10.1007/978-3-031-70359-1_3}, }
- Function Space Diversity for Uncertainty Prediction via Repulsive Last-Layer EnsemblesICML 2024 Workshop on Structured Probabilistic Inference & Generative Modeling, 2024
@inproceedings{steger2024function, title = {Function Space Diversity for Uncertainty Prediction via Repulsive Last-Layer Ensembles}, author = {Steger, Sophie and Knoll, Christian and Klein, Bernhard and Fr{\"o}ning, Holger and Pernkopf, Franz}, booktitle = {ICML 2024 Workshop on Structured Probabilistic Inference {\&} Generative Modeling}, year = {2024}, url = {https://openreview.net/forum?id=FbMN9HjgHI}, }
- Probabilistic Photonic Computing with Chaotic LightCoRR, abs/2401.17915, 2024
@article{brckerhoffplckelmann2024probabilistic, title = {Probabilistic Photonic Computing with Chaotic Light}, author = {Brückerhoff-Plückelmann, Frank and Borras, Hendrik and Klein, Bernhard and Varri, Akhil and Becker, Marlon and Dijkstra, Jelle and Brückerhoff, Martin and Wright, C. David and Salinga, Martin and Bhaskaran, Harish and Risse, Benjamin and Fr{\"o}ning, Holger and Pernice, Wolfram}, year = {2024}, volume = {abs/2401.17915}, journal = {CoRR}, url = {https://arxiv.org/abs/2401.17915}, }