Monitizer: Automating Design and Evaluation of Neural Network Monitors

Warning

This publication doesn't include Faculty of Arts. It includes Faculty of Informatics. Official publication website can be found on muni.cz.
Authors

AZEEM Muqsit KANAV Sudeep KŘETÍNSKÝ Jan MOHR Stefanie RIEDER Sabine

Year of publication 2024
Type Article in Proceedings
Conference Computer Aided Verification
MU Faculty or unit

Faculty of Informatics

Citation
Doi http://dx.doi.org/10.1007/978-3-031-65630-9_14
Keywords Neural Networks; Monitoring; Hyperparameter Tuning
Description The behavior of neural networks (NNs) on previously unseen types of data (out-of-distribution or OOD) is typically unpredictable. This can be dangerous if the network's output is used for decision making in a safety-critical system. Hence, detecting that an input is OOD is crucial for the safe application of the NN. Verification approaches do not scale to practical NNs, making runtime monitoring more appealing for practical use. While various monitors have been suggested recently, their optimization for a given problem, as well as comparison with each other and reproduction of results, remain challenging.
Related projects:

You are running an old browser version. We recommend updating your browser to its latest version.