Monitizer: Automating Design and Evaluation of Neural Network Monitors
Authors | |
---|---|
Year of publication | 2024 |
Type | Article in Proceedings |
Conference | Computer Aided Verification |
MU Faculty or unit | |
Citation | |
Doi | http://dx.doi.org/10.1007/978-3-031-65630-9_14 |
Keywords | Neural Networks; Monitoring; Hyperparameter Tuning |
Description | The behavior of neural networks (NNs) on previously unseen types of data (out-of-distribution or OOD) is typically unpredictable. This can be dangerous if the network's output is used for decision making in a safety-critical system. Hence, detecting that an input is OOD is crucial for the safe application of the NN. Verification approaches do not scale to practical NNs, making runtime monitoring more appealing for practical use. While various monitors have been suggested recently, their optimization for a given problem, as well as comparison with each other and reproduction of results, remain challenging. |
Related projects: |