Al Harith Farhad
Keep Your Distance: Determining Sampling and Distance Thresholds in Machine Learning Monitoring
Farhad, Al Harith; Sorokos, Ioannis; Schmidt, Andreas; Akram, Mohammed Naveed; Aslansefat, Koorosh; Schneider, Daniel
Authors
Ioannis Sorokos
Andreas Schmidt
Mohammed Naveed Akram
Dr Koorosh Aslansefat K.Aslansefat@hull.ac.uk
Lecturer/Assistant Professor
Daniel Schneider
Contributors
Christel Seguin
Editor
Marc Zeller
Editor
Tatiana Prosvirnova
Editor
Abstract
Machine Learning (ML) has provided promising results in recent years across different applications and domains. However, in many cases, qualities such as reliability or even safety need to be ensured. To this end, one important aspect is to determine whether or not ML components are deployed in situations that are appropriate for their application scope. For components whose environments are open and variable, for instance those found in autonomous vehicles, it is therefore important to monitor their operational situation in order to determine its distance from the ML components’ trained scope. If that distance is deemed too great, the application may choose to consider the ML component outcome unreliable and switch to alternatives, e.g. using human operator input instead. SafeML is a model-agnostic approach for performing such monitoring, using distance measures based on statistical testing of the training and operational datasets. Limitations in setting SafeML up properly include the lack of a systematic approach for determining, for a given application, how many operational samples are needed to yield reliable distance information as well as to determine an appropriate distance threshold. In this work, we address these limitations by providing a practical approach and demonstrate its use in a well known traffic sign recognition problem, and on an example using the CARLA open-source automotive simulator.
Citation
Farhad, A. H., Sorokos, I., Schmidt, A., Akram, M. N., Aslansefat, K., & Schneider, D. (2022, September). Keep Your Distance: Determining Sampling and Distance Thresholds in Machine Learning Monitoring. Presented at Model-Based Safety and Assessment, 8th International Symposium, IMBSA 2022, Munich, Germany
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | Model-Based Safety and Assessment, 8th International Symposium, IMBSA 2022 |
Start Date | Sep 5, 2022 |
End Date | Sep 7, 2022 |
Acceptance Date | Sep 1, 2022 |
Online Publication Date | Sep 9, 2022 |
Publication Date | Sep 9, 2022 |
Deposit Date | Aug 3, 2024 |
Publicly Available Date | Apr 16, 2025 |
Print ISSN | 0302-9743 |
Publisher | Springer (part of Springer Nature) |
Peer Reviewed | Peer Reviewed |
Volume | 13525 |
Pages | 219-234 |
Series Title | Lecture notes in computer science |
Series Number | 13525 |
Series ISSN | 0302-9743 ; 1611-3349 |
Book Title | Model-Based Safety and Assessment 8th International Symposium, IMBSA 2022, Proceedings. Lecture Notes in Computer Science (LNCS, volume 13525) |
ISBN | 978-3-031-15841-4 |
DOI | https://doi.org/10.1007/978-3-031-15842-1_16 |
Public URL | https://hull-repository.worktribe.com/output/4783347 |
Files
Accepted manuscript
(517 Kb)
PDF
Copyright Statement
© The Authors. This version of the article has been accepted for publication and is subject to Springer Nature’s AM terms of use:
https://www.springernature.com/gp/open-science/policies/accepted-manuscript-terms
You might also like
Safety-Security Co-Engineering Framework
(2023)
Report
A Hybrid Modular Approach for Dynamic Fault Tree Analysis
(2020)
Journal Article