Browse by author
Lookup NU author(s): Dr Sergiy Bogomolov, Dr Kostiantyn Potomkin, Dr Sadegh SoudjaniORCiD, Dr Paolo Zuliani
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND).
© 2024 The Authors. This is an open access article under the CC BY-NC-ND license. We present a novel technique for online safety verification of autonomous systems, which performs reachability analysis efficiently for both bounded and unbounded horizons by employing neural barrier certificates. Our approach uses barrier certificates given by parameterized neural networks that depend on a given initial set, unsafe sets, and time horizon. Such networks are trained efficiently offline using system simulations sampled from regions of the state space. We then employ a meta-neural network to generalize the barrier certificates to state-space regions that are outside the training set. These certificates are generated and validated online as sound over-approximations of the reachable states, thus either ensuring system safety or activating appropriate alternative actions in unsafe scenarios. We demonstrate our technique on case studies from linear models to nonlinear control-dependent models for online autonomous driving scenarios.
Author(s): Abate A, Bogomolov S, Edwards A, Potomkin K, Soudjani S, Zuliani P
Publication type: Conference Proceedings (inc. Abstract)
Publication status: Published
Conference Name: 8th IFAC Conference on Analysis and Design of Hybrid Systems (ADHS 2024)
Year of Conference: 2024
Pages: 107-114
Online publication date: 23/08/2024
Acceptance date: 02/04/2024
Date deposited: 16/09/2024
ISSN: 2405-8963
Publisher: Elsevier Ltd
URL: https://doi.org/10.1016/j.ifacol.2024.07.433
DOI: 10.1016/j.ifacol.2024.07.433
Series Title: IFAC-PapersOnLine