Analysis and Detectability of Offline Data Poisoning Attacks on Linear Systems

11/16/2022
by   Alessio Russo, et al.
0

A recent body of literature has investigated the effect of data poisoning attacks on data-driven control methods. Data poisoning attacks are well-known to the Machine Learning community, which, however, make use of assumptions, such as cross-sample independence, that in general do not hold for dynamical systems. As a consequence, attacks, and detection methods, operate differently from the i.i.d. setting studied in classical supervised problems. In particular, data poisoning attacks against data-driven control methods can be fundamentally seen as changing the behavior of the dynamical system described by the data. In this work, we study this phenomenon through the lens of statistical testing, and verify the detectability of different attacks for a linear dynamical system. On the basis of the arguments hereby presented, we propose a stealthy data poisoning attack that can escape classical detection tests, and conclude by showing the efficiency of the proposed attack.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset