Verification of Binarized Neural Networks
We study the problem of formal verification of Binarized Neural Networks (BNN), which have recently been proposed as a power-efficient alternative to more traditional learning networks. More precisely, given a trained BNN and a relation between possible inputs and outputs of this BNN, we develop verification procedures for establishing that the BNN indeed meets this specification for all possible inputs. For solving the verification problem of BNNs we build on well-known methods for hardware verification.The BNN verification problem is first encoded as a combinational miter. In a second step this miter is then transformed into a corresponding propositional satisfiability (SAT) problem. The main contributions of this paper are a number of essential optimizations for making this approach to BNN verification scalable. First, we provide a transformation on fully conntected BNNs for reducing the order of the number of bitwise operations in each layer of the BNN from quadratic to linear. Second, we are identifying redundant computations in a BNN based on optimal factoring techniques, and we provide transformations on BNNs for avoiding these multiple computations. We prove that the problem of optimal factoring is NP-hard, and we design efficient search procedures for generating approximate solutions of the optimal factoring problem. Third, we design a compositional verification procedure for analyzing each layer of a BNN separately, and for iteratively combining and refining local verification results. We experimentally demonstrate the scalability of our verification techniques to moderately-sized BNNs for embedded applications with thousands of neurons and inputs.
READ FULL TEXT