Distributed Online Learning with Multiple Kernels
In the Internet-of-Things (IoT) systems, there are plenty of informative data provided by a massive number of IoT devices (e.g., sensors). Learning a function from such data is of great interest in machine learning tasks for IoT systems. Focusing on streaming (or sequential) data, we present a privacy-preserving distributed online learning framework with multiplekernels (named DOMKL). The proposed DOMKL is devised by leveraging the principles of an online alternating direction of multipliers (OADMM) and a distributed Hedge algorithm. We theoretically prove that DOMKL over T time slots can achieve an optimal sublinear regret, implying that every learned function achieves the performance of the best function in hindsight as in the state-of-the-art centralized online learning method. Moreover, it is ensured that the learned functions of any two neighboring learners have a negligible difference as T grows, i.e., the so-called consensus constraints hold. Via experimental tests with various real datasets, we verify the effectiveness of the proposed DOMKL on regression and time-series prediction tasks.
READ FULL TEXT