Restricted Boltzmann machine to determine the input weights for extreme learning machines

08/17/2017
by   Andre Pacheco, et al.
0

The Extreme Learning Machine (ELM) is a single-hidden layer feedforward neural network (SLFN) learning algorithm that can learn effectively and quickly. The ELM training phase assigns the input weights and bias randomly and does not change them in the whole process. Although the network works well, the random weights in the input layer can make the algorithm less effective and impact on its performance. Therefore, we propose a new approach to determine the input weights and bias for the ELM using the restricted Boltzmann machine (RBM), which we call RBM-ELM. We compare our new approach with a well-known approach to improve the ELM and a state of the art algorithm to select the weights for the ELM. The results show that the RBM-ELM outperforms both methodologies and achieve a better performance than the ELM.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset