Optimizing method for Neural Network based on Genetic Random Weight Change Learning Algorithm

06/05/2019
by   Mohammad Ibrahim Sarker, et al.
0

Random weight change (RWC) algorithm is extremely component and robust for the hardware implementation of neural networks. RWC and Genetic algorithm (GA) are well known methodologies used for optimizing and learning the neural network (NN). Individually, each of these two algorithms has its strength and weakness along with separate objectives. However, recently, researchers combine these two algorithms for better learning and optimization of NN. In this paper, we proposed a methodology by combining the RWC and GA, namely Genetic Random Weight Change (GRWC), as well as demonstrate a seminal way to reduce the complexity of the neural network by removing weak weights of GRWC. In contrast to RWC and GA, GRWC contains an effective optimization procedure which is worthy at exploring a large and complex space in intellectual strategies influenced by the GA/RWC synergy. The learning behavior of the proposed algorithm was tested on MNIST dataset and it was able to prove its performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset