An Efficient Transfer Learning Technique by Using Final Fully-Connected Layer Output Features of Deep Networks

11/19/2018
by   Tasfia Shermin, et al.
0

In this paper, we propose a computationally efficient transfer learning approach using the output vector of final fully-connected layer of deep convolutional neural networks for classification. Our proposed technique uses a single layer perceptron classifier designed with hyper-parameters to focus on improving computational efficiency without adversely affecting the performance of classification compared to the baseline technique. Our investigations show that our technique converges much faster than baseline yielding very competitive classification results. We execute thorough experiments to understand the impact of similarity between pre-trained and new classes, similarity among new classes, number of training samples in the performance of classification using transfer learning of the final fully-connected layer's output features.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset