Stochastic Backpropagation: A Memory Efficient Strategy for Training Video Models

03/31/2022
by   Feng Cheng, et al.
0

We propose a memory efficient method, named Stochastic Backpropagation (SBP), for training deep neural networks on videos. It is based on the finding that gradients from incomplete execution for backpropagation can still effectively train the models with minimal accuracy loss, which attributes to the high redundancy of video. SBP keeps all forward paths but randomly and independently removes the backward paths for each network layer in each training step. It reduces the GPU memory cost by eliminating the need to cache activation values corresponding to the dropped backward paths, whose amount can be controlled by an adjustable keep-ratio. Experiments show that SBP can be applied to a wide range of models for video tasks, leading to up to 80.0 10 temporal action detection.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset