A Probabilistic Interpretation of Transformers

04/28/2022
by   Alexander Shim, et al.
0

We propose a probabilistic interpretation of exponential dot product attention of transformers and contrastive learning based off of exponential families. The attention sublayer of transformers is equivalent to a gradient ascent step of the log normalizer, which is the log-sum-exp term in the Hopfield theory of attention. This ascent step induces a parallel expansion of points, which is counterbalanced by a contraction from layer normalization. We also state theoretical limitations of our theory and the Hopfield theory and suggest directions for resolution.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset