Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix

12/21/2021
by   Peng Liu, et al.
10

In the context of multi-modality knowledge distillation research, the existing methods was mainly focus on the problem of only learning teacher final output. Thus, there are still deep differences between the teacher network and the student network. It is necessary to force the student network to learn the modality relationship information of the teacher network. To effectively exploit transfering knowledge from teachers to students, a novel modality relation distillation paradigm by modeling the relationship information among different modality are adopted, that is learning the teacher modality-level Gram Matrix.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset