Abstract: Existing studies on knowledge distillation typically focus on teacher-centered methods, in which the teacher network is trained according to its own standards before transferring the learned ...
Abstract: Generalized zero-shot learning (GZSL) aims to train a model for classifying data samples under the condition that some output classes are unknown during supervised learning. To address this ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results