yuyi
知不可乎骤得,托遗响于悲风
论文阅读
CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION
NC-WAMKD: Neighborhood Correction Weight-adaptive Multi-teacher Knowledge Distillation For Graph-based Semi-supervised Node Classification
下一步的研究计算有两条思路: