self-distillation topic
pytorch-be-your-own-teacher
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Deep-Hash-Distillation
Deep Hash Distillation for Image Retrieval - ECCV 2022
self_distillation
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
Awesome-Knowledge-Distillation-of-LLMs
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...
sdft
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".