Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content
#

feedback

Here are 951 public repositories matching this topic...

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

  • Updated Oct 22, 2024

Improve this page

Add a description, image, and links to the feedback topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the feedback topic, visit your repo's landing page and select "manage topics."

Learn more