Nicolas-BZRD
's Collections
LLMs Distillation
updated
Towards Cross-Tokenizer Distillation: the Universal Logit Distillation
Loss for LLMs
Paper
•
2402.12030
•
Published
mistralai/Mistral-7B-Instruct-v0.2
Text Generation
•
Updated
•
3.28M
•
•
2.7k
meta-llama/Llama-2-7b-chat-hf
Text Generation
•
Updated
•
1.28M
•
•
4.35k
EleutherAI/pythia-160m-deduped
Text Generation
•
Updated
•
76.6k
•
3
EleutherAI/pythia-410m-deduped
Text Generation
•
Updated
•
98.3k
•
20
EleutherAI/pythia-1b-deduped
Text Generation
•
Updated
•
21.4k
•
19
bigscience/bloomz-560m
Text Generation
•
Updated
•
195k
•
122
bigscience/mt0-base
Text2Text Generation
•
Updated
•
2.71k
•
•
30
facebook/opt-350m
Text Generation
•
Updated
•
259k
•
•
141
Viewer
•
Updated
•
98.2k
•
69.9k
•
297
google-research-datasets/qed
Updated
•
196
•
3
Viewer
•
Updated
•
10.6k
•
263
•
8
Viewer
•
Updated
•
274k
•
13.1k
•
198
Viewer
•
Updated
•
14.5k
•
34.7k
•
198