Falcon3 Collection Falcon3 family of Open Foundation Models is a set of pretrained and instruct LLMs ranging from 1B to 10B parameters. • 40 items • Updated 6 days ago • 71
Granite 3.1 Language Models Collection A series of language models with 128K context length trained by IBM licensed under Apache 2.0 license. • 8 items • Updated 7 days ago • 30
view post Post 1560 Tulu 3 SFT Mixture by AllenAI is a massive, good, multilingual dataset for fine-tuning Language Models.Unfortunately, it was missing the "language" column.I added it using the good old fastText.Check out the dataset here 👉 anakin87/tulu-3-sft-mixture-with-language See translation 1 reply · ❤️ 6 6 + Reply