Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance Paper • 2507.22448 • Published 1 day ago • 39
Seed-X Collection A powerful open-source multilingual translation language model series, including instruction and reasoning models. • 6 items • Updated 3 days ago • 60
💧 LFM2 Collection LFM2 is a new generation of hybrid models, designed for on-device deployment. • 15 items • Updated 3 days ago • 81
Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned). • 38 items • Updated about 15 hours ago • 48
🧠SmolLM3 Collection Smol, multilingual, long-context reasoner • 12 items • Updated 7 days ago • 66
view article Article SmolLM3: smol, multilingual, long-context reasoner By loubnabnl and 22 others • 24 days ago • 600
Jamba 1.7 Collection The AI21 Jamba family of models are hybrid SSM-Transformer foundation models, blending speed, efficient long context processing, and accuracy. • 4 items • Updated 29 days ago • 10
view article Article Welcome aMUSEd: Efficient Text-to-Image Generation By Isamu136 and 3 others • Jan 4, 2024 • 12
ERNIE 4.5 Collection collection of ERNIE 4.5 models. "-Paddle" models use PaddlePaddle weights, while "-PT" models use Transformer-style PyTorch weights. • 25 items • Updated 20 days ago • 157
MiniMax-M1 Collection MiniMax-M1, the world's first open-weight, large-scale hybrid-attention reasoning model. • 6 items • Updated 28 days ago • 110