Post
1904
Kimi-K2 is now available on the hubπ₯π
This is a trillion-parameter MoE model focused on long context, code, reasoning, and agentic behavior.
moonshotai/kimi-k2-6871243b990f2af5ba60617d
β¨ Base & Instruct
β¨ 1T total / 32B active - Modified MIT License
β¨ 128K context length
β¨ Muon optimizer for stable trillion-scale training
This is a trillion-parameter MoE model focused on long context, code, reasoning, and agentic behavior.
moonshotai/kimi-k2-6871243b990f2af5ba60617d
β¨ Base & Instruct
β¨ 1T total / 32B active - Modified MIT License
β¨ 128K context length
β¨ Muon optimizer for stable trillion-scale training