Models
November 17, 2025

Kimi’s K2 open-source model

Kimi has released K2, a massive open-source Mixture-of-Experts LLM: 32 B active parameters, 1 trillion total. It uses a new MuonClip optimizer and excels in agentic tasks.

Kimi announced K2, a next-generation open-source Mixture-of-Experts (MoE) model that activates 32 billion parameters out of a 1 trillion-parameter pool.

It’s trained using a novel optimizer called MuonClip, which uses a QK-clip technique to ensure stability while maintaining token efficiency. During post-training, K2 leverages a large-scale agentic data synthesis pipeline and reinforcement learning to improve via environment interactions.

In benchmarks, it outperforms many open and closed source models in coding, mathematics, reasoning, and agentic performance. The model checkpoint is being open-sourced to further research.

#
Kimi

Read Our Content

See All Blogs
Gen AI

WebMCP and AI orchestration: how the web is finally catching up to enterprise AI agents

Deveshi Dabbawala

March 10, 2026
Read more
Gen AI

OpenAI just released GPT-5.4: here’s what you need to know

Deveshi Dabbawala

March 6, 2026
Read more