Models
November 17, 2025

Kimi’s K2 open-source model

Kimi has released K2, a massive open-source Mixture-of-Experts LLM: 32 B active parameters, 1 trillion total. It uses a new MuonClip optimizer and excels in agentic tasks.

Kimi announced K2, a next-generation open-source Mixture-of-Experts (MoE) model that activates 32 billion parameters out of a 1 trillion-parameter pool.

It’s trained using a novel optimizer called MuonClip, which uses a QK-clip technique to ensure stability while maintaining token efficiency. During post-training, K2 leverages a large-scale agentic data synthesis pipeline and reinforcement learning to improve via environment interactions.

In benchmarks, it outperforms many open and closed source models in coding, mathematics, reasoning, and agentic performance. The model checkpoint is being open-sourced to further research.

#
Kimi

Read Our Content

See All Blogs
AWS

The Complete Guide to Nova 2 Omni

Sharan Sundar Sankaran

December 14, 2025
Read more
AWS

Day 4 at AWS re:Invent: Experience-Based Acceleration (EBA) partners announced and a big bang close

Deveshi Dabbawala

December 4, 2025
Read more