Models
November 17, 2025

Kimi’s K2 open-source model

Kimi has released K2, a massive open-source Mixture-of-Experts LLM: 32 B active parameters, 1 trillion total. It uses a new MuonClip optimizer and excels in agentic tasks.

Kimi announced K2, a next-generation open-source Mixture-of-Experts (MoE) model that activates 32 billion parameters out of a 1 trillion-parameter pool.

It’s trained using a novel optimizer called MuonClip, which uses a QK-clip technique to ensure stability while maintaining token efficiency. During post-training, K2 leverages a large-scale agentic data synthesis pipeline and reinforcement learning to improve via environment interactions.

In benchmarks, it outperforms many open and closed source models in coding, mathematics, reasoning, and agentic performance. The model checkpoint is being open-sourced to further research.

#
Kimi

Read Our Content

See All Blogs
Gen AI

Why GoML is the best Caylent alternative for AWS AI development

Deveshi Dabbawala

November 17, 2025
Read more
Gen AI

Why GoML is the best Accenture alternative for AI development and AI consulting

Deveshi Dabbawala

November 9, 2025
Read more