Back

The Arm AGI CPU for agentic AI infrastructure just launched

Deveshi Dabbawala

March 31, 2026
Table of contents

Arm Holdings sent out its own production chip on March 24, 2026. This was the first time in the company's 35-year history that it had done this. Not a design that other people can use. A real, finished processor called the Arm AGI CPU, made just for agentic AI infrastructure.

What is the Arm AGI CPU?

For more than thirty years, Arm's business was to design simple chip architectures, license them to Apple, NVIDIA, Google and AWS, and collect the royalty. Those businesses made the chips, but Arm never did. All that changes with the AGI CPU. It is Arm's first chip made in-house, and it was made with the help of Meta and Arm's Neoverse family of CPU IP cores.  

It runs on TSMC's 3nm process and has up to 136 Neoverse V3 cores that can run at up to 3.2 GHz all-core and 3.7 GHz boost. Its TDP is 300W, which is lower than the 500W of the best AMD EPYC and Intel Xeon processors. With liquid cooling, it can handle more than 45,000 cores per rack.  

At the scale of a one-gigawatt AI factory, that means $10 billion in capital spending. It scales to over 45,000 cores per rack with liquid cooling and at one gigawatt AI factory scale, that translates into $10 billion in capital expenditure savings compared to x86.  

Meta is the first official customer, with seven other committed customers including OpenAI, Cloudflare and SAP. Arm's CEO Rene Haas projects the AGI CPU alone will generate $15 billion in revenue by 2031.  

"AI has fundamentally redefined how computing is built and deployed. Agentic computing is accelerating that change." - Rene Haas, CEO, Arm

Why agentic AI needed a chip built from scratch

For a long time, GPUs and model training were the main topics of conversation about AI infrastructure. That's changing quickly. Real growth now is in agentic AI systems that don't just answer questions. They can also think, plan, use tools, and run on their own all the time. Think of automated compliance pipelines, multi-step clinical workflows, or agents that watch your money in real time 24/7.  

The way these workloads are parallel, persistent, and CPU-intensive is not something that GPUs were made to handle. The CEO of Arm said that new AI inference workloads would quadruple the need for CPUs in the near future. The AGI CPU is Arm's direct response to that gap and its most important contribution to modern agentic AI infrastructure so far.

What the AGI CPU architecture delivers

Most conversations about this chip stop at the specs. But the more important story is what the architecture enables that wasn't possible before.

  • Traditional CPUs were built for sequential, human-paced tasks. The AGI CPU is designed for parallel, always-on agentic workloads.  
  • 12 channels of DDR5 memory at up to 8800 MT/s deliver over 800 GB/s bandwidth with sub-100 nanosecond latency, giving AI agents near-instant access to context.  
  • This low-latency, high-bandwidth setup is critical for multi-agent systems where several agents reason and act at the same time.  
  • 96 lanes of PCIe Gen 6 and CXL 3.0 enable faster communication with GPUs and custom AI chips, reducing delays in multi-agent pipelines.  
  • This allows infrastructure to support more agents, tighter coordination, and continuous execution without becoming a bottleneck.  
  • Arm claims 2x performance per watt compared to x86 racks, meaning higher output in the same power and space.  
  • For enterprises, this directly impacts cost, efficiency, and how AI infrastructure is planned at scale.

What's next

Production silicon is ready to order now, with volume shipments expected by the end of 2026 and material revenue impact projected from 2028 onward.

The Arm AGI CPU is a clear signal that the industry is orienting agentic AI infrastructure as the dominant investment of the next decade.  

At GoML, our AI Matic framework is already built for exactly this kind of deployment fast, production-ready, and designed to scale with the agentic AI infrastructure beneath it. We'll be tracking Arm's roadmap closely. Stay tuned to learn more.