Ecosystem
May 28, 2025

AWS Neuron NxD Inference enters general availability for optimized model serving

AWS Neuron 2.23 brings NxD Inference to general availability, with enhanced ML performance, better developer tooling, and tighter framework integration for accelerated generative AI workloads on AWS Inferentia chips.

AWS released Neuron 2.23, introducing NxD Inference GA, new training and inference capabilities, and upgraded developer tools. NxD Inference offers high-performance, low-latency support for machine learning inference on AWS Inferentia hardware.

This update enhances model performance across LLMs and generative AI applications. With tighter integration, developers now benefit from improved compilation, profiling tools, and framework support including PyTorch and TensorFlow.

These improvements streamline AI/ML workloads on AWS, reinforcing AWS's commitment to optimizing GenAI infrastructure and performance at scale.

#
AWS

Read Our Content

See All Blogs
AI safety

Decoding White House Executive Order on “Winning the AI Race: America’s AI Action Plan” for Organizations planning to adopt Gen AI

Rishabh Sood

September 24, 2025
Read more
AWS

AWS AI offerings powering enterprise AI in 2025

Siddharth Menon

September 22, 2025
Read more