OpenAI’s latest open-weight models are now available on Amazon Web Services (AWS) through Amazon Bedrock and Amazon SageMaker, marking a major step in democratizing access to high-performance AI capabilities.
Starting today, AWS customers can integrate OpenAI’s new advanced gpt-oss-120b and gpt-oss-20b models directly into their workflows. These open-weight models are optimized for reasoning tasks and can be deployed securely at scale using AWS’s infrastructure.
According to AWS, the new OpenAI models offer substantial price-performance advantages:
- 3x more price-performant than Gemini 1.5 Pro
- 5x more price-performant than DeepSeek R1
- 2x better price-performance compared to OpenAI’s own GPT-4 (o4) on most enterprise workloads.
This partnership empowers enterprises with greater model choice and flexibility, aligning with the growing need for tailored AI solutions across industries. It also strengthens AWS's position as a comprehensive platform for building, deploying, and scaling AI applications.
The announcement highlights a new chapter in enterprise AI: open, customizable, and cost-effective foundation models deployed on trusted cloud infrastructure.
The GoML POV
OpenAI’s latest open-weight models are now available on Amazon Web Services (AWS) through Amazon Bedrock and Amazon SageMaker, marking a major step in bring OpenAI to the AWS gen AI ecosystem.
It is unclear whether this opens the door for all OpenAI models to eventually be available on Bedrock.
The real Big Move will be OpenAI models' general availability within AWS, which is unlikely at the moment because of the OpenAI - Azure partnership. But, for now, this move strengthens Bedrock's position as a comprehensive foundation layer for building, deploying, and scaling AI applications.