Crypto

OpenAI taps Amazon cloud to scale AI agents as Microsoft ties loosen



OpenAI is expanding access to its generative AI models by bringing them onto Amazon Web Services, a move that follows closely on the heels of a revised agreement with Microsoft that loosens earlier exclusivity around cloud usage.

Summary

  • OpenAI brings its latest models and Codex agent to Amazon Web Services via Amazon Bedrock, following a revised agreement with Microsoft that enables multi-cloud deployment.
  • New Amazon Bedrock Managed Agents, powered by OpenAI, allow enterprises to build AI agents with memory and multi-step task capabilities within AWS environments.
  • Amazon expands AI push with deeper OpenAI ties and up to $25 billion investment in Anthropic, as demand for large-scale AI infrastructure accelerates.

According to reports, the update allows OpenAI to deploy its products across multiple cloud providers. Within a day of that change, the company confirmed that its models will now be offered through AWS, giving enterprise customers another channel to access its latest systems.

Developers using AWS will be able to test OpenAI models alongside its Codex coding agent via Amazon Bedrock, according to a joint announcement on Tuesday. Wider availability is expected in the coming weeks.

AWS CEO Matt Garman said at a San Francisco event that demand for such integration has been consistent, noting, “This is what our customers have been asking us for for a really long time.”

Earlier, AWS users could only access OpenAI’s open-weight models introduced in August. The latest rollout expands that offering to include more advanced systems through Bedrock’s unified APIs and enterprise controls.

A key part of the launch is Amazon Bedrock Managed Agents, powered by OpenAI, which is designed to help businesses build AI agents capable of handling multi-step tasks with memory of prior interactions. The system combines OpenAI’s models with AWS infrastructure, allowing companies to deploy production-ready agents within their existing environments.

OpenAI’s ties with Microsoft remain significant. The software company has supplied computing capacity since before the 2022 debut of ChatGPT. Still, internal messaging suggests the arrangement had constraints. Revenue chief Denise Dresser told staff the partnership “has been critical” but “has also limited our ability to meet enterprises where they are — for many that’s Bedrock.”

The revised agreement announced earlier in the week allows OpenAI to cap revenue-sharing commitments with Microsoft and serve customers across different cloud platforms. Andy Jassy described the development as “very interesting” in a post on X, hinting at further updates.

Expanding AWS partnership

OpenAI’s collaboration with Amazon has been building over recent months.

In November, the company outlined a $38 billion commitment tied to AWS, shortly after indicating that Microsoft Azure would remain the sole cloud provider for certain API services involving third parties.

Roughly three months later, Amazon deepened the relationship, announcing plans to invest $50 billion in OpenAI. The AI firm also said it would rely on AWS infrastructure, including up to two gigawatts of Trainium chip capacity, to train its models.

The announcement came amid scrutiny following a report by The Wall Street Journal suggesting OpenAI had missed internal targets related to user growth and revenue. The report also raised questions about spending plans, triggering declines in shares of chipmakers such as Nvidia and Broadcom.

OpenAI leadership pushed back strongly. CEO Sam Altman and CFO Sarah Friar said in a joint statement, “This is ridiculous,” adding that the company remains “totally aligned on buying as much compute as we can.”

Amazon deepens AI infrastructure push

Amazon has been stepping up its investments across the AI ecosystem alongside its work with OpenAI.

Just a week earlier, the company confirmed a fresh $5 billion investment in Anthropic, the developer behind the Claude family of AI models, as competition for computing capacity intensifies.

The deal includes provisions for up to $20 billion in additional funding tied to performance milestones, bringing the total potential investment to $25 billion.

As part of the arrangement, Anthropic has committed to spending more than $100 billion over the next decade on AWS infrastructure to support model training and deployment. The company has also secured access to up to 5 gigawatts of computing power, with about 1 gigawatt expected to come online using Trainium2 and Trainium3 chips by the end of the year.

The series of moves signals Amazon’s intent to position AWS as a central platform for advanced AI workloads, while OpenAI’s latest shift points to a more flexible, multi-cloud approach for delivering its technology to enterprises.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *