Amazon wants to become a global marketplace for AI

In This Article:

Amazon Web Services isn't betting on one large language model (LLM) winning the artificial intelligence race. Instead, it's offering customers a buffet of models to choose from.

AWS, the cloud computing arm of Amazon (AMZN), aims to become the go-to infrastructure layer for the AI economy, regardless of which model wins out. By making customer choice a defining principle, AWS hopes to win out against rivals that have aligned closely with specific LLM providers — notably Microsoft (MSFT), which partnered with ChatGPT creator OpenAI (OPAI.PVT).

“We don't think that there's going to be one model to rule them all,” Dave Brown, vice president of compute and networking at AWS, told Yahoo Finance.

The model-neutral approach is embedded into Amazon Bedrock, a service that allows AWS customers to build their own applications using a wide range of models, with more than 100 to choose from.

Brown added that after Chinese startup DeepSeek surprised the world, AWS had a fully managed version of the disruptive model available on Bedrock within a week.

Two years after its launch, Bedrock is now the fastest-growing service offered by AWS, which accounted for over 18% of Amazon’s total revenue in the first quarter.

It's why Amazon CEO Andy Jassy sees Bedrock as a core part of the company’s AI growth strategy. But to understand the competitive advantage AWS hopes to offer with Bedrock, you have to go back to its origin story.

Inside the Bedrock bet

Bedrock dates back to a six-page internal memo that Atul Deo, AWS’s director of product management, wrote in 2020.

Before OpenAI's ChatGPT launched in 2022 and made “generative AI” a household term, Deo pitched a service that could generate code from plain English prompts using large language models.

But Jassy, the head of AWS at the time, didn’t buy it.

“His initial reaction was, 'This seems almost like a pipe dream,'” Deo said. He added that while a tool that makes coding easy sounds obvious now, the technology was “still not quite there.”

When that project, initially known as Code Whisperer, launched in 2023, the team realized they could offer the service for a broader set of use cases, giving customers a choice of different models with “generic capabilities” that “could be used as a foundation to build a lot of interesting applications,” according to Deo.

Deo noted that the team steered away from doubling down on its own model after it recognized a pattern of customers wanting choice in other AWS services. This led to AWS becoming the first provider to offer a range of different models to customers.