Exclusive: Amazon CEO Andy Jassy reveals AWS’ strategy for building the enterprise AI platform

Amazon.com Inc. Chief Executive Andy Jassy made an unscheduled appearance at the AWS re:Invent cloud conference Tuesday to announce the company’s bid to reshape the world of artificial intelligence models. Today, in an exclusive interview with SiliconANGLE’s theCUBE, Jassy doubled down on how important he views his company’s AI initiatives.

“This is the biggest change since the cloud and possibly the internet,” said Jassy (pictured, far right, with theCUBE Research’s John Furrier, left, and Dave Vellante). “I think every single customer experience we know of is going to be reinvented with AI.”

Jassy’s announcement onstage this week of Amazon Nova, a set of six models designed to provide support for processing text, images and video in a range of multimodal tasks, was among the many new offerings released by Amazon Web Services Inc. in Las Vegas this week. A key part of the company’s message at re:Invent has been that building the right infrastructure will be essential to the successful implementation of a company’s AI strategy.

To that end, AWS also announced major enhancements for its SageMaker and Bedrock offerings, to support building and scaling up generative AI applications with high-performing foundation models.

“You have to build the right set of primitives and that’s what we’ve been doing with SageMaker and Bedrock,” Jassy told theCUBE. “You have to have your infrastructure modernized for AI.”

Defining AI role for cloud

Jassy’s point feeds into a central theme to emerge from the re:Invent gathering this week: Customers are going to need the cloud to realize their AI vision fully. “Try doing AI from a mainframe,” Jassy said. “It’s nearly impossible.”

Despite the mainframe’s perceived limitations, enterprises have been slow to migrate to major cloud providers for AI’s heavy lifting, a source of some frustration to Amazon’s chief executive.

“As fast as the cloud has been growing IT spend, it is still 85% to 90% on-premises, which is insane,” Jassy said. But that’s also why he sees even more opportunity ahead for AWS.

“People are asking us, should we should we modernize our infrastructure or should we do generative AI? And of course, the answer is yes,” he said. “But if you don’t take the low-hanging fruit of modernizing your infrastructure, you’re actually not in a position to take advantage of AI.”

One of the biggest inhibitors generative AI applications to be used broadly, he noted, is that the costs of compute, chips for that compute and other technologies need to fall. “And we’ve done a lot of inventing there,” he said. “We are building very large-scale general AI applications inside Amazon because we’re building about a thousand of those apps already as well as running them for large customers.”

The steady drumbeat of AI-related announcements from AWS this week also reflected the company’s interest in meeting developer needs. On Wednesday, AWS released new tools for Q Developer designed to connect machine learning expertise with business needs.

“Those with the competence are going to do their own model building, so we’re going to try to make that as easy an experience as possible,” Jassy said. “Every application is going to have some generative AI and inference infused in it. It’s very much a building block.”

Jassy’s comments highlight how AWS has methodically approached the building of its role in the AI universe. The company has taken some heat for not moving faster to capture a prominent market position in AI. But signals from company executives this week underscore a philosophy that it will build a position the same way it captured the top pedestal as the largest cloud provider in the world.

“It took us about eight or nine years to build an AWS annual run rate of about $4 billion,” Jassy noted, and it’s only now that Amazon’s years of AI research and building are becoming apparent.

In fact, he noted, “it’s not just the model. I think people very often trick themselves with a good model that they think they’re they’re there, but they’re really only about 70% of the way there. In applications, they don’t really work well for users if there’s 30% error rates or wonkiness. And so the UI really matters. The fluency, the messaging, really matters, the latency really matters, the cost efficiency really matters.”

Another upside of AI is that as the company has tried out and implemented the technology, it has recharged its innovation efforts. “It opens up an explosion of new ideas,” he said.

And new services for the rest of us, he added: “There’s going to be so many experiences that are going to be so different for us and for our kids.”

Here’s the full video interview:

Featured photo: Don Klein/SiliconANGLE; Robert Hof/SiliconANGLE

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU

Related Content

What podcasts looked like in 2024 — literally

A look at the more challenging AI evaluations emerging in response to the rapid progress of models, including FrontierMath, Humanity's Last Exam, and RE-Bench (Tharin Pillay/Time)

2024 has been an amazing year for roguelikes

Leave a Comment