AWS Returns To Its Roots, While Embracing The AI Future

AWS Returns To Its Roots, While Embracing The AI Future
It was a busy week for me personally. I attended AWS re:Invent and also launched the FastForward blog and this newsletter.
If you’re reading this today, thank you for subscribing!
I made a few key observations as I listened to new-ish AWS CEO Matt Garman’s first re:Invent keynote since taking over from Adam Selipsky in May.
For starters, he didn’t jump right into AI like just about every tech executive has done at every tech conference over the last 18 months. Instead, he emphasized the basics, the cloud infrastructure pieces that he so aptly called the “building blocks.” Those include storage, compute, databases and Amazon’s plethora of custom silicon.
The silicon not only is designed to run more efficiently than third party chips on Amazon hardware, they also help lower the cost of…wait for it… processing AI workloads and building AI-fueled applications.
You knew that AI would come up sooner or later. Garman spent the first 75 minutes of the two and a half hour address, talking about those crucial infrastructure elements, then spent the next hour, along with his boss Andy Jassy and assorted guests, putting their competitors on notice that Amazon wasn’t ceding AI to Microsoft, Google or anyone else.
You may recall that last year, Microsoft was riding high. Its partnership with OpenAI seemed like pure strategic genius on the part of the company. Amazon frankly felt like it was running a few steps behind. As Scott Raney, a partner at Redpoint Ventures told me last year, AWS was in the unusual position of playing from behind:
“This might be the first time where people looked and said that Amazon isn’t in the pole position to capitalize on this massive opportunity. What Microsoft’s done around Copilot and the fact Q comes out [this week] means that in reality, they’re absolutely 100% playing catch-up,” Raney said.
But perhaps the biggest announcement, which wasn’t entirely clear at the time, was Amazon Bedrock, a model management platform that Amazon promised would support just about any model (except OpenAI directly). Microsoft was all over OpenAI, but Amazon wanted to let enterprise IT manage every other model under the sun, even future ones that didn’t exist yet.
It remains in many ways a bolder vision than Microsoft’s, which is concentrating mainly on OpenAI in word and deed, while supporting other models through the Azure platform. Yet Microsoft was riding a wave of hype from mainstream media and Wall Street alike.
This year, Amazon went as far as to announce a bunch of new Nova LLMs created in-house giving them a crucial missing piece, yet not forcing customers to even use Nova if they have other preferences.
Let’s be clear, there are still lots of open questions, including whether these in-house models are any good, but one thing did come into focus this week: Garman reminded people that AWS is first and foremost an infrastructure platform with the AI pieces laid on top of it. While Google all but ignored the infrastructure piece at GCN in favor of all AI all the time, Garman reminded us that Amazon has a deep technical set of capabilities, and in a world where AI is taking center stage, that could be more important than ever.
-Ron
Photo by Ron Miller