Computing is shifting from the cloud to the edge

In a prescient presentation in July 2017, Peter Levine, a general partner at Andreessen Horowitz predicted a future where, out of necessity, computing would shift from the cloud to the edge, closer to where the workloads lived. You have to remember that in 2017, cloud computing was really just beginning to gain traction in a big way in the enterprise. Most mobile computing was also being processed in the cloud. So this was a particularly bold statement, and he recognized that.
Yet he foresaw a world where complex devices like self-driving cars would be generating tons of data, so much so, that it had to be processed on the device—because any latency, no matter how brief, would be too much. He used an example of a self-driving car approaching a stop sign. It requires real-time processing to understand that it needs to stop, and by the time the cloud returned the answer, the car would have blown through the sign, not exactly the optimal result.

Levine saw that the devices themselves were becoming like self-contained data centers, processing information in real time, and then sharing information with other similar devices. “You think about a self-driving car, it's effectively a data center on wheels, and a drone is a data center with wings, and a robot is a data center with arms and legs, and a boat is a floating data center and on and on it goes,” Levine said.
The future is now
Fast forward almost eight years, and Levine's prediction was surprisingly accurate. While he didn’t get every detail right in that 2017 presentation, he did an amazing job of outlining where we would be in 2025 – using a time frame of five to ten years. While we are still awaiting more mainstream expansion for most of these use cases, his basic premise still applies.
As we shift to AI workloads and the Internet of Things continues to expand, we are continually hearing about a shift to the edge that Levine predicted in 2017. For instance, not long ago, we interviewed Akamai CIO Kate Prouty, whose company originally started as a content delivery network, designed to move content closer to the target user, essentially an early edge use case. Akamai has since expanded into security and cloud solutions, and acquired Linode in 2022 for $900 million, primarily to provide a set of services to fuel its edge computing strategy.

Even Google, a major cloud provider, highlighted edge use cases at Google Cloud Next earlier this month. That is pretty interesting for a company that sells cloud services, but Google likely recognizes that we are living in a world that will increasingly require processing at the edge, and it simply wants to provide solutions to meet those needs for customers.
Alex Williams, founder at The New Stack, writing on LinkedIn, noted the irony of a cloud provider shifting to edge products “Google on-premises? The Internet is a marvel. But lately I am marveled by what gets done at the edge, off the Internet,” he wrote.
One example he gave was running inferencing on the edge, so running a large language model locally for security and speed purposes, a scenario that would certainly appeal to a lot of CIOs and CISOs from a data safety perspective.
As data becomes increasingly valuable and dense, processing data in the cloud becomes more difficult to do in a timely fashion for a number of use cases, and we’re likely to see a growing shift toward edge computing. That doesn’t mean the cloud goes away--even Levine in his initial predictions didn’t go that far--but it could potentially change the role of the cloud to be focused more on storage and training. Although these shifts tend to take much longer than we imagine, it appears that Levine's vision could be playing out almost exactly as he thought it would, and that's pretty remarkable.
~Ron
Featured photo by Ivan Cujic on Pexels.