Kubernetes adoption has moved far beyond cloud data centers. The container orchestration platform now powers edge computing infrastructure in paint stores, baseball parks, and commercial airplanes—managing checkout systems, entertainment platforms, and ordering workflows at scale. Meanwhile, AI infrastructure workloads are driving 10x deployment frequency growth, with hyperscalers like CoreWeave using Kubernetes to manage thousands of customer clusters.
The Guest: Hong Wang, Co-founder and CEO at Akuity
The Bottom Line
- Kubernetes isn’t just for cloud—it’s the infrastructure backbone for AI hyperscalers, edge computing at retail/sports venues, and even flying edge locations like commercial airplanes
***
Speaking with TFiR, Hong Wang of Akuity defined the current state of Kubernetes adoption and explained what’s driving the company’s 43 million deployment milestone across 100+ customers.
What Is Driving Akuity’s Growth in AI Infrastructure?
Wang identified two primary growth drivers: Kubernetes becoming ubiquitous infrastructure, and AI workloads creating massive deployment frequency increases. Akuity’s largest customer, CoreWeave (an AI hyperscaler), uses the platform to manage thousands of customer clusters for AI training and inferencing workloads.
Hong Wang: “Two things. Number one is that Kubernetes is getting everywhere. You can hear a lot of stories about OpenAI—what’s used to manage the infrastructure is actually Kubernetes, managing thousands of nodes. The reason we’re gaining that momentum, and why we see that growth, is that Kubernetes is becoming the backbone system for AI—for AI training and for AI inference. Our biggest customer is CoreWeave, an AI hyperscaler, and they’re using us to deploy to their customer clusters. As they’re scaling up, as they have more and more AI applications to deploy, we saw 10× growth last year in deployment frequency. There are so many clusters we’re managing, so many apps we’re managing, and every app is being deployed more frequently.”
The 10x deployment frequency surge reflects the operational reality of AI infrastructure: models are continuously retrained, inference endpoints are frequently updated, and customer clusters are provisioned at scale. Kubernetes has become the de facto orchestration layer for these workloads, and platforms like Akuity provide the automation layer to manage deployments across thousands of clusters.
Broader Context: Edge Computing at Scale
Beyond AI infrastructure, Wang shared several edge computing use cases that illustrate how far Kubernetes adoption has spread. These deployments operate at the physical edge—inside retail stores, sports venues, and aircraft—rather than in centralized cloud data centers.
Hong Wang: “We have a customer that’s a paint store with franchises—1,000 stores across the United States. They run a small Kubernetes cluster in each store to power the checkout system, advertisement system, and ordering system. They want to centrally manage those applications across the continent. So if today is Black Friday and I want to give a discount to all the U.S. West states, rather than calling every store, they push a configuration to all the clusters in the West region. The clerk or staff doesn’t have to do anything—everything happens automatically; they just scan, and it’s 50% off today. Great. It’s centralized management.”
This pattern—centralized control over distributed edge infrastructure—solves a fundamental operational challenge for franchises and distributed organizations. Instead of manually updating software or configurations at each location, operations teams push updates to Kubernetes clusters at scale, ensuring consistency across all locations.
Wang also highlighted Major League Baseball as a customer, running Kubernetes in every ballpark to manage applications at each venue. The same centralized management model applies: updates are pushed to clusters across all ballparks simultaneously, rather than requiring manual intervention at each location.
Perhaps the most surprising use case: commercial airplanes. Wang explained that newer aircraft run Kubernetes clusters to manage in-flight systems.
Hong Wang: “In commercial airplanes, newer ones are running Kubernetes clusters now. We have customers using our platform to manage applications inside commercial airplanes. It’s the same story—these are flying edge locations, essentially flying stores. On the airplane, you have entertainment systems and ordering systems—non-mission-critical systems. Those systems need to be maintained, upgraded, and monitored. They’re running Kubernetes; they’re using our software to manage the fleet.”
The “flying edge location” framing is apt: aircraft operate as disconnected edge environments that periodically sync with centralized infrastructure. Kubernetes provides the orchestration layer, and platforms like Akuity handle fleet-wide deployment automation. While these are non-mission-critical systems (not flight control or navigation), the operational requirements—reliability, remote updates, monitoring—are the same as any distributed infrastructure deployment.
Wang also addressed the “boring” but equally important multi-cloud use case: enterprises running clusters across AWS, Azure, and on-premises environments. Akuity provides unified management across all three, allowing platform teams to treat hybrid infrastructure as a single deployment target rather than managing separate tooling for each environment.
Watch the full TFiR interview with Hong Wang here.





