Cloud Native

AI Is Everywhere at KubeCon — But Kubernetes Isn’t an AI Platform

0

Guests: Kat Cosgrove | Billy Thompson
Companies: Minimus |Akamai
Show Name: KubeStruck
Topics: Kubernetes, Open Source, CNCF

Walk the floor at KubeCon today and it feels like AI has swallowed everything. Keynotes, booths, product demos — nearly every conversation is framed around AI. The implication is clear: Kubernetes has become the foundation for the AI-native future.


📹 Going on record for 2026? We're recording the TFiR Prediction Series through mid-February. If you have a bold take on where AI Infrastructure, Cloud Native, or Enterprise IT is heading—we want to hear it. [Reserve your slot

Kat Cosgrove, Kubernetes Release Team Subproject Lead and Head of Developer Advocacy at Minimus, strongly disagrees.

In this clip, Kat cuts through the noise and resets expectations. Kubernetes isn’t doing anything special to enable AI workloads. It hasn’t suddenly evolved into an AI platform. It’s simply doing what it has always done well: orchestrating distributed containerized applications. AI workloads run on Kubernetes for the same reason databases, web services, and control planes do — because Kubernetes is the default infrastructure layer for complex systems.

That distinction matters.

Kat challenges a popular narrative that AI will soon become the dominant use case for Kubernetes. She points to recent industry reports that extrapolate AI growth into inevitability — and calls that assumption flawed. The surge in AI workloads today, she argues, has more to do with hype cycles and venture capital incentives than with durable architectural shifts.

Her comparison is blunt: this looks a lot like Web3.

A small number of genuinely innovative tools exist. But they’re buried under an ocean of shiny wrappers — products that slap “AI” onto existing services without delivering real value. Many of these tools are expensive to run, difficult to justify in production, and unlikely to survive once investor enthusiasm cools.

That skepticism extends into the open source world as well. AI introduces unresolved questions around data provenance, model transparency, and licensing. What data trained the model? Is that data open? Does closed training data undermine the openness of the output? Foundations and projects are scrambling to define policy, but there are no settled answers yet. Kubernetes itself is standing up new working groups precisely because these questions are far from solved.

Billy Thompson, Senior Global DevOps & Platform Engineering, Office of the CTO at Akamai, adds an important developer reality check. Survey data consistently shows that AI features rank low on the priority list for working engineers. This isn’t anti-innovation — it’s pragmatism. Developers care about tools that solve concrete problems, remove friction, and justify their cost. AI, like Kubernetes before it, only delivers value when teams clearly understand the problem it’s meant to solve.

The industry pattern is familiar. New paradigms attract massive investment. Some teams use them thoughtfully and see real gains. Many others rush in, spend heavily, and walk away disappointed. AI is not unique in this regard — it’s simply the current focal point.

The takeaway from this clip is not anti-AI. It’s anti-hype. Kubernetes remains what it has always been: infrastructure. Strong cloud native fundamentals matter far more than chasing whatever dominates conference agendas this year.

From Clinical Labs to Cloud Native: How CNCF’s Accessibility Made a Deaf Developer Feel She Belongs

Previous article

How Fingerprint’s Proximity Detection Stops Device Farms Fraudsters Can’t Hide

Next article