Guest: Dominic Wilde
Company: Mirantis
Show Name: An Eye on AI
Topic: AI Infrastructure
As enterprises push deeper into AI, they’re discovering that traditional cloud and virtualization strategies can’t keep up. In this clip from our conversation with Dominic Wilde, SVP & GM of the Core Business at Mirantis, he explains why the company’s latest update — k0rdent Virtualization — is designed to unify VMs and containers under a Kubernetes-native control plane. What started as a tool for managing multi-cluster sprawl is fast becoming a foundational layer for organizations moving toward performance-hungry AI workloads.
📹 Going on record for 2026? We're recording the TFiR Prediction Series through mid-February. If you have a bold take on where AI Infrastructure, Cloud Native, or Enterprise IT is heading—we want to hear it. [Reserve your slot
k0rdent was originally born from a clear problem: enterprises were entering an era of multi-cluster, hybrid cloud operations without the tools to manage them effectively. Over time, Kubernetes spread across teams, environments, and clouds, creating operational islands that slowed transformation. Wilde describes k0rdent as a Kubernetes-native, declarative “Uber control plane” that connects clusters across private and public clouds using technologies like Cluster API. The mission was straightforward — bring order to Kubernetes sprawl while giving companies the flexibility to modernize without being forced into a rip-and-replace rebuild.
But as AI adoption accelerates, k0rdent’s role has expanded.
Enter k0rdent Virtualization. This new layer allows VMs and containers to operate together within a single interface and management system. Wilde emphasizes that this unification isn’t just about streamlining operations; it’s a way to break down long-standing silos and help enterprises transition to modern architectures at their own pace. Many organizations still rely heavily on VM-based workloads, especially for mission-critical applications. At the same time, cloud-native teams are pushing ahead with containers and Kubernetes. AI widens this divide even further.
Mirantis sees this tension firsthand. Wilde notes that k0rdent has been getting significant traction in the AI space, particularly among fast-moving NeoCloud providers — cloud platforms built specifically to support large-scale AI workloads. These environments surface new virtualization challenges that go far beyond simply replacing VMware or reorganizing compute resources. AI workloads require high-performance GPU access, predictable scheduling, isolation models, and the ability to carve up GPU resources efficiently.
This is where k0rdent Virtualization takes on new importance. Mirantis is introducing advanced capabilities through a k0rdent AI variant, targeting AI-specific performance patterns. GPU deployment is only the first step; organizations also need to slice and allocate GPUs in ways that align with the compute patterns of their AI training and inference pipelines. Wilde calls virtualization for AI “a more complex topic” — a space where legacy virtualization platforms fall short.
The ability to unify VMs and containers under one Kubernetes-native management layer creates a foundation for this new wave of AI infrastructure. Enterprises get the continuity of their existing VM estates while gaining access to modern, cloud-native operating models. Teams can incrementally modernize instead of being forced into abrupt architectural change.
Another major advantage of k0rdent Virtualization is the ability to maintain composability. Wilde highlights a guiding principle inside Mirantis: “We have an opinion, but we don’t have an agenda.” This means customers can keep the parts of their environment that matter, including their preferred Kubernetes distributions. k0rdent provides an opinionated default — K0s — but allows teams to bring their own distro if desired. This flexibility has become critical as enterprises architect their future AI infrastructure.
As companies continue to adopt GPUs and integrate AI into their operations, virtualization strategies must evolve. This includes everything from GPU passthrough to workload-specific memory mappings and fine-grained performance tuning. Mirantis’ work with NeoClouds is accelerating its understanding of these needs. Wilde explains that AI infrastructure is revealing new challenges “as you go to scale,” and k0rdent’s composable model allows Mirantis to build solutions that adapt as these challenges emerge.
Looking ahead, k0rdent Virtualization is likely to become a core tool for any enterprise blending legacy systems with cloud-native and AI-driven workloads. It provides the operational unification needed to manage everything in one place, while also supporting the advanced capabilities AI platforms demand. From GPU deployment to cluster lifecycle management, the platform offers a path toward modernization without sacrificing stability or control.





