Cloud Native

With open source k0rdent, Mirantis lays the foundation for an AI-Native stack

0

Note: With this interview, we are launching our new show called ‘An Eye On AI’ focussing on innovation and real use-cases of AI. Subscribe to our channel to keep up with our latest shows!

With the proliferation of artificial intelligence (AI) in the enterprise world, Mirantis Co-Founder and CEO Alex Freedland envisions a new AI-native stack designed for future workloads. In line with this vision, the pure-play platform company has announced a new open source project called k0rdent which Freedland believes will serve as the OpenStack equivalent for the infrastructure layer of the AI-native stack.

In a nutshell, k0rdent is yet another Kubernetes-based, multi-cluster management software aka Distributed Container Management Environment (DCME). What sets it apart, according to Freedland, is that it’s the only truly open source solution available in the market today.

k0rdent enables platform engineers to manage multiple clusters irrespective of where they are running – public cloud, private cloud, on-prem or at the edge. k0rdent aims to simplify workload management across on-premises, public cloud, and edge environments.

Mirantis emerging as a force in open source

As many traditional industry players are becoming less vocal about open source, Mirantis is making a strong comeback. After stepping away from its open-source roots for a time, the company—led by Freedland— is returning to open source with more ambitious goals. To spearhead this effort, the company hired seasoned open source veteran Randy Bias to lead its open source initiative. The company has already released a couple of fully open source projects, including k0smotron, Rockoon and now k0rdent.

k0rdent is fully open-source, underscoring Mirantis’ renewed focus on open-source contributions. The company, which recently upgraded its Cloud Native Computing Foundation (CNCF) membership to Gold, has since launched multiple open-source projects. k0rdent aims to support platform engineers in managing clusters across different environments while providing flexibility for enterprises that want to avoid vendor lock-in. Freedland says, “People are moving away from vertically integrated, unwieldy distributions and need the flexibility and freedom to be able to run their workloads anywhere, on any cloud.”

Building the foundation for the AI Stack

Many enterprises are transitioning from training AI models to inferencing, introducing a need for seamless infrastructure that can operate efficiently across multiple locations. Kubernetes has become the standard orchestration layer for AI workloads, which aligns with k0rdent’s ability to help enterprises run AI applications across hybrid and multi-cloud environments. Freedland emphasizes that the AI-driven demand for multi-cluster management further reinforces k0rdent’s importance in today’s computing landscape. This complexity is compounded by challenges such as regulatory restrictions, performance optimization, and integration with existing infrastructure.

Deploying AI applications presents numerous obstacles, including data privacy concerns, performance requirements, and integration complexity. Many organizations must run AI models close to proprietary or regulated data because of privacy laws and security concerns. Additionally, AI workloads often require high-performance computing (HPC) infrastructure to ensure efficient inferencing, meaning enterprises need solutions that allow them to manage compute resources seamlessly. k0rdent is designed to address these challenges by offering a flexible, open-source approach to orchestrating multi-cluster AI workloads.

Freedland envisions k0rdent becoming a foundational component of the AI-native technology stack. As AI adoption accelerates, enterprises will require infrastructure tools that can support workloads at scale. The recent emergence of DeepSeek has further validated the economic viability of open-source AI solutions, demonstrating that AI applications can be built and deployed more efficiently with open-source tools, challenging the sustainability of proprietary models like OpenAI’s. Freedland believes this shift will drive greater demand for open-source AI frameworks that enterprises can deploy locally while maintaining control over their data.

It’s all about k0s family

Discussing the evolution of Mirantis’ k0 projects, Freedland explains that k0 was designed as a lightweight Kubernetes distribution tailored for edge computing, with over a million clusters currently running. k0smotron was built as a multi-cluster manager to simplify Kubernetes operations across multiple environments. k0rdent represents the next stage in this evolution, expanding beyond Kubernetes cluster management to include monitoring, networking, and data orchestration services.

Mirantis has evolved significantly from starting as a pure-play OpenStack company into a full-fledged platform company centered around Kubernetes. Freedland believes that a true platform must earn its status through widespread adoption. To achieve this, Mirantis aims to deliver open-source projects that address real enterprise challenges while fostering industry-wide innovation.

While open-source adoption is a priority, Freedland acknowledges that many enterprises require commercial support for security, compliance, and integration needs. Mirantis plans to provide enterprise-grade support for k0rdent, enabling companies to deploy it confidently while maintaining an open-source-first approach. Freedland believes the success of open-source projects depends on balancing community-driven innovation with vendor-backed stability, ensuring enterprises have the flexibility to scale without being locked into proprietary ecosystems.

Freedland anticipates k0rdent playing a crucial role in shaping AI and multi-cluster Kubernetes infrastructure in the future. As enterprises increasingly adopt AI-driven workloads and seek scalable solutions, k0rdent can help manage complexity across distributed environments. Mirantis will continue to expand its open-source ecosystem while ensuring that organizations have access to the technologies they need to drive innovation in cloud-native and AI-driven computing.

Guest: Alex Freedland (LinkedIn)
Company:  Mirantis

This summary was written by Emily Nicholls.

2025 will see the rise of network intelligence | Predictions by Kentik

Previous article

How Open Mainframe Project (OMP) initiatives are unlocking AI’s potential

Next article