AI Infrastructure

OpenInfra & StackHPC on Building the Next Generation of AI Clouds

0

Guest: Kendall Nelson | Stig Telfer
Organizations: OpenInfra Foundation | StackHPC
Show Name: An Eye on AI
Topic: AI Infrastructure

As AI workloads grow more complex, open collaboration is proving essential for building scalable, vendor-neutral infrastructure. At the OpenInfra Summit in Paris, Kendall Nelson of the OpenInfra Foundation and Stig Telfer of StackHPC unveiled a new white paper exploring OpenStack’s role in enabling AI-ready clouds.

The discussion centers on the OpenInfra community’s newly released white paper, “Open Infrastructure for AI: OpenStack’s Role in the Next Generation Cloud.” Kendall Nelson explains that the paper helps the broader community understand how OpenStack supports modern AI workloads — covering everything from core AI scenarios to detailed reference architectures.

Stig Telfer outlines StackHPC’s involvement, noting that the company contributed sections on high-performance networking and a real-world case study with 6G AI Sweden, which runs AI workloads using Nvidia HGX pod designs combined with open infrastructure software stacks.

Both guests emphasize that the paper itself was created through open collaboration — crowdsourced contributions from the OpenInfra AI Working Group, with input from members across the community.

The interview also dives into how OpenStack’s flexibility supports digital sovereignty across Europe, allowing organizations to maintain control over their data while adopting cutting-edge AI technologies. Nelson highlights that more than 20 OpenInfra member companies are already running AI workloads on OpenStack, signaling its growing importance in enterprise AI ecosystems.

They conclude with a look ahead: the AI Working Group plans to develop a “Containers for AI” white paper next, extending collaboration to other open-source communities. Both speakers stress the need for continued engagement — encouraging developers and enterprises alike to contribute insights that will help evolve the open-source stack for AI.

Here is the edited Q&A of the interview:

Swapnil Bhartiya: Kendall, tell us about this white paper and why it’s so significant.

Kendall Nelson: The paper helps the community understand how OpenStack supports AI workloads — it includes case studies from StackHPC, China Mobile, FPT Smart Cloud, and Rackspace.

Swapnil Bhartiya: Stig, what did collaboration look like, and what was StackHPC’s role?

Stig Telfer: The process embodied open collaboration. We contributed content on high-performance networking and an AI case study with 6G AI Sweden.

Swapnil Bhartiya: Tell us more about 6G AI Sweden and their setup.

Stig Telfer: They combine Nvidia HGX reference designs with OpenStack and Kubernetes to build flexible, vendor-neutral AI infrastructure.

Swapnil Bhartiya: How does OpenStack evolve to meet AI needs?

Kendall Nelson: OpenStack has already become a foundation for AI workloads — with millions of cores running globally. It’s evolving rapidly to support GPU enablement and large-scale performance.

Swapnil Bhartiya: What’s next for the OpenInfra AI Working Group?

Kendall Nelson: More collaboration — including a new white paper focused on containers for AI workloads and more community-led show-and-tell sessions.

Swapnil Bhartiya: Any advice for organizations starting their AI infrastructure journey?

Stig Telfer: Work iteratively and involve open-source experts early. Open technologies give you control, flexibility, and innovation at scale.

Kendall Nelson: Get involved with the community — share your challenges so we can evolve OpenStack together for AI.

How Leaders Can Build Organizational Buy-In for AI Adoption — Glenn Russell, Egen

Previous article

Why OpenStack Is Winning the VMware Migration Race — Jimmy McArthur, OpenInfra Foundation & Ken Crandall, Rackspace

Next article