Cloud Native

Why Cloud Native Success Depends on Culture, Not Technology—And Open Models Save $25B

0

Guest: Hilary Carter (LinkedIn)
Company: The Linux Foundation
Show Name: KubeStruck
Topic: Cloud Native

As Kubernetes becomes invisible infrastructure, organizations face a surprising reality: the biggest barriers to cloud native success aren’t technical anymore. They’re cultural. Hilary Carter, Senior Vice President of Research at the Linux Foundation,  reveals what the 2025 CNCF Annual Cloud Native Survey shows about the real challenges holding organizations back—and why the shift from building AI models to using open models could save the industry $25 billion in opportunity costs. The message is clear: culture eats strategy for breakfast, and complexity still matters.


📹 Going on record for 2026? We're recording the TFiR Prediction Series through mid-February. If you have a bold take on where AI Infrastructure, Cloud Native, or Enterprise IT is heading—we want to hear it. [Reserve your slot

The Misconception: It’s Not About Technology Anymore

When asked about the biggest misconception organizations have about Kubernetes and cloud native adoption at this mature stage, Carter’s answer cuts straight to the core issue: “The numbers really reveal that it’s not just about technical aspects or technical challenges. What this study revealed is that the current challenges relate to cultural change and managing change.”

This represents a fundamental shift in how we need to think about cloud native success. The early days of Kubernetes adoption were dominated by technical questions: Can we deploy it? Will it scale? How do we manage orchestration? Those questions have largely been answered. Now the challenges are human: “The human dynamics of modernization and computing decision making and strategy and how we go about best practices. So that is the current pain point.”

Carter breaks down what this means in practice: “How we manage culture, how we manage technological innovation, how we manage change, and how we manage practices—that’s really what this study reveals: that these dynamics are worth organizations paying some attention to, and that they’re really linked to success.”

Culture as Competitive Advantage

The study reveals that effective cultural management shows up in specific practices: how organizations manage the ingress of open source code, how they optimize projects, and how they release software. “How we manage that most effectively is through greater cultural collaboration, both across projects and within ecosystems,” Carter explains.

One practical recommendation stands out: participating in open source events like KubeCon. Carter describes this as “a way to help infuse your organization with an open source culture, a mindset, an awareness of best practices, and an awareness of how we can get over some of the complexities around tooling, security, and adoption, and bring out the most optimal scenario within our organizations.”

This isn’t about attending conferences for networking—it’s about cultural transformation. Organizations that succeed with cloud native technologies are those that embrace open source culture, not just open source tools. As Carter puts it directly: “Culture eats strategy for breakfast, and so too does it in the context of cloud native success.”

Complexity: The Persistent Challenge

While cultural challenges have emerged as the primary barrier, technical complexity hasn’t disappeared. Carter is candid about this: “I’ll tell you what surprised us, and I’ll say quite honestly, it wasn’t necessarily from this past study, but it was when we did the Kubernetes Turns 10 study that revealed that complexity was still a major barrier to Kubernetes adoption.”

That finding was informative because it required reflection: “How can we make these processes easier? How can we make this project less complex, more accessible? What do we need to close that gap?” The answer isn’t simple because, as Carter notes, “the goalpost is always moving.”

Complexity isn’t unique to Kubernetes—it’s a challenge across open source projects. Carter conducts qualitative interviews routinely and finds that “sometimes the technological decision-making choice comes down to: Will my ability to onboard my team members be successful? Does this project have the right documentation and the right onboarding tools to make my job easier?”

This highlights a critical point: organizations evaluate technology not just on capabilities but on practical adoption factors like documentation quality and onboarding ease. Projects that address complexity through better documentation, clearer onboarding paths, and more accessible tooling have a competitive advantage in the adoption landscape.

The AI Model Shift: From Building to Using

The biggest shift Carter has observed in how organizations approach cloud native and AI infrastructure over the past year relates to AI model strategies.

“The trend a couple of years ago was all about models and foundation models—building models and training those models. What we learned very quickly is that that’s an expensive thing to do. It’s really expensive to build a model from scratch and train it. It takes a lot of energy, resources, time, and talent.”

The CNCF survey revealed a significant change: “More than half of organizations are not doing that. They’re taking a model and using a foundation model and training it.” The inference workload data shows that models are not being built from scratch—organizations are using existing foundation models and training them with their own data.

Carter sees this as “a great sign for open models” and notes that Linux Foundation research studies have made this a recommendation: “Optimize your resources. No need to recreate the wheel. Use the models that are highly effective, optimize your costs, and bring them into your environments and train them with your data. Much more cost effective way to go.”

This shift signals organizational maturity in AI adoption. Rather than pursuing the expensive and resource-intensive path of building foundation models, organizations are thinking strategically about sovereign AI, independent AI systems, and data—while using open models within those contexts.

The $25 Billion Opportunity Cost

Carter points to research by Frank Nagle, the Linux Foundation’s chief economist, that quantifies the cost of not using open models. “It’s really about encouraging more organizations to use open models because of the cost savings. And it’s not unlike using other types of open source technologies.”

The numbers are striking: “While organizations are running inference workloads on open models, there’s still almost an equal number that are not, and that’s a cost. That’s an opportunity cost. Frank Nagle calculates it to be approximately $25 billion in cost from not using open models.”

This $25 billion figure represents real money organizations are leaving on the table by building proprietary models instead of leveraging existing open models. It’s the AI equivalent of building a proprietary operating system instead of using Linux—technically possible, but economically inefficient.

Carter’s conclusion is straightforward: “The big picture is that open source, open models, and open cloud orchestration tooling are still the most cost-effective way to optimize your innovation strategy today.”

What This Means for Infrastructure Leaders

For organizations navigating cloud native and AI infrastructure decisions, the implications are clear:

First, recognize that your biggest challenges aren’t technical—they’re cultural. Invest in change management, foster open source culture, and participate in community events not just for technology insights but for cultural transformation.

Second, address complexity through better documentation, onboarding processes, and team enablement. The ability to onboard team members successfully is as important as the technology’s capabilities.

Third, rethink your AI model strategy. If you’re building foundation models from scratch, ask whether that investment makes strategic sense. More than half of organizations have shifted to using open models for inference workloads—a more cost-effective path that avoids the $25 billion opportunity cost of proprietary model development.

The post-adoption era of cloud native requires different skills than the early adoption phase. Technology implementation is table stakes. Cultural transformation and strategic resource optimization—whether in how you manage open source adoption or which AI models you use—are the new differentiators.

CNCF’s Jonathan Bryce: 2026 Is the Year AI Moves from the Lab to the Factory

Previous article