Guest: Randy Bias (LinkedIn)
Company: Mirantis
Show Name: The Agentic Enterprise
Topic: AI Infrastructure
Model Context Protocol’s move to the Linux Foundation through the Agentic AI Foundation (AAIF) isn’t just another standards announcement. According to Randy Bias, VP of Strategy & Technology at Mirantis, it’s the moment that determines which AI protocol wins—and the answer is already clear. With backing from Anthropic, OpenAI, Google, and now the Linux Foundation’s ecosystem, MCP is poised to dominate agent communication standards the way Kubernetes dominated container orchestration.
The Oxygen Gets Sucked Out
“Clearly, with MCP entering the AAIF, it’s going to suck the oxygen out of the room for many of the other aspirational agent and MCP protocols,” Bias states bluntly. “We’re going to see everyone kind of aggregate around MCP, and that’s a good thing.”
The AI infrastructure landscape has been fragmented, with multiple competing approaches to agent-to-tool communication. Each protocol had technical merits. Each had passionate advocates. But fragmentation in foundational infrastructure creates friction that slows adoption across the entire ecosystem.
MCP’s move to the Linux Foundation ends that fragmentation. The protocol now carries institutional weight that competing standards can’t match. When enterprises evaluate which standard to build on, they’re not just assessing technical capabilities—they’re assessing staying power, ecosystem support, and long-term viability.
MCP now has all three.
Mass Adoption Beats Technical Perfection
Bias delivers a lesson that technical teams often resist: “I am always kind of surprised when you still see people trying to look for the best technological solution. What we’ve seen time and time again is that the best technological solution isn’t the one that wins—it’s the one that achieves mass adoption.”
This observation echoes across technology history. VHS beat Betamax. HTTP beat Gopher. Kubernetes beat Docker Swarm. In each case, the winner wasn’t necessarily the most elegant solution—it was the one that achieved critical mass first.
MCP’s backing from major AI platform providers—Anthropic, OpenAI, Google—plus the governance and community structure of the Linux Foundation creates a momentum that competing protocols can’t overcome. Developers building agents will target MCP because that’s where the tools are. Tool builders will implement MCP servers because that’s where the agents are. The network effects become self-reinforcing.
“This is going to be the technology of the future,” Bias declares. “Now’s the time to get on board—kind of like OpenStack and the early days of Kubernetes. This is the wave to ride.”
The Emerging AI-Native Pattern
Beyond the standards consolidation, Bias identifies something more fundamental: an AI-native architectural pattern that’s crystallizing around MCP.
The pattern has three components working together. First, general-purpose agents provide the reasoning and execution engine. These agents—Claude Code, Codex, Goose, and others—handle the core intelligence work.
Second, skills encode domain expertise. “The skills are the encoded knowledge and wisdom of human beings as operators and users in that domain,” Bias explains. This is where organizational knowledge gets captured—the workflows, the decision trees, the hard-won lessons from years of operations.
Third, MCP tools provide real-time system introspection. These tools give agents the ability to query running systems, inspect configurations, retrieve logs, and gather the contextual data needed for informed decisions.
“Skills for your process and workflow, tools for real-time data acquisition, and agents tie it all together,” Bias summarizes. “I think that’s an emerging, clear AI-native pattern.”
What Goes Where: Skills vs. MCP Servers
As organizations implement this pattern, a practical question emerges: what belongs in skills versus what belongs in MCP servers?
Bias has been working through this question in proof-of-concept implementations. The distinction matters because it affects how systems are architected and how capabilities are distributed.
Skills represent encoded human expertise—the process knowledge that guides how work gets done. They’re relatively static, changing when organizational processes change. MCP servers, by contrast, provide access to dynamic systems—infrastructure that’s constantly evolving, generating new data, changing state.
This separation creates clean boundaries. Skills can be version-controlled and reviewed like code. MCP servers can be independently updated as infrastructure evolves. Agents can mix and match skills and tools based on the task at hand.
Why This Matters for Linux Foundation
For the Linux Foundation, MCP represents expansion into a critical new domain. The Foundation has successfully shepherded standards in cloud-native infrastructure (Kubernetes, containerd, Prometheus), networking (ONAP, OPNFV), and enterprise technology (Hyperledger, Automotive Grade Linux).
AI infrastructure represents the next frontier. As enterprises deploy AI at scale, they need the same kind of standards governance, community coordination, and neutral stewardship that the Linux Foundation provides for other technologies.
MCP joining the AAIF under the Linux Foundation umbrella brings proven governance models to an emerging technology. It signals to enterprises that investing in MCP isn’t betting on a vendor-controlled protocol—it’s building on community-governed infrastructure.
The Call to Action
For enterprises still evaluating AI infrastructure standards, Bias’s message is unambiguous: the decision has been made. MCP has the backing, the momentum, and the ecosystem support to become the dominant standard.
“I’m going to bet on it. I think a lot of other people are going to bet on it,” he says. “It’s going to be sort of the thing that eventually wins.”
The parallel to early Kubernetes is apt. Organizations that adopted Kubernetes in 2015 and 2016 gained years of experience while competitors waited for the dust to settle. By the time container orchestration was “solved,” those early adopters had production systems running and teams with deep expertise.
MCP offers the same opportunity. The protocol is young, but the trajectory is clear. Organizations that invest now in understanding MCP, building skills libraries, and developing MCP servers will be years ahead when AI agents become standard enterprise infrastructure.
The standards war is over. The implementation work begins now.





