Guests: Prenil Kottayankandy |Â Zeke Dean
Companies: Akamai | Redpanda
Show Name: Cloud: Evolution
Topics: Edge Computing, Agentic AI
Modern applications demand real-time data processing at massive scale, whether it’s powering AI models, streaming telemetry from edge devices, or synchronizing global workloads across distributed environments. But most enterprises are stuck patching together legacy streaming infrastructure that wasn’t built for cloud-native or edge-first architectures. Akamai and Redpanda are changing that equation by bringing high-performance data streaming directly to the edge through a strategic partnership that unlocks new possibilities for latency-sensitive and globally distributed workloads.
📹 Going on record for 2026? We're recording the TFiR Prediction Series through mid-February. If you have a bold take on where AI Infrastructure, Cloud Native, or Enterprise IT is heading—we want to hear it. [Reserve your slot
A Strategic Partnership for Distributed Streaming
Redpanda recently joined Akamai’s Qualified Compute Partner Program, a highly curated partnership initiative designed to bring best-in-class solutions to Akamai’s distributed cloud platform. According to Prenil Kottayankandy, Director of Business Development at Akamai, the program is strategic by design. “It’s very strategic for Akamai, as we recruit best-in-class solutions and best-in-class providers to run, deploy, scale, and secure applications on the Akamai platform, ultimately delivering a differentiated solution to our customers,” he explained.
The program benefits all parties involved. Customers get access to battle-tested solutions that have proven themselves at scale. Partners gain immediate access to Akamai’s extensive customer base—more than 6,000 enterprise customers and 150,000 SMB customers. The program has seen rapid growth since its launch three years ago, with approximately 30 partners currently serving 600 to 700 customers.
The Technical Fit: Streaming Meets Distributed Edge
Redpanda is an agentic data platform built on streaming technology that’s fully compatible with the Apache Kafka API. As Zeke Dean, Senior Partner Solutions Engineer at Redpanda, described it, “We re-architected data streaming to be simpler to operate, faster under load and more reliable at scale.”
The partnership pairs Redpanda’s streaming capabilities with Akamai’s globally distributed cloud infrastructure built on Linode. This combination addresses a fundamental challenge in modern application architecture: bringing compute and data as close as possible to users and devices. “Akamai’s inference cloud runs the inference closer to the users, and then Redpanda can run right where the customers are, right at their event data,” Dean noted.
The technical advantages are measurable. Dean highlighted that Linode’s platform uses SSDs for core compute and high-performance NVMe storage blocks for intensive workloads, making it well-suited for low-latency and data-intensive applications. From a streaming perspective, Redpanda delivers consistent low latency across the board. “We deliver consistent low latency. From p99 onward, you’ll never experience an interruption in latency,” Dean explained.
Real Performance Gains and Cost Efficiencies
The partnership delivers tangible benefits beyond theoretical advantages. Kottayankandy shared concrete results from Akamai’s own internal testing: “We’ve seen about a 55% improvement in TCO simply by leveraging and adopting this new technology. Some of our existing products already run on it, and the benefits will compound as we gain more customers, more data, and larger applications built on the platform.”
The efficiency gains come from multiple sources. Redpanda drives per-node efficiencies, delivering better performance, throughput, and latency on each individual machine. Akamai contributes network-layer efficiencies through its distributed architecture, placing compute close to users.
“When you combine both of these, you really see a one-plus-one-equals-three situation for customers, where they get a service that doesn’t exist in the market today—especially at this price-to-performance ratio compared with any other alternative,” Kottayankandy said.
Another significant advantage is Akamai’s approach to data egress costs. “We don’t penalize people for egress cost from the edge and the compute layers,” Kottayankandy noted. This removes a major barrier that often prevents developers from building distributed applications, as egress costs can quickly spiral when moving data between regions or cloud providers.
Unlocking New Use Cases for AI and Real-Time Applications
The timing of this partnership aligns with a fundamental shift in application requirements. As Kottayankandy observed, “There is now an explosion of applications that need access to real-time data and real-time streaming data. Everyone is talking about building applications that use AI, or agents that talk to each other, to humans, and to different data layers.”
Dean outlined three primary use cases that benefit from combining Redpanda’s streaming with Akamai’s edge infrastructure:
Agentic AI at the edge: Multi-step agents require repeated low-latency inference calls and continuously refreshed context streams. Running LLMs, GPUs, and real-time applications on the same node eliminates round-trip delays to centralized regions.
Global IoT ingestion and automation: Millions of devices producing streams need immediate filtering, enrichment, and action. Processing these streams at the edge reduces latency and bandwidth requirements.
Personalization and customer experience: Serving real-time recommendations and conversational experiences globally requires data and compute proximity to eliminate lag.
Simplified Operations and Unified Architecture
Beyond performance, the partnership addresses operational complexity. The commercial relationship is designed to be straightforward—customers can purchase Redpanda through Akamai’s cloud platform with unified billing. Akamai handles the commercial side while Redpanda provides first-line support for the software.
For hybrid deployments, the partnership enables a unified streaming architecture across environments. Kottayankandy explained Akamai’s vision of a compute continuum spanning core regions, distributed regions, and edge locations. “Our goal with this partnership is to allow developers to build across the continuum, and also leverage the benefits that the Akamai edge and the security solutions bring to the table,” he said.
Redpanda’s portability complements this approach. As Dean noted, “The nice thing about Redpanda is that it’s not complex to deploy or operate. You can basically run it wherever you want.” The platform includes connectors for data transformation and an agentic data plane that provides access controls, enabling teams to govern who has access to specific data streams and to trace what agents are doing across the organization.
The Bottom Line
This partnership represents more than a technology integration—it’s a strategic response to the evolving requirements of modern applications. By combining Redpanda’s high-performance streaming with Akamai’s globally distributed infrastructure, enterprises gain a path to build real-time, AI-driven applications that can operate at global scale without the latency penalties or cost overhead of centralized architectures.
As both companies emphasized, the opportunity extends beyond existing streaming use cases to enable entirely new classes of applications that simply weren’t practical with previous-generation infrastructure. For organizations evaluating streaming infrastructure for AI, edge compute, or global cloud deployments, the Akamai and Redpanda partnership offers a compelling alternative to traditional approaches.





