Guest: Zeke Dean
Companies: Redpanda
Show Name: To The Point
Topics: Edge Computing, Agentic AI
The race to build effective agentic and AI-driven applications isn’t just about better models—it’s about getting the right data to those models at the right time. That’s the driving force behind Redpanda’s new partnership with Akamai, which formally brings Redpanda into Akamai’s Qualified Compute Partner Program and establishes a co-sell motion where Akamai can sell Redpanda Enterprise Edition on Akamai Cloud.
Redpanda is an agentic data platform built on streaming technology that’s fully compatible with the Apache Kafka API, but re-architected to be simpler to operate, faster under load, and more reliable at scale. The partnership pairs this real-time streaming capability with Akamai’s distributed cloud platform built on Linode, enabling customers to run high-performance, latency-sensitive workloads closer to users, devices, and data sources.
“This isn’t just a technology handshake,” says Zeke Dean, Senior Partner Solutions Engineer at Redpanda. The partnership establishes a formal co-sell motion where Akamai can sell Redpanda Enterprise Edition on Akamai Cloud, with Redpanda providing first-line support.
One of the key technical advantages is Linode’s developer-friendly infrastructure. The platform is designed to be straightforward, with fast provisioning, developer-friendly tooling, and what Dean describes as a “very, very well-done” managed Kubernetes experience. Linode’s LKE (Linode Kubernetes Engine) can deliver production-like clusters in minutes. On the performance side, Linode’s platform uses SSDs for core compute and high-performance NVMe storage blocks for intensive workloads—ideal for low-latency, data-intensive applications.
The timing of this partnership is particularly strategic. As more organizations attempt to build agentic and AI-driven applications, the winning architectures will be those that combine low latency inference with fresh, trustworthy, real-time data. Akamai’s inference cloud is designed to run inference closer to users at the edge with performance optimization, while Redpanda can run right where customers’ event data originates.
“Akamai brings a globally distributed compute and edge reach. Redpanda brings a real-time streaming backbone,” Dean explains. “Together, we’ll make it dramatically easier for enterprises to deploy and operate next-generation real-time and agentic applications worldwide.”
The partnership addresses a critical gap in the market: the ability to process streaming data at the edge while maintaining the performance characteristics necessary for real-time AI inference. By combining Akamai’s global infrastructure footprint with Redpanda’s streaming platform, enterprises gain a clean path from event streaming to improved context and action at global scale.
For organizations building the next generation of AI applications, this partnership offers a compelling value proposition—real-time data processing capabilities deployed on globally distributed infrastructure, with the simplicity and reliability needed to operate at scale.





