Cloud Native

Hybrid Cloud Is Breaking Streaming Architecture—Here’s How to Unify It | Prenil Kottayankandy, Akamai & Zeke Dean, Redpanda | TFiR

0

Guests: Prenil KottayankandyZeke Dean
Companies: Akamai | Redpanda
Show Name: Cloud: Evolution
Topics: Edge Computing, Agentic AI

Enterprises are running hybrid architectures out of necessity—some workloads require centralized cloud compute, others need edge proximity. But streaming data across these environments exposes a fundamental problem: traditional platforms weren’t built for this continuum. The result is operational silos, data consistency challenges, security blind spots, and governance gaps. For organizations building agentic AI systems, IoT automation, or real-time personalization engines that span core to edge, these architectural fragmentation issues aren’t minor inconveniences—they’re blockers.

In a recent discussion, Prenil Kottayankandy, Director of Business Development at Akamai, and Zeke Dean, Senior Partner Solutions Engineer at Redpanda, explained how to build unified streaming architectures that span core regions, distributed compute, and edge locations without creating operational complexity or sacrificing governance and security.

The Compute Continuum Challenge

Modern enterprise architectures aren’t monolithic—they’re distributed across multiple layers. “You’re probably aware of the Akamai plan, the strategy that we have to have this continuum of compute—from core regions to distributed regions to edge regions,” Kottayankandy explained. “When you think about the core, you’re probably talking about tens of locations; distributed regions involve hundreds of locations; and the edge spans thousands of locations.”

This distributed reality creates immediate architectural tension for streaming workloads. Centralized streaming platforms can’t deliver the latency performance required at the edge. Edge-only deployments create data silos and governance challenges. And running different streaming technologies in different environments fragments operations and breaks data consistency.

Portable Infrastructure Across the Continuum

The solution starts with portability. “Redpanda obviously runs on multiple clouds and it’s portable,” Kottayankandy noted. “But our goal with this partnership is to allow developers to build across the continuum, and also, on top of that, leverage the benefits that the Akamai edge and the security solutions that we bring to the table to our customers.”

Dean emphasized the operational simplicity required for this to work at scale. “The nice thing about Redpanda is that it’s not complex to deploy and operate. You can basically run it wherever you want—it can run on a small compute machine, and then ship that data to different storage systems, such as a global CDN that Akamai is well known for.”

This portability means the same streaming infrastructure can run consistently whether deployed in a core region with tens of nodes or at the edge with thousands of distributed locations. Developers don’t need to learn different platforms, manage different operational models, or deal with inconsistent APIs across environments.

Unified Architecture Without Operational Silos

Kottayankandy described how this portability translates to architectural flexibility: “You can deploy it and create your applications using Redpanda running on the node, and you can choose to deploy it in different locations. It’s not limited to core or distributed regions, and it can work very well with the CDN, which is the front point of entry for all your user requests.”

The CDN integration is particularly important for hybrid architectures. Instead of streaming data being isolated from content delivery, the two layers can work together—with the CDN serving as the entry point for user requests while streaming infrastructure processes events and maintains state across the continuum.

Security Across Every Layer

Hybrid deployments expand the attack surface, making consistent security controls essential. Kottayankandy noted that Akamai’s security capabilities extend across the streaming layer: “You can still be secured from DDoS attacks, application data attacks, or any other plethora of those security threats that you’re concerned about when you build those applications.”

This isn’t security bolted on after the fact—it’s integrated into the platform layer, ensuring that streaming workloads running at the edge have the same threat protection as those in core regions.

Agentic Data Plane: Governance at Scale

For organizations deploying agentic AI systems that operate autonomously across distributed environments, governance and visibility are critical. Dean explained Redpanda’s approach: “Storage, various connectors which you can run in Akamai cloud—you can transform the data. You can do access controls through our agentic data plane, so you can make sure that your agents and workloads go to specific locations, specific teams, specific destinations.”

This isn’t just about access control—it’s about accountability and traceability. “It’s all about governance and accountability at scale,” Dean emphasized. “I’m talking about tracing what those agents are doing, who has access to what, and giving you total visibility into your real-time information across your organization.”

For enterprises concerned about AI agents operating autonomously, this visibility is non-negotiable. The agentic data plane provides audit trails, access logs, and policy enforcement that span the entire compute continuum—ensuring that governance doesn’t break down at the edge.

Data Transformation and Routing at Every Layer

Hybrid architectures require data to move between environments, be transformed for different use cases, and be routed to appropriate destinations based on policy. Dean highlighted the flexibility built into the platform: “You can transform the data. You can do access controls through our agentic data plane, so you can make sure that your agents and workloads go to specific locations, specific teams, specific destinations.”

This means streaming data can be processed, filtered, enriched, or redacted based on where it’s deployed and who’s accessing it—critical for multi-jurisdictional compliance, data sovereignty requirements, and internal security policies.

Why This Matters for Hybrid Deployments

The shift to hybrid architectures isn’t temporary—it’s the new normal. Latency-sensitive workloads will continue to move to the edge. Compliance requirements will force data residency constraints. And agentic AI systems will need to operate autonomously across distributed environments while maintaining centralized visibility and control.

Traditional streaming platforms force enterprises into compromises: centralized control with poor edge performance, or edge proximity with operational fragmentation. The Akamai-Redpanda partnership offers a third option: portable streaming infrastructure that runs consistently across core, distributed, and edge locations, with unified governance, integrated security, and operational simplicity.

For enterprises building real-time applications that span hybrid environments, this removes one of the most significant architectural blockers—and opens the door to designs that were previously impractical due to operational complexity or governance gaps.

Your Observability Bills Are Exposing an Architecture Problem | Eric Tschetter, Imply | TFiR

Previous article

AI Experience Quality Starts With Infrastructure | Danielle Cook, Akamai | TFiR

Next article