In a significant move to enhance its cloud computing capabilities, Akamai has announced its new Managed Container Service, designed to simplify deployment and operations for containerized applications while leveraging the company’s extensive global network. This latest offering reinforces Akamai’s evolution from a content delivery network to a comprehensive cloud platform provider that prioritizes performance, security, and developer flexibility.
Scaling to the Edge with 4,300+ Points of Presence
Akamai’s Managed Container Service allows customers to provide containerized applications that Akamai will host, operate, and distribute on their behalf. What sets this service apart is its ability to utilize Akamai’s impressive network of over 4,300 points of presence across more than 700 cities worldwide.
📹 Going on record for 2026? We're recording the TFiR Prediction Series through mid-February. If you have a bold take on where AI Infrastructure, Cloud Native, or Enterprise IT is heading—we want to hear it. [Reserve your slot
“Our customers tell us that they have an application that needs to reach a certain audience. They provide the business logic and the requirements for that application, and then we deploy it on the Akamai platform with the ability to scale compute resources to any edge location that we have on our distributed network,” explained Ari Weil, VP of Product Marketing at Akamai.
This extensive network creates what Weil calls a “continuum of compute” that can leverage resources from the core to the edge, providing optimal performance regardless of user location.
Addressing Platform Engineering Challenges
The new service aims to address several key challenges that enterprises face when deploying containerized applications:
- Container Platform Selection: Many enterprises struggle with choosing the right container platform. Akamai favors Kubernetes for its portability and rapid improvement through open source contributions.
- Multi-Cloud Considerations: Companies need to consider their application architecture over time and whether they need to leverage multiple clouds.
- Operational Overhead: Managing a fleet of containers across different environments introduces significant complexity and operational burden.
“As you start to deploy any containerized environment, the first question is: do my engineers—do my developers—understand how to build in Kubernetes and scale it effectively?” said Weil. “With the Akamai Managed Container Service, you can take advantage of a managed Kubernetes service in the Akamai cloud. You can take advantage of the full footprint and presence that Akamai maintains in over 700 cities worldwide.”
Breaking Free from the “Friendly Prison”
A refreshing aspect of Akamai’s approach is its emphasis on flexibility and avoiding vendor lock-in. Weil contrasted Akamai’s philosophy with that of hyperscalers, which he described as creating a “friendly prison” for customers on their platforms.
“What we’re focused on, from end to end, is bringing more ROI to your development projects and making it easier for you to launch and scale applications—without worrying about whether you’ll need to take advantage of another cloud’s services, or want to use another cloud platform, whether it’s platform-as-a-service or software-as-a-service,” Weil stated.
This approach allows customers to evolve their architectures without concerns about jeopardizing agreements or commitments with Akamai.
Use Cases: From Media Streaming to AI Inference
The Managed Container Service is already finding applications across various industries. One of Akamai’s earliest customers is using the service for streaming media, leveraging Akamai’s ability to encode, transcode, package, optimize, and deliver streams to millions of concurrent viewers globally.
Akamai is also pushing forward with AI infrastructure services, including its recently announced AI inference solution. “Gartner recently named Akamai an emerging leader in generative AI (GenAI) infrastructure services because of the way that we’ve decided to invest in GPUs that were tailored to media streaming and AI inference on our platform,” Weil noted.
These specialized GPUs are designed specifically for AI inference rather than training large language models, focusing on helping developers build applications that leverage small language models or fine-tuned models.
The Evolution of Akamai
Weil emphasized how Akamai has evolved from its roots as a content delivery network and edge computing platform:
“Akamai is evolving from its roots as a content delivery network and edge computing platform, to adding robust security services over the past—call it—10 years, and now expanding into a full cloud platform that enables developers to build and scale secure applications that require low latency, high throughput, and global reach.”
This evolution positions Akamai as a competitor in the cloud space, with a unique value proposition centered around speed, scale, and flexibility.
Looking Ahead: Developer-Focused Future
Looking forward, Akamai is focusing on making the developer experience “as easy and enjoyable as humanly possible.” This includes partnerships with companies like Fermyon for WebAssembly applications and Zuplo for API gateways.
Akamai’s roadmap includes:
- Cloud Infrastructure Services for developers who want maximum control and flexibility
- Platform as a Service Capabilities that abstract away infrastructure concerns
- Vertical-Specific Solutions for industry-aligned use cases
“Our goal is to make cloud infrastructure services available to anyone who wants maximum control and flexibility over how they build in the cloud. We also aim to offer platform-as-a-service capabilities so you can focus more on building and less on the hardcore operational, IT, and DevOps concerns of running applications at scale,” explained Weil.
Conclusion
Akamai’s Managed Container Service represents a significant step in the company’s transformation from a CDN provider to a comprehensive cloud platform. By leveraging its extensive global network and focusing on developer experience, Akamai is positioning itself as an attractive alternative to traditional hyperscalers, especially for applications that require low latency, high throughput, and global reach.
The emphasis on open source Kubernetes, predictable pricing, and flexibility without lock-in addresses many pain points that enterprises face when deploying containerized applications at scale. As organizations continue to embrace edge computing and distributed applications, Akamai’s approach may prove compelling for developers looking for performance without compromise.
Guest: Ari Weil (LinkedIn)
Company: Akamai
Show: Cloud : Evolution





