Guest: Ari Weil
Companies: Akamai
Show Name: An Eye on AI
Topic: Edge Computing, Kubernetes
The generative AI boom taught enterprises how to add chatbots. But as Google’s zero-click searches reshape how users discover content, and as fraud detection and supply chain optimization demand real-time intelligence, companies face a harder question: how do you move from experimental AI interfaces to production-grade, personalized applications that actually solve business problems?
Ari Weil, VP of Product Marketing at Akamai, addresses this evolution head-on. In a conversation exploring Akamai’s acquisition of Fermyon, Weil explained how edge computing combined with WebAssembly is fundamentally changing what AI applications can do—and where they need to run.
From Chatbots to Real AI: The Developer Experience Gap
For years, enterprises hitting the limits of traditional serverless platforms faced a familiar problem: language constraints. When developers wanted to build in their preferred language and existing CDN platforms couldn’t accommodate them, the conversation stalled. Translation layers felt wrong. Workarounds created complications.
“That’s where the developer piece was always a challenge,” Weil explained. “When our answer was no, it’s really a challenge to suggest how we would create a translation layer, because you start to talk about complications and things that to a developer just don’t feel right.”
The Fermyon team changed that equation. Their developer advocacy approach—going directly into customer environments, assessing problems, guiding solutions, and feeding insights back to engineering—created a feedback loop that improved both the platform and the developer experience. This isn’t just about enabling more languages. It’s about removing friction from the entire development cycle so teams can focus on solving business problems rather than fighting infrastructure.
The Zero-Click Search Problem Retailers Can’t Ignore
The shift toward zero-click searches—where Google’s Gemini or other LLMs answer queries directly without sending users to websites—creates an existential challenge for retailers and content publishers. If users never click through, how do you maintain visibility? How do you rank well when the LLM decides what’s valuable?
Weil outlined a sophisticated strategy that requires rethinking content delivery at the edge. Instead of assembling website pieces at the edge just to serve fast pages to crawlers, organizations need to fully render applications server-side, use distributed data management rather than simple caching, and make intelligent routing decisions based on who’s asking.
“As somebody comes into our proxy, what should I serve?” Weil asked. “Should I serve something optimized for a search engine robot—fast, dynamic, personalized—that gets me ranked well? Or should I serve a richer customer experience?”
This requires creating AI-generated video content, conversational interfaces, and other signals that feed LLMs with legitimacy markers. The goal isn’t gaming search engines—it’s providing the kind of rich, authoritative content that LLMs surface in zero-click results. Running this intelligence at the edge, with WebAssembly-based functions making routing decisions in real-time, becomes critical.
Beyond Static Content: The Live Application Shift
The broader transformation Weil describes extends well beyond search optimization. Organizations are rethinking entire application architectures. Fraud detection systems need to make decisions in milliseconds. Supply chain optimization requires processing distributed data where it’s generated. Payment systems demand security and speed simultaneously.
“It’s moving us from delivery of static content and optimizing networking to distribution of live applications and figuring out where inference needs to run,” Weil said.
This shift explains why Akamai’s acquisition of Fermyon matters beyond adding another tool to the platform. WebAssembly’s portability, combined with Fermyon’s developer-friendly spin framework, enables applications to execute logic close to users while maintaining flexibility across languages and environments. The edge becomes not just a caching layer but an application runtime capable of intelligent decision-making.
What This Means for Enterprise Architecture
For enterprise architects evaluating edge platforms, Weil’s insights point to several critical considerations. First, developer experience cannot be an afterthought. If your platform forces workarounds or translation layers, you’ll lose valuable development time and create technical debt.
Second, the use cases driving edge adoption are evolving rapidly. What started as performance optimization for content delivery now encompasses AI inference, real-time personalization, fraud detection, and preparing for a search landscape dominated by LLMs rather than blue links.
Third, the flexibility to run applications across distributed infrastructure while maintaining security and performance requires rethinking compute models. Traditional containers and serverless functions weren’t designed for this level of distribution. WebAssembly’s security sandbox and near-instant cold start times solve problems that older technologies struggle with.
As AI workloads push compute closer to users and zero-click searches reshape content discovery, the edge computing category is maturing from a performance enhancement into a fundamental application platform. Organizations that treat it as just faster CDNs will miss the opportunity. Those that embrace distributed, intelligent applications—powered by developer-friendly tools like Fermyon’s WebAssembly runtime—will be positioned to compete in a world where inference happens everywhere and decisions must be made in real-time.





