Cloud Native

How Akamai is helping companies deliver AI-powered applications for the distributed edge

0

Guest: Ari Weil (LinkedIn)
Companies: Akamai (Twitter
Show: Let’s Talk

Akamai, a pioneer in the content delivery network (CDN) and security space, has spent decades developing a globally distributed edge network. The company acquired Linode a few years ago to expand its edge footprint. While much of the tech world focuses on centralized cloud solutions, the edge is where some of the most exciting things and use cases are emerging. Akamai calls it a continuum of compute that spans from edge to cloud.

In this episode of Let’s Talk #AI, Ari Weil, VP of Product Marketing at Akamai, discusses the expansion of Akamai’s edge network for low-latency compute, the role of AI in hyper-personalization and security, and partnerships with companies like Neural Magic and Nvidia. Weil also highlights real-world AI use cases, including personalized car shopping and AI-driven computer vision in insurance. When people think about integrating or using AI, “they want it to be fast, magical, and personalized to their needs,” says Weil.

Weil’s return to Akamai and market evolution over the years

  • Weil explains that he rejoined Akamai with the purpose of helping to drive the expansion of their edge network, a foundational system that has been in place for over 25 years. Weil notes that the edge has become increasingly critical as digital demands grow.
  • Akamai has placed a particular focus on implementing security at the edge, targeting locations closer to users where requests are generated and where potential security breaches or attacks may occur, enhancing protection at the network’s periphery.
  • Weil highlights that customers from a variety of industries are increasingly asking for more specialized compute use cases. These requests often go beyond the standard offerings provided by hyperscaler cloud services.
  • Akamai’s strategy focuses on building a connected cloud platform that delivers a full range of compute resources, empowering developers with tools for low-latency, high-performance computing, essential for seamless, latency-sensitive user experiences.

Akamai’s strategic approach to artificial intelligence integration

  • Weil explains that while AI is often overhyped in the media, it remains essential for businesses. Companies must develop strategies to harness AI’s capabilities and adapt to its evolving applications.
  • Weil notes that most companies will likely choose to consume large language models provided by commercial vendors rather than going through the process of building and training their own models, as this can be resource-intensive and complex.
  • Weil describes various AI-driven applications, such as intelligent chatbots used for customer interactions, hyper-personalized experiences in commerce, and enhanced gaming services that respond dynamically to player behavior in real time.
  • Akamai’s security operations are also utilizing AI technologies, particularly to identify and combat malicious activities driven by AI. This includes detecting the difference between human users and bot-generated traffic.

Understanding the role of AI inference in real-time applications

  • Weil breaks down AI inference by explaining that while a large language model (LLM) serves as a knowledge base, inference is the process of applying that knowledge in real time to make quick decisions based on user inputs or prompts.
  • Weil elaborates saying that inference involves taking a model that has already been trained and using it to produce results or actions instantly, drawing from the learned data to provide relevant responses or outcomes in specific contexts.
  • Weil distinguishes between the “heavyweight” processing required during the model training phase and the “lightweight” processing used during inference. This allows the application of knowledge to happen efficiently and in real time.

How AI-powered applications impact infrastructure at scale

  • Weil discusses how AI-powered applications are changing the requirements for infrastructure, explaining that hyperscalers will continue to build expansive data centers to handle the training and management of large language models, which require significant computational resources.
  • Weil identifies proximity as a key challenge, as users who access compute platforms from distant locations experience higher latency and encounter issues with traffic congestion, which negatively impacts performance.
  • Companies are employing strategies like multi-cloud and hybrid deployments, as well as edge caching, to distribute infrastructure more efficiently and bring resources closer to end-users, reducing latency and improving overall responsiveness.
  • Weil stresses the importance of distributed infrastructure, noting that addressing both the speed and the complexity of compute processes is essential to meeting the growing demand for AI-powered services.

Akamai’s partner network and its role in AI delivery

  • Weil talks about Akamai’s partner network saying they collaborate closely with major cloud platforms to provide flexibility and scalability for businesses seeking AI solutions. These partnerships enable Akamai to extend its services beyond its own infrastructure.
  • Key partnerships include collaborations with companies like Neural Magic, which helps optimize AI models by sparsifying them, Nvidia, which provides custom GPUs designed for intensive AI workloads, and NETINT, which supports video processing at the edge.
  • Weil emphasizes the importance of developer tools and integrations with existing services, highlighting that Akamai’s partnership strategy is focused on offering an open platform. This allows customers to choose the best software or partner for their specific needs.

Real-world applications of Akamai’s AI technology

  • Weil shares real-world examples of how customers are using Akamai’s technology for AI-driven workloads, highlighting an automotive customer utilizing AI to deliver a highly personalized car shopping experience.
  • Weil also discusses how AI is being employed in the insurance industry, specifically for computer vision applications. AI models can analyze user-generated content, such as photos of damaged property, and provide prompts for the next steps in the claims process.
  • Another use case Weil mentions is in social media, where AI is used to enhance content recommendations by analyzing user interactions and ensuring compliance with regional laws and regulations regarding content distribution.

This summary was written by Emily Nicholls.

Who are the key players in the evolving energy sector? Christophe Villemer

Previous article

Impact of licensing shifts in open source projects | Mike Dolan

Next article