Open Source

How Agentic AI Is Transforming Search and Analytics | Carl Meadows, OpenSearch Software Foundation

0

Guest: Carl Meadows (LinkedIn)
Company:  OpenSearch Software Foundation
Show Name: The Source
Topics: Agentic AI, Observability

Search and observability are the backbone of enterprise systems, but most tools still struggle to balance cost, performance, and flexibility. OpenSearch, now marking its first anniversary under the Linux Foundation, is redefining this balance. With its 3.2 release, the project introduces agentic AI features that bring intelligence, efficiency, and adaptability directly into open-source search infrastructure.

Carl Meadows, Chair of the Governing Board of the OpenSearch Software Foundation, joined Swapnil Bhartiya to discuss how the open-source project is evolving from a fast-growing community into a critical engine for AI-driven innovation.

When OpenSearch transitioned to the Linux Foundation, the goal was to create a neutral home where contributors from across the ecosystem could collaborate freely. “We hoped that moving it into this neutral governance would make it easier for many more contributors to actively participate in the project,” Meadows said. “That’s exactly what’s happened — we’ve seen large growth from both big and small companies.”

The 3.2 release marks a major step forward for OpenSearch. It introduces experimental agentic AI capabilities — agentic search and agentic memory — designed to help enterprises move beyond traditional static search and toward more intelligent, context-aware systems.

“Think about agentic memory,” Meadows explained. “It can greatly enhance the quality of experience with agents by allowing them to store and learn from context within OpenSearch itself. You don’t need a separate backend to manage memory or sessions. Everything happens in one system — simple and powerful.”

By embedding context learning directly into the search engine, OpenSearch reduces the friction of building AI-powered applications. Developers can now store embeddings, manage conversational sessions, and retrieve information without complex integrations. For enterprise teams that are adopting AI at scale, this means faster development cycles, better performance, and lower infrastructure costs.

One of the most exciting examples comes from Adobe. As Meadows highlighted, the Acrobat AI Assistant uses OpenSearch to deliver real-time responses within documents. “When you ask a question in Acrobat, it builds embeddings on the fly and uses OpenSearch to find and retrieve relevant answers — all within seconds,” he said. “That’s the power of combining agents, memory, and real-time search in one platform.”

Beyond AI, OpenSearch 3.x is also focused on performance and efficiency — two critical challenges as enterprises deal with exponentially growing volumes of data. “The world is generating more and more data, and getting real-time analytical access to it can be expensive,” Meadows said. “Our focus is on scaling both performance and cost efficiency. In the 3.x branch, we’ve achieved nearly a 10x improvement over the 2.x releases.”

This performance leap positions OpenSearch to serve not just search use cases but also large-scale observability and security workloads that demand millisecond latency and horizontal scalability. “No data engine ever went wrong by getting faster or cheaper,” Meadows joked.

OpenSearch’s pipe processing language (PPL) is another area of significant advancement. Originally introduced around 2020, it allows users to perform iterative data discovery using an intuitive syntax — especially valuable in observability and security analytics. “We’ve added Apache Calcite as a query planner,” Meadows said. “Now PPL supports complex joins and operators, making it as powerful as any analytical query language out there.”

This evolution reflects OpenSearch’s broader mission: making sophisticated data and AI capabilities accessible through open source. As enterprises race to integrate LLMs and vector search into their workflows, OpenSearch offers a flexible, community-driven foundation that rivals commercial offerings in both reliability and scale.

“The quality, scale, and reliability of OpenSearch are incredible,” Meadows noted. “At AWS, we run very large workloads on top of Amazon OpenSearch Service — some of the largest in the world. That’s proof of how robust the technology really is.”

Looking ahead, Meadows expects continued investment in analytical performance, vector search, and real-time AI capabilities. “We’re going to keep improving performance, lowering latency, and enhancing throughput. That benefits everything from observability to security analytics,” he said. “We’re also evolving vector search techniques to improve relevancy, efficiency, and cost. The themes you see today will only get stronger.”

The combination of open governance, technical innovation, and enterprise-grade reliability has positioned OpenSearch as one of the most significant open-source platforms in modern data infrastructure. With agentic AI now entering the mix, it’s paving the way for intelligent, adaptive systems that learn, respond, and scale — without sacrificing transparency or freedom.

You’re the Principal Now: Unpacking the True Cost of AI Agency

Previous article

Seamless, Secure, and Open: How Akamai and Harmonic See the Future of Content Delivery | Ari Weil & Jean Macher

Next article