Guest: Carl Meadows (LinkedIn)
Company: OpenSearch Software Foundation
Show Name: The Source
Topics: Agentic AI, Observability
AI is redefining how enterprises search, analyze, and act on data — and OpenSearch is leading that transformation. With version 3.2, the open source platform adds agentic AI capabilities that promise more intelligent, efficient, and context-aware systems.
In this episode of TFiR: The Source, host Swapnil Bhartiya sits down with Carl Meadows, Chair of the Governing Board at the OpenSearch Software Foundation, to discuss the major innovations coming with OpenSearch 3.2 and what they mean for enterprises and developers.
Carl begins by reflecting on OpenSearch’s first year under the Linux Foundation, a move that created a neutral governance model and accelerated community contributions. The open model has encouraged companies of all sizes to build on the platform, driving rapid innovation across the ecosystem.
The centerpiece of the discussion is OpenSearch 3.2’s introduction of agentic AI capabilities — agentic search and agentic memory. These features are designed to make data systems more adaptive and context-aware. With agentic memory, developers can store and learn from context directly within OpenSearch instead of building separate back-end systems to manage memory. This simplifies architectures, reduces latency, and improves the overall experience for end users.
Carl points out that this approach is already showing real-world impact. One of the most notable examples comes from Adobe, where Acrobat AI Assistant uses OpenSearch as its engine. When users ask questions about a document, Acrobat instantly builds embeddings, interacts with OpenSearch, and delivers answers in real time — all within seconds. This seamless integration shows how AI-driven search can power user experiences at scale while managing data efficiently and securely.
Another key theme of the conversation is performance. As data volumes grow, enterprises face the challenge of maintaining real-time analytics without incurring massive costs. OpenSearch 3.2 addresses this through significant efficiency improvements, achieving nearly a 10x performance boost over previous versions. Carl emphasizes that speed and cost reduction are essential for supporting high-throughput, low-latency applications in fields like observability and security.
The discussion also highlights OpenSearch’s differentiation in a crowded AI and analytics market. Beyond being open source, it stands out for its scale, reliability, and proven enterprise performance — qualities reinforced by its use inside Amazon’s own massive workloads. Developers can trust OpenSearch to scale as their needs grow while maintaining the freedom and flexibility of an open platform.
Carl also shares his excitement about OpenSearch’s evolving Pipe Processing Language (PPL), which enables users to explore data iteratively through intuitive, SQL-like queries. The recent integration of Apache Calcite as a query planner has made PPL more powerful, supporting complex joins and operations that make data discovery faster and more interactive.
Looking ahead, Carl says OpenSearch will continue advancing performance, analytics, and vector capabilities. With agentic AI now part of the platform, OpenSearch is positioned to redefine how organizations handle search, observability, and AI-driven data exploration.
Here is the edited Q&A of the interview:
Swapnil Bhartiya: Enterprises heavily depend on search and observability, but the tools are often costly, rigid, or slow to keep up with innovations like AI. OpenSearch Software is taking a different path. It’s open source, community-driven, and now it’s AI-powered. With version 3.2 introducing agentic AI capabilities like agentic search and agentic memory, what does this mean for developers and enterprises?
Carl Meadows: Thanks for asking. When we transitioned OpenSearch to the Linux Foundation a year ago, we hoped neutral governance would make it easier for more contributors to participate. That’s exactly what’s happened. We’ve seen tremendous growth from both large and small companies, and it’s exciting to see the community expand so quickly.
Swapnil Bhartiya: With this 3.2 release, you’ve added experimental agentic AI capabilities. What do these bring to the platform?
Carl Meadows: Agentic memory can greatly enhance how agents interact. It stores and learns from context directly within OpenSearch, reducing complexity. Developers don’t need a separate backend for memory management — it’s all built in. This makes building AI-driven experiences simpler and faster.
Swapnil Bhartiya: Have you seen early use cases for these AI-powered features?
Carl Meadows: It’s early for 3.2, but I’m excited about what developers are building. For instance, Adobe shared at OpenSearchCon that Acrobat AI Assistant is powered by OpenSearch. When users ask questions in their documents, OpenSearch generates embeddings and delivers real-time answers. It’s fast, efficient, and transparent to the user.
Swapnil Bhartiya: What differentiates OpenSearch from proprietary search platforms?
Carl Meadows: The quality, reliability, and scale of OpenSearch are exceptional. Beyond being open source, it powers some of the largest workloads at Amazon. That gives developers confidence that it can scale with their business. Thousands of companies already rely on it.
Swapnil Bhartiya: What’s next for OpenSearch?
Carl Meadows: We’ll continue improving performance, analytics, and vector search capabilities. Expect lower latency, higher throughput, and more advanced AI-driven features. OpenSearch is evolving fast — and we’re just getting started.





