At Open Source Summit in Denver, AWS offered a sharp lens into the rapidly evolving world of agentic AI—highlighting emerging protocols, open source infrastructure, and their own developer tooling.
Remember when APIs revolutionized software development? We’re at a similar moment with AI agents. David Nalley, AWS Director of Developer Experience, sees the parallel clearly: “We’re kind of at a similar state right now in AI,” he says, referencing the early web days when protocols like HTTP and DNS laid the foundation for everything that followed.
Antje Barth, Principal Developer Advocate at AWS, explained how we’ve evolved from basic chat interactions to autonomous agents capable of reasoning, planning, and taking actions based on dynamic inputs. “You give it an end-user input, and the agent is able to reason and take actions in a much more autonomous way,” she noted.
This transformation also requires access to tools and APIs, along with internal company data. As systems become more distributed and specialized, multi-agent collaboration is critical—and this is where new communication protocols like A2A come in.
The Agent2Agent (A2A) Protocol
AWS has joined other industry leaders in contributing to A2A, a protocol that enables inter-agent communication. “We participated in the A2A announcement and have an employee on the steering committee,” said Nalley. “We’re excited to see a de facto standard emerging—one that can accelerate how agents collaborate.”
The A2A protocol lets agents advertise their capabilities via agent cards and enables dynamic discovery to route requests to the right entity. While still in its early days, Nalley said the move to the Linux Foundation signals promising traction.
Strands: An SDK for Agent Builders
To support developers, AWS recently open sourced the Strands Agents SDK—a lightweight, flexible toolkit for rapidly prototyping agents. Antje emphasized its simplicity: “You give it a model and a tool, and then you can start prompting it. It’s designed to make it really easy to build agents with just a few lines of code.”
Strands supports local testing with Llama and scalable deployments via Amazon Bedrock, Anthropic, or Meta APIs. It also features native support for MCP (Model Control Protocol) and LiteLLM for broader model access.
Why the name Strands? “We let the AI name itself,” said Antje. “It chose ‘Strands,’ as in the strands of DNA—linking the model with tools.”
Valkey and the Bundled Future of Caching
Valkey, the Redis-compatible in-memory database project, has also seen rapid evolution. “The Valkey Bundle just shipped, combining the core engine with all key modules to simplify adoption,” said Nalley.
More than just feature-packed, Valkey represents an open, community-driven alternative that’s gaining real traction. “I remain amazed at the pace of innovation and how welcoming the community has been,” Nalley added.
Measuring Open Source Health: Pony and Elephant Factors
AWS tracks project health using two internal metrics: Pony Factor and Elephant Factor. “Pony Factor tells us how many developers account for 50% of contributions. A healthy project has a high pony factor—meaning it’s not reliant on just one person,” Nalley explained.
Elephant Factor reflects corporate diversity—how many different companies contribute significantly. “A low elephant factor suggests monoculture risk, while a high one signals project resilience.”
What’s Next
Looking ahead, AWS continues to monitor A2A, MCP, and foundational tools like PGVector, Jupyter, and SciPy. Nalley’s advice to developers: “Now is the time to start paying attention to agents. Build your first agent, connect it to data with MCP, then make it talk to another agent with A2A.”
As Antje concluded: “Start building, but also start sharing. Let’s grow this space together.”





