AI Infrastructure

StarTree Launches AI-Native Innovations for Real-Time Data at Enterprise Scale

5G
0

StarTree has announced two major AI-native enhancements to its real-time analytics platform: Model Context Protocol (MCP) support and vector embedding model hosting. These features enable real-time Retrieval-Augmented Generation (RAG), agent-facing applications, and conversational querying at enterprise scale and speed.

MCP allows AI agents to access real-time data for decision-making beyond static knowledge, while vector auto embedding accelerates vector generation and ingestion for RAG use cases. MCP will be available in June 2025; vector auto embedding is expected in Fall 2025.

The platform now supports dynamic agent interactions, seamless natural language querying, and streamlined RAG pipelines—advancing the shift from static data stores to real-time, AI-driven engines.

StarTree also introduced Bring Your Own Kubernetes (BYOK), now in private preview. BYOK offers full infrastructure control for regulated industries and hybrid deployments, joining StarTree’s SaaS and Bring Your Own Cloud options.

These updates position StarTree to meet the demands of AI systems requiring sub-second latency, real-time context, and massive concurrency.

StarTree will highlight these innovations at the Real-Time Analytics Summit 2025 on May 14, featuring speakers from Uber, Netflix, AWS, and more.

What Happened Today May 2, 2025

Previous article

Google Prepares to Launch AI Mode in Search

Next article