Redis, the real-time data platform, and Tecton, the enterprise feature store company, have announced a partnership and a product integration that enables low-latency, highly scalable and cost-effective serving of features to support operational Machine Learning (ML) applications.
With the new integration, Tecton customers now have the option to use Redis Enterprise Cloud as the online store for their ML features. According to the company, Redis Enterprise Cloud provides 3x faster serving latencies compared to Amazon DynamoDB, while reducing the cost per transaction by up to 14x. This enables organizations to support more demanding ML use cases, such as recommendations and search ranking.
The Tecton feature store is a central hub for ML features, the real-time data signals that power ML models. Tecton allows data teams to define features as code using Python and SQL. Tecton then automates ML data pipelines, generates accurate training datasets and serves features online for real-time inference. With Tecton, data teams can build features collaboratively using DevOps engineering best practices and share features across models and use cases. New features can be delivered in minutes without the need to build bespoke data pipelines.
Redis Enterprise Cloud is a cost-effective, fully managed Database-as-a-Service (DBaaS) available as a hybrid and multi-cloud solution. Built on a serverless concept, Redis Enterprise Cloud simplifies and automates database provisioning on the leading cloud service providers: AWS, Microsoft Azure and Google Cloud. Designed for modern distributed applications, Redis Enterprise Cloud delivers sub-millisecond performance at a virtually infinite scale.
This allows developers and operations teams to build intelligent, high-performance, scalable and resilient applications faster using Redis native data structures and modern data models with low-latency retrieval necessary for online stores