AI/MLCloud Native ComputingDevelopersDevOpsFeaturedLet's TalkVideo

Stream Processing Helps Organizations Maximize The Value Of Their Data | Manish Devgan


Guest: Manish Devgan (LinkedIn)
Company: Hazelcast (Twitter)
Show: Let’s Talk

We live in a real-time world where just a split second can mean the difference between a closed sale or an abandoned cart. While many companies are starting to transition to streaming data, it can require a mix of technology, tools, and a cultural shift to make a successful transition. Hazelcast, the unified data platform, is helping organizations make that leap.

In this episode of TFiR: Let’s Talk, Manish Devgan, Chief Product Officer at Hazelcast, talks about the maturity model of stream processing. Devgan gives us an overview of Hazelcast and how it helps people leverage real-time data, and the shift from batch processing to stream processing. He also touches on the security aspects many companies are facing and where AI comes into the picture.

Key highlights from this video interview:

  • Devgan gives us an overview of Hazelcast, a unified data platform that allows you to leverage real-time data from a storage, compute, and streaming analytics perspective. The platform forms a base to enable developers to build real-time applications in verticals like financial services and retail.
  • Stream processing enables businesses to leverage the most value out of their data in real time, both in motion and data in rest, in order to create hyper-personalized customer experiences that generate more opportunities for the business.
  • Devgan explains that we are living in a real-time economy and although historically it has been FAANG companies who are leveraging real-time data, every company is becoming a real-time business. He talks about the shift from processing data in batches and how real-time data can be used for generating real-time offers or for tackling fraud.
  • Devgan talks about the importance of security since many of the applications their customers are working with are mission-critical. He discusses how AI and off-the-shelf tools are being used for online fraud attempts and how AI and ML are being used to counteract these attempts.
  • Generative AI is a hot topic right now and Devgan talks about how their customers are using generative AI to create synthetic data. He explains how synthetic data can train AI models for better accuracy. He talks about how tools like ChatGPT have carved the way for predictive AI.
  • Devgan gives us some insight into how much adoption he is seeing with real-time data. He tells us that while there are a lot of opportunities, they also come with threats that need to be countered using a real-time data platform like Hazelcast. He talks about how they help customers who already have a system in place with plug-in-type capabilities.
  • If you are processing data in batches, you will already be collecting lots of real-time data but this approach leads to delayed actions. He shares how their customer, BNP Paribas utilizes real-time data to increase the loan origination by 400% by creating real-time loan offers based on the customers’ credit history while the customer uses an ATM.
  • Devgan shares his advice for how customers can get started with their real-time streaming data journey in terms of the tech stack, including real-time messaging, a real-time data store, stream processing, and real-time machine learning.
  • While there are technologies and tools to help organizations embrace streaming data, mindset also plays a significant role. Devgan talks about the world’s shift toward an automated, real-time world.

This summary was written by Emily Nicholls.