Hazelcast Upgrades its Real-Time Data Processing Platform to Version 5.4
Hazelcast has launched version 5.4 of its real-time data processing platform, introducing enhancements aimed specifically at optimizing operational and artificial intelligence (AI) workloads.
The Hazelcast platform serves as a real-time intelligent application environment, featuring both open-source and enterprise editions. Its architecture fuses a high-speed data store with stream processing capabilities, making it suitable for data analytics, business intelligence, and increasingly, machine learning (ML) and AI applications. As businesses rapidly adopt AI for latency-sensitive decision-making, the new 5.4 update enhances Hazelcast's core features to address the complex data processing needs of production AI pipelines. Notable clients include JPMorgan Chase, Volvo, New York Life, and Target.
Key Advancements in Version 5.4
“It’s another step in our multi-year leadership in supporting AI workloads for large organizations,” said Kelly Herrell, CEO of Hazelcast. “For AI to deliver value, the data processing infrastructure must perform reliably, and that’s where our strengths lie.”
Consistency in Real-Time Data Processing
Hazelcast processes data in real-time as it streams into the system. In modern, highly available data systems where multiple nodes work together, maintaining data consistency can be challenging.
“Data consistency is a hard problem,” Herrell noted. “We’ve had a strong consistency subsystem for years, and our customers have rigorously tested it.”
With the evolving demands of AI-powered applications, maintaining data consistency has become even more urgent. The new 5.4 release introduces an advanced CP (Consistency Provider) subsystem that creates a strongly consistent in-memory data layer, informed by the CAP theorem (Consistency, Availability, Partitions) for managing consistency in distributed clusters.
Furthermore, Hazelcast 5.4 features an innovative Thread-Per-Core (TPC) architecture that enhances computation performance by 30% through improved threading capabilities.
“Most developers understand that prioritizing consistency can slow the system down, which is a common trade-off,” Herrell explained. “By integrating advanced consistency with TPC, we minimize this trade-off, ensuring top performance alongside strong consistency.”
Tiered Storage: Meeting AI’s Data Requirements
The Hazelcast platform's in-memory data processing capability is essential, but modern AI and ML workloads often demand extensive storage that exceeds in-memory limits. This is where the new tiered storage capability becomes invaluable, offering varying performance levels for real-time data processing.
“In the AI realm, the hunger for data is insatiable,” Herrell stated. “Holding everything in memory can be costly. Tiered storage allows users to scale their storage solutions to effectively handle AI and ML workloads within one cohesive environment.”
Accelerating AI for Fraud Detection
Hazelcast's platform has been leveraged across diverse AI and ML applications, particularly in fraud detection.
Herrell highlighted the use of the Hazelcast platform by a major credit card company for real-time fraud detection. When a credit card is swiped, the payment terminal quickly verifies approval status. This approval decision must occur in less than 50 milliseconds.
“In that brief window, we process six discrete ML algorithms for fraud detection and generate a composite score, providing a well-informed response on whether the transaction should be approved,” Herrell explained.
In summary, Hazelcast 5.4 elevates real-time data processing capabilities, offering enhanced consistency and storage solutions crucial for AI workloads, paving the way for faster decision-making in various applications, including fraud detection.