Analytics

Unify your data: AI and Analytics in an Open Lakehouse

Cloudera customers run some of the biggest data lakes on earth. These lakes power mission-critical, large-scale data analytics and AI use cases—including enterprise data warehouses. Nearly two years ago, Cloudera announced the general availability of Apache Iceberg in the Cloudera platform, which helps users avoid vendor lock-in and implement an open lakehouse. With an open data lakehouse powered by Apache Iceberg, businesses can better tap into the power of analytics and AI.

Bringing Financial Services Business Use Cases to Life: Leveraging Data Analytics, ML/AI, and Gen AI

The financial services industry is undergoing a significant transformation, driven by the need for data-driven insights, digital transformation, and compliance with evolving regulations. In this context, Cloudera and TAI Solutions have partnered to help financial services customers accelerate their data-driven transformation, improve customer centricity, ensure compliance with regulations, enhance risk management, and drive innovation.

Future-Proofing Your App: Strategies for Building Long-Lasting Apps

The generative AI industry is changing fast. New models and technologies (Hello GPT-4o) are emerging regularly, each more advanced than the last. This rapid development cycle means that what was cutting-edge a year ago might now be considered outdated. The rate of change demands a culture of continuous learning and technological adaptation.

Transforming Enterprise Operations with Gen AI - MLOp Live #29 with McKinsey

In this webinar we discussed the transformative impact of gen AI on enterprise operations, spotlighting advancements across manufacturing, supply chain and procurement. We covered the main gen AI use cases, challenges to be mindful of during implementation and key learnings from client projects; highlighting three main pillars –people, processes and technology.

Solving the Dual-Write Problem: Effective Strategies for Atomic Updates Across Systems

The dual-write problem occurs when two external systems must be updated in an atomic fashion. A classic example is updating an application’s database while pushing an event into a messaging system like Apache Kafka. If the database update succeeds but the write to Kafka fails, the system ends up in an inconsistent state. However, the dual-write problem isn’t unique to event-driven systems or Kafka. It occurs in many situations involving different technologies and architectures.

Retail Media's Business Case for Data Clean Rooms Part 2: Commercial Models

In Part 1 of “Retail Media’s Business Case for Data Clean Rooms,” we discussed how to (1) assess your data assets and (2) define your data structures and permissions. Once you have a plan on paper, you can begin sizing the data clean room opportunity for your business.