Systems | Development | Analytics | API | Testing

Confluent

Shift Left: Bad Data in Event Streams, Part 2

Alright, I’m back. Time for part 2. In the first part, I covered how we handle bad data in batch processing. In particular, cutting out the bad data, replacing it, and running it again. But this strategy doesn’t work for immutable event streams as they are, well, immutable. You can’t cut out and replace bad data like you would in batch processed data sets.

Unlocking Data Value in the Age of AI and Data Streaming

Imagine getting into your car to head to work on a hot day. Your car already knows and sets the temperature, the ambient lighting, and the music you prefer. Not only that, it optimizes your route, and with Level 3 autonomy, it can even drive you there. But what does the automotive industry have to do on the backend in order to achieve this kind of personalization?

Spring Into Confluent Cloud with Kotlin - Part 2: Kafka Streams

After a short break, we’re back with Part 2 of this series on Spring Framework, Confluent Cloud, and the Kotlin language. Many organizations that write applications and microservices for the JVM have chosen Spring Framework, leveraging the many libraries available for features such as REST services, persisting data to a variety of datastores, and integration with messaging. These organizations have existing investments in building, testing, deploying, and monitoring applications using Spring.

Enhancing Security with IAM Roles in Confluent Managed Connectors

As cloud environments evolve, so must the security measures that protect them. With Confluent’s latest enhancement—AWS IAM role integration for managed connectors—you can now adopt temporary security credentials, reducing both the risk of long-term credential exposure and the operational burden of key management. This feature tightens security and simplifies access management for your data flows between AWS and Confluent Cloud.

Shift Left: Bad Data in Event Streams, Part 1

At a high level, bad data is data that doesn’t conform to what is expected. For example, an email address without the “@”, or a credit card expiry where the MM/YY format is swapped to YY/MM. “Bad” can also include malformed and corrupted data, such that it’s completely indecipherable and effectively garbage.

How Booking.com Used Data Streaming to Put Travel Decisions into Customer's Hands

Booking.com wanted to give people a “connected trip” experience, allowing customers to seamlessly book flights, accommodations, car rentals, and excursions in one visit. The company realized the value of data streaming early on in reaching this goal, but the operational effort had become overwhelming. Learn how Booking.com found the answer in Confluent’s data streaming platform. With its automated configuration that required no ongoing maintenance, the team was able to prioritize innovation with data and provide the comprehensive booking experience they had been searching for.

Your Guide to the Apache Flink Table API: An In-Depth Exploration

Apache Flink offers a variety of APIs that provide users with significant flexibility in processing data streams. Among these, the Table API stands out as one of the most popular options. Its user-friendly design allows developers to express complex data processing logic in a clear and declarative manner, making it particularly appealing for those who want to efficiently manipulate data without getting bogged down in intricate implementation details.

Expanding Confluent's Integration with Microsoft Azure: Create and Manage Confluent Resources Directly from the Azure Portal with Confluent's Fully Managed Connectors (Preview)

We are thrilled to announce yet another milestone in our integration capabilities with Microsoft Azure. Now, you can manage Confluent resources (Preview) directly from the Azure portal. This new capability not only simplifies the setup and management process but also empowers you to leverage the full potential of Confluent's data streaming platform on Azure.