Systems | Development | Analytics | API | Testing

Data Streaming

An Introduction to Apache Kafka Consumer Group Strategy

Ever dealt with a misbehaving consumer group? Imbalanced broker load? This could be due to your consumer group and partitioning strategy! Once, on a dark and stormy night, I set myself up for this error. I was creating an application to demonstrate how you can use Apache Kafka® to decouple microservices. The function of my “microservices” was to create latte objects for a restaurant ordering service.

Data Streaming Cheat Sheet and Checklist | Data Streaming Systems

Thank you for watching this course. We have a few additional resources for you to dig deeper and be fully equiped to start your data in motion journey: a comprehensive cheat sheet with a check list of what you need to verify before going to production and a sneak preview of what we saved for the follow-up course.

Cloud Kafka Resiliency and Fault Tolerance | Data Streaming Systems

Learn how to manage cloud volatility when running applications on Confluent Cloud. Understand how to optimally configure Kafka client for resilient cloud operations and explore error handling patterns in Kafka Streams. Leverage concepts like idempotent producers and consumers, and exactly one processing semantics.

Current '23 Keynote: Streaming into the Future - The Evolution & Impact of Data Streaming Platforms

Jay Kreps (Confluent Co-Founder and CEO), Shaun Clowes (Confluent CPO), and data streaming leaders from organizations like NASA, Warner Brothers, and Notion explore the past, present, and future of data streaming. They will address two key questions: how can organizations integrate data across their applications to deliver better experiences, and how can they embed data and analytics into every part of the business to drive better decision-making?

Top 6 Reasons to Modernize Legacy Messaging Infrastructure

Traditional messaging middleware like Message Queues (MQs), Enterprise Service Buses (ESBs), and Extract, Transform and Load (ETL) tools have been widely used for decades to handle message distribution and inter-service communication across distributed applications. However, they can no longer keep up with the needs of modern applications across hybrid and multi cloud environments for asynchronicity, heterogeneous datasets and high volume throughput.

Practical Data Mesh: Building Decentralized Data Architectures with Event Streams

Why a data mesh? Predicated on delivering data as a first-class product, data mesh focuses on making it easy to publish and access important data across your organization. An event-driven data mesh combines the scale and performance of data in motion with product-focused rigor and self-service capabilities, putting data at the front and center of both operational and analytical use-cases.

Confluent unveils Apache Flink® on Confluent Cloud, making it easier to build real-time applications with stream processing on a unified platform

Confluent launches the industry's only serverless, cloud-native Flink service to simplify building high-quality, reusable data streams. Confluent expands Stream Governance capabilities with Data Portal, so teams can easily find all the real-time data streams in an organisation. New Confluent Cloud Enterprise offering lowers the cost of private networking and storage for Apache Kafka.