Systems | Development | Analytics | API | Testing

Lenses

Lenses 5.3: Robust Kafka with single click topic backup/restore

Navigating the intricacies of Apache Kafka just got a lot more intuitive. With Lenses 5.3 we bring you peace of mind, regardless of where you are in your Kafka journey. Our newest release is all about smoothing out the bumps, and making sure you're equipped to handle Kafka's challenges with confidence. Here's a sprinkle of what's in store, ahead of our big 6.0 release later this year.

A single-click Kafka topic backup experience

We like to reduce the most mundane, complex and time-consuming work associated with managing a Kafka platform. One such task is backing up topic data. With a growing reliance on Kafka for various workloads, having a solid backup strategy is not just a nice-to-have, but a necessity. If you haven’t backed up your Kafka and you live in fear of disaster striking, worry no more.

Mission-critical data flows with the open-source Lenses Kafka Connector for Amazon S3

An effective data platform thrives on solid data integration, and for Kafka, S3 data flows are paramount. Data engineers often grapple with diverse data requests related to S3. Enter Lenses. By partnering with major enterprises, we've levelled up our S3 connector, making it the market's leading choice. We've also incorporated it into our Lenses 5.3 release, boosting Kafka topic backup/restore.

Overview of Cloud storage for your data platform

One of the most important questions in architecting a data platform is where to store and archive data. In a blog series, we’ll cover the different storage strategies for Kafka and introduce you to Lenses’ S3 Connector for backup/restore. But in this first blog, we must introduce the different Cloud storage options available. Later blogs will focus on specific solutions, explain in more depth how this maps to Kafka and then how Lenses manage your Kafka topic backups.

Lenses 5.2: for a healthier Kafka with less manual effort

Kafka adoption is growing fast. Very fast. At Lenses, we’re pushing out new features to increase developer productivity, reduce manual effort & improve the cost & hygiene of operating your Kafka platform. Only a few weeks since Lenses 5.1, yet here we are again with more goodies in our release 5.2.

Lenses 5.1 - A 1st class ticket to be event-driven in AWS

Hello again. We strive to improve the productivity of developers building event-driven applications on the technology choices that best fit your organization. AWS continues to be a real powerhorse for our customers. Not just for running the workloads, but in supporting them with their native services: MSK Kafka, MSK Connect and now increasingly Glue Schema Registry. This is bringing a strong alternative to Confluent and their Kafka infrastructure offerings.

Secret rotation for Kafka Connect connectors with AWS Secret Manager

With version 5.1, Lenses is now offering enterprise support for our popular open-source Secret Provider to customers. In this blog, we’ll explain how secrets for Kafka Connect connectors can be safely protected using Secret Managers and walk you through configuring the Lenses S3 Sink Connector with the Lenses Secret Provider plugin and AWS Secret Manager.

The Glue Schema that binds Apache Kafka

With increased applications developed by different engineering teams on Kafka comes increased need for data governance. JSON is often used when streaming projects bootstrap but this quickly becomes a problem as your applications iterate, changing the data structures with add new fields, removing old and even changing data formats. It makes your applications brittle, chaos ensues as downstream consumers fall over due to miss data and SREs curse you.

Producing Protobuf data to Kafka

Until recently, teams were building a small handful of Kafka streaming applications. They were usually associated with Big Data workloads (analytics, data science etc.), and data serialization would typically be in AVRO or JSON. Now a wider set of engineering teams are building entire software products with microservices decoupled through Kafka. Many teams have adopted Google Protobuf as their serialization, partly due to its use in gRPC.

Lenses 5.0: The developer experience for mass Kafka adoption

Kafka is a ubiquitous component of a modern data platform. It has acted as the buffer, landing zone, and pipeline to integrate your data to drive analytics, or maybe surface after a few hops to a business service. More recently, though, it has become the backbone for new digital services with consumer-facing applications that process live off the stream. As such, Kafka is being adopted by dozens, (if not hundreds) of software and data engineering teams in your organization.