Systems | Development | Analytics | API | Testing

Kafka

New Kafka governance: approval flows & app topology

It feels like only yesterday that we announced Lenses 3.1. The release was focused around helping developers be more productive when developing real-time data applications. Since then, our customers are onboarding new applications and new users onto their data platform at a faster rate than ever. And with more apps and users come stricter and tougher requirements for compliance and governance.

Splunk your Kafka with SQL

Here at Lenses.io, we’re focused on making data technologies such as Apache Kafka and Kubernetes as accessible to every organization as possible. It’s part of our DataOps vision and company DNA. Lenses is built by developers, for developers. We understand the headaches they live with and the challenges they face seemingly have to learn a new data technology every few months. We believe that’s just not the right model.

That 3am security call about Apache Kafka...

If you have worn the Platform or Security Engineer badge, or if you have a Sec/Ops role, you might have experienced something like this at some point in your career. Hopefully not. You receive a call at 3am, it’s your SOC, something’s not right. Oh sh*t! There’s unidentified traffic on the network from an unknown host and it’s communicating with a remote server. Sounds like a Level 3 exfiltration. It’s gonna be a long night.

Staying on top of your Kafka with email alerts from Lenses.io

Imagine being out on a Saturday morning, drinking coffee and catching some sun when a message on Slack from a colleague tells you about an incident on your Kafka infrastructure. Fortunately, you are able to identify the problem and correct it from where you are but wouldn't it be good to know about that Kafka broker going down at the time that this has happened? Is there a way to prevent that from happening again? Yes, there is: Lenses alerts.

Secure your Kafka Connect connections with Azure Key Vault

Kafka Connect is a great framework for moving data in and out of Kafka. But doing so requires connections with some of your most important data crown jewels. Customer information held in MongoDB, audits in a S3 bucket, payroll information in an Oracle database. Connecting to these data stores requires authentication.

Apache Kafka Example: How Rollbar Removed Technical Debt - Part 2

April 7th, 2020 • By Jon de Andrés Frías In the first part of our series of blog posts on how we remove technical debt using Apache Kafka at Rollbar, we covered some important topics such as: In the second part of the series, we’ll give an overview of how our Kafka consumer works, how we monitor it, and which deployment and release process we followed so we could replace an old system without any downtime.