Systems | Development | Analytics | API | Testing

Lenses

Lenses.io joins forces with Celonis to bring streaming data to business execution

Today, I’m thrilled to announce that Lenses.io is joining Celonis, the leader in execution management. Together we will raise the bar in how businesses are run by driving them with real-time data, making the power of streaming open, operable and actionable for organizations across the world. When Lenses.io began, we could never have imagined we’d reach this moment.

Introducing the Kafka to Celonis Sink Connector

Apache Kafka has grown from an obscure open-source project to a mass-adopted streaming technology, supporting all kinds of organizations and use cases. Many began their Apache Kafka journey to feed a data warehouse for analytics. Then moved to building event-driven applications, breaking down entire monoliths. Now, we move to the next chapter. Joining Celonis means we’re pleased to open up the possibility of real-time process mining and business execution with Kafka.

Lenses

Lenses ® is a DataOps platform that provides cutting edge visibility into data streams, integrations, processing, operations and enables ethical and secure data usage for the modern data-driven enterprise.

Event-Driven Architecture is unblocking data-driven decisions in shipping

In March 2021, a 200,000 tonne ship got stuck in the Suez Canal, and the global shipping industry suddenly caught the world’s attention. It made us realize ships play an important role in our daily lives. Really important in fact; 90% of the things we consume arrive by ship. Take a look at this map. By visualizing vessel routes over time, the pattern creates a map of the earth. Note the lack of vessels travelling close to the coast of Somalia where piracy is common.

Assessing security risks with Kafka audits

Suppose that you work for the infosec department of a government agency in charge of tax collection. You recently noticed that some tax fraud incident records went missing from a certain Apache Kafka topic. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data. But for Kafka in particular, this can prove challenging.

Increase compliance with Kafka audits

Suppose that you work for a government tax agency. You recently noticed that some tax fraud incident records have been leaked on the darknet. This information is held in a Kafka Topic. The incident response team wants to know who has accessed this data over the last six months. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data to respond to this kind of situation.

How to create a Kafka topic (the safe way)

We live in a dynamic world. It is safe to say that companies aim to speed up time-to-market and out-innovate their competition with Kafka, but at the same time struggle with some limitations. These can range from compliance-related setbacks for regulations such as GDPR, CCPA and HIPAA, to self-service slip-ups that could see a whole Kafka cluster going down. Even something as seemingly innocuous as configuring and creating a Kafka Topic can lead to operational U-turns, slowdowns and even downtime.

Lenses magnified: Enhanced, secure, self-serve developer experience for Kafka

In our world of streaming applications, developers are forever climbing a steep learning curve to stay successful with technologies such as Apache Kafka. There is no end to the debt and the detail you need to manage when it comes to Kafka - and particularly since it doesn’t come with guardrails to help you out, the stakes for making mistakes are high.