Lenses

London, UK
2016
  |  By Guillaume Aymé
Buckle up, we’re past the AI hype. Now, it’s about making intelligent systems that act on our behalf. In 2025, AI isn’t just a tool– it’s becoming our core way of operating, powered by real-time data. How we stream, manage and monetize that data will define the next generation of business. Here, we zoom into four examples of what autonomous real-time intelligence could look like in the coming year.
  |  By Drew Oetzel
The need to democratize and share data inside and outside your organization, as a real-time data stream, has never been more in demand. Treating real-time data as a product, and adopting Data Mesh practices, is the way forward. Here, we explain the concept through a real-life example of an airline building applications that process data across different domains.
  |  By Drew Oetzel
Democratizing and sharing data inside and outside your organization, as a real-time data stream, has never been more in demand. Treating data as-a-product and adopting Data Mesh practices is leading the way. Here, we explain the concept through a real-life example of an airline building applications that process data across different domains.
  |  By Andrew Stevenson
Organizations today face complex data challenges as they scale, with more distributed data architectures and a growing number of teams building streaming applications. They will need to implement Data Mesh principles for sharing data across business domains, ensure data sovereignty across different jurisdictions and clouds, and maintain real-time operations.
  |  By Guillaume Aymé
Every enterprise is modernizing their business systems and applications to respond to real-time data. Within the next few years, we predict that most of an enterprise's data products will be built using a streaming fabric – a rich tapestry of real-time data, abstracted from the infrastructure it runs on. This streaming fabric spans not just one Apache Kafka cluster, but dozens, hundreds, maybe even thousands of them.
  |  By Andrew Stevenson
As Kafka evolves in your business, adopting best practices becomes a must. The GitOps methodology makes sure deployments match intended outcomes, anchored by a single source of truth. When integrating Apache Kafka with GitOps, many will think of Strimzi. Strimzi uses the Operator pattern for synchronization. This approach, whilst effective, primarily caters to Kubernetes-based Kafka resources (e.g. Topics). But this isn’t ideal.
  |  By Alex Durham
It was lovely to see so many of the community and hear about the latest data streaming initiatives at Kafka Summit this year. We always try to distill the sea of content from the industry’s premier event into a digestible blog post. This time we’ll do it slightly differently and summarize some broader learnings, not only from the sessions we saw, but the conversations we had across the two days.
  |  By Andrew Stevenson
In this age of AI, the demand for real-time data integration is greater than ever. For many, these data pipelines should no longer be configured and deployed by centralized teams, but distributed, so that each owner creates their flows independently. But how to simplify this, whilst practicing good software and data governance? We are introducing Lenses 5.5.
  |  By Guillaume Aymé
If 2023 was the year we woke up to how generative AI would change our world, 2024 is the year we realize the change. The real-time AI-driven enterprise may not be pixel-perfect yet, but we’re well on the way. Gen AI has a knock-on effect on all the trends and challenges we will see in 2024. Here’s our take.
  |  By Mateus Henrique Cândido de Oliveira
Amazon S3 is a standout storage service known for its ease of use, power, and affordability. When combined with Apache Kafka, a popular streaming platform, it can significantly reduce costs and enhance service levels. In this post, we’ll explore various ways S3 is put to work in streaming data platforms.
  |  By Lenses
Software should be more than a product. For us, it is about the experience of working with real-time data. Meet the new Developer Experience.
  |  By Lenses
Lenses 6.0 is now available as a Community Edition for you to use for free forever. You can connect it to up to 2 Kafka clusters, or if you prefer, it has a localhost for local development. Simply run the docker command here: And follow the steps in the video to get started.
  |  By Lenses
Lenses can now help you to work with data streams on one screen, regardless of how you select your streaming data architecture. This is how we see autonomy in data streaming.
  |  By Lenses
With the new branding, we’ve also redefined how developers work with real-time data and data architectures. Lenses 6 is a new version of Developer Experience designed to empower developers to operate data seamlessly across multiple clusters and environments. With Global SQL Studio. This is what we mean by Autonomy in Data Streaming.
  |  By Lenses
How can engineers enable real-time insights when working with high-throughput, data-intensive streams? In this 30-minute session, Imply and Lenses.io show you how to Enable self-service access for developers working with critical, high-velocity data flows in #apache #kafka Ingest and normalize complex data structures, enabling real-time analytics at scale via modern databases like #druid.
  |  By Lenses
As a developer, you want to deploy a new Kafka connector without bothering the platform admin. How? Here you'll learn how to integrate Kafka connectors with your CI/CD toolchain, and manage your connectors as-code with Lenses.
  |  By Lenses
DataOps is the art of progressing from data to value in seconds. For us, its all about making data operations as easy and fast as using the email.
  |  By Lenses
Apache Kafka is a popular and powerful component of modern data platforms. But it's complicated. Complicated to run, complex to manage and crucially - it's near impossible to drive Kafka adoption from the command line across your organization. So here's your how-to for seeing it through to production (... and possibly fame and fortune). We cover key principles for Kafka observability and monitoring.
  |  By Lenses
Lenses, a DataOps platform, accelerates time to value, opening up data streams to a wide audience. Lenses enables rapid construction and deployment of data pipelines at scale, for enterprises, with governance and security.
  |  By Lenses
Lenses, a DataOps platform, accelerates time to value, opening up data streams to a wide audience. Lenses enables rapid construction and deployment of data pipelines at scale, for enterprises, with governance and security.

Lenses ® is a DataOps platform that provides cutting edge visibility into data streams, integrations, processing, operations and enables ethical and secure data usage for the modern data-driven enterprise.

Accelerate time to value, open up data in motion to a wide audience, enable rapid construction and deployment of data pipelines at scale, for enterprises, with governance and security.

Why Lenses?

  • Confidence in production: Everyone’s scared of the dark. Lenses gives you visibility into your streams and data platform that you’ve never had. That means you don’t need to worry running in production.
  • Keeping things simple: Life’s hard enough without having to learn new languages and manage code deployments. Build data flows in minutes with just SQL. Reduce the skills needed to develop, package and deploy applications.
  • Being more productive: Build, debug and deploy new flows in minutes not days/weeks. In fact, many of our customers build and deploy apps to production 95% faster.

25,000+ Developers Trust Lenses for data operations over Kafka.