Systems | Development | Analytics | API | Testing

October 2024

Cloudera and Snowflake Partner to Deliver the Most Comprehensive Open Data Lakehouse

In August, we wrote about how in a future where distributed data architectures are inevitable, unifying and managing operational and business metadata is critical to successfully maximizing the value of data, analytics, and AI. One of the most important innovations in data management is open table formats, specifically Apache Iceberg, which fundamentally transforms the way data teams manage operational metadata in the data lake.

The Evolution of LLMOps: Adapting MLOps for GenAI

In recent years, machine learning operations (MLOps) have become the standard practice for developing, deploying, and managing machine learning models. MLOps standardizes processes and workflows for faster, scalable, and risk-free model deployment, centralizing model management, automating CI/CD for deployment, providing continuous monitoring, and ensuring governance and release best practices.

Cloudera Lakehouse Optimizer Makes it Easier Than Ever to Deliver High-Performance Iceberg Tables

The open data lakehouse is quickly becoming the standard architecture for unified multifunction analytics on large volumes of data. It combines the flexibility and scalability of data lake storage with the data analytics, data governance, and data management functionality of the data warehouse.

Deploy and Scale AI Applications With Cloudera AI Inference Service

We are thrilled to announce the general availability of the Cloudera AI Inference service, powered by NVIDIA NIM microservices, part of the NVIDIA AI Enterprise platform, to accelerate generative AI deployments for enterprises. This service supports a range of optimized AI models, enabling seamless and scalable AI inference.