Systems | Development | Analytics | API | Testing

Cloudera

Fine-Tuning a Foundation Model for Multiple Tasks

In this video we discuss the reasons why fine-tuning is needed to create mroe contextual accurate LLMs, and the methods that you can do to accomplish this. We also give a demo of our newest Applied ML Prototype (AMP) which demonstrates how to implement LLM fine-tuning jobs that make use of the QLoRA and Accelerate implementations available in the PEFT open-source library from Huggingface and an example application that swaps the fine-tuned adapters in real time for inference targetting different tasks. Learn more at cloudera.com#ai #ml.

Revolutionize Your Data Experience With Cloudera on Private Cloud

In the age of the AI revolution, where chatbots, generative AI, and large language models (LLMs) are taking the business world by storm, enterprises are fast realizing the need for strong data control and privacy to protect their confidential and commercially sensitive data, while still providing access to this data for context-specific AI insights.

A Hybrid, Open Data Lakehouse That Can Handle it All

Open more possibilities with an open data lakehouse. As the industry’s first open data lakehouse, Cloudera Data Platform delivers scalable performance and efficiency to enable smarter business decisions—paving the safest, fastest path from data to AI. Cloudera, together with Intel and HPE, can power analytics and AI at scale across any CSP or private cloud infrastructure.

Red Hat + Cloudera | A Hybrid Data Platform for Generative AI for FSI

Red Hat and Cloudera have joined forces to enable customers to take advantage of the cloud with full confidence, especially in the financial services industry, where data protection is critical. Red Hat Payment Industry Lead, Ramon Villarreal describes how collaborating with Cloudera provides leading financial services organizations with data resiliency, performance and expedited time to market as they leverage the cloud to move and manipulate massive amounts of data.

How Financial Services and Insurance Streamline AI Initiatives with a Hybrid Data Platform

With the emergence of new creative AI algorithms like large language models (LLM) fromOpenAI’s ChatGPT, Google’s Bard, Meta’s LLaMa, and Bloomberg’s BloombergGPT—awareness, interest and adoption of AI use cases across industries is at an all time high. But in highly regulated industries where these technologies may be prohibited, the focus is less on off the shelf generative AI, and more on the relationship between their data and how AI can transform their business.

Expanding Possibilities: Cloudera's Teen Accelerator Program Completes Its Second Year

At Cloudera, we’re known for making innovative technological solutions that drive change and impact the world. Our mission is to make data and analytics easy and accessible to everyone. And that doesn’t end with our customer base. We also aim to provide equitable access to career opportunities within data and analytics to the workforce of tomorrow.

Apache Ozone Odyssey | Exploring the Future of Scalable Storage with Apache Ozone.

This collaborative meetup was designed to bring together individuals interested in exploring the basics of Apache Ozone. Expert Ozone developer Nandakumar Vadivelu will guide you through the basics of setting up and configuring Ozone, as well as highlighting its key features and benefits. Begining with an overview of Apache Ozone's fundamentals, diving into its architecture and core components. This session is perfect for those who are new to Ozone or want to explore its potential as a highly scalable and efficient storage solution.

Installing MiNiFi agents has never been so easy!

This video walks you through one of the new features coming with Edge Flow Manager 1.6.0: the one-line installer command. Did you ever think that installing a MiNiFi (C++ or Java) agent was complicated? Did you ever struggle with generating and configuring the certificates for mTLS communication between the agents and Edge Flow Manager?

Deploying an LLM ChatBot Augmented with Enterprise Data

The release of ChatGPT pushed the interest in and expectations of Large Language Model based use cases to record heights. Every company is looking to experiment, qualify and eventually release LLM based services to improve their internal operations and to level up their interactions with their users and customers. At Cloudera, we have been working with our customers to help them benefit from this new wave of innovation.

The Art of Data Leadership | A discussion with Synchrony's Head of Provisioning, Ram Karnati

Did you know there are 1.4 million open tech jobs, but global educational institutions only produce 400,000 qualified individuals annually to fill them? In our third episode of The Art of Data Leadership, Synchrony’s Ram Karnati believes the answer lies with #AI, “Day-to-day coding is going to get easier; AI is going to take care of it. So, the next phase of talent transformation will not be focused on being the best coder out there. People will start to look for generalists who also understand business.”