Systems | Development | Analytics | API | Testing

AI

5 Prerequisites to Consider While Building Trustworthy AI

Artificial intelligence (AI) is advancing at a breakneck pace, and it is swiftly emerging as a potential disruptor and vital enabler for almost every business in every sector. At this point, the technology itself is not a barrier to the mainstream use of AI; rather, it a collection of challenges that, unfortunately, are far more human, such as ethics, governance, and human values, are.

Simplify Data Integration with Artificial Intelligence

Data-driven decision-making is fundamental for any business that wants to thrive in today’s cut-throat environment. In fact, there is enough evidence today that proves that data-driven decision-making powered by artificial intelligence (AI) platforms can help businesses expedite their operations, thus saving valuable time and money. Such decisions involve leveraging past information to predict the challenges and opportunities that await an enterprise in the future.

AI at Scale isn't Magic, it's Data - Hybrid Data

A recent VentureBeat article , “4 AI trends: It’s all about scale in 2022 (so far),” highlighted the importance of scalability. I recommend you read the entire piece, but to me the key takeaway – AI at scale isn’t magic, it’s data – is reminiscent of the 1992 presidential election, when political consultant James Carville succinctly summarized the key to winning – “it’s the economy”.

How to Accelerate HuggingFace Throughput by 193%

Deploying models is becoming easier every day, especially thanks to excellent tutorials like Transformers-Deploy. It talks about how to convert and optimize a Huggingface model and deploy it on the Nvidia Triton inference engine. Nvidia Triton is an exceptionally fast and solid tool and should be very high on the list when searching for ways to deploy a model. Our developers know this, of course, so ClearML Serving uses Nvidia Triton on the backend if a model needs GPU acceleration.

Is AI/ML Transforming the Banking Industry

Artificial Intelligence (AI) is quite powerful and is constantly evolving and currently knows no bounds. It is focused on outperforming its limits using the power of Machine Learning (ML). AI is empowering computers to do things that human beings are unable to do efficiently and effectively and machine learning is aiding the computers to do so by breaking the rules of traditional programming.

How to Do Data Labeling, Versioning, and Management for ML

It has been months ago when Toloka and ClearML met together to create this joint project. Our goal was to showcase to other ML practitioners how to first gather data and then version and manage data before it is fed to an ML model. We believe that following those best practices will help others build better and more robust AI solutions. If you are curious, have a look at the project we have created together.

A Guide to Principal Component Analysis (PCA) for Machine Learning

Principal Component Analysis (PCA) is one of the most commonly used unsupervised machine learning algorithms across a variety of applications: exploratory data analysis, dimensionality reduction, information compression, data de-noising, and plenty more. In this blog, we will go step-by-step and cover: Before we delve into its inner workings, let’s first get a better understanding of PCA. Imagine we have a 2-dimensional dataset.