Systems | Development | Analytics | API | Testing

%term

Securely Query Confluent Cloud from Amazon Redshift with mTLS

Querying databases comes with costs—wall clock time, CPU usage, memory consumption, and potentially actual dollars. As your application scales, optimizing these costs becomes crucial. Materialized views offer a powerful solution by creating a pre-computed, optimized data representation. Imagine a retail scenario with separate customer and product tables. Typically, retrieving product details for a customer's purchase requires cross-referencing both tables.

Node.js v22: "Jod" Binaries Available

At NodeSource, we pride ourselves on delivering the best tools and open source support for Node.js users. Staying aligned with the latest releases is a key part of that commitment. With the latest Long-Term Support (LTS) release, Node.js 22 (codenamed "Jod"), we continue to ensure our users have access to reliable, secure, and up-to-date **Node.js binaries **through our NodeSource Binary Distributions.

EP 2: Beyond Just Data in Today's Market

Airports are an interconnected system where one unforeseen event can tip the scale into chaos. For a smaller airport in Canada, data has grown to be its North Star in an industry full of surprises. But in order for data to bring true value to operations–and ultimately customer experience–those data insights must be grounded in trust. Ryan Garnett, Senior Manager Business Solutions of Halifax International Airport Authority, joins The AI Forecast to share how the airport revamped its approach to data, creating a predictions engine that drives operational efficiency and improved customer experience.

Error Monitoring Across the SDLC with Mac Clark

Can your software handle the pressure when bugs slip through the cracks? In this episode of Test Case Scenario, Jason Baum and Evelyn Coleman chat with Mac Clark, Senior Solutions Engineer at Sauce Labs, about the dynamic world of shift-left and shift-right testing. Mac shares how gaming and software industries leverage AI-driven testing, real-time error monitoring, and feature flags to catch issues before they snowball into costly problems in production.

Cloudera AI Inference Service Enables Easy Integration and Deployment of GenAI Into Your Production Environments

Welcome to the first installment of a series of posts discussing the recently announced Cloudera AI Inference service. Today, Artificial Intelligence (AI) and Machine Learning (ML) are more crucial than ever for organizations to turn data into a competitive advantage. To unlock the full potential of AI, however, businesses need to deploy models and AI applications at scale, in real-time, and with low latency and high throughput. This is where the Cloudera AI Inference service comes in.

Fueling the Future of GenAI with NiFi: Cloudera DataFlow 2.9 Delivers Enhanced Efficiency and Adaptability

For more than a decade, Cloudera has been an ardent supporter and committee member of Apache NiFi, long recognizing its power and versatility for data ingestion, transformation, and delivery. Our customers rely on NiFi as well as the associated sub-projects (Apache MiNiFi and Registry) to connect to structured, unstructured, and multi-modal data from a variety of data sources – from edge devices to SaaS tools to server logs and change data capture streams.

Cloudera announces 'Interoperability Ecosystem' with founding members AWS and Snowflake

Today enterprises can leverage the combination of Cloudera and Snowflake—two best-of-breed tools for ingestion, processing and consumption of data—for a single source of truth across all data, analytics, and AI workloads. But now AWS customers will gain more flexibility, data utility, and complexity, supporting the modern data architecture.

Why Short-Lived Connections Are Killing Your Performance! | Kafka Developer Mistakes

Constantly starting and stopping Apache Kafka producers and consumers? That’s a recipe for high resource usage and inefficiency. Short-lived connections are heavy on resources, and can slow down your whole cluster. Keep them running to boost performance, cut latency, and get the most out of your Kafka setup.

Coverage Requirements Discovery with WSO2 Accelerator for Healthcare

In this video, we showcase how the Coverage Requirements Discovery (CRD) workflow integrates seamlessly into clinical actions, such as ordering a CT scan within Electronic Health Record (EHR) systems, using the WSO2 Healthcare solution. The demo setup demonstrated consists of two main components: Backend/CDS Server: Developed using the Ballerina language, this component implements the Clinical Decision Support (CDS) server API. Source Code Web Application: This is a demo application, Built with React, to showcase EHR functionality with necessary CDS workflows. Source Code.