Systems | Development | Analytics | API | Testing

Can ChatGPT Replace Google Search?

Since ChatGPT introduced the ability to display website names and clickable links to sources, it has generated a lot of buzz in the web community. This new feature enables users to get answers that directly link to websites for suggestions, services, and businesses. As of late 2024 and early 2025, some business owners have reported not only receiving web traffic from ChatGPT but also converting these visits into leads and sales through their web forms.

Optimizing Supply Chains with Data Streaming and Generative AI

It’s a truism that global supply chains are complex. The process of sourcing raw materials, transforming them into finished products, and distributing them to customers encompasses numerous systems (e.g., ERPs, WMSs, and TMSs). All systems within “the supply chain” are trending in the same direction; they’re aiming to be more efficient, resilient, and agile. Various technological developments have facilitated this directional trend.

Benchmarking llama.cpp on Arm Neoverse-based AWS Graviton instances with ClearML

By Erez Schnaider, Technical Product Marketing Manager, ClearML In a previous blog post, we demonstrated how easy it is to leverage Arm Neoverse-based Graviton instances on AWS to run training workloads. In this post, we’ll explore how ClearML simplifies the management and deployment of LLM inference using llama.cpp on Arm-based instances and helps deliver up to 4x performance compared to x86 alternatives on AWS. (Want to run llama.cpp directly?

SwiftKV from Snowflake AI Research Reduces Inference Costs of Meta Llama LLMs up to 75% on Cortex AI

Large language models (LLMs) are at the heart of generative AI transformations, driving solutions across industries — from efficient customer support to simplified data analysis. Enterprises need performant, cost-effective and low-latency inference to scale their gen AI solutions. Yet, the complexity and computational demands of LLM inference present a challenge. Inference costs remain prohibitive for many workloads. That’s where SwiftKV and Snowflake Cortex AI come in.

The AI Tipping Point: What Manufacturing Leaders Need to Know for 2025

AI is proving that it’s here to stay. While 2023 brought wonder, and 2024 saw widespread experimentation, 2025 will be the year that manufacturing enterprises get serious about AI's applications. But it’s complicated: AI proofs of concept are graduating from the sandbox to production, just as some of AI’s biggest cheerleaders are turning a bit dour.

Announcing Strategic Distribution Partnerships to Scale AI

As we head into 2025, Qlik is taking a significant step forward in the evolution of our go-to-market approach by placing an even greater emphasis on our partnerships. This move is aimed at capturing the growing market opportunity in data integration, data quality, analytics and AI.

Vector Databases: Why QA professionals Needs to Care About them in the Age of AI? | Toni Ramchandani

In the rapidly evolving age of AI, vector databases have become the backbone of modern systems, revolutionizing the way high-dimensional data is managed and queried. In this insightful session, Toni Ramchandani explores why QA professionals must adapt their skills and approaches to meet the unique demands of vector databases. Traditional testing methods fall short in addressing challenges like similarity search, vector indexing, and performance optimization.

Event-Driven AI: Building a Research Assistant with Kafka and Flink

This post was originally published on Medium on Nov. 20, 2024. The rise of agentic AI has fueled excitement around agents that autonomously perform tasks, make recommendations, and execute complex workflows blending AI with traditional computing. But creating such agents in real-world, product-driven environments presents challenges that go beyond the AI itself.

Deploy AI Infrastructure in 2025: Serverless GPUs, Autoscaling, Scale to Zero, and More!

We’re on a mission to simplify application deployment for developers and businesses worldwide, whether they're AI-driven models, full stack applications, APIs, or databases. Our next-generation serverless platform significantly accelerates your deployments and improves efficiency, enabling you to build more with less spend. 2024 was a major year for us, packed with crucial serverless milestones.

EP 6: To Prevent the Artificial Charlatan, Data Management Has to be Fun

The AI explosion has led to non-stop hype cycles as the technology continues to develop. But AI is only as good as the data behind it. The threat of lousy data is bad AI. Andrew Brust, Founder and CEO of Blue Badge Insights, joins The AI Forecast to discuss the AI hype–and how to prevent what he calls an “artificial Charlatan” of bad AI. He emphasizes the dependent relationship between data and AI and the former’s role in the success of the latter. Specifically, he addresses the data governance conundrum, and why in order for data technology to be successful, it has to be fun.