Systems | Development | Analytics | API | Testing

AI in QA: What leading quality experts want every team to know

Our goal with the Tricentis blog is to distill insights that help QA professionals navigate the massive, AI-driven transformation happening across the software delivery landscape. To that end, I reached out to experts across Tricentis, from product and services to marketing and strategy, to hear what they’re really thinking about AI in QA right now. This group brings decades of experience building testing products, guiding enterprise transformations, and shaping how organizations adopt AI.

SpotCache: Scale AI-ready data without cloud-spend surprises

AI is changing how work gets done. But for many data leaders, it’s also creating a new challenge: managing the cloud bill. As more people (and more AI agents) query data, cloud data warehouse (CDW) spend can spike fast. Costs become harder to predict, and teams end up making tradeoffs—scaling AI insights or staying within budget. That tension creates a real bottleneck on the path to becoming AI-ready.

Powering agentic software quality with MCP servers | From the Bear Cave

In this From the Bear Cave session, Dan Faulkner, CEO of SmartBear, and Vineeta Puranik, CTO/CPO of SmartBear, discuss why agentic automation is becoming essential in software development and delivery, how MCP Server enables connected autonomy across the SDLC, and what this transformation means for business outcomes and the future of human + AI collaboration.

2026 Predictions: What's Next for Data Streaming and AI | Life Is But A Stream

AI isn’t just evolving—it’s reshaping who your customers are, how systems operate, and what real time really means. From machines making purchase decisions to agents increasing query volume across databases, the realities of 2026 are forcing leaders to rethink data architecture and governance strategies at a fundamental level. In this episode, Joseph is joined by Will LaForest (Field CTO, Confluent), Adi Polak (Director of Developer Advocacy & Experience, Confluent), and independent analyst, Sanjeev Mohan, to break down critical insights from Confluent’s 2026 Predictions Report.

Identity Passthrough for AI: Why Your LLM Needs to Know Who's Asking

When a user asks your AI assistant a question, who actually runs the database query? In most enterprise AI deployments, the answer is troubling: a shared service account with broad access to everything. The user's identity evaporates the moment their request enters the AI system. This architectural pattern creates security gaps, compliance failures, and data leakage risks that undermine enterprise AI adoption.

What Is MCP? Connecting AI Across the Software Delivery Lifecycle

AI promises speed and automation — but most teams are still stuck jumping between disconnected tools across development, testing, and operations. In this video, we introduce the Model Context Protocol (MCP) and how it enables AI assistants to securely access tools, systems, and real-time context across the software delivery lifecycle. MCP is the foundation of Perforce Intelligence, allowing AI to: The result: less friction, faster feedback, and AI that works with your existing systems — not around them.

Software Quality Gates: How Do They Work?

Shipping fast feels great – until something breaks in production. Sometimes, even solid-looking builds fail just because one small issue slipped through testing. That’s where software quality gates step in. They act as automated checks that stop risky code before it moves ahead in the pipeline. Rather than relying upon instinct, we rely on data – code coverage numbers, test results, and security signals.

Empowering Customers: The Role of Confluent's Trust Center

The foundation of every successful customer relationship is trust. At Confluent, we understand that for our customers and prospects to innovate with confidence, they must have complete trust in the security and integrity of our platform. Our commitment goes beyond simply providing a secure product. It’s about empowering our customers with the tools and transparency they need to feel confident in their data streaming architectures.