Systems | Development | Analytics | API | Testing

%term

A Guide to Automated Data Governance: Importance & Benefits

Automated data governance is a relatively new concept that is fundamentally altering data governance practices. Traditionally, organizations have relied on manual processes to ensure effective data governance. This approach has given governance a reputation as a restrictive discipline. But, as organizations increasingly adopt automation in their governance processes, this perception is changing.

What is an AI Gateway?

In this session, Marco Palladino, CTO and co-founder of Kong, discusses the importance of AI gateways in supporting the growing demand for AI and API usage. He explains how Kong's AI Gateway simplifies the creation of AI applications by providing a unified infrastructure layer that abstracts common requirements, such as authentication, authorization, traffic control, and observability.

Down the AI Rabbit Hole: Leveraging AI in Your Projects Without Ending Up Lost in Wonderland

Generative AI is transforming the world around us, and is quickly becoming a part of the conversation as we greenfield new features and applications. It is very alluring to deliver AI features into our existing products, and think about new projects we might build around AI. However, you might have already found that the journey into the realm of AI often feels like tumbling down the rabbit hole into wonderland - a maze of complexity and uncertainty.

Where Does Data Governance Fit Into Hybrid Cloud?

At a time when artificial intelligence (AI) and tools like generative AI (GenAI) and large language models (LLMs) have exploded in popularity, getting the most out of organizational data is critical to driving business value and carving out a competitive market advantage. To reach that goal, more businesses are turning toward hybrid cloud infrastructure – with data on-premises, in the cloud, or both – as a means to tap into valuable data.

Ensuring the performance of your Kafka-dependent applications

In today’s data-driven world, Apache Kafka has emerged as an essential component in building real-time data pipelines and streaming applications. Its fault tolerance, scalability, and ability to handle high throughput makes it a great choice for businesses handling high volumes of data.

Ingest Data Faster, Easier and Cost-Effectively with New Connectors and Product Updates

The journey toward achieving a robust data platform that secures all your data in one place can seem like a daunting one. But at Snowflake, we’re committed to making the first step the easiest — with seamless, cost-effective data ingestion to help bring your workloads into the AI Data Cloud with ease. Snowflake is launching native integrations with some of the most popular databases, including PostgreSQL and MySQL.

Addressing the Elephant in the Room - Welcome to Today's Cloudera

Hadoop. The first time that I really became familiar with this term was at Hadoop World in New York City some ten or so years ago. There were thousands of attendees at the event – lining up for book signings and meetings with recruiters to fill the endless job openings for developers experienced with MapReduce and managing Big Data. This was the gold rush of the 21st century, except the gold was data.

QA Testing Best Practices

Today, as businesses invest approximately 23% of their annual IT budget in QA and testing, the field of QA is undergoing a transformative shift. QA teams are often tasked with developing comprehensive test plans based on application development methodologies, architecture styles, frameworks, and other factors. However, for QA teams to develop better-quality software, they need to have the right mindset rather than simply enforcing rigid review processes.

Software Test Estimation & 6 Techniques

Software testing evolved from a simple debugging activity in the 1950s to becoming integral to software development with advanced testing tools and test estimation techniques. As a C-level executive or business developer, ensuring your teams provide accurate QA effort estimates is crucial. This precision influences the project outcome and bolsters your credibility with clients. Underestimating QA efforts can lead to potential underperformance and unclear requirements.