Systems | Development | Analytics | API | Testing

%term

Reach for the Clouds: A Crawl/Walk/Run Strategy with Kong and AWS - Part 2: Walk

Reach for the Clouds: A Crawl/Walk/Run Strategy with Kong and AWS - Part 2: Walk Brought to you by @KongInc Senior Partner Developer Danny Freese Welcome to the "Walk" stage of the Cloud Migration Journey, where we will take you through the second phase of the migration process. In this stage, we will show you how to de-risk and lift-and-shift your connections during the migration process to the cloud using Konnect and Kong Mesh.

Reach for the Clouds: A Crawl/Walk/Run Strategy with Kong and AWS - Part 3: Run

Reach for the Clouds: A Crawl/Walk/Run Strategy with Kong and AWS - Part 2: Walk Brought to you by @KongInc Senior Partner Developer Danny Freese In this video, we'll guide you through the "Run" stage of the cloud migration journey, the final step of our crawl-walk-run tutorial. By this point, you should have already deployed the monolith and Konnect runtime instance, onboarded the monolith to Konnect, deployed the Kong Mesh control plane and the on-prem mesh zone, and reconfigured the Konnect runtime-instance so that runtime-instance monolith communication occurs over the mesh.

Building GraphQL APIs with PostgreSQL: Top Developer Tools to Consider

Developers often build high-performing, scalable applications using GraphQL and PostgreSQL to define data structure and achieve reliability, scalability, and high performance. First, however, selecting the appropriate framework to simplify and streamline the development process is crucial while building a GraphQL API with PostgreSQL. This blog will explore the top tools for building GraphQL APIs with PostgreSQL, including Hasura, Postgraphile, Prisma, and GraphQL Nexus.

Building a Data-Centric Platform for Generative AI and LLMs at Snowflake

Generative AI and large language models (LLMs) are revolutionizing many aspects of both developer and non-coder productivity with automation of repetitive tasks and fast generation of insights from large amounts of data. Snowflake users are already taking advantage of LLMs to build really cool apps with integrations to web-hosted LLM APIs using external functions, and using Streamlit as an interactive front end for LLM-powered apps such as AI plagiarism detection, AI assistant, and MathGPT.

Using Dead Letter Queues with SQL Stream Builder

Cloudera SQL Stream builder gives non-technical users the power of a unified stream processing engine so they can integrate, aggregate, query, and analyze both streaming and batch data sources in a single SQL interface. This allows business users to define events of interest for which they need to continuously monitor and respond quickly. A dead letter queue (DLQ) can be used if there are deserialization errors when events are consumed from a Kafka topic.