Build Agentic Workflows: Expose API Orchestration as MCP Tools with Kong AI Gateway
Learn how to expose an API orchestration workflow as an MCP server using Kong AI Gateway, configure semantic guardrails, and build an agent with the Volcano SDK. We onboard GPT-4 behind /llm, orchestrate with DataKit, and debug MCP tools in Insomnia—end-to-end without adding server code.
What you’ll learn:
Onboard an LLM to Kong AI Gateway (/llm) and apply semantic guardrails
Build API orchestration with DataKit and expose it as an MCP server (Model Context Protocol)
Debug and call MCP tools with Insomnia’s MCP client
Implement a TypeScript agent with Volcano SDK to chain tool calls and LLM context
Prevent prompt injection and secret exfiltration (400 responses) via gateway policies
Learn more:
Kong: https://konghq.com
Kong Gateway: https://konghq.com/products/kong-ai-gateway
Insomnia: https://konghq.com/products/kong-insomnia
Subscribe for more developer-first demos and tutorials.
#KongGateway #AIGateway #AgenticAI #MCP #LLM #APISecurity #Insomnia #APIGateway #APIDevelopment