Volcano SDK:
Kong and its Agents
Volcano SDK:
Kong and its Agents
In the world of Artificial Intelligence development, the challenge has moved from simple text generation to the creation of autonomous agents capable of taking concrete actions in the real world. These agents must be able to interact with APIs, query databases, create documents, send emails/notifications, orchestrate complex tasks, and so on.
This is where complexity arises: how do you build such systems so they are robust, maintainable, and ready for real production environments?
This is the question Kong, a well-known name in the API management field, tries to answer with the release of Volcano SDK: an open-source framework in TypeScript born out of an internal need to simplify the creation of multi-step, multi-provider AI workflows, without sacrificing essential production features.
Unlike heavier, all-in-one frameworks, Kong’s SDK focuses on one clear goal: providing lightweight and intuitive APIs to chain calls to different large language models (LLM) and integrate them with external tools.
Its philosophy relies on several key pillars:
The real elegance of Volcano SDK shows in the code. Let’s start with their sample: imagine you want to create an agent that first determines the astrological sign of a birth date (using an external MCP tool), then writes a short horoscope.
Supported providers include: OpenAI, Anthropic, Mistral, Llama, AWS Bedrock, Google Vertex, Azure AI, and you can create custom providers as well.
We’ll add some comments to the code to make certain points clearer:
import { agent, llmOpenAI, mcp } from "volcano-sdk";
// 1. Define default LLM model
const llm = llmOpenAI({
apiKey: process.env.OPENAI_API_KEY!,
model: "gpt-4o-mini",
});
// 2. Set the address of our MCP tool
const astroTool = mcp("http://localhost:3211/mcp");
// 3. Build the "chainable" workflow
const results = await agent({ llm })
.then({
prompt: "Determine the astrological sign for 1993-07-11.",
mcps: [astroTool] // The agent automatically discovers and uses the tool
})
.then({
// Context (i.e. astrological sign) is automatically passed
prompt: "Now write a one-line fortune for that sign."
})
.run();
console.log(results[1].llmOutput);
// Sample output: "A creative wave is coming; ride it to new professional heights."As you can see, with just a few lines of code we've created a two-step workflow that integrates an external tool and manages context between steps. The complexity of discovering which tool function to call and how to pass parameters is handled automatically by the framework.
Of course, no tool is perfect in every scenario. As a technology partner, our job is to assess solutions with a critical and pragmatic eye. Here are some strengths and aspects to watch out for when choosing this framework.
Kong’s Volcano SDK shouldn’t be seen as a universal replacement for all other solutions, but rather an (additional) powerful, streamlined, production-grade solution for the increasingly pivotal challenge in AI development: reliable orchestration of multi-provider agents. Don’t be fooled by its simplicity: under the hood are robust features that help you go from prototype to scalable business service.
Choosing the right framework always depends on the specific project context, the team’s skills, and business priorities.
If you are designing the next generation of agent-based AI applications and need a partner to navigate this technological complexity, Volcanic Minds is here to help you build solid, high-performance, future-proof solutions.
You can consult the following resources for further information:
Publication date: October 22, 2025
Last updated: October 22, 2025