This simple
It leverages a utility function convertMcpToLangchainTools()
from
@h1deya/langchain-mcp-tools
.
This function handles parallel initialization of specified multiple MCP servers
and converts their available tools into an array of LangChain-compatible tools
(StructuredTool[]
).
LLMs from Anthropic, OpenAI and Groq are currently supported.
A python version of this MCP client is available here
- Node.js 16+
- npm 7+ (
npx
) to run Node.js-based MCP servers - [optional]
uv
(uvx
) installed to run Python-based MCP servers - API keys from
Anthropic, OpenAI, and/or Groq as needed.
-
Install dependencies:
npm install
-
Setup API keys:
cp .env.template .env
- Update
.env
as needed. .gitignore
is configured to ignore.env
to prevent accidental commits of the credentials.
- Update
-
Configure LLM and MCP Servers settings
llm_mcp_config.json5
as needed.- The configuration file format
for MCP servers follows the same structure as
Claude for Desktop, with one difference: the key name mcpServers
has been changed tomcp_servers
to follow the snake_case convention commonly used in JSON configuration files. - The file format is
JSON5, where comments and trailing commas are allowed. - The format is further extended to replace
${...}
notations with the values of corresponding environment variables. - Keep all the credentials and private info in the
.env
file and refer to them with${...}
notation as needed.
- The configuration file format
for MCP servers follows the same structure as
Run the app:
npm start
Run in verbose mode:
npm run start:v
See commandline options:
npm run start:h
At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.
Example queries can be configured in llm_mcp_config.json5