This simple
It leverages a utility function convertMcpToLangchainTools() from
@h1deya/langchain-mcp-tools.
This function handles parallel initialization of specified multiple MCP servers
and converts their available tools into an array of LangChain-compatible tools
(StructuredTool[]).
LLMs from Anthropic, OpenAI and Groq are currently supported.
A python version of this MCP client is available here
- Node.js 16+
- npm 7+ (
npx) to run Node.js-based MCP servers - [optional]
uv(uvx) installed to run Python-based MCP servers - API keys from
Anthropic, OpenAI, and/or Groq as needed.
-
Install dependencies:
npm install
-
Setup API keys:
cp .env.template .env
- Update
.envas needed. .gitignoreis configured to ignore.envto prevent accidental commits of the credentials.
- Update
-
Configure LLM and MCP Servers settings
llm_mcp_config.json5as needed.- The configuration file format
for MCP servers follows the same structure as
Claude for Desktop, with one difference: the key name mcpServershas been changed tomcp_serversto follow the snake_case convention commonly used in JSON configuration files. - The file format is
JSON5, where comments and trailing commas are allowed. - The format is further extended to replace
${...}notations with the values of corresponding environment variables. - Keep all the credentials and private info in the
.envfile and refer to them with${...}notation as needed.
- The configuration file format
for MCP servers follows the same structure as
Run the app:
npm startRun in verbose mode:
npm run start:vSee commandline options:
npm run start:hAt the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.
Example queries can be configured in llm_mcp_config.json5