mcp-use

MCP.Pizza Chef: zandko

mcp-use is a TypeScript client library that enables seamless integration of LangChain.js-compatible large language models (LLMs) with MCP servers. It facilitates building powerful, flexible AI agents with dynamic tool access and multi-server support, all in an open-source environment. This client empowers developers to connect any LLM to various MCP servers, enabling workflows involving web browsing, file operations, 3D modeling, and more, without relying on closed-source dependencies.

Use This MCP client To

Connect LangChain.js LLMs to multiple MCP servers dynamically Build AI agents with real-time tool access and multi-server support Integrate web browsing capabilities into AI workflows Enable file operations through LLM-driven commands Create custom AI workflows combining diverse MCP servers Develop open-source AI agents without vendor lock-in Facilitate multi-step reasoning across different MCP servers

README

Unified MCP Client Library

npm version License Code style: ESLint GitHub stars

🌐 MCP Client is the open-source way to connect any LLM to any MCP server in TypeScript/Node.js, letting you build custom agents with tool access without closed-source dependencies.

πŸ’‘ Let developers easily connect any LLM via LangChain.js to tools like web browsing, file operations, 3D modeling, and more.


✨ Key Features

Feature Description
πŸ”„ Ease of use Create an MCP-capable agent in just a few lines of TypeScript.
πŸ€– LLM Flexibility Works with any LangChain.js-supported LLM that supports tool calling.
🌐 HTTP Support Direct SSE/HTTP connection to MCP servers.
βš™οΈ Dynamic Server Selection Agents select the right MCP server from a pool on the fly.
🧩 Multi-Server Support Use multiple MCP servers in one agent.
πŸ›‘οΈ Tool Restrictions Restrict unsafe tools like filesystem or network.
πŸ”§ Custom Agents Build your own agents with LangChain.js adapter or implement new adapters.

πŸš€ Quick Start

Requirements

  • Node.js 22.0.0 or higher
  • npm, yarn, or pnpm (examples use pnpm)

Installation

# Install from npm
npm install mcp-use
# LangChain.js and your LLM provider (e.g., OpenAI)
npm install langchain @langchain/openai dotenv

Create a .env:

OPENAI_API_KEY=your_api_key

Basic Usage

import { ChatOpenAI } from '@langchain/openai'
import { MCPAgent, MCPClient } from 'mcp-use'
import 'dotenv/config'

async function main() {
  // 1. Configure MCP servers
  const config = {
    mcpServers: {
      playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
    }
  }
  const client = MCPClient.fromDict(config)

  // 2. Create LLM
  const llm = new ChatOpenAI({ modelName: 'gpt-4o' })

  // 3. Instantiate agent
  const agent = new MCPAgent({ llm, client, maxSteps: 20 })

  // 4. Run query
  const result = await agent.run('Find the best restaurant in Tokyo using Google Search')
  console.log('Result:', result)
}

main().catch(console.error)

πŸ“‚ Configuration File

You can store servers in a JSON file:

{
  "mcpServers": {
    "playwright": {
      "command": "npx",
      "args": ["@playwright/mcp@latest"]
    }
  }
}

Load it:

import { MCPClient } from 'mcp-use'
const client = MCPClient.fromConfigFile('./mcp-config.json')

πŸ”„ Multi-Server Example

const config = {
  mcpServers: {
    airbnb: { command: 'npx', args: ['@openbnb/mcp-server-airbnb'] },
    playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
  }
}
const client = MCPClient.fromDict(config)
const agent = new MCPAgent({ llm, client, useServerManager: true })
await agent.run('Search Airbnb in Barcelona, then Google restaurants nearby')

πŸ”’ Tool Access Control

const agent = new MCPAgent({
  llm,
  client,
  disallowedTools: ['file_system', 'network']
})

πŸ‘₯ Contributors

Zane/
Zane

πŸ“œ License

MIT Β© Zane

mcp-use FAQ

How does mcp-use integrate with LangChain.js?
mcp-use acts as a client library that connects LangChain.js-compatible LLMs to MCP servers, enabling dynamic tool access and multi-server support in TypeScript/Node.js environments.
Can mcp-use connect to multiple MCP servers simultaneously?
Yes, mcp-use supports multi-server connections, allowing AI agents to access various tools and data sources concurrently.
Is mcp-use limited to specific LLM providers?
No, mcp-use is provider-agnostic and can connect any LangChain.js-compatible LLM, including those from OpenAI, Claude, and Gemini.
Does mcp-use require closed-source dependencies?
No, mcp-use is fully open-source, ensuring transparency and flexibility for developers.
What programming languages and environments does mcp-use support?
mcp-use is built for TypeScript and Node.js environments, making it ideal for modern JavaScript-based AI development.
How does mcp-use handle tool access for AI agents?
It provides dynamic tool access by connecting LLMs to MCP servers exposing various functionalities like web browsing, file operations, and 3D modeling.
Can mcp-use be used to build custom AI agents?
Yes, it is designed to help developers build powerful, flexible AI agents with tailored toolsets and multi-server capabilities.
Where can I find the source code and license for mcp-use?
The source code and license are available on GitHub at https://github.com/zandko/mcp-use under an open-source license.