Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

openapi_mcp_server

MCP.Pizza Chef: rahgadda

OpenAPI MCP Server is a lightweight MCP server that exposes configured REST APIs as structured context to large language models (LLMs). It enables LLMs to perform HTTP API calls (GET, PUT, POST, PATCH) dynamically via natural language prompts, bridging LLMs with external RESTful services. This server simplifies integrating real-time API data and actions into AI workflows, supporting rapid deployment with minimal configuration.

Use This MCP server To

Enable LLMs to call REST APIs via natural language prompts Provide real-time REST API data as context to LLMs Automate API interactions within AI-enhanced workflows Test and debug REST API calls through LLM interfaces Integrate external HTTP services into MCP-enabled applications

README

OpenAPI MCP Server

smithery badge

Overview

  • This project will install MCP - Model Context Protocol Server, that provides configured REST API's as context to LLM's.
  • Using this we can enable LLMs to interact with RestAPI's and perform REST API call's using LLM prompts.
  • Currently we support HTTP API Call's GET/PUT/POST/PATCH.

Installation

  • Install package
    pip install openapi_mcp_server
  • Create .env in a folder with minimum values for OPENAPI_SPEC_PATH & API_BASE_URL. Sample file available here
  • Test openapi_mcp_server server using uv run openapi_mcp_server from the above folder.

Claud Desktop

  • Configuration details for Claud Desktop
    {
      "mcpServers": {
        "openapi_mcp_server":{
          "command": "uv",
          "args": ["run","openapi_mcp_server"]
          "env": {
              "DEBUG":"1",
              "API_BASE_URL":"https://petstore.swagger.io/v2",
              "OPENAPI_SPEC_PATH":"https://petstore.swagger.io/v2/swagger.json",
              "API_HEADERS":"Accept:application/json",
              "API_WHITE_LIST":"addPet,updatePet,findPetsByStatus"
          }
        }
      }
    }
    Pet Store Demo

Configuration

  • List of available environment variables
    • DEBUG: Enable debug logging (optional default is False)
    • OPENAPI_SPEC_PATH: Path to the OpenAPI document. (required)
    • API_BASE_URL: Base URL for the API requests. (required)
    • API_HEADERS: Headers to include in the API requests (optional)
    • API_WHITE_LIST: White Listed operationId in list format ["operationId1", "operationId2"] (optional)
    • API_BLACK_LIST: Black Listed operationId in list format ["operationId3", "operationId4"] (optional)
    • HTTP_PROXY: HTTP Proxy details (optional)
    • HTTPS_PROXY: HTTPS Proxy details (optional)
    • NO_PROXY: No Proxy details (optional)

Contributing

Contributions are welcome.
Please feel free to submit a Pull Request.

License

This project is licensed under the terms of the MIT license.

Github Stars

Star History Chart

Appendix

UV

mkdir -m777 openapi_mcp_server
cd openapi_mcp_server
uv init
uv add mcp[cli] pydantic python-dotenv requests
uv add --dev twine setuptools
uv sync
uv run openapi_mcp_server
uv build
pip install --force-reinstall --no-deps .\dist\openapi_mcp_server-*fileversion*.whl
export TWINE_USERNAME="rahgadda"
export TWINE_USERNAME="<<API Key>>"
uv run twine upload --verbose dist/*

Reference

openapi_mcp_server FAQ

How do I install the OpenAPI MCP Server?
Install via pip using 'pip install openapi_mcp_server' and configure environment variables.
What HTTP methods does the server support?
It supports GET, PUT, POST, and PATCH HTTP methods for API calls.
How do I configure the server to connect to my API?
Set 'OPENAPI_SPEC_PATH' and 'API_BASE_URL' in a .env file for your API specification and base URL.
Can this server be used with different LLM providers?
Yes, it is provider-agnostic and works with OpenAI, Claude, Gemini, and others.
How do I run the OpenAPI MCP Server?
Use the command 'uv run openapi_mcp_server' from the folder containing your .env configuration.
Is the server limited to any specific API specification format?
It primarily uses OpenAPI specifications to configure REST API endpoints.
Can I extend the server to support additional HTTP methods?
Currently supports core methods; extending requires modifying the server code.
How does the server handle authentication for APIs?
Authentication details can be included in the API spec or environment configuration as needed.