Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-server-collector

MCP.Pizza Chef: chatmcp

The mcp-server-collector is a specialized MCP Server designed to discover, extract, and collect MCP Servers from various internet sources. It provides tools to extract MCP Server information from URLs and raw content, and supports submitting these servers to centralized directories like mcp.so. This server facilitates aggregation and indexing of MCP Servers, making it easier to manage and explore the MCP ecosystem. It requires configuration with environment variables including OpenAI API keys and submission URLs, and supports integration with popular LLM providers such as OpenAI, Claude, and Gemini.

Use This MCP server To

Extract MCP Servers from web URLs for discovery Parse raw content to find embedded MCP Servers Submit discovered MCP Servers to public directories Aggregate MCP Server data for ecosystem indexing Automate MCP Server collection for research or monitoring

README

mcp-server-collector MCP server

A MCP Server used to collect MCP Servers over the internet.

Components

Resources

No resources yet.

Prompts

No prompts yet.

Tools

The server implements 3 tools:

  • extract-mcp-servers-from-url: Extracts MCP Servers from given URL.
    • Takes "url" as required string argument
  • extract-mcp-servers-from-content: Extracts MCP Servers from given content.
    • Takes "content" as required string argument
  • submit-mcp-server: Submits a MCP Server to the MCP Server Directory like mcp.so.
    • Takes "url" as required string argument and "avatar_url" as optional string argument

Configuration

.env file is required to be set up.

OPENAI_API_KEY="sk-xxx"
OPENAI_BASE_URL="https://api.openai.com/v1"
OPENAI_MODEL="gpt-4o-mini"

MCP_SERVER_SUBMIT_URL="https://mcp.so/api/submit-project"

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration ``` "mcpServers": { "fetch": { "command": "uvx", "args": ["mcp-server-fetch"] }, "mcp-server-collector": { "command": "uv", "args": [ "--directory", "path-to/mcp-server-collector", "run", "mcp-server-collector" ], "env": { "OPENAI_API_KEY": "sk-xxx", "OPENAI_BASE_URL": "https://api.openai.com/v1", "OPENAI_MODEL": "gpt-4o-mini", "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project" } } } ```
Published Servers Configuration ``` "mcpServers": { "fetch": { "command": "uvx", "args": ["mcp-server-fetch"] }, "mcp-server-collector": { "command": "uvx", "args": [ "mcp-server-collector" ], "env": { "OPENAI_API_KEY": "sk-xxx", "OPENAI_BASE_URL": "https://api.openai.com/v1", "OPENAI_MODEL": "gpt-4o-mini", "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project" } } } ```

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory path-to/mcp-server-collector run mcp-server-collector

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Community

About the author

mcp-server-collector FAQ

How do I configure the mcp-server-collector?
You must set environment variables including OPENAI_API_KEY, OPENAI_BASE_URL, OPENAI_MODEL, and MCP_SERVER_SUBMIT_URL in a .env file.
What tools does the mcp-server-collector provide?
It offers tools to extract MCP Servers from URLs and content, and to submit MCP Servers to directories like mcp.so.
Can the mcp-server-collector submit MCP Servers automatically?
Yes, it can submit MCP Servers to configured directories using the submit-mcp-server tool with URL and optional avatar URL.
Is the mcp-server-collector compatible with multiple LLM providers?
Yes, it supports OpenAI, Claude, and Gemini models via configurable API endpoints.
Where can I find the MCP Server submission endpoint?
The submission URL is set via the MCP_SERVER_SUBMIT_URL environment variable, typically pointing to services like https://mcp.so/api/submit-project.
Does the mcp-server-collector require any special installation steps?
Installation involves setting up the .env file with required keys and optionally configuring for platforms like Claude Desktop.
What input formats are supported for extracting MCP Servers?
It supports extraction from both URLs and raw content strings.
Can I customize the extraction behavior?
Extraction tools take parameters like URL or content, allowing flexible input sources for MCP Server discovery.