wolframalpha-llm-mcp

MCP.Pizza Chef: Garoth

The wolframalpha-llm-mcp is an MCP server that integrates WolframAlpha's LLM API, enabling natural language queries for complex math, science, history, and geography. It returns structured, LLM-optimized responses with options for simplified or detailed answers, facilitating advanced knowledge retrieval and computation within AI workflows.

Use This MCP server To

Query WolframAlpha for complex mathematical problem solutions Retrieve structured scientific and historical facts for AI models Generate detailed or simplified answers for knowledge-based queries Validate WolframAlpha API keys within MCP workflows Integrate WolframAlpha knowledge into AI-powered applications Enhance LLM responses with precise computational data Support multi-domain question answering in AI assistants

README

WolframAlpha LLM MCP Server

WolframAlpha LLM MCP Logo

A Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API. https://products.wolframalpha.com/llm-api/documentation

WolframAlpha MCP Server Example 1

WolframAlpha MCP Server Example 2

Features

  • Query WolframAlpha's LLM API with natural language questions
  • Answer complicated mathematical questions
  • Query facts about science, physics, history, geography, and more
  • Get structured responses optimized for LLM consumption
  • Support for simplified answers and detailed responses with sections

Available Tools

  • ask_llm: Ask WolframAlpha a question and get a structured llm-friendly response
  • get_simple_answer: Get a simplified answer
  • validate_key: Validate the WolframAlpha API key

Installation

git clone https://github.com/Garoth/wolframalpha-llm-mcp.git
npm install

Configuration

  1. Get your WolframAlpha API key from developer.wolframalpha.com

  2. Add it to your Cline MCP settings file inside VSCode's settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):

{
  "mcpServers": {
    "wolframalpha": {
      "command": "node",
      "args": ["/path/to/wolframalpha-mcp-server/build/index.js"],
      "env": {
        "WOLFRAM_LLM_APP_ID": "your-api-key-here"
      },
      "disabled": false,
      "autoApprove": [
        "ask_llm",
        "get_simple_answer",
        "validate_key"
      ]
    }
  }
}

Development

Setting Up Tests

The tests use real API calls to ensure accurate responses. To run the tests:

  1. Copy the example environment file:

    cp .env.example .env
  2. Edit .env and add your WolframAlpha API key:

    WOLFRAM_LLM_APP_ID=your-api-key-here
    

    Note: The .env file is gitignored to prevent committing sensitive information.

  3. Run the tests:

    npm test

Building

npm run build

License

MIT

wolframalpha-llm-mcp FAQ

How do I authenticate the WolframAlpha LLM MCP server?
You authenticate by providing a valid WolframAlpha API key, which can be validated using the built-in validate_key tool.
Can this MCP server handle complex mathematical queries?
Yes, it is designed to solve complicated math problems using WolframAlpha's computational engine.
What types of responses does the server provide?
It offers structured, LLM-friendly responses with options for simplified or detailed answers including sections.
Is the server limited to math queries only?
No, it supports a wide range of topics including science, physics, history, and geography.
How does this server improve LLM interactions?
By providing precise, structured knowledge and computations that LLMs can consume directly for better accuracy.
Can I use this server with multiple LLM providers?
Yes, it is provider-agnostic and can be integrated with models like OpenAI, Claude, and Gemini.
What tools are available in this MCP server?
Tools include ask_llm for queries, get_simple_answer for simplified responses, and validate_key for API key validation.
How do I get started with this MCP server?
Refer to the GitHub documentation for setup instructions and API usage examples.