Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

py-mcp-line

MCP.Pizza Chef: amornpan

py-mcp-line is a Python-based MCP server that provides a standardized interface for Language Models to access and analyze LINE Bot messages. It supports asynchronous operation with asyncio, environment-based configuration, comprehensive logging, and FastAPI integration for API endpoints. This server handles LINE Bot webhook events and stores messages in JSON format, enabling real-time conversational data access for AI workflows.

Use This MCP server To

Access LINE Bot messages for real-time analysis Store and retrieve LINE conversation data in JSON Integrate LINE messaging data into AI workflows Handle LINE webhook events asynchronously Provide standardized API for LINE message access Enable LLMs to read and analyze LINE chats

README

Python LINE MCP Server

Version Python MCP FastAPI License

A Model Context Protocol server implementation in Python that provides access to LINE Bot messages. This server enables Language Models to read and analyze LINE conversations through a standardized interface.

Features

Core Functionality

  • Asynchronous operation using Python's asyncio
  • Environment-based configuration using python-dotenv
  • Comprehensive logging system
  • LINE Bot webhook event handling
  • Message storage in JSON format
  • FastAPI integration for API endpoints
  • Pydantic models for data validation
  • Support for text, sticker, and image messages

Prerequisites

  • Python 3.8+
  • Required Python packages:
    • fastapi
    • pydantic
    • python-dotenv
    • mcp-server
    • line-bot-sdk
    • uvicorn

Installation

git clone https://github.com/amornpan/py-mcp-line.git
cd py-mcp-line
pip install -r requirements.txt

Project Structure

PY-MCP-LINE/
├── src/
│   └── line/
│       ├── __init__.py
│       └── server.py
├── data/
│   └── messages.json
├── tests/
│   ├── __init__.py
│   └── test_line.py
├── .env
├── .env.example
├── .gitignore
├── README.md
├── Dockerfile
└── requirements.txt

Directory Structure Explanation

  • src/line/ - Main source code directory
    • __init__.py - Package initialization
    • server.py - Main server implementation
  • data/ - Data storage directory
    • messages.json - Stored LINE messages
  • tests/ - Test files directory
    • __init__.py - Test package initialization
    • test_line.py - LINE functionality tests
  • .env - Environment configuration file (not in git)
  • .env.example - Example environment configuration
  • .gitignore - Git ignore rules
  • README.md - Project documentation
  • Dockerfile - Docker configuration
  • requirements.txt - Project dependencies

Configuration

Create a .env file in the project root:

LINE_CHANNEL_SECRET=your_channel_secret
LINE_ACCESS_TOKEN=your_access_token
SERVER_PORT=8000
MESSAGES_FILE=data/messages.json

API Implementation Details

Resource Listing

@app.list_resources()
async def list_resources() -> list[Resource]
  • Lists available message types from the LINE Bot
  • Returns resources with URIs in the format line://<message_type>/data
  • Includes resource descriptions and MIME types

Resource Reading

@app.read_resource()
async def read_resource(uri: AnyUrl) -> str
  • Reads messages of the specified type
  • Accepts URIs in the format line://<message_type>/data
  • Returns messages in JSON format
  • Supports filtering by date, user, or content

Usage with Claude Desktop

Add to your Claude Desktop configuration:

On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "line": {
      "command": "python",
      "args": [
        "server.py"
      ],
      "env": {
        "LINE_CHANNEL_SECRET": "your_channel_secret",
        "LINE_ACCESS_TOKEN": "your_access_token",
        "SERVER_PORT": "8000",
        "MESSAGES_FILE": "data/messages.json"
      }
    }
  }
}

Error Handling

The server implements comprehensive error handling for:

  • Webhook validation failures
  • Message storage errors
  • Resource access errors
  • URI validation
  • LINE API response errors

All errors are logged and returned with appropriate error messages.

Security Features

  • Environment variable based configuration
  • LINE message signature validation
  • Proper error handling
  • Input validation through Pydantic

Contact Information

Amornpan Phornchaicharoen

Email LinkedIn HuggingFace GitHub

Feel free to reach out to me if you have any questions about this project or would like to collaborate!


Made with ❤️ by Amornpan Phornchaicharoen

Author

Amornpan Phornchaicharoen

Requirements

Create a requirements.txt file with:

fastapi>=0.104.1
pydantic>=2.10.6
uvicorn>=0.34.0 
python-dotenv>=1.0.1
line-bot-sdk>=3.5.0
anyio>=4.5.0
mcp==1.2.0

These versions have been tested and verified to work together. The key components are:

  • fastapi and uvicorn for the API server
  • pydantic for data validation
  • line-bot-sdk for LINE Bot integration
  • mcp for Model Context Protocol implementation
  • python-dotenv for environment configuration
  • anyio for asynchronous I/O support

Acknowledgments

  • LINE Developers for the LINE Messaging API
  • Model Context Protocol community
  • Python FastAPI community
  • Contributors to the python-dotenv project

py-mcp-line FAQ

How does py-mcp-line handle LINE Bot webhook events?
It processes webhook events asynchronously using Python's asyncio and FastAPI for efficient event handling.
What configuration method does py-mcp-line use?
It uses environment-based configuration via python-dotenv for flexible setup.
How are LINE messages stored in py-mcp-line?
Messages are stored in JSON format for easy access and processing by LLMs.
Which Python framework does py-mcp-line use for its API?
It uses FastAPI to provide RESTful API endpoints for LINE message access.
Is py-mcp-line suitable for real-time conversational AI applications?
Yes, its asynchronous design and webhook handling make it ideal for real-time LINE message analysis.
Can py-mcp-line integrate with multiple LLM providers?
Yes, it supports provider-agnostic MCP interfaces compatible with OpenAI, Claude, and Gemini.
What logging capabilities does py-mcp-line offer?
It includes a comprehensive logging system to monitor server operations and message processing.