Fire in da houseTop Tip:Paying $100+ per month for Perplexity, MidJourney, Runway, ChatGPT and other tools is crazy - get all your AI tools in one site starting at $15 per month with Galaxy AI Fire in da houseCheck it out free

mcp-vertexai-search

MCP.Pizza Chef: ubie-oss

The mcp-vertexai-search is an MCP server that integrates Vertex AI Search with Gemini grounding to enable high-quality document search using private data stored in Vertex AI Datastore. It supports multiple data stores and improves search relevance by grounding responses in your own data. This server can be deployed via Docker or directly from source, facilitating seamless integration of Vertex AI's generative AI capabilities into MCP workflows.

Use This MCP server To

Search private documents using Vertex AI with Gemini grounding Integrate multiple Vertex AI data stores for unified search Deploy document search server via Docker for easy setup Enhance search relevance by grounding results in private data Enable real-time document retrieval in AI workflows Combine generative AI with enterprise data for search Use Vertex AI Search in MCP-enabled applications

README

MCP Server for Vertex AI Search

This is a MCP server to search documents using Vertex AI.

Architecture

This solution uses Gemini with Vertex AI grounding to search documents using your private data. Grounding improves the quality of search results by grounding Gemini's responses in your data stored in Vertex AI Datastore. We can integrate one or multiple Vertex AI data stores to the MCP server. For more details on grounding, refer to Vertex AI Grounding Documentation.

Architecture

How to use

There are two ways to use this MCP server. If you want to run this on Docker, the first approach would be good as Dockerfile is provided in the project.

1. Clone the repository

# Clone the repository
git clone git@github.com:ubie-oss/mcp-vertexai-search.git

# Create a virtual environment
uv venv
# Install the dependencies
uv sync --all-extras

# Check the command
uv run mcp-vertexai-search

Install the python package

The package isn't published to PyPI yet, but we can install it from the repository. We need a config file derives from config.yml.template to run the MCP server, because the python package doesn't include the config template. Please refer to Appendix A: Config file for the details of the config file.

# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git

# Check the command
mcp-vertexai-search --help

Development

Prerequisites

Set up Local Environment

# Optional: Install uv
python -m pip install -r requirements.setup.txt

# Create a virtual environment
uv venv
uv sync --all-extras

Run the MCP server

This supports two transports for SSE (Server-Sent Events) and stdio (Standard Input Output). We can control the transport by setting the --transport flag.

We can configure the MCP server with a YAML file. config.yml.template is a template for the config file. Please modify the config file to fit your needs.

uv run mcp-vertexai-search serve \
    --config config.yml \
    --transport <stdio|sse>

Test the Vertex AI Search

We can test the Vertex AI Search by using the mcp-vertexai-search search command without the MCP server.

uv run mcp-vertexai-search search \
    --config config.yml \
    --query <your-query>

Appendix A: Config file

config.yml.template is a template for the config file.

  • server
    • server.name: The name of the MCP server
  • model
    • model.model_name: The name of the Vertex AI model
    • model.project_id: The project ID of the Vertex AI model
    • model.location: The location of the model (e.g. us-central1)
    • model.impersonate_service_account: The service account to impersonate
    • model.generate_content_config: The configuration for the generate content API
  • data_stores: The list of Vertex AI data stores
    • data_stores.project_id: The project ID of the Vertex AI data store
    • data_stores.location: The location of the Vertex AI data store (e.g. us)
    • data_stores.datastore_id: The ID of the Vertex AI data store
    • data_stores.tool_name: The name of the tool
    • data_stores.description: The description of the Vertex AI data store

mcp-vertexai-search FAQ

How do I deploy the mcp-vertexai-search server?
You can deploy it using the provided Dockerfile or by cloning the repository and installing dependencies manually.
What is Gemini grounding in this context?
Gemini grounding refers to enhancing search results by anchoring Gemini's generative responses in your private data stored in Vertex AI Datastore.
Can I connect multiple Vertex AI data stores to this server?
Yes, the server supports integration with one or multiple Vertex AI data stores for comprehensive search.
Is this server limited to any specific data formats?
The server is designed to work with data stored in Vertex AI Datastore, which supports various document types compatible with Vertex AI Search.
Does this MCP server support real-time search queries?
Yes, it enables real-time document retrieval by leveraging Vertex AI Search capabilities.
What are the prerequisites for running this server?
You need access to Vertex AI with configured data stores and a compatible environment for Docker or Python dependencies.
How does this server improve search quality?
By grounding Gemini's generative AI responses in your private data, it ensures more accurate and relevant search results.
Can this server be integrated with other MCP clients?
Yes, it is designed to work within the MCP ecosystem, allowing integration with various MCP clients and workflows.