MCP Server for Vertex AI Search logo

MCP Server for Vertex AI Search

by ubie-oss

This is an MCP server that enables searching documents using Vertex AI's grounding capabilities. It leverages Gemini with Vertex AI grounding to improve search result quality by grounding responses in your private data stored in Vertex AI Datastore.

View on GitHub

Last updated: N/A

What is MCP Server for Vertex AI Search?

This MCP server allows you to search documents using Google's Vertex AI, leveraging Gemini with Vertex AI grounding. It integrates with one or more Vertex AI data stores to provide search results grounded in your private data.

How to use MCP Server for Vertex AI Search?

You can use this server either by running it in Docker or by installing the Python package. Both methods require configuring the server using a YAML file based on the provided template. The server supports SSE and stdio transports, configurable via a flag.

Key features of MCP Server for Vertex AI Search

  • Uses Gemini with Vertex AI grounding for enhanced search quality

  • Integrates with multiple Vertex AI data stores

  • Supports SSE and stdio transports

  • Configurable via YAML file

  • Provides a search command for direct testing

  • Can be deployed using Docker

Use cases of MCP Server for Vertex AI Search

  • Searching internal documentation

  • Knowledge base retrieval

  • Finding relevant information in large datasets

  • Building AI-powered search applications

  • Improving the accuracy of chatbot responses

FAQ from MCP Server for Vertex AI Search

What is Vertex AI Grounding?

Grounding improves the quality of search results by grounding Gemini's responses in your data stored in Vertex AI Datastore.

How do I configure the server?

You need to create a config file derived from config.yml.template and modify it to fit your needs.

What are the prerequisites for development?

You need uv and a Vertex AI data store. Refer to the official documentation for more information on data stores.

How do I run the MCP server?

Use the command uv run mcp-vertexai-search serve --config config.yml --transport <stdio|sse> after setting up your environment and configuring the server.

How do I test the Vertex AI Search?

You can use the mcp-vertexai-search search command without running the full MCP server: uv run mcp-vertexai-search search --config config.yml --query <your-query>