MCP REST API and CLI Client logo

MCP REST API and CLI Client

by rakesh-eltropy

A simple REST API and CLI client to interact with Model Context Protocol (MCP) servers. It supports multiple MCP-compatible servers and integrates with LangChain to execute LLM prompts.

View on GitHub

Last updated: N/A

What is MCP REST API and CLI Client?

This is a REST API and CLI client designed to interact with Model Context Protocol (MCP) servers, enabling users to query and retrieve information across various data sources using Large Language Models (LLMs). It leverages LangChain for prompt execution and supports multiple LLM providers.

How to use MCP REST API and CLI Client?

  1. Clone the repository. 2. Navigate to the project directory. 3. Set the required API keys (OPENAI_API_KEY, BRAVE_API_KEY) as environment variables or in the mcp-server-config.json file. 4. Run the CLI using uv run cli.py or the REST API using uvicorn app:app --reload. 5. Use the help command in the CLI to explore available commands. Use the curl command to chat with llm from REST API.

Key features of MCP REST API and CLI Client

  • MCP-Compatible Servers

  • Integrated with LangChain

  • LLM Provider Support

  • Pre-configured default servers (SQLite, Brave Search)

Use cases of MCP REST API and CLI Client

  • Querying local databases using LLMs

  • Searching the web and integrating results with local data

  • Chatting with LLMs using data from multiple MCP servers

  • Accessing information from different AI models and information sources simultaneously

  • Orchestrating multi-agent collaborations driven by LLMs for complex tasks

FAQ from MCP REST API and CLI Client

What MCP compatible servers are supported?

The client supports any MCP-compatible servers and comes pre-configured with SQLite and Brave Search. Additional servers can be added in the mcp-server-config.json file.

Which LLM provider I can use?

The client is compatible with any LLM provider that supports APIs with function capabilities, such as OpenAI, Claude, Gemini, AWS Nova, Groq, and Ollama.

How do I set up the API Keys

You can set the OPENAI_API_KEY and BRAVE_API_KEY environment variables globally or set them in the mcp-server-config.json file.

How do I chat with LLM using REST API?

You can use a curl -X POST command to send message to the /chat end point.

What is MCP?

MCP stands for Model Context Protocol. Please refer to https://modelcontextprotocol.io/ for more details.