Uber Eats MCP Server
by ericzakariasson
This is a proof-of-concept server demonstrating how to build an MCP server on top of Uber Eats. It leverages the Model Context Protocol for seamless integration between LLM applications and external tools.
Last updated: N/A
What is Uber Eats MCP Server?
This is a POC of an MCP server built on top of Uber Eats, utilizing the Model Context Protocol (MCP) to enable integration between LLM applications and external tools.
How to use Uber Eats MCP Server?
To use this server, you need Python 3.12 or higher, an Anthropic API key (or other supported LLM provider), and a virtual environment. Install the required packages using uv pip install -r requirements.txt
and playwright install
. Update the .env
file with your API key. You can run the MCP inspector tool with uv run mcp dev server.py
.
Key features of Uber Eats MCP Server
Model Context Protocol (MCP) integration
Uber Eats data access
LLM Application Integration
Stdio MCP transport
Use cases of Uber Eats MCP Server
Integrating Uber Eats data into LLM applications
Building conversational interfaces for ordering food
Creating AI-powered food recommendations
Automating Uber Eats tasks with LLMs
FAQ from Uber Eats MCP Server
What is MCP?
What is MCP?
The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external tools.
What are the prerequisites?
What are the prerequisites?
Python 3.12 or higher and an Anthropic API key (or other supported LLM provider).
How do I install the dependencies?
How do I install the dependencies?
Use uv pip install -r requirements.txt
and playwright install
.
How do I set up the API key?
How do I set up the API key?
Update the .env
file with your API key: ANTHROPIC_API_KEY=your_openai_api_key_here
How do I run the MCP inspector tool?
How do I run the MCP inspector tool?
Use the command uv run mcp dev server.py