MCP Dev Server
by ianrichard
The MCP Dev Server provides a UI and client for interacting with various LLMs. It allows users to test and develop applications using different LLM providers like OpenAI, Azure, and Groq.
Last updated: N/A
What is MCP Dev Server?
The MCP Dev Server is a development environment for building and testing applications that utilize Large Language Models (LLMs). It includes a UI and a chat client for easy interaction with different LLM providers.
How to use MCP Dev Server?
To use the server, run mcp dev main.py
. Access the UI at http://127.0.0.1:6274/#tools. To use the chat client, run python client/client.py
. You can configure the LLM provider using the LLMFactory
class with options like LLMFactory.ollama()
, LLMFactory.openai()
, LLMFactory.azure()
, and LLMFactory.groq()
. You can also specify the model, e.g., LLMFactory.openai("gpt-4o")
or LLMFactory.groq("llama-3.3-70b")
.
Key features of MCP Dev Server
UI for LLM interaction
Chat client
Support for multiple LLM providers (OpenAI, Azure, Groq, Ollama)
Easy LLM configuration using LLMFactory
Model selection
Use cases of MCP Dev Server
Testing LLM integrations
Developing chat applications
Experimenting with different LLM providers
Prototyping LLM-powered features
Comparing LLM performance
FAQ from MCP Dev Server
How do I start the server?
How do I start the server?
Run mcp dev main.py
Where is the UI located?
Where is the UI located?
The UI is accessible at http://127.0.0.1:6274/#tools
How do I start the chat client?
How do I start the chat client?
Run python client/client.py
How do I select an LLM provider?
How do I select an LLM provider?
Use the LLMFactory
class with options like LLMFactory.ollama()
, LLMFactory.openai()
, etc.
How do I specify a specific model?
How do I specify a specific model?
Pass the model name to the LLMFactory
method, e.g., LLMFactory.openai("gpt-4o")