LocalMcpServer logo

LocalMcpServer

by DimonSmart

LocalMcpServer is a demonstration implementation of a server supporting the Model Context Protocol (MCP). It showcases the interaction between client and server using STDIO transport, providing tools that can be used by MCP clients.

View on GitHub

Last updated: N/A

What is LocalMcpServer?

This is a demonstration implementation of an MCP server built with .NET 9.0. It provides a basic server setup that supports the Model Context Protocol, allowing language models to interact with external tools.

How to use LocalMcpServer?

To use this server, clone the repository, build the application using dotnet build, and run it with dotnet run. A compatible MCP client, such as OllamaChat, is needed to interact with the server.

Key features of LocalMcpServer

  • Basic MCP server implementation

  • STDIO for client communication

  • Sample TimeTool that returns the current server time

  • Automatic tool discovery and registration

Use cases of LocalMcpServer

  • Demonstrating MCP server capabilities

  • Testing MCP client-server interactions

  • Integrating with local chat clients like OllamaChat

  • Providing tool-using capabilities to language models in a local environment

FAQ from LocalMcpServer

What is MCP?

Model Context Protocol (MCP) is a protocol designed to standardize communication between language models and external tools.

What is the purpose of this server?

This server demonstrates how to create a simple MCP server and showcases the interaction between a client and server using STDIO transport.

What are the prerequisites for running this server?

You need .NET 9.0 SDK or higher and a compatible MCP client for testing, such as OllamaChat.

What tools are available in this server?

The server includes a TimeTool that returns the current server time and an InterfaceLookupService that extracts C# interface definitions from NuGet packages.

How can I integrate this server with OllamaChat?

By connecting this MCP server to OllamaChat, you can demonstrate the capabilities of tool-using language models in a local environment. Follow the instructions provided in the OllamaChat documentation for integration.