Booner_MCP logo

Booner_MCP

by vespo92

Booner_MCP is an AI infrastructure-as-code platform that leverages the Model Context Protocol (MCP) with Ollama for agentic coding and server management. It enables AI agents to interact with and manage local infrastructure, deploying various server types.

View on GitHub

Last updated: N/A

What is Booner_MCP?

Booner_MCP is an AI-driven infrastructure-as-code platform that utilizes the Model Context Protocol (MCP) and integrates with Ollama to allow AI agents to manage and deploy various server types, including web servers, game servers, and databases, within a local infrastructure.

How to use Booner_MCP?

To use Booner_MCP, clone the repository with submodules, create an environment file, generate a secure AUTH_SECRET, and deploy with Docker Compose. Access the web UI, API, and Ollama server through the provided URLs. The system allows AI agents to interact with the infrastructure and manage server deployments.

Key features of Booner_MCP

  • AI-powered infrastructure management

  • Model Context Protocol (MCP) integration

  • Ollama integration for AI capabilities

  • Agentic coding for automated tasks

  • Support for various server types (web, game, databases)

Use cases of Booner_MCP

  • Automated server deployment and management

  • AI-driven infrastructure scaling

  • Agentic coding for infrastructure tasks

  • Local AI infrastructure experimentation

FAQ from Booner_MCP

What is the primary purpose of Booner_MCP?

To provide an AI-driven platform for managing and deploying infrastructure using agentic coding and the Model Context Protocol.

What LLM is used with Booner_MCP?

Mixtral, running via Ollama.

What are the hardware requirements for Booner_Ollama?

AMD 5700X3D, 4070 Ti Super, 64GB RAM, Quadro P4000

What are the main programming languages used in the project?

Python, Go, and NextJS

How do I access the Web UI after deployment?

The Web UI can be accessed at http://10.0.0.1:3000