MCP Servers logo

MCP Servers

by MCP-Mirror/AllAboutAI-YT

This repository provides MCP servers for integrating with OpenAI's o1 model and Flux capabilities. It enables interaction with these models through the Model Context Protocol.

View on GitHub

Last updated: N/A

MCP Servers - OpenAI and Flux Integration

This repository contains MCP (Model Context Protocol) servers for integrating with OpenAI's o1 model and Flux capabilities.

Server Configurations

OpenAI o1 MCP Server

The o1 server enables interaction with OpenAI's o1 preview model through the MCP protocol.

{
  "mcpServers": {
    "openai": {
      "command": "openai-server",
      "env": {
        "OPENAI_API_KEY": "apikey"
      }
    }
  }
}

Key features:

  • Direct access to o1-preview model
  • Streaming support
  • Temperature and top_p parameter control
  • System message configuration

Flux MCP Server

The Flux server provides integration with Flux capabilities through MCP.

{
  "mcpServers": {
    "flux": {
      "command": "flux-server",
      "env": {
        "REPLICATE_API_TOKEN": "your-replicate-token"
      }
    }
  }
}

Key features:

  • SOTA Image Model

Usage

  1. Clone or Fork Server
git clone https://github.com/AllAboutAI-YT/mcp-servers.git
  1. Set up environment variables in your .env file:
FLUX_API_KEY=your_flux_key_here
  1. Start the servers using the configurations above.

Security

  • Store API keys securely
  • Use environment variables for sensitive data
  • Follow security best practices in SECURITY.md

License

MIT License - See LICENSE file for details.