Tokens MCP logo

Tokens MCP

by antonkulaga

Tokens MCP is an MCP server for the Token Metrics API, providing a standardized interface for AI systems to access cryptocurrency market data. It simplifies working with crypto data and strategy development for algorithmic trading bots or market research.

View on GitHub

Last updated: N/A

What is Tokens MCP?

Tokens MCP is an MCP (Model Control Protocol) server that provides a standardized interface for AI systems to access the TokenMetrics API. It allows users to access comprehensive cryptocurrency market data and implement trading strategies.

How to use Tokens MCP?

To use Tokens MCP, clone the repository, install dependencies using uv sync, configure your TokenMetrics API key in .env and mcp_server_config.json, and then run the server using uv run mcp run run.py. The server can then be integrated with IDEs like Cursor that support MCP.

Key features of Tokens MCP

  • Access comprehensive cryptocurrency market data

  • Implement and backtest trading strategies

  • Generate visual performance metrics

  • Analyze token performance across different timeframes

Use cases of Tokens MCP

  • Algorithmic trading bots

  • Market research

  • Automated trading systems

  • AI-powered crypto analysis

FAQ from Tokens MCP

What is MCP?

MCP (Model Control Protocol) provides a standardized interface for AI systems to access external tools and data sources.

What API does this server use?

This server uses the TokenMetrics API to access cryptocurrency market data.

How do I configure the API key?

Copy .env.example to .env and configure your API keys. You must also update the mcp_server_config.json file with your TokenMetrics API key.

What IDEs are supported?

Cursor provides native support for MCP, allowing AI assistants to directly interact with the TokenMetrics API through this server.

Are there any known issues?

Yes, the mcp_server_config.json file currently contains absolute paths to the server that need to be manually updated. Also, the test files are manually run scripts rather than proper pytest files. Many TokenMetrics API endpoints had to be implemented directly because they are not available in the existing tmai-api library.