Deep-Co
by succlz123
Deep-Co is a chat client for LLMs, written in Compose Multiplatform. It supports various API providers and allows configuration of OpenAI-compatible APIs or native models.
Last updated: N/A
What is Deep-Co?
Deep-Co is a desktop application designed as a chat client for interacting with Large Language Models (LLMs). It's built using Compose Multiplatform, enabling cross-platform compatibility.
How to use Deep-Co?
To use Deep-Co, you need to configure your preferred LLM API provider (e.g., OpenRouter, OpenAI, DeepSeek) by providing the necessary API keys. You can also use native models via LM Studio/Ollama. The application provides features for managing prompts, users, and MCP servers. Follow the build instructions in the README to run the application.
Key features of Deep-Co
Chat with MCP Server (Stream&Complete)
Chat History
MCP Support
Prompt Management
User Define
DeepSeek LLM Support
Grok LLM Support
Google Gemini LLM Support
TTS (Edge API)
Use cases of Deep-Co
Interacting with various LLMs through a single client
Managing and organizing prompts for different tasks
Configuring and utilizing MCP servers for enhanced functionality
Developing and testing LLM-based applications
Experimenting with different LLM models and API providers
FAQ from Deep-Co
What platforms does Deep-Co support?
What platforms does Deep-Co support?
Deep-Co supports Windows, macOS, and Linux.
What LLM providers are supported?
What LLM providers are supported?
Deep-Co supports API providers such as OpenRouter, Anthropic, Grok, OpenAI, DeepSeek, Coze, Dify, Google Gemini, etc. You can also configure any OpenAI-compatible API or use native models via LM Studio/Ollama.
What is MCP?
What is MCP?
MCP stands for Model Context Protocol. It seems to be a protocol for interacting with LLMs, but the README doesn't provide a detailed explanation.
How do I install the necessary dependencies?
How do I install the necessary dependencies?
The README provides instructions for installing dependencies on MacOS and Windows using brew
and winget
respectively. You will need uv
and node
.
How do I build and run the application?
How do I build and run the application?
Use ./gradlew desktopApp:run
to run the application. Use ./gradlew :desktop:packageDistributionForCurrentOS
to build the desktop distribution.