My Own Assistant
by Interpause
This project aims to simplify the creation of fully integrated, self-hosted assistants using MCPs. It leverages OpenAI-compatible APIs, specifically to take advantage of models like Qwen3-30B-A3B.
Last updated: N/A
What is My Own Assistant?
My Own Assistant is a template and framework for building a fully integrated, self-hosted AI assistant. It provides a foundation for creating custom assistants that can leverage large language models through OpenAI-compatible APIs.
How to use My Own Assistant?
The project provides a FastAPI-based API template. To use it, replace all instances of py-api-template
with your desired name, optionally rename the src
directory, and follow the provided commands for development, testing, building, and publishing the application. Ensure poetry is installed and dependencies are installed.
Key features of My Own Assistant
CPU/GPU acceleration support
Poetry package management
Ruff formatting & linting
VSCode debugging tasks
Dev Containers support
Use cases of My Own Assistant
Creating a personal AI assistant
Developing a custom chatbot
Building an AI-powered service with local hosting
Experimenting with large language models
Integrating AI into existing applications
FAQ from My Own Assistant
What is MCP?
What is MCP?
The README doesn't define MCP. More context is needed.
What OpenAI compatible API should I use?
What OpenAI compatible API should I use?
The project is designed to be compatible with any OpenAI-compatible API. You can choose one based on your needs and budget.
How do I deploy this to production?
How do I deploy this to production?
The README provides commands for building and publishing a Docker image, which can then be deployed to a container orchestration platform like Kubernetes or Docker Swarm.
What are the system requirements?
What are the system requirements?
The system requirements will depend on the specific language model you are using and the amount of traffic you expect. A GPU is recommended for optimal performance.
Can I use a different language model?
Can I use a different language model?
Yes, as long as it is accessible through an OpenAI-compatible API.