Llama MCP Streamlit logo

Llama MCP Streamlit

by nikunj2003

Llama MCP Streamlit is an interactive AI assistant that leverages Streamlit, NVIDIA NIM's API (LLaMa 3.3:70b)/Ollama, and Model Control Protocol (MCP). It provides a conversational interface to interact with an LLM, execute real-time external tools via MCP, retrieve data, and perform actions.

View on GitHub

Last updated: N/A

What is Llama MCP Streamlit?

Llama MCP Streamlit is a Streamlit application that creates an AI assistant. It integrates an LLM (via NVIDIA NIM or Ollama), MCP for tool execution, and a user-friendly chat interface.

How to use Llama MCP Streamlit?

First, configure the .env file with API keys for NVIDIA NIM or Ollama. Then, run the application using Poetry (install dependencies and run Streamlit) or Docker (build and run the container). Modify the utils/mcp_server.py file to configure the desired MCP server (NPX or Docker).

Key features of Llama MCP Streamlit

  • Real-time tool execution via MCP

  • LLM-powered chat interface

  • Streamlit UI with interactive chat elements

  • Support for multiple LLM backends (NVIDIA NIM & Ollama)

Use cases of Llama MCP Streamlit

  • Executing external tools via chat interface

  • Retrieving data and performing actions

  • Automation of tasks through LLM and tools

  • Interactive AI-powered assistance

FAQ from Llama MCP Streamlit

What LLMs are supported?

The assistant supports NVIDIA NIM (LLaMa 3.3:70b) and Ollama.

How do I configure the API keys?

Set the API_ENDPOINT and API_KEY environment variables in the .env file.

How can I run this project?

You can run the project using either Poetry or Docker.

How do I specify the MCP server?

Update the utils/mcp_server.py file and choose either the NPX or Docker configuration.

What is MCP?

Model Control Protocol (MCP) is a protocol for securely controlling and executing real-time external tools.