ZIO-LLM-Proxy logo

ZIO-LLM-Proxy

by pizzaeueu

The ZIO-LLM-Proxy enables integration between OpenAI chat models and local MCP servers via Function Calling, allowing custom PII checks for sensitive data verification. It provides the ability to make MCP calls for OpenAI models and run PII checks for the data provided to LLM.

View on GitHub

Last updated: N/A

What is ZIO-LLM-Proxy?

The ZIO-LLM-Proxy is a proxy server that facilitates communication between OpenAI chat models and local MCP (Model Context Protocol) servers. It leverages OpenAI's Function Calling feature to enable custom PII (Personally Identifiable Information) checks on data before it's sent to the LLM.

How to use ZIO-LLM-Proxy?

To use the proxy, set your OpenAI API key and the absolute path to the shared directory in the docker-compose.yml file. Then, run docker compose up. Access the proxy at http://localhost:4173/ and start chatting with the model. Ensure the filesystem MCP server is pre-installed as a docker image or configure other MCP servers in the data/McpConfig.conf file.

Key features of ZIO-LLM-Proxy

  • MCP call integration for OpenAI models

  • PII checks for data provided to LLM

  • User consent for sharing sensitive information

  • Configuration-based MCP server support

  • Uses Function Calling

Use cases of ZIO-LLM-Proxy

  • Protecting sensitive data when using LLMs

  • Integrating LLMs with local data sources via MCP

  • Implementing custom PII detection logic

  • Building LLM-powered applications that require data privacy

  • Enabling LLMs to access and utilize data from MCP servers

FAQ from ZIO-LLM-Proxy

Is authentication supported?

No, there is no authentication mechanism.

Are metrics collected?

No, metrics are not collected.

What happens if sensitive data is found?

Users will be asked for access to share sensitive information with LLM. If access is denied, the dialog will be stopped.

Which languages are supported for PII detection?

PII detection currently supports only the English language (EN).

What happens if the data retrieved from MCP servers is out of the context window?

The app will return an error to the user and stop the dialog.