AI Image Generation Pipeline logo

AI Image Generation Pipeline

by lalanikarim

This project demonstrates the use of Model Context Protocol (MCP) with LangGraph for AI image generation workflows. It includes scripts showcasing different LangGraph APIs and integration with Open WebUI Pipelines.

View on GitHub

Last updated: N/A

What is AI Image Generation Pipeline?

This project is a collection of scripts that demonstrate how to construct AI image generation pipelines using LangGraph and the Model Context Protocol (MCP). It showcases different LangGraph API approaches, including Functional API, Graph API, and integration with Open WebUI.

How to use AI Image Generation Pipeline?

  1. Install dependencies: pip install aiosqlite langgraph langgraph-checkpoint-sqlite mcp[cli] comfy-mcp-server. 2. Set environment variables for MCP server configuration (COMFY_URL, COMFY_URL_EXTERNAL, etc.). 3. Run the desired script (app.py, graph.py, ai-image-gen-pipeline.py) with appropriate command-line arguments (e.g., --topic, --thread_id, --feedback). Alternatively, use the uv utility.

Key features of AI Image Generation Pipeline

  • AI Image Generation

  • Human-in-the-Loop (HIL) feedback integration

  • LangGraph Functional and Graph API usage

  • Integration with Open WebUI Pipelines

  • Model Context Protocol (MCP) support

Use cases of AI Image Generation Pipeline

  • Generating AI images based on user-defined topics

  • Creating interactive AI workflows with user feedback

  • Building AI pipelines within Open WebUI

  • Experimenting with different LangGraph API approaches

  • Automated creation of image generation prompt.

FAQ from AI Image Generation Pipeline

What is the purpose of MCP?

MCP, or Model Context Protocol, is used to manage and interact with external models, in this case, the ComfyUI image generation model. It provides a standardized way to run tools and exchange data between the LangGraph pipeline and the image generation model.

What are the dependencies required for this project?

The required dependencies are aiosqlite, langgraph, langgraph-checkpoint-sqlite, mcp[cli], and comfy-mcp-server. These can be installed using pip install aiosqlite langgraph langgraph-checkpoint-sqlite mcp[cli] comfy-mcp-server.

How do I provide feedback in the graph.py script?

You can provide feedback by running the script with the --feedback argument (e.g., python graph.py --thread_id "your-thread-id" --feedback "y/n").

What is the role of the run_tool function?

The run_tool function is used to interact with the MCP server. It takes a tool name and arguments as input, and returns the result of running the tool on the MCP server.

What is LangGraph?

LangGraph is a library that allows you to build sequential and cyclical graphs with LLMs to build more robust and stateful LLM applications.