Jinni logo

Jinni

by smat-dev

Jinni is a tool to efficiently provide Large Language Models the context of your projects. It gives a consolidated view of relevant project files complete with metadata, overcoming the limitations and inefficiencies of reading files one by one.

View on GitHub

Last updated: N/A

What is Jinni?

Jinni is a tool designed to provide Large Language Models (LLMs) with the context of your projects. It consists of an MCP server for integration with AI tools and a command-line utility (CLI) for manual use, both aimed at efficiently gathering and filtering relevant project files for LLM consumption.

How to use Jinni?

Jinni can be used either through its MCP server, which integrates with IDEs and other AI tools, or via its CLI. The MCP server requires configuration in your client of choice (e.g., Cursor, Claude Desktop) to execute the jinni-server command. The CLI can be used directly from the command line with various options to specify target paths, override rules, and control output. Configuration is done using .contextfiles in your project directories.

Key features of Jinni

  • Efficient Context Gathering

  • Intelligent Filtering (Gitignore-Style Inclusion)

  • Customizable Configuration (.contextfiles / Overrides)

  • Large Context Handling

  • Metadata Headers

  • Encoding Handling

  • List Only Mode

Use cases of Jinni

  • Providing project context to LLMs for code generation

  • Enabling LLMs to answer questions about your codebase

  • Automating code review and refactoring tasks

  • Generating documentation from code comments

  • Debugging code with LLM assistance

FAQ from Jinni

What is the purpose of .contextfiles?

.contextfiles are used to define which files and directories to include or exclude from the project context, using .gitignore-style patterns.

How do I configure the MCP server?

You need to configure your MCP client (e.g., Cursor, Claude Desktop) to run the jinni-server command using uvx or python -m. Refer to your client's documentation for specific setup steps.

What do I do if I encounter a Context Size Error?

Review the largest files reported in the error message and use .contextfiles or the --overrides / rules options to exclude unnecessary files or directories. You can also increase the size limit, but be mindful of LLM context window limits.

How do I run the Jinni CLI?

Use the jinni command followed by the desired options and target paths. For example: jinni ./my_project/ to dump the context of my_project/ to the console.

How do I install Jinni?

You can install Jinni using pip install jinni or uv pip install jinni.