Jira Weekly Reporter MCP Server logo

Jira Weekly Reporter MCP Server

by Jongryong

This project provides a FastMCP server that connects to your Jira instance to generate weekly reports based on issue activity. It leverages the `pycontribs-jira` library for Jira interaction and can optionally use the connected client's Large Language Model (LLM) for summarizing the generated report.

View on GitHub

Last updated: N/A

What is Jira Weekly Reporter MCP Server?

A FastMCP server that connects to Jira (Cloud or Server/Data Center) to generate weekly reports based on issue activity. It uses the pycontribs-jira library and can optionally summarize reports using the client's LLM.

How to use Jira Weekly Reporter MCP Server?

  1. Install dependencies using uv or pip. 2. Create a .env file with Jira connection details (URL, username, API token). 3. Run the server using python jira_reporter_server.py or fastmcp run jira_reporter_server.py. 4. Integrate with Claude Desktop by adding a server configuration to claude_desktop_config.json.

Key features of Jira Weekly Reporter MCP Server

  • Secure Jira connection using API tokens

  • Exposes a generate_jira_report tool via Model Context Protocol

  • Flexible reporting with custom JQL queries and project key filtering

  • Optional LLM summarization of reports using the client's LLM

  • Asynchronous handling of Jira library calls

Use cases of Jira Weekly Reporter MCP Server

  • Automated weekly Jira report generation

  • Summarizing Jira activity for project stakeholders

  • Integrating Jira data into Claude Desktop workflows

  • Customized reporting based on specific JQL queries

FAQ from Jira Weekly Reporter MCP Server

What Jira versions are supported?

The server supports Jira Cloud, Server, and Data Center.

How do I secure my Jira API token?

Store your API token in a .env file and ensure this file is not committed to version control (add it to .gitignore).

Can I customize the report generation?

Yes, you can use a custom JQL query, filter by project key, and limit the number of results.

Is LLM summarization required?

No, LLM summarization is optional and requires the client to have an LLM available via ctx.sample().

What are the server dependencies?

The server depends on fastmcp, jira[cli], python-dotenv, httpx, and anyio.