Sponsored by Deepsite.site

JIRA MCP Server (Async)

Created By
judexzhu7 months ago
A high-performance, asynchronous Model Context Protocol (MCP) server that integrates with JIRA using stdio transport
Content

JIRA MCP Server (Async)

Python 3.13+ MCP Compatible UV

A high-performance, asynchronous Model Context Protocol (MCP) server that integrates with JIRA using stdio transport, allowing AI assistants to:

  • Connect to your company's JIRA instance with async operations
  • Search for issues using JQL (JIRA Query Language) with concurrent processing
  • Get detailed issue information including comments with improved performance
  • Track issue relationships (links, parent/child, epics) efficiently
  • Create new issues and update existing ones
  • View available workflow transitions

🚀 Performance Features

This async implementation provides significant performance improvements over traditional synchronous JIRA clients:

  • Concurrent API Calls: Process multiple JIRA requests simultaneously
  • Connection Pooling: Efficient HTTP connection management with aiohttp
  • Rate Limiting: Built-in throttling to respect JIRA API limits
  • Non-blocking I/O: True async operations that don't block the event loop
  • Stdio Transport: Optimized for MCP client integration
  • Clean Architecture: Focused on essential tools without unnecessary complexity

Performance Comparison

  • Synchronous: Traditional blocking operations
  • Asynchronous: Non-blocking concurrent operations with connection pooling

Features

This MCP server provides functionality through MCP tools:

MCP Tools

The server exposes the following MCP tools with jira_ prefixes to avoid conflicts with other MCP servers (like GitHub):

ToolDescriptionParameters
jira_search_issuesSearch for JIRA issues using JQLjql: JQL query string
max_results: Maximum number of results to return
jira_get_issue_detailsGet detailed information about a specific JIRA issueissue_key: The JIRA issue key (e.g., "PROJECT-123")
jira_get_issue_commentsGet all comments for a specific JIRA issueissue_key: The JIRA issue key
jira_get_issue_linksGet all links for a specific JIRA issueissue_key: The JIRA issue key
jira_get_epic_issuesGet all issues that belong to a specific epicepic_key: The JIRA epic issue key
jira_get_subtasksGet all subtasks for a specific JIRA issueissue_key: The parent JIRA issue key
jira_get_available_transitionsLists available workflow transitions for a given Jira issueissue_key: The JIRA issue key
jira_create_issueCreates a new issue in a specified Jira projectproject_key: Key of the project
summary: Issue summary
description: Issue description
issue_type_name: Type of the issue
assignee_name: (Optional) Name of the assignee
priority_name: (Optional) Name of the priority
labels: (Optional) List of labels
custom_fields: (Optional) Dictionary of custom fields

Architecture

The server uses a clean, tool-focused architecture:

  • 8 MCP Tools: All essential JIRA operations as simple, focused functions
  • No Resources: Simplified design without MCP resources for easier maintenance
  • Async Client: High-performance AsyncJiraClient with connection pooling
  • Comprehensive Logging: Detailed logging for monitoring and debugging

This approach provides:

  • Simplicity: Easy to understand and maintain
  • Performance: Async operations with connection pooling
  • Reliability: Focused functionality with comprehensive error handling
  • Flexibility: All essential JIRA operations available through clean tool interfaces

Setup

Prerequisites

  • Python 3.13+
  • uv package manager
  • JIRA API token from your Atlassian account

Installation

  1. Clone this repository:

    git clone https://github.com/yourusername/mcp-jira.git
    cd mcp-jira
    
  2. Install dependencies:

    uv sync
    
  3. Create a .env file with your JIRA credentials:

    cp config.env.example .env
    
  4. Edit the .env file with your JIRA credentials:

    # JIRA Configuration
    JIRA_SERVER_URL=https://your-company.atlassian.net
    JIRA_API_TOKEN=your_api_token_here
    
    # Performance Configuration
    MAX_CONCURRENT_REQUESTS=2
    LOG_LEVEL=INFO
    
    # Timeouts (in seconds)
    REQUEST_TIMEOUT=30
    CONNECT_TIMEOUT=10
    

Running the Server

This is a STDIO MCP Server designed to be used with MCP clients like Claude Desktop.

The server is designed to be used with MCP clients. For Claude Desktop:

  1. Add to Claude Desktop Configuration:

    {
      "mcpServers": {
        "jira": {
          "command": "python",
          "args": ["/path/to/your/jira_mcp_server.py"],
          "env": {
            "JIRA_SERVER_URL": "https://your-company.atlassian.net",
            "JIRA_API_TOKEN": "your_api_token_here"
          }
        }
      }
    }
    
  2. Restart Claude Desktop to load the new server configuration.

Environment Variables

The server uses the following environment variables with built-in defaults:

VariableDescriptionDefaultRequired
JIRA_SERVER_URLYour JIRA instance URLNoneRequired
JIRA_API_TOKENYour JIRA API tokenNoneRequired
MAX_CONCURRENT_REQUESTSMax concurrent requests & rate limit (req/sec)2Optional
REQUEST_TIMEOUTHTTP request timeout (seconds)30Optional
CONNECT_TIMEOUTHTTP connection timeout (seconds)10Optional
LOG_LEVELLogging level (DEBUG, INFO, WARNING, ERROR)ERROROptional
LOG_TO_STDOUTEnable stdout logging (interferes with MCP)falseOptional

Only JIRA_SERVER_URL and JIRA_API_TOKEN are required - all other settings have sensible defaults.

Logging

The server includes comprehensive logging:

  • Console Output: Real-time status and errors
  • Log File: Detailed logs saved to jira_mcp_server.log
  • Configurable Levels: Set LOG_LEVEL in your .env file

Log levels:

  • DEBUG: Detailed debugging information
  • INFO: General operational messages (default)
  • WARNING: Warning messages and rate limiting notices
  • ERROR: Error conditions
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
DeepChatYour AI Partner on Desktop
Amap Maps高德地图官方 MCP Server
Playwright McpPlaywright MCP server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
CursorThe AI Code Editor
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Serper MCP ServerA Serper MCP Server
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code