Sponsored by Deepsite.site

🚀 Atlassian AI Agent

Created By
cnoe-io8 months ago
Atlassian (Jira/Confluence) AI Agent powered by 1st Party MCP Server using OpenAPI Codegen, LangGraph and LangChain MCP Adapters. Agent is exposed on various agent transport protocols (AGNTCY ACP, Google A2A, MCP Server)
Content

🚀 Atlassian AI Agent

Python Poetry License

Conventional Commits Ruff Linter Super Linter

A2A Docker Build and Push

🧪 Evaluation Badges

ClaudeGeminiOpenAILlama
Claude EvalsGemini EvalsOpenAI EvalsLlama Evals

  • 🤖 Atlassian Agent is an LLM-powered agent built using the LangGraph ReAct Agent workflow and MCP tools.
  • 🌐 Protocol Support: Compatible with ACP and A2A protocols for integration with external user clients.
  • 🛡️ Secure by Design: Enforces Atlassian API token-based RBAC and supports external authentication for strong access control.
  • 🔌 Integrated Communication: Uses langchain-mcp-adapters to connect with the Atlassian MCP server within the LangGraph ReAct Agent workflow.
  • 🏭 First-Party MCP Server: The MCP server is generated by our first-party openapi-mcp-codegen utility, ensuring version/API compatibility and software supply chain integrity.

🚦 Getting Started

1️⃣ Configure Environment

Example .env file:

1️⃣ Create/Update .env

LLM_PROVIDER=

AGENT_NAME=atlassian

ATLASSIAN_TOKEN=
ATLASSIAN_EMAIL=
ATLASSIAN_API_URL=
ATLASSIAN_VERIFY_SSL=

########### LLM Configuration ###########
# Refer to: https://github.com/cnoe-io/cnoe-agent-utils#-usage

Use the following link to get your own Atlassian API Token:

https://id.atlassian.com/manage-profile/security/api-tokens

2️⃣ Start the Agent (A2A Mode)

Run the agent in a Docker container using your .env file:

docker run -p 0.0.0.0:8000:8000 -it\
   -v "$(pwd)/.env:/app/.env"\
   ghcr.io/cnoe-io/agent-atlassian:a2a-latest

3️⃣ Run the Client

Use the agent-chat-cli to interact with the agent:

uvx https://github.com/cnoe-io/agent-chat-cli.git a2a

🏗️ Architecture

flowchart TD
  subgraph Client Layer
    A[User Client ACP/A2A]
  end

  subgraph Agent Transport Layer
    B[AGNTCY ACP<br/>or<br/>Google A2A]
  end

  subgraph Agent Graph Layer
    C[LangGraph ReAct Agent]
  end

  subgraph Tools/MCP Layer
    D[LangGraph MCP Adapter]
    E[Atlassian MCP Server]
    F[Atlassian API Server]
  end

  A --> B --> C
  C --> D
  D -.-> C
  D --> E --> F --> E

✨ Features

  • 🤖 LangGraph + LangChain MCP Adapter for agent orchestration
  • 🧠 Azure OpenAI GPT-4o as the LLM backend
  • 🔗 Connects to Atlassian via a dedicated Atlassian MCP agent
  • 🔄 Multi-protocol support: Compatible with both ACP and A2A protocols for flexible integration and multi-agent orchestration

2️⃣ Start Workflow Server (ACP or A2A)

You can start the workflow server in either ACP or A2A mode:

  • ACP Mode:
    make run-acp
    
  • A2A Mode:
    make run-a2a
    

🧪 Usage

▶️ Test with Atlassian Server

🏃 Quick Start: Run Atlassian Locally with Minikube

If you don't have an existing Atlassian server, you can quickly spin one up using Minikube:

  1. Start Minikube:
minikube start
  1. Install Atlassian in the atlassian namespace:
kubectl create namespace atlassian
kubectl apply -n atlassian -f https://raw.githubusercontent.com/argoproj/argo-cd/stable/manifests/install.yaml
  1. Expose the Atlassian API server:
kubectl port-forward svc/atlassian-server -n atlassian 8080:443

The API will be available at https://localhost:8080.

  1. Get the Atlassian admin password:
kubectl -n atlassian get secret atlassian-initial-admin-secret -o jsonpath="{.data.password}" | base64 -d && echo
  1. (Optional) Install Atlassian CLI:
brew install atlassian
# or see https://argo-cd.readthedocs.io/en/stable/cli_installation/

For more details, see the official getting started guide.

1️⃣ Run the ACP Client

To interact with the agent in ACP mode:

make run-acp-client

Configure Environment Variables

Create or update a .env file in your project root with the following:

AGENT_ID="<YOUR_AGENT_ID>"
API_KEY="<YOUR_API_KEY>"
WFSM_PORT="<YOUR_ACP_SERVER_PORT>"

Example Interaction

> Your Question: how can you help?
Agent: I can assist you with managing applications in Atlassian, including tasks such as:
  • Listing Applications: Retrieve a list of applications with filtering options.
  • Getting Application Details: Fetch detailed information about a specific application.
  • Creating Applications: Create new applications in Atlassian.
  • Updating Applications: Update existing applications.
  • Deleting Applications: Remove applications from Atlassian.
  • Syncing Applications: Synchronize applications to a specific Git revision.
  • Getting User Info: Retrieve information about the current user.
  • Getting Atlassian Settings: Access server settings.
  • Getting Plugins: List available plugins.
  • Getting Version Information: Retrieve Atlassian API server version.

2️⃣ Run the A2A Client

To interact with the agent in A2A mode:

make run-a2a-client

Sample Streaming Output

When running in A2A mode, you’ll see streaming responses like:

============================================================
RUNNING STREAMING TEST
============================================================

--- Single Turn Streaming Request ---
--- Streaming Chunk ---
The current version of Atlassian is **v2.13.3+a25c8a0**. Here are some additional details:

- **Build Date:** 2025-01-03
- **Git Commit:** a25c8a0eef7830be0c2c9074c92dbea8ff23a962
- **Git Tree State:** clean
- **Go Version:** go1.23.1
- **Compiler:** gc
- **Platform:** linux/amd64
- **Kustomize Version:** v5.4.3
- **Helm Version:** v3.15.4+gfa9efb0
- **Kubectl Version:** v0.31.0
- **Jsonnet Version:** v0.20.0

🧬 Internals

  • 🛠️ Uses create_react_agent for tool-calling
  • 🔌 Tools loaded from the Atlassian MCP server (submodule)
  • ⚡ MCP server launched via uv run with stdio transport
  • 🕸️ Single-node LangGraph for inference and action routing

📁 Project Structure

agent_atlassian/
├── agent.py              # LLM + MCP client orchestration
├── langgraph.py          # LangGraph graph definition
├── __main__.py           # CLI entrypoint
├── state.py              # Pydantic state models
└── atlassian_mcp/           # Git submodule: Atlassian MCP server

client/
└── client_agent.py       # Agent ACP Client

🧩 MCP Submodule (Atlassian Tools)

This project uses a first-party MCP module generated from the Atlassian OpenAPI specification using our openapi-mcp-codegen utility. The generated MCP server is included as a git submodule in atlassian_mcp/.

All Atlassian-related LangChain tools are defined by this MCP server implementation, ensuring up-to-date API compatibility and supply chain integrity.


🔌 MCP Integration

The agent uses MultiServerMCPClient to communicate with MCP-compliant services.

Example (stdio transport):

async with MultiServerMCPClient(
  {
    "atlassian": {
      "command": "uv",
      "args": ["run", "/abs/path/to/atlassian_mcp/server.py"],
      "env": {
        "ATLASSIAN_TOKEN": atlassian_token,
        "ATLASSIAN_API_URL": atlassian_api_url,
        "ATLASSIAN_VERIFY_SSL": "false"
      },
      "transport": "stdio",
    }
  }
) as client:
  agent = create_react_agent(model, client.get_tools())

Example (SSE transport):

async with MultiServerMCPClient(
  {
    "atlassian": {
      "transport": "sse",
      "url": "http://localhost:8000"
    }
  }
) as client:
  ...

Evals

Running Evals

This evaluation uses agentevals to perform strict trajectory match evaluation of the agent's behavior. To run the evaluation suite:

make evals

This will:

  • Set up and activate the Python virtual environment
  • Install evaluation dependencies (agentevals, tabulate, pytest)
  • Run strict trajectory matching tests against the agent

Example Output

=======================================
 Setting up the Virtual Environment
=======================================
Virtual environment already exists.
=======================================
 Activating virtual environment
=======================================
To activate venv manually, run: source .venv/bin/activate
. .venv/bin/activate
Running Agent Strict Trajectory Matching evals...
Installing agentevals with Poetry...
. .venv/bin/activate && uv add agentevals tabulate pytest
...
set -a && . .env && set +a && uv run evals/strict_match/test_strict_match.py
...
Test ID: atlassian_agent_1
Prompt: show atlassian version
Reference Trajectories: [['__start__', 'agent_atlassian']]
Note: Shows the version of the Atlassian Server Version.
...
Results:
{'score': True}
...

Evaluation Results

Latest Strict Match Eval Results


📜 License

Apache 2.0 (see LICENSE)


👥 Maintainers

See MAINTAINERS.md

  • Contributions welcome via PR or issue!

🙏 Acknowledgements

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
ChatWiseThe second fastest AI chatbot™
Tavily Mcp
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
WindsurfThe new purpose-built IDE to harness magic
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
CursorThe AI Code Editor
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
DeepChatYour AI Partner on Desktop
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Amap Maps高德地图官方 MCP Server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
Serper MCP ServerA Serper MCP Server