Sponsored by Deepsite.site

Grafana MCP server

Created By
grafanaa year ago
MCP server for Grafana
Content

Grafana MCP server

A Model Context Protocol (MCP) server for Grafana.

This provides access to your Grafana instance and the surrounding ecosystem.

Features

The following features are currently available in MCP server. This list is for informational purposes only and does not represent a roadmap or commitment to future features.

Dashboards

  • Search for dashboards: Find dashboards by title or other metadata
  • Get dashboard by UID: Retrieve full dashboard details using its unique identifier
  • Update or create a dashboard: Modify existing dashboards or create new ones. Note: Use with caution due to context window limitations; see issue #101
  • Get panel queries and datasource info: Get the title, query string, and datasource information (including UID and type, if available) from every panel in a dashboard

Datasources

  • List and fetch datasource information: View all configured datasources and retrieve detailed information about each.
    • Supported datasource types: Prometheus, Loki.

Prometheus Querying

  • Query Prometheus: Execute PromQL queries (supports both instant and range metric queries) against Prometheus datasources.
  • Query Prometheus metadata: Retrieve metric metadata, metric names, label names, and label values from Prometheus datasources.

Loki Querying

  • Query Loki logs and metrics: Run both log queries and metric queries using LogQL against Loki datasources.
  • Query Loki metadata: Retrieve label names, label values, and stream statistics from Loki datasources.

Incidents

  • Search, create, update, and close incidents: Manage incidents in Grafana Incident, including searching, creating, updating, and resolving incidents.

Sift Investigations

  • Create Sift investigations: Start a new Sift investigation for analyzing logs or traces.
  • List Sift investigations: Retrieve a list of Sift investigations, with support for a limit parameter.
  • Get Sift investigation: Retrieve details of a specific Sift investigation by its UUID.
  • Get Sift analyses: Retrieve a specific analysis from a Sift investigation.
  • Find error patterns in logs: Detect elevated error patterns in Loki logs using Sift.
  • Find slow requests: Detect slow requests using Sift (Tempo).

Alerting

  • List and fetch alert rule information: View alert rules and their statuses (firing/normal/error/etc.) in Grafana.
  • List contact points: View configured notification contact points in Grafana.

Grafana OnCall

  • List and manage schedules: View and manage on-call schedules in Grafana OnCall.
  • Get shift details: Retrieve detailed information about specific on-call shifts.
  • Get current on-call users: See which users are currently on call for a schedule.
  • List teams and users: View all OnCall teams and users.

Admin

  • List teams: View all configured teams in Grafana.

The list of tools is configurable, so you can choose which tools you want to make available to the MCP client. This is useful if you don't use certain functionality or if you don't want to take up too much of the context window. To disable a category of tools, use the --disable-<category> flag when starting the server. For example, to disable the OnCall tools, use --disable-oncall.

Tools

ToolCategoryDescription
list_teamsAdminList all teams
search_dashboardsSearchSearch for dashboards
get_dashboard_by_uidDashboardGet a dashboard by uid
update_dashboardDashboardUpdate or create a new dashboard
get_dashboard_panel_queriesDashboardGet panel title, queries, datasource UID and type from a dashboard
list_datasourcesDatasourcesList datasources
get_datasource_by_uidDatasourcesGet a datasource by uid
get_datasource_by_nameDatasourcesGet a datasource by name
query_prometheusPrometheusExecute a query against a Prometheus datasource
list_prometheus_metric_metadataPrometheusList metric metadata
list_prometheus_metric_namesPrometheusList available metric names
list_prometheus_label_namesPrometheusList label names matching a selector
list_prometheus_label_valuesPrometheusList values for a specific label
list_incidentsIncidentList incidents in Grafana Incident
create_incidentIncidentCreate an incident in Grafana Incident
add_activity_to_incidentIncidentAdd an activity item to an incident in Grafana Incident
resolve_incidentIncidentResolve an incident in Grafana Incident
query_loki_logsLokiQuery and retrieve logs using LogQL (either log or metric queries)
list_loki_label_namesLokiList all available label names in logs
list_loki_label_valuesLokiList values for a specific log label
query_loki_statsLokiGet statistics about log streams
list_alert_rulesAlertingList alert rules
get_alert_rule_by_uidAlertingGet alert rule by UID
list_oncall_schedulesOnCallList schedules from Grafana OnCall
get_oncall_shiftOnCallGet details for a specific OnCall shift
get_current_oncall_usersOnCallGet users currently on-call for a specific schedule
list_oncall_teamsOnCallList teams from Grafana OnCall
list_oncall_usersOnCallList users from Grafana OnCall
get_investigationSiftRetrieve an existing Sift investigation by its UUID
get_analysisSiftRetrieve a specific analysis from a Sift investigation
list_investigationsSiftRetrieve a list of Sift investigations with an optional limit
find_error_pattern_logsSiftFinds elevated error patterns in Loki logs.
find_slow_requestsSiftFinds slow requests from the relevant tempo datasources.

Usage

  1. Create a service account in Grafana with enough permissions to use the tools you want to use, generate a service account token, and copy it to the clipboard for use in the configuration file. Follow the Grafana documentation for details.

  2. You have several options to install mcp-grafana:

    • Docker image: Use the pre-built Docker image from Docker Hub.

      Important: The Docker image's entrypoint is configured to run the MCP server in SSE mode by default, but most users will want to use STDIO mode for direct integration with AI assistants like Claude Desktop:

      1. STDIO Mode: For stdio mode you must explicitly override the default with -t stdio and include the -i flag to keep stdin open:
      docker pull mcp/grafana
      docker run --rm -i -e GRAFANA_URL=http://localhost:3000 -e GRAFANA_API_KEY=<your service account token> mcp/grafana -t stdio
      
      1. SSE Mode: In this mode, the server runs as an HTTP server that clients connect to. You must expose port 8000 using the -p flag:
      docker pull mcp/grafana
      docker run --rm -p 8000:8000 -e GRAFANA_URL=http://localhost:3000 -e GRAFANA_API_KEY=<your service account token> mcp/grafana
      
    • Download binary: Download the latest release of mcp-grafana from the releases page and place it in your $PATH.

    • Build from source: If you have a Go toolchain installed you can also build and install it from source, using the GOBIN environment variable to specify the directory where the binary should be installed. This should also be in your PATH.

      GOBIN="$HOME/go/bin" go install github.com/grafana/mcp-grafana/cmd/mcp-grafana@latest
      
  3. Add the server configuration to your client configuration file. For example, for Claude Desktop:

    If using the binary:

    {
      "mcpServers": {
        "grafana": {
          "command": "mcp-grafana",
          "args": [],
          "env": {
            "GRAFANA_URL": "http://localhost:3000",
            "GRAFANA_API_KEY": "<your service account token>"
          }
        }
      }
    }
    

Note: if you see Error: spawn mcp-grafana ENOENT in Claude Desktop, you need to specify the full path to mcp-grafana.

If using Docker:

{
  "mcpServers": {
    "grafana": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-e",
        "GRAFANA_URL",
        "-e",
        "GRAFANA_API_KEY",
        "mcp/grafana",
        "-t",
        "stdio"
      ],
      "env": {
        "GRAFANA_URL": "http://localhost:3000",
        "GRAFANA_API_KEY": "<your service account token>"
      }
    }
  }
}

Note: The -t stdio argument is essential here because it overrides the default SSE mode in the Docker image.

Using VSCode with remote MCP server

If you're using VSCode and running the MCP server in SSE mode (which is the default when using the Docker image without overriding the transport), make sure your .vscode/settings.json includes the following:

"mcp": {
  "servers": {
    "grafana": {
      "type": "sse",
      "url": "http://localhost:8000/sse"
    }
  }
}

Debug Mode

You can enable debug mode for the Grafana transport by adding the -debug flag to the command. This will provide detailed logging of HTTP requests and responses between the MCP server and the Grafana API, which can be helpful for troubleshooting.

To use debug mode with the Claude Desktop configuration, update your config as follows:

If using the binary:

{
  "mcpServers": {
    "grafana": {
      "command": "mcp-grafana",
      "args": ["-debug"],
      "env": {
        "GRAFANA_URL": "http://localhost:3000",
        "GRAFANA_API_KEY": "<your service account token>"
      }
    }
  }
}

If using Docker:

{
  "mcpServers": {
    "grafana": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-e",
        "GRAFANA_URL",
        "-e",
        "GRAFANA_API_KEY",
        "mcp/grafana",
        "-t",
        "stdio",
        "-debug"
      ],
      "env": {
        "GRAFANA_URL": "http://localhost:3000",
        "GRAFANA_API_KEY": "<your service account token>"
      }
    }
  }
}

Note: As with the standard configuration, the -t stdio argument is required to override the default SSE mode in the Docker image.

Development

Contributions are welcome! Please open an issue or submit a pull request if you have any suggestions or improvements.

This project is written in Go. Install Go following the instructions for your platform.

To run the server locally in STDIO mode (which is the default for local development), use:

make run

To run the server locally in SSE mode, use:

go run ./cmd/mcp-grafana --transport sse

You can also run the server using the SSE transport inside a custom built Docker image. Just like the published Docker image, this custom image's entrypoint defaults to SSE mode. To build the image, use:

make build-image

And to run the image in SSE mode (the default), use:

docker run -it --rm -p 8000:8000 mcp-grafana:latest

If you need to run it in STDIO mode instead, override the transport setting:

docker run -it --rm mcp-grafana:latest -t stdio

Testing

There are three types of tests available:

  1. Unit Tests (no external dependencies required):
make test-unit

You can also run unit tests with:

make test
  1. Integration Tests (requires docker containers to be up and running):
make test-integration
  1. Cloud Tests (requires cloud Grafana instance and credentials):
make test-cloud

Note: Cloud tests are automatically configured in CI. For local development, you'll need to set up your own Grafana Cloud instance and credentials.

More comprehensive integration tests will require a Grafana instance to be running locally on port 3000; you can start one with Docker Compose:

docker-compose up -d

The integration tests can be run with:

make test-all

If you're adding more tools, please add integration tests for them. The existing tests should be a good starting point.

Linting

To lint the code, run:

make lint

This includes a custom linter that checks for unescaped commas in jsonschema struct tags. The commas in description fields must be escaped with \\, to prevent silent truncation. You can run just this linter with:

make lint-jsonschema

See the JSONSchema Linter documentation for more details.

License

This project is licensed under the Apache License, Version 2.0.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Tavily Mcp
ChatWiseThe second fastest AI chatbot™
Playwright McpPlaywright MCP server
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
DeepChatYour AI Partner on Desktop
Serper MCP ServerA Serper MCP Server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
WindsurfThe new purpose-built IDE to harness magic
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
CursorThe AI Code Editor
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Amap Maps高德地图官方 MCP Server
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.