Sponsored by Deepsite.site

Steampipe Model Context Protocol (MCP) Server

Created By
turbot9 months ago
Enable AI assistants to explore and query your Steampipe data!
Content

Steampipe Model Context Protocol (MCP) Server

Unlock the power of AI-driven infrastructure analysis with Steampipe! This Model Context Protocol server seamlessly connects AI assistants like Claude to your cloud infrastructure data, enabling natural language exploration and analysis of your entire cloud estate.

Steampipe MCP bridges AI assistants and your infrastructure data, allowing natural language:

  • Queries across AWS, Azure, GCP and 100+ cloud services
  • Security and compliance analysis
  • Cost and resource optimization
  • Query development assistance

Works with both local Steampipe installations and Turbot Pipes workspaces, providing safe, read-only access to all your cloud and SaaS data.

Installation

Prerequisites

  • Node.js v16 or higher (includes npx)
  • For local use: Steampipe installed and running (steampipe service start)
  • For Turbot Pipes: A Turbot Pipes workspace and connection string

Configuration

Add Steampipe MCP to your AI assistant's configuration file:

{
  "mcpServers": {
    "steampipe": {
      "command": "npx",
      "args": [
        "-y",
        "@turbot/steampipe-mcp"
      ]
    }
  }
}

By default, this connects to your local Steampipe installation at postgresql://steampipe@localhost:9193/steampipe. Make sure to run steampipe service start first.

To connect to a Turbot Pipes workspace instead, add your connection string to the args:

{
  "mcpServers": {
    "steampipe": {
      "command": "npx",
      "args": [
        "-y",
        "@turbot/steampipe-mcp",
        "postgresql://my_name:my_pw@workspace-name.usea1.db.pipes.turbot.com:9193/abc123"
      ]
    }
  }
}

AI Assistant Setup

AssistantConfig File LocationSetup Guide
Claude Desktopclaude_desktop_config.jsonClaude Desktop MCP Guide →
Cursor~/.cursor/mcp.jsonCursor MCP Guide →

Save the configuration file and restart your AI assistant for the changes to take effect.

Prompting Guide

First, run the best_practices prompt included in the MCP server to teach your LLM how best to work with Steampipe. Then, ask anything!

Explore your cloud infrastructure:

What AWS accounts can you see?

Simple, specific questions work well:

Show me all S3 buckets that were created in the last week

Generate infrastructure reports:

List my EC2 instances with their attached EBS volumes

Dive into security analysis:

Find any IAM users with access keys that haven't been rotated in the last 90 days

Get compliance insights:

Show me all EC2 instances that don't comply with our tagging standards

Explore potential risks:

Analyze my S3 buckets for security risks including public access, logging, and encryption

Remember to:

  • Be specific about which cloud resources you want to analyze (EC2, S3, IAM, etc.)
  • Mention regions or accounts if you're interested in specific ones
  • Start with simple queries before adding complex conditions
  • Use natural language - the LLM will handle the SQL translation
  • Be bold and exploratory - the LLM can help you discover insights across your entire infrastructure!

Capabilities

Tools

  • steampipe_query

    • Query cloud and security logs with SQL.
    • For best performance: use CTEs instead of joins, limit columns requested.
    • All queries are read-only and use PostgreSQL syntax.
    • Input: sql (string): The SQL query to execute using PostgreSQL syntax
  • steampipe_table_list

    • List all available Steampipe tables.
    • Optional input: schema (string): Filter tables by specific schema
    • Optional input: filter (string): Filter tables by ILIKE pattern (e.g. '%ec2%')
  • steampipe_table_show

    • Get detailed information about a specific table, including column definitions, data types, and descriptions.
    • Input: name (string): The name of the table to show details for (can be schema qualified e.g. 'aws_account' or 'aws.aws_account')
    • Optional input: schema (string): The schema containing the table
  • steampipe_plugin_list

    • List all Steampipe plugins installed on the system. Plugins provide access to different data sources like AWS, GCP, or Azure.
    • No input parameters required
  • steampipe_plugin_show

    • Get details for a specific Steampipe plugin installation, including version, memory limits, and configuration.
    • Input: name (string): Name of the plugin to show details for

Prompts

  • best_practices
    • Best practices for working with Steampipe data
    • Provides detailed guidance on:
      • Response style and formatting conventions
      • Using CTEs (WITH clauses) vs joins
      • SQL syntax and style conventions
      • Column selection and optimization
      • Schema exploration and understanding
      • Query structure and organization
      • Performance considerations and caching
      • Error handling and troubleshooting

Resources

  • status
    • Represents the current state of the Steampipe connection
    • Properties include:
      • connection_string: The current database connection string
      • status: The connection state (connected/disconnected)

This resource enables AI tools to check and verify the connection status to your Steampipe instance.

Development

Clone and Setup

  1. Clone the repository and navigate to the directory:
git clone https://github.com/turbot/steampipe-mcp.git
cd steampipe-mcp
  1. Install dependencies:
npm install
  1. Build the project:
npm run build

Testing

To test your local development build with AI tools that support MCP, update your MCP configuration to use the local dist/index.js instead of the npm package. For example:

{
  "mcpServers": {
    "steampipe": {
      "command": "node",
      "args": [
        "/absolute/path/to/steampipe-mcp/dist/index.js",
        "postgresql://steampipe@localhost:9193/steampipe"
      ]
    }
  }
}

Or, use the MCP Inspector to validate the server implementation:

npx @modelcontextprotocol/inspector dist/index.js

Environment Variables

The following environment variables can be used to configure the MCP server:

  • STEAMPIPE_MCP_LOG_LEVEL: Control server logging verbosity (default: info)
  • STEAMPIPE_MCP_WORKSPACE_DATABASE: Override the default Steampipe connection string (default: postgresql://steampipe@localhost:9193/steampipe)

Open Source & Contributing

This repository is published under the Apache 2.0 license. Please see our code of conduct. We look forward to collaborating with you!

Steampipe is a product produced from this open source software, exclusively by Turbot HQ, Inc. It is distributed under our commercial terms. Others are allowed to make their own distribution of the software, but they cannot use any of the Turbot trademarks, cloud services, etc. You can learn more in our Open Source FAQ.

Get Involved

Join #steampipe on Slack →

Want to help but don't know where to start? Pick up one of the help wanted issues:

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
WindsurfThe new purpose-built IDE to harness magic
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Serper MCP ServerA Serper MCP Server
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Amap Maps高德地图官方 MCP Server
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Tavily Mcp
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
CursorThe AI Code Editor
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
DeepChatYour AI Partner on Desktop
Playwright McpPlaywright MCP server
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。