Sponsored by Deepsite.site

OpenStack MCP + Agent PoC

Created By
Akrog6 months ago
An OpenStack MCP server PoC
Content

OpenStack MCP + Agent PoC

This repository contains the code and instructions to run an OpenStack MCP server and a very basic agent companion program to test it.

The approach taken in this PoC is:

  • Use the OpenStack codegenerator project (openstack-code-generator) to create OpenAPI Specs for OpenStack.

  • Use the mcp-openapi project to start multiple MCP servers (one per OpenStack component) from a filtered version of their full OpenAPI specifications.

  • Use the agent code to interact with the MCP servers via an LLM.

You don't need all this for a quick test, you can just follow the QuickStart guide, but if you want to do something different you'll need to go through the Running things guide.

QuickStart

We assume you have access to an OpenStack deployment and to an LLM, as they are required to run things.

You'll also need the uv package manager. If you don't you can install it with sudo dnf install uv.

We'll be using 2 terminals, one for the MCP server and another for the Agent.

Clone this repository

We'll clone this repository with just the minimum external projects necessary to run things.

$ git clone --shallow-submodules \
  --recurse-submodules=mcp-openapi \
  https://github.com/Akrog/mcp-openstack.git

$ cd mcp-openstack

Run the MCP server

First we need to configure the MCP server, and for that we'll edit the servers.yaml file and replace the <NOVA_PUBLIC_URL>, <CINDER_PUBLIC_URL>, and <GLANCE_PUBLIC_URL> placeholders with the values of our cluster.

You'll notice that only a subset of REST API paths are being enabled to avoid having tool many tools that could go over our model's context window or induce hallucinations.

Now we can run the server:

$ cd mcp-openapi

$ uv run main.py --config ../servers.yaml

We leave this service running on this terminal and go to another terminal.

Run the Agent

In another terminal we'll go into the agent directory:

$ cd agent

OpenStack client config

Now we'll make sure the configuration for the OpenStack client is available in one of the standard locations as described in the documentation:

  • agent/
  • ~/.config/openstack/

If we are running an OSP 18 cloud we can get this with:

$ oc cp openstackclient:/home/cloud-admin/.config/openstack/clouds.yaml ./clouds.yaml
$ oc cp openstackclient:/home/cloud-admin/.config/openstack/secure.yaml ./secure.yaml 

LLM config

We'll create an llm.token file with the secret/token for the OpenAI LLM endpoint we want to use.

Then edit the agent.json file to replace the <LLM_URL> placeholder with our LLM's address and replace <MODEL> with the model we want to use. For example for Anthropic we could have something like this:

{
  "base_url": "https://api.anthropic.com/v1",
  "model": "claude-3-5-haiku-20241022",
  < ... >
}

Run the agent

Now we just need to start the agent:

$ uv run main.py

Enjoy

We can now ask things about our deployment and see the REST API calls on the MCP server terminal.

For example to get the nova flavors we could do:

$ uv run main.py
LLM: claude-3-5-haiku-20241022 @ https://api.anthropic.com/v1 Tools: 46
>>> What are my flavors?

Be aware that the agent doesn't have memory, so each prompt will be a clean prompt for the LLM.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
ChatWiseThe second fastest AI chatbot™
Playwright McpPlaywright MCP server
Tavily Mcp
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
DeepChatYour AI Partner on Desktop
WindsurfThe new purpose-built IDE to harness magic
Serper MCP ServerA Serper MCP Server
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
CursorThe AI Code Editor
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Amap Maps高德地图官方 MCP Server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.