Sponsored by Deepsite.site

Proxmox LangChain Agent

Created By
johnstetter6 months ago
langchain-based client for proxmox mcp server - co-pilot experiment
Content

Proxmox LangChain Agent

A lightweight FastAPI service that provides a chat endpoint backed by LangChain and Redis-based message history. This service uses an Ollama LLM for responses and persists conversations in Redis. It is designed to interact with a Proxmox MCP (Model Context Protocol) server, allowing conversational access to Proxmox cluster and VM information.

Repository Structure

agent-proxmox/
├── Dockerfile
├── app.py
├── requirements.txt
└── README.md
  • Dockerfile: Builds a slim Python image with the application and dependencies.
  • app.py: FastAPI application defining a /chat endpoint that runs a LangChain agent with Proxmox-specific tools and Redis-backed history.
  • requirements.txt: Python dependencies needed to run the agent.

Prerequisites

  • Docker Engine (20.x+)
  • Redis instance accessible by the agent
  • Proxmox MCP server accessible by the agent (see below)
  • (Optional) Docker Compose if integrating into a larger stack

Proxmox MCP Server Integration

This agent is designed to work with a Proxmox MCP (Model Context Protocol) server. The MCP server exposes HTTP endpoints for cluster and VM information, which the agent accesses using custom LangChain tools:

  • get_vm_list: Calls the MCP endpoint /mcp/context/vms to retrieve a list of all VMs and their statuses.
  • get_cluster_info: Calls the MCP endpoint /mcp/context/cluster to retrieve high-level Proxmox cluster status info.

You must have a running MCP server accessible at the address configured in app.py (default: http://mcp:8008).

Environment Variables

VariableDefaultDescription
REDIS_HOSTlocalhostHostname or IP of your Redis server
REDIS_PORT6379Port number of your Redis server
REDIS_PASSWORDnonePassword for Redis (required if Redis is secured)
LANGCHAIN_LOG_DIR./logsDirectory path (inside container) for logs

Make sure to set REDIS_PASSWORD if your Redis instance requires authentication.

Building the Docker Image

From within the project directory:

docker build -t agent-proxmox .

Running the Container

A minimal docker run example:

docker run -d \
  --name agent-proxmox \
  -p 8501:8501 \
  -e REDIS_HOST=ai-redis \
  -e REDIS_PORT=6379 \
  -e REDIS_PASSWORD=$REDIS_PASSWORD \
  -e LANGCHAIN_LOG_DIR=/logs \
  -v $(pwd)/logs:/logs \
  agent-proxmox

This exposes the FastAPI app on port 8501 and mounts a local logs/ directory for persistent logging.

Integrating with Docker Compose

If you have a larger Docker Compose setup, add this service:

services:
  agent-proxmox:
    build: ./agent-proxmox
    container_name: agent-proxmox
    networks:
      - ai-stack
    depends_on:
      - ai-redis
      - mcp
    environment:
      - REDIS_HOST=ai-redis
      - REDIS_PORT=6379
      - REDIS_PASSWORD=${REDIS_PASSWORD}
      - LANGCHAIN_LOG_DIR=${LANGCHAIN_LOG_DIR}
    volumes:
      - ./agent-proxmox/logs:${LANGCHAIN_LOG_DIR}
    ports:
      - "8501:8501"

API Usage

Send a POST request to /chat with a JSON body { "message": "Your question here" }:

curl -X POST http://localhost:8501/chat \
  -H "Content-Type: application/json" \
  -d '{"message":"Show me all VMs in the cluster"}'

Response:

{ "response": "<LLM reply>" }

Logs

Conversation logs are written to langchain_YYYYMMDD.log in the LANGCHAIN_LOG_DIR directory. Adjust verbosity by modifying the logging.basicConfig settings in app.py.

CI/CD Pipeline (GitLab)

This project uses a GitLab CI/CD pipeline to automate validation, linting, building, and deployment of the Docker container to the GitLab Container Registry. The pipeline is defined in .gitlab-ci.yml and includes the following stages:

  • validate: Installs dependencies and checks Python syntax.
  • lint: Runs flake8 to enforce code style and quality.
  • deploy: Builds and pushes the Docker image to the registry (only on main branch and tags).

The pipeline uses GitLab's built-in CI/CD variables for authentication and registry information. No additional variables are required for standard operation.

FastAPI Application Overview

  • The main application is in app.py and uses FastAPI to provide a /chat endpoint.
  • The endpoint expects a POST request with a JSON body: { "message": "Your question here" }.
  • The backend uses LangChain's agent framework with two custom tools for Proxmox MCP integration.
  • Conversation history is stored in Redis using langchain_community.chat_message_histories.RedisChatMessageHistory.
  • The LLM is provided by an Ollama server (default: http://ollama:11434).
  • Logging is configured to write conversation logs to the directory specified by LANGCHAIN_LOG_DIR (default: ./logs).

Developer Notes

  • Code Style: The project enforces PEP8 compliance using flake8. Ensure your code passes flake8 app.py before committing.
  • Environment Variables: See the table above. You must set REDIS_PASSWORD if your Redis instance is secured.
  • Testing the API: Use the provided curl example or a tool like Postman to interact with the /chat endpoint.
  • Extending the Agent: To add new tools, define a new function and add it to the tools list in app.py using the Tool class from LangChain.
  • Pipeline Troubleshooting: If the Docker image fails to push, ensure you are pushing from the main branch or a tag, as the pipeline only deploys in those cases.
  • Logs: All conversations are logged with timestamps. Check the logs/ directory for daily log files.

Contributing

  1. Fork the repository and create a feature branch.
  2. Ensure your code passes linting and validation locally:
    pip install -r requirements.txt
    flake8 app.py
    python -m py_compile app.py
    
  3. Push your branch and create a merge request.

For questions or issues, please open an issue in the GitLab repository.

Happy chaining!

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Serper MCP ServerA Serper MCP Server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Amap Maps高德地图官方 MCP Server
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
CursorThe AI Code Editor
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Tavily Mcp
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Playwright McpPlaywright MCP server
ChatWiseThe second fastest AI chatbot™
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
DeepChatYour AI Partner on Desktop
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
WindsurfThe new purpose-built IDE to harness magic