Sponsored by Deepsite.site

MCP Chat Adapter

Created By
aiamblichus9 months ago
MCP server for using OpenAI compatible chat endpoints
Content

MCP Chat Adapter

An MCP (Model Context Protocol) server that provides a clean interface for LLMs to use chat completion capabilities through the MCP protocol. This server acts as a bridge between an LLM client and any OpenAI-compatible API. The primary use case is for chat models, as the server does not provide support for text completions.

Overview

The OpenAI Chat MCP Server implements the Model Context Protocol (MCP), allowing language models to interact with OpenAI's chat completion API in a standardized way. It enables seamless conversations between users and language models while handling the complexities of API interactions, conversation management, and state persistence.

Features

  • Built with FastMCP for robust and clean implementation
  • Provides tools for conversation management and chat completion
  • Proper error handling and timeouts
  • Supports conversation persistence with local storage
  • Easy setup with minimal configuration
  • Configurable model parameters and defaults
  • Compatible with OpenAI and OpenAI-compatible APIs

Typical Workflow

The idea is that you can have Claude spin off and maintain multiple conversations with other models in the background. All conversations are stored in the CONVERSATION_DIR directory, which you should set in the env section of your mcp.json file.

It is possible to tell Claude either to create a new conversation, or to continue an existing one (identified by the integer conversation_id). You can continue with the old conversation even if you are starting fresh in a new context, although in that case you may want to tell Claude to read the old conversation before continuing using the get_conversation tool.

Note that you can also edit the conversations in the CONVERSATION_DIR directory manually. In this case, you may need to restart the server to see the changes.

Configuration

Required Environment Variables

These environment variables must be set for the server to function:

OPENAI_API_KEY=your-api-key  # Your API key for OpenAI or compatible service
OPENAI_API_BASE=https://openrouter.ai/api/v1 # The base URL for the API (can be changed for compatible services)

You should also set the CONVERSATION_DIR environment variable to the directory where you want to store the conversation data. Use an absolute path.

Optional Environment Variables

The following environment variables are optional and have default values:

# Model Configuration
DEFAULT_MODEL=google/gemini-2.0-flash-001 # Default model to use if not specified
DEFAULT_SYSTEM_PROMPT="You are an unhelpful assistant."  # Default system prompt
DEFAULT_MAX_TOKENS=50000 # Default maximum tokens for completion
DEFAULT_TEMPERATURE=0.7  # Default temperature setting
DEFAULT_TOP_P=1.0 # Default top_p setting
DEFAULT_FREQUENCY_PENALTY=0.0 # Default frequency penalty
DEFAULT_PRESENCE_PENALTY=0.0  # Default presence penalty

# Storage Configuration
CONVERSATION_DIR=./convos # Directory to store conversation data
MAX_CONVERSATIONS=1000 # Maximum number of conversations to store

Integrating with Claude UI etc.

Your mcp.json file should look like this:

{
  "mcpServers": {
    "chat-adapter": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-chat-adapter"
      ],
      "env": {
          "CONVERSATION_DIR": "/Users/aiamblichus/mcp-convos",
          "OPENAI_API_KEY": "xoxoxo",
          "OPENAI_API_BASE": "https://openrouter.ai/api/v1",
          "DEFAULT_MODEL": "qwen/qwq-32b"
      }
    }
  }
}

The latest version of the package is published to npm here.

Available Tools

1. Create Conversation

Creates a new chat conversation.

{
  "name": "create_conversation",
  "arguments": {
    "model": "gpt-4",
    "system_prompt": "You are a helpful assistant.",
    "parameters": {
      "temperature": 0.7,
      "max_tokens": 1000
    },
    "metadata": {
      "title": "My conversation",
      "tags": ["important", "work"]
    }
  }
}

2. Chat

Adds a message to a conversation and gets a response.

{
  "name": "chat",
  "arguments": {
    "conversation_id": "123",
    "message": "Hello, how are you?",
    "parameters": {
      "temperature": 0.8
    }
  }
}

3. List Conversations

Gets a list of available conversations.

{
  "name": "list_conversations",
  "arguments": {
    "filter": {
      "tags": ["important"]
    },
    "limit": 10,
    "offset": 0
  }
}

4. Get Conversation

Gets the full content of a conversation.

{
  "name": "get_conversation",
  "arguments": {
    "conversation_id": "123"
  }
}

5. Delete Conversation

Deletes a conversation.

{
  "name": "delete_conversation",
  "arguments": {
    "conversation_id": "123"
  }
}

Development

Installation

# Clone the repository
git clone https://github.com/aiamblichus/mcp-chat-adapter.git
cd mcp-chat-adapter

# Install dependencies
yarn install

# Build the project
yarn build

Running the server

For FastMCP cli run:

yarn cli

For FastMCP inspect run:

yarn inspect

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Tavily Mcp
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
DeepChatYour AI Partner on Desktop
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Amap Maps高德地图官方 MCP Server
Playwright McpPlaywright MCP server
WindsurfThe new purpose-built IDE to harness magic
ChatWiseThe second fastest AI chatbot™
Serper MCP ServerA Serper MCP Server
CursorThe AI Code Editor
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.