Sponsored by Deepsite.site

Quickstart Resources

Created By
tomoro-ai6 months ago
Extended the anthropic mcp tutorial to support openai and customGPT connection through http-bridge
Content

Quickstart Resources

A repository of servers and clients from the following Model Context Protocol tutorials:

Extended Features

This repository has been enhanced with additional implementations:

Integration Methods

MethodFor Custom GPTs?For CLI/Dev Use?
index.ts (Claude)
index-openai.ts
http-bridge.ts
  • Claude Integration (index.ts): Original CLI client using Anthropic's Claude
  • OpenAI Integration (index-openai.ts): Alternative CLI client using OpenAI GPT models (not for custom GPTs)
  • HTTP API Bridge (http-bridge.ts): REST API server for connecting custom GPTs via Actions (recommended for GPT Builder)

Custom GPT Integration

The HTTP API bridge is the only supported method for connecting custom GPTs in OpenAI's GPT Builder:

  • REST API server for custom GPT Actions
  • File: http-bridge.ts
  • Web-based integration via OpenAPI schema
  • Endpoints: /weather/forecast, /weather/alerts

How to Connect Your Custom GPT to Your MCP Server

Step 1: Start Your Servers
  1. Start the MCP Weather Server

    cd weather-server-typescript
    npm run build
    node build/index.js
    
  2. Start the HTTP Bridge

    cd mcp-client-typescript
    npm run build-bridge
    node build/http-bridge.js ../weather-server-typescript/build/index.js
    
Step 2: Expose Your Local Server to the Internet

OpenAI requires a public HTTPS endpoint for Actions. The easiest way is to use ngrok:

ngrok http 3000
  • This will give you a public HTTPS URL like https://randomstring.ngrok.io.
Step 3: Prepare Your OpenAPI Schema
  • Download or copy your OpenAPI schema from http://localhost:3000/openapi.json.
  • Update the "servers" section to use your ngrok URL:
    "servers": [
      { "url": "https://randomstring.ngrok.io" }
    ]
    
  • Add a privacy policy and terms of service to the "info" section:
    "info": {
      "title": "Weather API",
      "description": "Weather forecasts and alerts via MCP",
      "version": "1.0.0",
      "termsOfService": "https://www.example.com/terms",
      "x-privacy-policy-url": "https://www.example.com/privacy"
    }
    
  • You can use placeholder URLs or your own.
Step 4: Configure Your Custom GPT in GPT Builder
  1. Go to ChatGPT GPT Builder
  2. Click the Actions tab.
  3. Import the schema from your ngrok URL (e.g., https://randomstring.ngrok.io/openapi.json) or paste the edited JSON.
  4. Set Authentication to "None".
  5. Save and publish your GPT.
Step 5: Test Your Custom GPT

Ask your GPT:

  • "What's the weather forecast for San Francisco?"
  • "Are there any weather alerts in California?"
  • "Get me the forecast for latitude 40.7, longitude -74.0"

Your GPT will call your HTTP bridge, which will call your MCP server and return real weather data!


Using the HTTP Bridge with Remote or Third-Party MCP Servers

By default, the HTTP bridge is designed to launch and connect to a local MCP server (e.g., your weather server). However, you can also use it to connect to any MCP server that is accessible from your machine, including:

  • A remote MCP server running on another machine or in the cloud
  • A third-party MCP-compatible service
  • A Dockerized MCP server with a mapped port

How to Connect the HTTP Bridge to a Remote MCP Server:

  1. Ensure the remote MCP server is running and accessible

    • The server must be reachable from the machine running the HTTP bridge (e.g., via public IP, VPN, or SSH tunnel).
    • The server should support the same MCP stdio or TCP transport.
  2. Modify the HTTP bridge launch command
    Instead of launching a local script, you can point the bridge to a remote server by:

    • Using an SSH command to start the server remotely and pipe stdio over SSH
    • (Or, for advanced users) Modifying the bridge to use a TCP transport if the MCP server exposes a TCP socket

    Example: Using SSH to connect to a remote MCP server

    node build/http-bridge.js "ssh user@remotehost 'node /path/to/remote/build/index.js'"
    
    • This will use SSH to start the MCP server on the remote host and connect the stdio streams to your bridge.
  3. (Optional) Use TCP Transport for Direct Network Connections

    • If your MCP server supports TCP, you can modify the bridge to use TcpClientTransport instead of StdioClientTransport and provide the remote host/port.

    Example (pseudo-code):

    import { TcpClientTransport } from '@modelcontextprotocol/sdk/client/tcp.js';
    // ...
    this.transport = new TcpClientTransport({ host: 'remotehost', port: 12345 });
    
  4. Continue with the rest of the setup

    • Expose your HTTP bridge with ngrok or deploy it to a public server as before.
    • Your custom GPT will now be able to access any MCP server you can reach!

Tip:
This flexibility means you can use the HTTP bridge as a universal adapter for any MCP-compatible tool, whether it's running locally, in the cloud, or provided by a third party.


Troubleshooting & Tips

  • Public URL Required: OpenAI will not accept localhost or http:// URLs for public GPTs. Always use your ngrok HTTPS URL.
  • Privacy Policy: A valid privacy policy URL is required for public GPTs. Use a placeholder or generate one at privacypolicies.com.
  • Schema Import Issues: If importing from URL fails, copy the JSON and paste it manually.
  • ngrok Free Plan: The URL changes each time you restart ngrok. Update your schema and GPT config if you restart ngrok.
  • Production: For a permanent solution, deploy your bridge to a public HTTPS server and use your real domain.

Architecture Overview

Custom GPT ──Actions──> HTTP Bridge ──MCP──> Weather Server ──API──> National Weather Service

Quick Demo

  1. Start weather server: cd weather-server-typescript && npm run build && node build/index.js
  2. Start HTTP bridge: cd mcp-client-typescript && npm run build-bridge && node build/http-bridge.js ../weather-server-typescript/build/index.js
  3. Create custom GPT with Actions pointing to your ngrok URL (see above)
  4. Ask: "What's the weather in San Francisco?"

See mcp-client-typescript/README.md for detailed setup instructions.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Serper MCP ServerA Serper MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
Tavily Mcp
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
WindsurfThe new purpose-built IDE to harness magic
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
CursorThe AI Code Editor
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
ChatWiseThe second fastest AI chatbot™
DeepChatYour AI Partner on Desktop
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Amap Maps高德地图官方 MCP Server