Sponsored by Deepsite.site

London Transport Mcp Server

Created By
anoopt2 months ago
Content

Transport for London MCP Server

⚠️ Important Disclaimer: This is not an official Transport for London (TfL) MCP server. This is an independent project that uses the publicly available TfL Unified API to provide transport data. It is not affiliated with, endorsed by, or officially supported by Transport for London.

A Model Context Protocol (MCP) server providing real-time Transport for London data, including line status, journey planning, and disruption information. This server is deployed on Cloudflare Workers and can be used with any MCP-compatible client like Claude Desktop.

Demo

Features

This MCP server provides the following tools:

🚇 get_line_status

Get the current status of a specific TfL transport line.

  • Parameters:
    • lineId: The line ID to query (e.g., 'central', 'piccadilly', 'northern', 'victoria')

🗺️ plan_journey

Plan journeys between two locations in London.

  • Parameters:
    • from: Starting location (postcode, address, or station name)
    • to: Destination location (postcode, address, or station name)
    • modes (optional): Transport modes to use (e.g., 'tube,bus,walking')
    • time (optional): Journey time in HH:MM format or 'now'
    • timeIs (optional): 'Departing' or 'Arriving'
    • date (optional): Journey date in YYYYMMDD format
    • walkingSpeed (optional): 'Slow', 'Average', or 'Fast'
    • cyclePreference (optional): Cycling preference options
    • optimize (optional): 'Time', 'LeastInterchange', or 'LeastWalking'
    • maxTransferMinutes (optional): Maximum transfer time in minutes
    • maxWalkingMinutes (optional): Maximum walking time in minutes

Getting Started

Try It Now (No Deployment Required)

Want to test the server before deploying your own? You can use the public demo instance!

For Claude Desktop:

  1. Open Claude Desktop Settings > Developer > Edit Config
  2. Add this configuration:
{
  "mcpServers": {
    "tfl": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://london-transport-mcp.anoopt.workers.dev/mcp"
      ]
    }
  }
}
  1. Restart Claude Desktop
  2. Start asking questions about London transport!

For VS Code (GitHub Copilot):

  1. Open VS Code Settings > Extensions > GitHub Copilot > MCP Servers
  2. Add this configuration:
{
  "servers": {
    "tfl": {
      "type": "http",
      "url": "https://london-transport-mcp.anoopt.workers.dev/mcp"
    }
  }
}
  1. Start the MCP server
  2. Start using TfL tools in Copilot Chat!

Quick Deploy

Deploy to Cloudflare Workers

Click the button above to deploy this MCP server to your Cloudflare Workers account with one click!

Manual Deployment

  1. Clone this repository
  2. Install dependencies:
    npm install
    
  3. Deploy to Cloudflare Workers:
    npm run deploy
    

Your MCP server will be deployed to a URL like: london-transport-mcp.<your-account>.workers.dev

Authentication (Optional)

This server supports optional TfL API key authentication. While the TfL API works without authentication, using an API key provides higher rate limits and better performance.

Without API Key: The server will work with TfL's public API limits (subject to rate limiting).

With API Key (Recommended):

  1. Get a free API key from TfL API Portal
  2. Pass the API key in the X-API-Key header when connecting to the server

Connect to Claude Desktop

To connect to your MCP server from Claude Desktop, update your Claude Desktop configuration:

  1. Go to Claude Desktop Settings > Developer > Edit Config
  2. Add one of the following configurations:

Using mcp-remote

With API Key:

{
  "mcpServers": {
    "tfl": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://london-transport-mcp.<your-account>.workers.dev/mcp",
        "--header",
        "X-API-Key: YOUR_TFL_API_KEY"
      ]
    }
  }
}

Without API Key:

{
  "mcpServers": {
    "tfl": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://london-transport-mcp.<your-account>.workers.dev/mcp"
      ]
    }
  }
}

Note: For local development, replace the URL with http://localhost:8787/mcp

Connect to Cloudflare AI Playground

You can also connect to your MCP server from the Cloudflare AI Playground:

  1. Go to https://playground.ai.cloudflare.com/
  2. Enter your deployed MCP server URL with the /mcp endpoint
  3. Add the X-API-Key header with your TfL API key
  4. Start using the TfL tools directly from the playground!

Local Development

For local development:

npm run dev

This will start the server locally at http://localhost:8787

API Key (Optional)

This server supports optional TfL API key authentication for higher rate limits. The API key is not required but recommended for production use.

To use without an API Key: Simply omit the X-API-Key header when connecting. The TfL API will apply standard rate limits.

To use with an API Key:

  1. Visit the TfL API Portal
  2. Register for a free account
  3. Create a new application to get your API key
  4. Pass the key in the X-API-Key header when connecting

Examples

Once connected, you can ask Claude things like:

  • "Is the Northern line running ok?"
  • "What's the status of the Victoria line?"
  • "Plan a journey from King's Cross to Heathrow Airport"
  • "How do I get from Oxford Circus to Canary Wharf?"
  • "What's the fastest route from Wimbledon to Liverpool Street?"

Tech Stack

  • Runtime: Cloudflare Workers
  • Framework: Hono.js
  • MCP SDK: @modelcontextprotocol/sdk
  • Language: TypeScript
  • API: Transport for London Unified API

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is open source and available under the MIT License.

Server Config

{
  "mcpServers": {
    "tfl": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://london-transport-mcp.anoopt.workers.dev/mcp"
      ]
    }
  }
}
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
WindsurfThe new purpose-built IDE to harness magic
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Tavily Mcp
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
CursorThe AI Code Editor
DeepChatYour AI Partner on Desktop
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Serper MCP ServerA Serper MCP Server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Amap Maps高德地图官方 MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
ChatWiseThe second fastest AI chatbot™
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.