- Crawleo Mcp Server
Crawleo Mcp Server
Crawleo MCP Server
Real-time web search and crawling capabilities for AI assistants through Model Context Protocol (MCP).
Overview
Crawleo MCP enables AI assistants to access live web data through two powerful tools:
web.search - Real-time web search with multiple output formats
web.crawl - Deep content extraction from any URL
Features
✅ Real-time web search from any country/language
✅ Multiple output formats - Enhanced HTML, Raw HTML, Markdown, Plain Text
✅ Device-specific results - Desktop, mobile, or tablet view
✅ Deep content extraction with JavaScript rendering
✅ Zero data retention - Complete privacy
✅ Auto-crawling option for search results
Getting Your API Key
Visit crawleo.dev
Sign up for a free account
Navigate to your dashboard
Copy your API key
Setup Instructions
1. Claude Desktop
Location of config file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Linux: ~/.config/Claude/claude_desktop_config.json
Configuration:
Open the config file and add:
json{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
```
Replace `YOUR_API_KEY_HERE` with your actual API key from crawleo.dev.
**Steps:**
1. Open the config file in a text editor
2. Add the Crawleo MCP configuration
3. Save the file
4. Restart Claude Desktop completely (quit and reopen)
5. Start a new conversation and ask Claude to search the web!
**Example usage:**
```
"Search for the latest AI news and summarize the top 5 articles"
"Find Python web scraping tutorials and extract code examples"
2. Cursor IDE
Location of config file:
macOS: ~/.cursor/config.json or ~/Library/Application Support/Cursor/config.json
Windows: %APPDATA%\Cursor\config.json
Linux: ~/.config/Cursor/config.json
Configuration:
json{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
```
**Steps:**
1. Locate and open your Cursor config file
2. Add the Crawleo MCP configuration
3. Save the file
4. Restart Cursor
5. The MCP tools will be available in your AI assistant
**Example usage in Cursor:**
```
"Search for React best practices and add them to my code comments"
"Find the latest documentation for this API endpoint"
3. Windsurf IDE
Location of config file:
macOS: ~/Library/Application Support/Windsurf/config.json
Windows: %APPDATA%\Windsurf\config.json
Linux: ~/.config/Windsurf/config.json
Configuration:
json{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Steps:
Open the Windsurf config file
Add the Crawleo MCP server configuration
Save and restart Windsurf
Start using web search in your coding workflow
4. GitHub Copilot
⚠️ Note: As of now, GitHub Copilot does not natively support MCP servers. MCP integration is currently available for Claude Desktop, Cursor, Windsurf, and other MCP-compatible applications.
If GitHub Copilot adds MCP support in the future, the configuration would be similar to other tools.
Alternative: Use Cursor IDE (which supports both Copilot-like features AND MCP) for the best of both worlds.
5. OpenAI Platform (Custom Integration)
OpenAI's platform doesn't directly support MCP, but you can integrate Crawleo through function calling:
Using OpenAI API with Crawleo:
pythonimport openai
import requests
# Define Crawleo as a function for OpenAI
tools = [
{
"type": "function",
"function": {
"name": "web_search",
"description": "Search the web in real-time",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "Search query"
},
"markdown": {
"type": "boolean",
"description": "Return results in Markdown format"
}
},
"required": ["query"]
}
}
}
]
# When OpenAI calls the function, execute it:
def execute_web_search(query, markdown=True):
response = requests.post(
"https://api.crawleo.dev/mcp",
headers={"Authorization": "Bearer YOUR_API_KEY_HERE"},
json={
"method": "web.search",
"params": {
"query": query,
"markdown": markdown
}
}
)
return response.json()
# Use with OpenAI
response = openai.ChatCompletion.create(
model="gpt-4",
messages=[{"role": "user", "content": "Search for AI news"}],
tools=tools
)
```
---
## Available Tools
### web.search
Search the web in real-time with customizable parameters.
**Parameters:**
- `query` *(required)* - Search term
- `max_pages` - Number of result pages (default: 1)
- `setLang` - Language code (e.g., "en", "ar")
- `cc` - Country code (e.g., "US", "EG")
- `device` - Device type: "desktop", "mobile", "tablet" (default: "desktop")
- `enhanced_html` - Get clean HTML (default: true)
- `raw_html` - Get raw HTML (default: false)
- `markdown` - Get Markdown format (default: true)
- `page_text` - Get plain text (default: false)
- `auto_crawling` - Auto-crawl result URLs (default: false)
**Example:**
```
Ask your AI: "Search for 'Python web scraping' and return results in Markdown"
```
---
### web.crawl
Extract content from specific URLs.
**Parameters:**
- `urls` *(required)* - List of URLs to crawl
- `rawHtml` - Return raw HTML (default: false)
- `markdown` - Convert to Markdown (default: false)
- `screenshot` - Capture screenshot (optional)
- `country` - Geographic location
**Example:**
```
Ask your AI: "Crawl https://example.com and extract the main content in Markdown"
```
---
## Troubleshooting
### MCP server not appearing
1. **Check config file location** - Make sure you're editing the correct file
2. **Verify JSON syntax** - Use a JSON validator to check for syntax errors
3. **Restart the application** - Completely quit and reopen (not just reload)
4. **Check API key** - Ensure your API key is valid and active at crawleo.dev
### Authentication errors
- Verify your API key is correct
- Make sure the key is wrapped in quotes
- Check that "Bearer " prefix is included in the Authorization header
- Confirm your account has available credits at crawleo.dev
### No results returned
- Check your internet connection
- Verify the search query is not empty
- Try a simpler search query first
- Check API status at crawleo.dev
---
## Usage Examples
### Research Assistant
```
"Search for recent developments in quantum computing and summarize the key findings"
```
### Content Analysis
```
"Search for competitor pricing pages and extract their pricing tiers"
```
### Code Documentation
```
"Find the official documentation for FastAPI and extract the quickstart guide"
```
### News Monitoring
```
"Search for today's news about artificial intelligence from US sources"
```
### Market Research
```
"Search for customer reviews of iPhone 15 and analyze sentiment"
Pricing
Crawleo MCP uses the same affordable pricing as our API:
10,000 searches → $20
100,000 searches → $100
250,000 searches → $200
Check your usage and manage your subscription at crawleo.dev
Privacy & Security
✅ Zero data retention - We never store your search queries or results
✅ Secure authentication - API keys transmitted over HTTPS
✅ No tracking - Your usage patterns remain private
Support
Documentation: crawleo.dev/docs
API Status: crawleo.dev/status
Contact: support@crawleo.dev
Links
🌐 Website: crawleo.dev
📚 Documentation: crawleo.dev/docs
🔑 Get API Key: crawleo.dev
Built with ❤️ by Ahmed Ellaban
Empowering AI with real-time web knowledge.
Would you like me to:
Add troubleshooting for specific errors?
Create a video script for setup tutorial?
Add more code examples for different languages?
Create a quick-start guide separate from this README?
RetryAEthis is example of doing that using githup copilot {
"servers": {
"MongoDB": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"mongodb-mcp-server",
"--connectionString",
"mongodb+srv://Smarty:Smartydb@cluster0.bla8e.mongodb.net/",
"--readOnly"
]
},
"Crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer sk_2adaff4e_215a0da400999b9c2ca8c2002643f206cab10dc4bcb1f34f798ccd328c1f4117",
}
}
}
}and this using open ai platformfrom openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-5-nano",
input=[
{
"role": "user",
"content": [
{
"type": "input_text",
"text": "search for latest news about openai models\n\n"
}
]
}
],
text={
"format": {
"type": "text"
},
"verbosity": "medium"
},
reasoning={
"effort": "medium"
},
tools=[
{
"type": "mcp",
"server_label": "Crawleo",
"server_url": "https://api.crawleo.dev/mcp",
"server_description": "Crawleo MCP Server - Real-Time Web Knowledge for AI",
"authorization": "sk_2adaff4e_215a0da400999b9c2ca8c2002643f206cab10dc4bcb1f34f798ccd328c1f4117",
"allowed_tools": [
"search_web",
"crawl_web"
],
"require_approval": "always"
}
],
store=True,
include=[
"reasoning.encrypted_content",
"web_search_call.action.sources"
]
)Show morePerfect! Here's the updated README with the correct configurations for GitHub Copilot and OpenAI Platform:
Crawleo MCP Server
Real-time web search and crawling capabilities for AI assistants through Model Context Protocol (MCP).
Overview
Crawleo MCP enables AI assistants to access live web data through two powerful tools:
web.search - Real-time web search with multiple output formats
web.crawl - Deep content extraction from any URL
Features
✅ Real-time web search from any country/language
✅ Multiple output formats - Enhanced HTML, Raw HTML, Markdown, Plain Text
✅ Device-specific results - Desktop, mobile, or tablet view
✅ Deep content extraction with JavaScript rendering
✅ Zero data retention - Complete privacy
✅ Auto-crawling option for search results
Getting Your API Key
Visit crawleo.dev
Sign up for a free account
Navigate to your dashboard
Copy your API key (starts with sk_)
Setup Instructions
1. Claude Desktop
Location of config file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Linux: ~/.config/Claude/claude_desktop_config.json
Configuration:
json{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
```
Replace `YOUR_API_KEY_HERE` with your actual API key from crawleo.dev.
**Steps:**
1. Open the config file in a text editor
2. Add the Crawleo MCP configuration
3. Save the file
4. Restart Claude Desktop completely (quit and reopen)
5. Start a new conversation and ask Claude to search the web!
**Example usage:**
```
"Search for the latest AI news and summarize the top 5 articles"
"Find Python web scraping tutorials and extract code examples"
2. Cursor IDE
Location of config file:
macOS: ~/.cursor/config.json or ~/Library/Application Support/Cursor/config.json
Windows: %APPDATA%\Cursor\config.json
Linux: ~/.config/Cursor/config.json
Configuration:
json{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
```
**Steps:**
1. Locate and open your Cursor config file
2. Add the Crawleo MCP configuration
3. Save the file
4. Restart Cursor
5. The MCP tools will be available in your AI assistant
**Example usage in Cursor:**
```
"Search for React best practices and add them to my code comments"
"Find the latest documentation for this API endpoint"
3. Windsurf IDE
Location of config file:
macOS: ~/Library/Application Support/Windsurf/config.json
Windows: %APPDATA%\Windsurf\config.json
Linux: ~/.config/Windsurf/config.json
Configuration:
json{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Steps:
Open the Windsurf config file
Add the Crawleo MCP server configuration
Save and restart Windsurf
Start using web search in your coding workflow
4. GitHub Copilot
Location of config file:
For GitHub Copilot in VS Code or compatible editors, you need to configure MCP servers.
Configuration:
Create or edit your MCP config file and add:
json{
"servers": {
"Crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Complete example with multiple servers:
json{
"servers": {
"Crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
```
**Steps:**
1. Open your GitHub Copilot MCP configuration
2. Add the Crawleo server configuration
3. Save the file
4. Restart VS Code or your IDE
5. GitHub Copilot can now use Crawleo for web searches!
**Example usage:**
```
Ask Copilot: "Search for the latest Python best practices"
Ask Copilot: "Find documentation for this library"
5. OpenAI Platform (Direct Integration)
OpenAI now supports MCP servers directly! Here's how to use Crawleo with OpenAI's API:
Python Example:
pythonfrom openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4",
input=[
{
"role": "user",
"content": [
{
"type": "input_text",
"text": "search for latest news about openai models"
}
]
}
],
text={
"format": {
"type": "text"
},
"verbosity": "medium"
},
reasoning={
"effort": "medium"
},
tools=[
{
"type": "mcp",
"server_label": "Crawleo",
"server_url": "https://api.crawleo.dev/mcp",
"server_description": "Crawleo MCP Server - Real-Time Web Knowledge for AI",
"authorization": "YOUR_API_KEY_HERE",
"allowed_tools": [
"web.search",
"web.crawl"
],
"require_approval": "always"
}
],
store=True,
include=[
"reasoning.encrypted_content",
"web_search_call.action.sources"
]
)
print(response)
Key Parameters:
server_url - Crawleo MCP endpoint
authorization - Your Crawleo API key
allowed_tools - Enable web.search and/or web.crawl
require_approval - Set to "always", "never", or "conditional"
Node.js Example:
javascriptimport OpenAI from 'openai';
const client = new OpenAI();
const response = await client.responses.create({
model: 'gpt-4',
input: [
{
role: 'user',
content: [
{
type: 'input_text',
text: 'search for latest AI developments'
}
]
}
],
tools: [
{
type: 'mcp',
server_label: 'Crawleo',
server_url: 'https://api.crawleo.dev/mcp',
server_description: 'Crawleo MCP Server - Real-Time Web Knowledge for AI',
authorization: 'YOUR_API_KEY_HERE',
allowed_tools: ['web.search', 'web.crawl'],
require_approval: 'always'
}
]
});
console.log(response);
```
---
## Available Tools
### web.search
Search the web in real-time with customizable parameters.
**Parameters:**
- `query` *(required)* - Search term
- `max_pages` - Number of result pages (default: 1)
- `setLang` - Language code (e.g., "en", "ar")
- `cc` - Country code (e.g., "US", "EG")
- `device` - Device type: "desktop", "mobile", "tablet" (default: "desktop")
- `enhanced_html` - Get clean HTML (default: true)
- `raw_html` - Get raw HTML (default: false)
- `markdown` - Get Markdown format (default: true)
- `page_text` - Get plain text (default: false)
- `auto_crawling` - Auto-crawl result URLs (default: false)
**Example:**
```
Ask your AI: "Search for 'Python web scraping' and return results in Markdown"
```
---
### web.crawl
Extract content from specific URLs.
**Parameters:**
- `urls` *(required)* - List of URLs to crawl
- `rawHtml` - Return raw HTML (default: false)
- `markdown` - Convert to Markdown (default: false)
- `screenshot` - Capture screenshot (optional)
- `country` - Geographic location
**Example:**
```
Ask your AI: "Crawl https://example.com and extract the main content in Markdown"
```
---
## Troubleshooting
### MCP server not appearing
1. **Check config file location** - Make sure you're editing the correct file
2. **Verify JSON syntax** - Use a JSON validator to check for syntax errors
3. **Restart the application** - Completely quit and reopen (not just reload)
4. **Check API key** - Ensure your API key is valid and active at crawleo.dev
### Authentication errors
- Verify your API key is correct (should start with `sk_`)
- Make sure the key is wrapped in quotes
- Check that "Bearer " prefix is included in the Authorization header (for Claude/Cursor/Windsurf)
- For OpenAI Platform, use the key directly in the `authorization` field
- Confirm your account has available credits at crawleo.dev
### No results returned
- Check your internet connection
- Verify the search query is not empty
- Try a simpler search query first
- Check API status at crawleo.dev
### Tool names not recognized
Make sure you're using the correct tool names:
- Use `web.search` (not `search_web`)
- Use `web.crawl` (not `crawl_web`)
---
## Usage Examples
### Research Assistant
```
"Search for recent developments in quantum computing and summarize the key findings"
```
### Content Analysis
```
"Search for competitor pricing pages and extract their pricing tiers"
```
### Code Documentation
```
"Find the official documentation for FastAPI and extract the quickstart guide"
```
### News Monitoring
```
"Search for today's news about artificial intelligence from US sources"
```
### Market Research
```
"Search for customer reviews of iPhone 15 and analyze sentiment"
Pricing
Crawleo MCP uses the same affordable pricing as our API:
10,000 searches → $20
100,000 searches → $100
250,000 searches → $200
Check your usage and manage your subscription at crawleo.dev
Privacy & Security
✅ Zero data retention - We never store your search queries or results
✅ Secure authentication - API keys transmitted over HTTPS
✅ No tracking - Your usage patterns remain private
Support
Documentation: crawleo.dev/docs
API Status: crawleo.dev/status
Contact: support@crawleo.dev
Links
🌐 Website: crawleo.dev
📚 Documentation: crawleo.dev/docs
🔑 Get API Key: crawleo.dev
Overview
What is Crawleo MCP Server?
Crawleo MCP Server provides real-time web search and crawling capabilities for AI assistants through the Model Context Protocol (MCP).
How to use Crawleo MCP Server?
To use Crawleo, sign up at crawleo.dev to get your API key, configure your AI assistant (like Claude Desktop, Cursor IDE, or Windsurf IDE) with the provided API key, and start making requests for web searches or content extraction.
Key features of Crawleo MCP Server?
- Real-time web search from any country/language
- Multiple output formats: Enhanced HTML, Raw HTML, Markdown, Plain Text
- Device-specific results for desktop, mobile, or tablet
- Deep content extraction with JavaScript rendering
- Zero data retention for complete privacy
- Auto-crawling option for search results
Use cases of Crawleo MCP Server?
- Researching recent developments in various fields.
- Extracting competitor pricing information for market analysis.
- Monitoring news related to specific topics.
- Analyzing customer reviews and sentiment.
FAQ from Crawleo MCP Server?
- How do I get an API key?
Visit crawleo.dev, sign up for a free account, and copy your API key from your dashboard.
- Is there a limit on searches?
Yes, Crawleo MCP has a pricing model based on the number of searches, starting from 10,000 searches for $20.
- What happens to my data?
Crawleo MCP ensures zero data retention, meaning your search queries and results are not stored.
Server Config
{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
WindsurfThe new purpose-built IDE to harness magic
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
Tavily Mcp
Amap Maps高德地图官方 MCP Server
DeepChatYour AI Partner on Desktop
CursorThe AI Code Editor
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Serper MCP ServerA Serper MCP Server
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题;
Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
ChatWiseThe second fastest AI chatbot™
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
Playwright McpPlaywright MCP server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code