- MCP-Ollama Client
MCP-Ollama Client
Lightweight MCP client that uses a local Ollama LLM to query multiple MCP servers defined in config.json
Content
MCP-Ollama Client
Small command‑line chat client that
- Runs entirely offline with a local LLM via Ollama
- Talks to any number of Model‑Context‑Protocol (MCP) servers, all declared in one
config.json
At start‑up the client launches every server, fetches their tool schemas, prefixes tool names (postgres.*, filesystem.*, …) and gives the merged list to the model.
The LLM decides which server to call for each user request.
Features
| Feature | Notes |
|---|---|
| Local LLM first | Default model is qwen3:14b, but any function‑calling model that Ollama exposes will work. No cloud keys required. |
| Multi‑server out‑of‑the‑box | Postgres, filesystem, or your own MCP servers can run side‑by‑side; all are defined in config.json. |
| Collision‑free tool names | Tools are exposed as <server>.<tool> so names never clash. |
Requirements
| Component | Version tested |
|---|---|
| Python | ≥ 3.12 |
| Ollama | ≥ 0.8.0 |
| MCP server(s) | Anything that supports stdio transport |
Quick start
# 1. clone
git clone https://github.com/Nagharjun17/MCP-Ollama-Client.git
cd mcp-ollama-client
# 2. set up environment
uv venv
source .venv/bin/activate
uv pip sync # or: uv pip install -r uv.lock
# 3. pull a local model
ollama pull qwen3:14b
# 4. edit Model Name, DATABASE_URI etc. in config.json
# 5. run
uv run client.py
Example boot log:
🔌 Starting MCP server: postgres
🔌 Starting MCP server: filesystem
🛠️ Aggregated tools: ['postgres.list_schemas', 'filesystem.read_file', ...]
>>>

Type natural language queries; the model will decide when and how to call the exposed tools.
config.json format
{
"llm": {
"model": "qwen3:14b",
"temperature": 0.7,
"max_tokens": 2048
},
"mcpServers": {
"postgres": {
"command": "postgres-mcp",
"args": ["--access-mode=restricted"],
"env": {
"DATABASE_URI": "postgresql://user:pass@localhost:5432/db"
}
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/mnt/smbshare/"
]
}
}
}
Keys under mcpServers become prefixes (postgres.*, filesystem.*).
Each server starts as its own stdio subprocess.
Add or remove servers without touching client.py.
Everything stays local, everything is configurable in one file.
Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
CursorThe AI Code Editor
Context7Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
Serper MCP ServerA Serper MCP Server
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
ChatWiseThe second fastest AI chatbot™
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题;
Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
Tavily Mcp
WindsurfThe new purpose-built IDE to harness magic
DeepChatYour AI Partner on Desktop
Playwright McpPlaywright MCP server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Amap Maps高德地图官方 MCP Server