- 🧠 Advanced MCP Server Setup with
🧠 Advanced MCP Server Setup with
🧠 Advanced MCP Server Setup with uv, llama-index, ollama, and Cursor IDE
✅ Prerequisites
- Python 3.10+ installed
- uv (by Astral) installed globally (
pip install uv) - Ollama installed and running locally
- Cursor IDE installed
🛠 Step 1: Project Setup
1.1 Create a New Project Directory
uv init mcp-server
cd mcp-server
1.2 Create and Activate Virtual Environment
uv venv
.venv\Scripts\activate # On Windows
# OR
source .venv/bin/activate # On Linux/Mac
🔐 Step 2: Environment Configuration
Create a .env file in the root of your project and add your API key:
LINKUP_API_KEY=your_api_key_here
📦 Step 3: Install Required Dependencies
Run these commands one by one inside your virtual environment:
# Core MCP CLI and HTTP utilities
uv add mcp[cli] httpx
# Linkup SDK for orchestrating agents
uv add linkup-sdk
# LlamaIndex integrations
uv add llama-index
uv add llama-index-embeddings-huggingface
uv add llama-index-llms-ollama
# Optional: for using notebooks
uv add ipykernel
🧪 Step 4: Confirm Installation
After installation, check your uv-managed pyproject.toml for something like this:
[tool.uv.dependencies]
mcp = { extras = ["cli"] }
httpx = "*"
linkup-sdk = "*"
llama-index = "*"
llama-index-embeddings-huggingface = "*"
llama-index-llms-ollama = "*"
ipykernel = "*"
⚙️ Step 5: Create a Minimal Server Entry Point
Create a server.py file inside the project root:
# server.py
from mcp.cli import app
if __name__ == "__main__":
app()
You can later replace this with your own
FastMCPor Agent orchestrator script.
🧠 Step 6: Run Ollama Locally
Make sure Ollama is installed and running:
ollama run llama3.2 Or any model you want
This starts the LLM backend at http://localhost:11434.
🖥️ Step 7: Configure MCP Server in Cursor IDE
7.1 Open Cursor Settings
- Open
Settings→ Go to MCP section. - Click on "Add New Global MCP Server"
7.2 Fill Out the Configuration
Replace the paths with your actual machine paths. You can get the full path to uv by running:
where uv # Windows
Now add this to your Cursor IDE settings:
{
"mcpServers": {
"weather": {
"command": "C:\\Users\\SIDHYA\\AppData\\Roaming\\Python\\Python311\\Scripts\\uv.exe", // Replace with your actual uv path
"args": [
"--directory",
"C:\\Users\\SIDHYA\\Development\\Ai\\mcp-server",
"run",
"server.py"
]
}
}
}
🧪 Step 8: Test the Integration
- Open any
.pyfile in Cursor. - Use the MCP tools (usually accessible via
⌘KorCtrl+K) to run the “weather” MCP server. - You should see the server spin up using your
server.py.
📘 Suggested Directory Structure
mcp-server/
├── .env
├── pyproject.toml
├── server.py
└── rag.py
🔁 Keep Things Updated
To update dependencies:
uv pip install --upgrade llama-index
uv pip install --upgrade linkup-sdk
✍️ Author
👋 Hey, I'm Asutosh Sidhya
🌐 Connect with Me
- 📧 Email: sidhyaasutosh@gmail.com
- 🧑💻 GitHub: @asutosh7
- 💼 LinkedIn: linkedin.com/in/asutosh-sidhya
If you're building something around AI agents, local LLMs, or automated RAG pipelines—I'd love to connect or collaborate!