Sponsored by Deepsite.site

Ollama Pydantic Project

Created By
jageenshukla10 months ago
Created sample project for pydantic agent with local ollama model with mcp server integration.
Content

Ollama Pydantic Project

This project demonstrates how to use a local Ollama model with the Pydantic agent framework to create an intelligent agent. The agent is connected to an MCP server to utilize tools and provides a user-friendly interface using Streamlit.

Overview

The main goal of this project is to showcase:

  • Local Ollama Model Integration: Using a locally hosted Ollama model for generating responses.
  • Pydantic Agent Framework: Creating an agent with Pydantic for data validation and interaction.
  • MCP Server Connection: Enabling the agent to use tools via an MCP server.
  • Streamlit UI: Providing a web-based chatbot interface for user interaction.

Prerequisites

Before setting up the project, ensure the following:

  1. Python: Install Python 3.8 or higher. You can download it from python.org.
  2. Ollama Model: Install and run the Ollama server locally:
    • Download the Ollama CLI from Ollama's official website.
    • Install the CLI by following the instructions provided on their website.
    • Start the Ollama server:
      ollama serve
      
    • Ensure the server is running on http://localhost:11434/v1.
  3. MCP Server: Set up an MCP server to enable agent tools. For more details, refer to MCP Server Sample.

Setup Instructions

Follow these steps to set up the project:

  1. Clone the Repository:

    git clone <repository-url>
    cd ollama-pydantic-project
    
  2. Create a Virtual Environment:

    python3 -m venv venv
    
  3. Activate the Virtual Environment:

    • On macOS/Linux:
      source venv/bin/activate
      
    • On Windows:
      venv\Scripts\activate
      
  4. Install Dependencies:

    pip install -r requirements.txt
    
  5. Ensure the Ollama Server is Running: Start the Ollama server as described in the prerequisites.

  6. Run the Application: Start the Streamlit application:

    streamlit run src/streamlit_app.py
    

Usage

Once the application is running, open the provided URL in your browser (usually http://localhost:8501). You can interact with the chatbot by typing your queries in the input box. The agent will process your queries using the Ollama model and tools provided by the MCP server.

Example Interaction

Below is an example of how the chatbot interface looks when interacting with the agent:

Chatbot Example

Project Structure

The project is organized as follows:

ollama-pydantic-project/
├── src/
│   ├── streamlit_app.py        # Main Streamlit application
│   ├── agents/
│   │   ├── base_agent.py       # Abstract base class for agents
│   │   ├── ollama_agent.py     # Implementation of the Ollama agent
│   ├── utils/
│       ├── config.py           # Configuration settings
│       ├── logger.py           # Logger utility
├── requirements.txt            # Python dependencies
├── README.md                   # Project documentation
├── assets/
│   ├── ollama_agent_mcp_example.png  # Example interaction image
├── .gitignore                  # Git ignore file

Features

  • Streamlit Chatbot: A user-friendly chatbot interface.
  • Ollama Model Integration: Uses a local Ollama model for generating responses.
  • MCP Server Tools: Connects to an MCP server to enhance agent capabilities.
  • Pydantic Framework: Ensures data validation and type safety.

Troubleshooting

  • If you encounter issues with the Ollama server, ensure it is running on http://localhost:11434/v1.
  • If dependencies fail to install, ensure you are using Python 3.8 or higher and that your virtual environment is activated.
  • For MCP server-related issues, refer to the MCP Server Sample.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Contributing

Contributions are welcome! Feel free to open issues or submit pull requests.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
DeepChatYour AI Partner on Desktop
Playwright McpPlaywright MCP server
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
Serper MCP ServerA Serper MCP Server
Amap Maps高德地图官方 MCP Server
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
CursorThe AI Code Editor
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
ChatWiseThe second fastest AI chatbot™
Tavily Mcp
WindsurfThe new purpose-built IDE to harness magic