Sponsored by Deepsite.site

Llama MCP Streamlit

Created By
Nikunj200310 months ago
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
Content

Llama MCP Streamlit

This project is an interactive AI assistant built with Streamlit, NVIDIA NIM's API (LLaMa 3.3:70b)/Ollama, and Model Control Protocol (MCP). It provides a conversational interface where you can interact with an LLM to execute real-time external tools via MCP, retrieve data, and perform actions seamlessly.

The assistant supports:

  • Custom model selection (NVIDIA NIM / Ollama)
  • API configuration for different backends
  • Tool integration via MCP to enhance usability and real-time data processing
  • A user-friendly chat-based experience with Streamlit

📸 Screenshots

Homepage Screenshot

Tools Screenshot

Chat Screenshot

Chat (What can you do?) Screenshot

📁 Project Structure

llama_mcp_streamlit/
│── ui/
│   ├── sidebar.py       # UI components for Streamlit sidebar
│   ├── chat_ui.py       # Chat interface components
│── utils/
│   ├── agent.py         # Handles interaction with LLM and tools
│   ├── mcp_client.py    # MCP client for connecting to external tools
│   ├── mcp_server.py    # Configuration for MCP server selection
│── config.py            # Configuration settings
│── main.py              # Entry point for the Streamlit app
.env                      # Environment variables
Dockerfile                # Docker configuration
pyproject.toml            # Poetry dependency management

🔧 Environment Variables

Before running the project, configure the .env file with your API keys:

# Endpoint for the NVIDIA Integrate API
API_ENDPOINT=https://integrate.api.nvidia.com/v1
API_KEY=your_api_key_here

# Endpoint for the Ollama API
API_ENDPOINT=http://localhost:11434/v1/
API_KEY=ollama

🚀 Running the Project

Using Poetry

  1. Install dependencies:
    poetry install
    
  2. Run the Streamlit app:
    poetry run streamlit run llama_mcp_streamlit/main.py
    

Using Docker

  1. Build the Docker image:
    docker build -t llama-mcp-assistant .
    
  2. Run the container:
    docker compose up
    

🔄 Changing MCP Server Configuration

To modify which MCP server to use, update the utils/mcp_server.py file. You can use either NPX or Docker as the MCP server:

NPX Server

server_params = StdioServerParameters(
    command="npx",
    args=[
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/Users/username/Desktop",
        "/path/to/other/allowed/dir"
    ],
    env=None,
)

Docker Server

server_params = StdioServerParameters(
    command="docker",
    args=[
        "run",
        "-i",
        "--rm",
        "--mount", "type=bind,src=/Users/username/Desktop,dst=/projects/Desktop",
        "--mount", "type=bind,src=/path/to/other/allowed/dir,dst=/projects/other/allowed/dir,ro",
        "--mount", "type=bind,src=/path/to/file.txt,dst=/projects/path/to/file.txt",
        "mcp/filesystem",
        "/projects"
    ],
    env=None,
)

Modify the server_params configuration as needed to fit your setup.


📌 Features

  • Real-time tool execution via MCP
  • LLM-powered chat interface
  • Streamlit UI with interactive chat elements
  • Support for multiple LLM backends (NVIDIA NIM & Ollama)
  • Docker support for easy deployment

🛠 Dependencies

  • Python 3.11+
  • Streamlit
  • OpenAI API (for NVIDIA NIM integration)
  • MCP (Model Control Protocol)
  • Poetry (for dependency management)
  • Docker (optional, for containerized deployment)

📜 License

This project is licensed under the MIT License.


🤝 Contributing

Feel free to submit pull requests or report issues!


📬 Contact

For any questions, reach out via GitHub Issues.


Recommend Clients
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
chatmcpChatMCP is an AI chat client implementing the Model Context Protocol (MCP).
VISBOOM
MCP PlaygroundCall MCP Server Tools Online
A Sleek AI Assistant & MCP Client5ire is a cross-platform desktop AI assistant, MCP client. It compatible with major service providers, supports local knowledge base and tools via model context protocol servers .
Roo Code (prev. Roo Cline)Roo Code (prev. Roo Cline) gives you a whole dev team of AI agents in your code editor.
ChatWiseThe second fastest AI chatbot™
Continue⏩ Create, share, and use custom AI code assistants with our open-source IDE extensions and hub of models, rules, prompts, docs, and other building blocks
CursorThe AI Code Editor
WindsurfThe new purpose-built IDE to harness magic
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
y-cli 🚀A Tiny Terminal Chat App for AI Models with MCP Client Support
HyperChatHyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
DeepChatYour AI Partner on Desktop
LutraLutra is the first MCP compatible client built for everyone
Cline – #1 on OpenRouterAutonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
MCP ConnectEnables cloud-based AI services to access local Stdio based MCP servers via HTTP requests
ZedCode at the speed of thought – Zed is a high-performance, multiplayer code editor from the creators of Atom and Tree-sitter.
Cherry Studio🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
Refact.aiOpen-source AI Agent for VS Code and JetBrains that autonomously solves coding tasks end-to-end.