Sponsored by Deepsite.site

Installation and Setup Guide

Created By
ricardoborges9 months ago
LLM chat app for integration tests using llama-stack-client, llama, Ollama, MCP, Tools
Content

Installation and Setup Guide

This document provides step-by-step instructions for setting up the development environment and running the application.

Screenshot

screenshot

Prerequisites

Before starting, ensure that you have all the necessary tools installed on your system.

Installation Steps

1. Installing and Running Ollama (skip if you will use Together.ai API)

Ollama is required to provide model inference capabilities.

  1. Download and install Ollama from https://ollama.com/
  2. Start the Ollama service with the command:
    ollama serve
    
    ollama pull llama3.2:3b
    

2. Setting Up LLama-Stack

LLama-Stack will be used to manage our inference environment.

  1. Install the uv package manager
  2. Set up a virtual environment (venv)
  3. Run the following command inside the virtual environment:
    INFERENCE_MODEL=llama3.2:3b llama stack build --template ollama --image-type venv --run
    
    or
    INFERENCE_MODEL=meta-llama/Llama-3.3-70B-Instruct llama stack build --template together --image-type venv --run
    

3. Project Setup

Clone this repository and install the necessary dependencies:

  1. Clone the repository:

    git clone [https://github.com/ricardoborges/chatlab.git]
    cd [chatlab]
    
  2. Create a virtual environment and install dependencies:

    uv venv
    uv pip install -r myproject.toml
    

4. Running the Application

Create togetherAI account if you won't start Ollama local service. So, you would first get an API key from Together if you dont have one already.

How to get your API key: https://docs.google.com/document/d/1Vg998IjRW_uujAPnHdQ9jQWvtmkZFt74FldW2MblxPY/edit?tab=t.0

You will need this env variables in your .env file:

TAVILY_SEARCH_API_KEY= TOGETHER_API_KEY=

Or just ignore and set DEFAULT_STACK="Ollama" in main.py (if you will run local Ollama service)

Start the Gradio application with the following command:

gradio main.py

After running this command, the application interface will be available in your browser.

Troubleshooting

If you encounter any issues during installation, check:

  • That the Ollama service is running
  • That the virtual environment was activated correctly
  • That all dependencies were successfully installed

Additional Resources

For more information about LLama-Stack, refer to the official documentation.

Recommend Servers
TraeBuild with Free GPT-4.1 & Claude 3.7. Fully MCP-Ready.
MiniMax MCPOfficial MiniMax Model Context Protocol (MCP) server that enables interaction with powerful Text to Speech, image generation and video generation APIs.
AiimagemultistyleA Model Context Protocol (MCP) server for image generation and manipulation using fal.ai's Stable Diffusion model.
TimeA Model Context Protocol server that provides time and timezone conversion capabilities. This server enables LLMs to get current time information and perform timezone conversions using IANA timezone names, with automatic system timezone detection.
Y GuiA web-based graphical interface for AI chat interactions with support for multiple AI models and MCP (Model Context Protocol) servers.
Serper MCP ServerA Serper MCP Server
MCP AdvisorMCP Advisor & Installation - Use the right MCP server for your needs
Amap Maps高德地图官方 MCP Server
Zhipu Web SearchZhipu Web Search MCP Server is a search engine specifically designed for large models. It integrates four search engines, allowing users to flexibly compare and switch between them. Building upon the web crawling and ranking capabilities of traditional search engines, it enhances intent recognition capabilities, returning results more suitable for large model processing (such as webpage titles, URLs, summaries, site names, site icons, etc.). This helps AI applications achieve "dynamic knowledge acquisition" and "precise scenario adaptation" capabilities.
ChatWiseThe second fastest AI chatbot™
DeepChatYour AI Partner on Desktop
Playwright McpPlaywright MCP server
Baidu Map百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
CursorThe AI Code Editor
Jina AI MCP ToolsA Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Visual Studio Code - Open Source ("Code - OSS")Visual Studio Code
Tavily Mcp
WindsurfThe new purpose-built IDE to harness magic
Howtocook Mcp基于Anduin2017 / HowToCook (程序员在家做饭指南)的mcp server,帮你推荐菜谱、规划膳食,解决“今天吃什么“的世纪难题; Based on Anduin2017/HowToCook (Programmer's Guide to Cooking at Home), MCP Server helps you recommend recipes, plan meals, and solve the century old problem of "what to eat today"
EdgeOne Pages MCPAn MCP service designed for deploying HTML content to EdgeOne Pages and obtaining an accessible public URL.
BlenderBlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), allowing Claude to directly interact with and control Blender. This integration enables prompt assisted 3D modeling, scene creation, and manipulation.