Skip to content

A project (LLM Sentinel) that showcases NVIDIA's NeMo-Guardrails and LangChain for improving LLM safety

License

Notifications You must be signed in to change notification settings

aidatatools/LLM_Sentinel

Repository files navigation

LLM Sentinel (NeMoLangChainOllamaGradio)

LLM Sentinel (NeMo LangChain Ollama Gradio) for NVIDIA GenAI Contest

https://www.nvidia.com/en-us/ai-data-science/generative-ai/developer-contest-with-langchain/terms-and-conditions/

Desired and Required Libraries

  • nemoguardrails==0.9.0
  • langchain-community==0.0.38
  • ollama==0.2.1
  • gradio==4.36.1
  • python-dotenv==1.0.1

It's tested on Python 3.9 and above on macOS, and Ubuntu Linux

Step 1: Set Up Your Environment

  1. Hardware Requirements: Ensure you have access to NVIDIA GPUs, ideally A100 80GB VRAM, to run the model (Llama3:70b) efficiently. In my case I rent A100 GPU from Digital Ocean Paperspace. Please see the screenshot. OS: Ubuntu 22.04 Disk Size: At least 200 GB (llama3:70b)-> 40GB, (llama3:8b)-> 5GB

    create_a_new_machine

  2. First git clone the repository

    cd ~
    git clone https://github.com/aidatatools/LLM_Sentinel.git
    cd LLM_Sentinel
  3. venv:

    Ensure you have Python 3.10 or later installed.

    cd ~/LLM_Sentinel
    python3.10 -m venv venv
    source venv/bin/activate
  4. Install requirements.txt

    pip install -r requirements.txt
  5. Check the backend ollama service is running, and the model (llama3:8b)(for DEV) or (llama3:70b)(for Production) exists. If you are not familiar with ollama, please visit https://ollama.com

    ollama list
    curl http://127.0.0.1:11434
  6. Copy .env.example to .env and set the variable(ENV_PROD) to True or False

    echo 'ENV_PROD=False' > .env

Step 2: Start the Web UI to Interact with Chatbot

  1. Start the WebUI in terminal:

    python chatbot3.py
  2. Open a browser, and visit the site with port number:

    http://127.0.0.1:7860

Reference

About

A project (LLM Sentinel) that showcases NVIDIA's NeMo-Guardrails and LangChain for improving LLM safety

Topics

Resources

License

Stars

Watchers

Forks

Languages