Skip to content

Latest commit

Β 

History

History
77 lines (58 loc) Β· 2.62 KB

File metadata and controls

77 lines (58 loc) Β· 2.62 KB

RAG-Agent Chat with Streamlit

A modern UI for the RAG-Agent application using Streamlit - providing better UI components and a more user-friendly experience than Chainlit.

Features

  • πŸ” Document Retrieval: Upload documents and get answers based on their content
  • πŸ’¬ Chat Interface: Clean, modern chat UI with message bubbles and streaming text
  • πŸ€– Model Selection: Proper dropdown menu for Ollama model selection
  • πŸ”„ Persistent Sidebar: Always-visible controls for all features
  • πŸ“± Responsive Design: Works well on desktop and mobile
  • πŸ“Š Context Display: Expandable panels to view retrieved context
  • πŸŽ›οΈ Toggle Controls: Clear toggles for enabling/disabling features

Installation

  1. Make sure you have Ollama installed and running:

    ollama serve
  2. Install the requirements:

    pip install -r requirements_streamlit.txt
  3. Run the application:

    ./start_streamlit.sh

    Or directly:

    streamlit run app_streamlit.py
  4. Open your browser to http://localhost:8501

Usage

Upload Documents

  1. Use the file upload section in the sidebar to upload one or more documents
  2. Supported formats: PDF, TXT, CSV, MD, DOC, DOCX

Chat with Your Documents

  1. Type your question in the chat input at the bottom
  2. The application will:
    • Retrieve relevant context from your documents
    • Display this context in an expandable panel
    • Generate a response based on the retrieved information

Toggle Features

  • Use the "Database Retrieval" toggle in the sidebar to enable/disable RAG functionality
  • When turned off, the model will respond without using document context

Change Models

  • Select a different Ollama model from the dropdown menu
  • The change is applied immediately without restarting

Clear History

  • Click the "Clear Chat History" button to start fresh
  • This clears both the conversation and document references

Troubleshooting

  • Ollama not connecting: Make sure Ollama is running with ollama serve
  • Model not found: Check if the model is installed in Ollama with ollama list
  • Documents not loading: Verify file format is supported, try another document

Advantages over Chainlit Version

  • Better UI Components: Native sidebars, dropdowns, and toggles
  • Improved Context Handling: Cleaner display of retrieved document chunks
  • More Responsive: UI elements react instantly to changes
  • Visual Clarity: Better visual organization of components
  • Error Handling: More robust error handling and feedback
  • Persistent Settings: UI state persists across page reloads