2025-05-11 21:09:46 +03:00
2025-04-19 20:54:35 +03:00
2025-05-11 21:09:46 +03:00
2025-04-19 20:54:35 +03:00
2025-04-19 20:54:35 +03:00
2025-05-11 21:09:46 +03:00
2025-05-11 21:09:46 +03:00
2025-05-11 21:09:46 +03:00

Pre-historic Knowledge Assistant Web App

A web application for the RAG-based knowledge assistant.

Features

  • Multiple LLM provider support (Azure OpenAI, OpenAI, Ollama, vLLM, custom endpoints)
  • Flexible embedding configuration
  • Web interface with real-time responses
  • REST API endpoints
  • Health check endpoint

Setup

  1. Install dependencies:

    pip install -r requirements.txt
    
  2. Create .env file from .env.example:

    cp .env.example .env
    # Edit .env with your configuration
    
  3. Run the application:

    python app.py
    # or
    ./run.sh
    

Configuration

LLM Providers

The application supports multiple LLM providers:

  • Azure OpenAI: Set LLM_PROVIDER=azure_openai and configure Azure credentials
  • OpenAI: Set LLM_PROVIDER=openai and provide API key
  • Ollama: Set LLM_PROVIDER=ollama and configure host URL
  • vLLM: Set LLM_PROVIDER=vllm and configure vLLM host
  • Custom: Set LLM_PROVIDER=custom for any OpenAI-compatible endpoint

Environment Variables

See .env.example for all available configuration options.

API Endpoints

  • GET /: Web interface
  • POST /ask: Process question via web form
  • POST /api/ask: REST API endpoint for questions
  • GET /api/health: Health check endpoint

Development

To run in development mode with auto-reload:

python app.py

The application will be available at http://localhost:8000.

Description
No description provided
Readme 37 KiB
Languages
Python 63.9%
HTML 34.1%
Shell 2%