Pre-historic Knowledge Assistant Web App
A web application for the RAG-based knowledge assistant.
Features
- Multiple LLM provider support (Azure OpenAI, OpenAI, Ollama, vLLM, custom endpoints)
- Flexible embedding configuration
- Web interface with real-time responses
- REST API endpoints
- Health check endpoint
Setup
-
Install dependencies:
pip install -r requirements.txt -
Create
.envfile from.env.example:cp .env.example .env # Edit .env with your configuration -
Run the application:
python app.py # or ./run.sh
Configuration
LLM Providers
The application supports multiple LLM providers:
- Azure OpenAI: Set
LLM_PROVIDER=azure_openaiand configure Azure credentials - OpenAI: Set
LLM_PROVIDER=openaiand provide API key - Ollama: Set
LLM_PROVIDER=ollamaand configure host URL - vLLM: Set
LLM_PROVIDER=vllmand configure vLLM host - Custom: Set
LLM_PROVIDER=customfor any OpenAI-compatible endpoint
Environment Variables
See .env.example for all available configuration options.
API Endpoints
GET /: Web interfacePOST /ask: Process question via web formPOST /api/ask: REST API endpoint for questionsGET /api/health: Health check endpoint
Development
To run in development mode with auto-reload:
python app.py
The application will be available at http://localhost:8000.
Description
Languages
Python
63.9%
HTML
34.1%
Shell
2%