Files
Fast-Whisper-MCP-Server/docker-entrypoint.sh
Alihan fb1e5dceba Upgrade to PyTorch 2.6.0 and enhance GPU reset script with Ollama management
- Upgrade PyTorch and torchaudio to 2.6.0 with CUDA 12.4 support
- Update GPU reset script to gracefully stop/start Ollama via supervisorctl
- Add Docker Compose configuration for both API and MCP server modes
- Implement comprehensive Docker entrypoint for multi-mode deployment
- Add GPU health check cleanup to prevent memory leaks
- Fix transcription memory management with proper resource cleanup
- Add filename security validation to prevent path traversal attacks
- Include .dockerignore for optimized Docker builds
- Remove deprecated supervisor configuration

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 23:01:22 +03:00

68 lines
2.2 KiB
Bash
Executable File

#!/bin/bash
set -e
# Docker Entrypoint Script for Whisper Transcriptor
# Supports both MCP and API server modes
datetime_prefix() {
date "+[%Y-%m-%d %H:%M:%S]"
}
echo "$(datetime_prefix) Starting Whisper Transcriptor in ${SERVER_MODE} mode..."
# Ensure required directories exist
mkdir -p "$WHISPER_MODEL_DIR"
mkdir -p "$TRANSCRIPTION_OUTPUT_DIR"
mkdir -p "$TRANSCRIPTION_BATCH_OUTPUT_DIR"
mkdir -p "$JOB_METADATA_DIR"
mkdir -p /app/outputs/uploads
# Display GPU information
if command -v nvidia-smi &> /dev/null; then
echo "$(datetime_prefix) GPU Information:"
nvidia-smi --query-gpu=name,driver_version,memory.total --format=csv,noheader
else
echo "$(datetime_prefix) Warning: nvidia-smi not found. GPU may not be available."
fi
# Check server mode and start appropriate service
case "${SERVER_MODE}" in
"api")
echo "$(datetime_prefix) Starting API Server mode with nginx reverse proxy"
# Update nginx configuration to use correct backend
sed -i "s/server 127.0.0.1:33767;/server ${API_HOST}:${API_PORT};/" /etc/nginx/sites-available/transcriptor.conf
# Enable nginx site
ln -sf /etc/nginx/sites-available/transcriptor.conf /etc/nginx/sites-enabled/
rm -f /etc/nginx/sites-enabled/default
# Test nginx configuration
echo "$(datetime_prefix) Testing nginx configuration..."
nginx -t
# Start nginx in background
echo "$(datetime_prefix) Starting nginx..."
nginx
# Start API server (foreground - this keeps container running)
echo "$(datetime_prefix) Starting API server on ${API_HOST}:${API_PORT}"
echo "$(datetime_prefix) API accessible via nginx on port 80"
exec python -u /app/src/servers/api_server.py
;;
"mcp")
echo "$(datetime_prefix) Starting MCP Server mode (stdio)"
echo "$(datetime_prefix) Model directory: $WHISPER_MODEL_DIR"
# Start MCP server in stdio mode
exec python -u /app/src/servers/whisper_server.py
;;
*)
echo "$(datetime_prefix) ERROR: Invalid SERVER_MODE: ${SERVER_MODE}"
echo "$(datetime_prefix) Valid modes: 'api' or 'mcp'"
exit 1
;;
esac