Files
Fast-Whisper-MCP-Server/.dockerignore
Alihan fb1e5dceba Upgrade to PyTorch 2.6.0 and enhance GPU reset script with Ollama management
- Upgrade PyTorch and torchaudio to 2.6.0 with CUDA 12.4 support
- Update GPU reset script to gracefully stop/start Ollama via supervisorctl
- Add Docker Compose configuration for both API and MCP server modes
- Implement comprehensive Docker entrypoint for multi-mode deployment
- Add GPU health check cleanup to prevent memory leaks
- Fix transcription memory management with proper resource cleanup
- Add filename security validation to prevent path traversal attacks
- Include .dockerignore for optimized Docker builds
- Remove deprecated supervisor configuration

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 23:01:22 +03:00

61 lines
675 B
Plaintext

# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
*.egg-info/
dist/
build/
# Virtual environments
venv/
env/
ENV/
.venv
# Project specific
logs/
outputs/
models/
*.log
*.logs
mcp.logs
api.logs
# Git
.git/
.gitignore
.github/
# IDE
.vscode/
.idea/
*.swp
*.swo
*~
# Docker
.dockerignore
docker-compose.yml
docker-compose.*.yml
# Temporary files
*.tmp
*.temp
.DS_Store
Thumbs.db
# Documentation (optional - uncomment if you want to exclude)
# README.md
# CLAUDE.md
# IMPLEMENTATION_PLAN.md
# Scripts (already in container)
# reset_gpu.sh - NEEDED for GPU health checks
run_api_server.sh
run_mcp_server.sh
# Supervisor config (not needed in container)
supervisor/