Files
claude-context-mcp/docs/getting-started/environment-variables.md
Max Qian 54cc336378 Add custom gemini baseurl support (#183)
* Add Gemini API basic URL configuration, optimize CI workflow to support multiple operating systems, update dependencies and fix cleanup scripts

* Optimize build performance, add Windows specific settings and troubleshooting documentation, update configurations to support incremental builds and caching

* Optimize CI workflow to support cross platform build output validation, update documentation to include validation commands for Windows and Unix
2025-08-30 21:52:52 +08:00

4.5 KiB

Environment Variables Configuration

🎯 Global Configuration

Claude Context supports a global configuration file at ~/.context/.env to simplify MCP setup across different MCP clients.

Benefits:

  • Configure once, use everywhere
  • No need to specify environment variables in each MCP client
  • Cleaner MCP configurations

📋 Environment Variable Priority

  1. Process Environment Variables (highest)
  2. Global Configuration File (~/.context/.env)
  3. Default Values (lowest)

🔧 Required Environment Variables

Embedding Provider

Variable Description Default
EMBEDDING_PROVIDER Provider: OpenAI, VoyageAI, Gemini, Ollama OpenAI
EMBEDDING_MODEL Embedding model name (works for all providers) Provider-specific default
OPENAI_API_KEY OpenAI API key Required for OpenAI
OPENAI_BASE_URL OpenAI API base URL (optional, for custom endpoints) https://api.openai.com/v1
VOYAGEAI_API_KEY VoyageAI API key Required for VoyageAI
GEMINI_API_KEY Gemini API key Required for Gemini
GEMINI_BASE_URL Gemini API base URL (optional, for custom endpoints) https://generativelanguage.googleapis.com/v1beta

💡 Note: EMBEDDING_MODEL is a universal environment variable that works with all embedding providers. Simply set it to the model name you want to use (e.g., text-embedding-3-large for OpenAI, voyage-code-3 for VoyageAI, etc.).

Supported Model Names:

  • OpenAI Models: See getSupportedModels in openai-embedding.ts for the full list of supported models.

  • VoyageAI Models: See getSupportedModels in voyageai-embedding.ts for the full list of supported models.

  • Gemini Models: See getSupportedModels in gemini-embedding.ts for the full list of supported models.

  • Ollama Models: Depends on the model you install locally.

📖 For detailed provider-specific configuration examples and setup instructions, see the MCP Configuration Guide.

Vector Database

Variable Description Default
MILVUS_TOKEN Milvus authentication token. Get Zilliz Personal API Key Recommended
MILVUS_ADDRESS Milvus server address. Optional when using Zilliz Personal API Key Auto-resolved from token

Ollama (Optional)

Variable Description Default
OLLAMA_HOST Ollama server URL http://127.0.0.1:11434
OLLAMA_MODEL(alternative to EMBEDDING_MODEL) Model name

Advanced Configuration

Variable Description Default
HYBRID_MODE Enable hybrid search (BM25 + dense vector). Set to false for dense-only search true
EMBEDDING_BATCH_SIZE Batch size for processing. Larger batch size means less indexing time 100
SPLITTER_TYPE Code splitter type: ast, langchain ast
CUSTOM_EXTENSIONS Additional file extensions to include (comma-separated, e.g., .vue,.svelte,.astro) None
CUSTOM_IGNORE_PATTERNS Additional ignore patterns (comma-separated, e.g., temp/**,*.backup,private/**) None

🚀 Quick Setup

1. Create Global Config

mkdir -p ~/.context
cat > ~/.context/.env << 'EOF'
EMBEDDING_PROVIDER=OpenAI
OPENAI_API_KEY=sk-your-openai-api-key
EMBEDDING_MODEL=text-embedding-3-small
MILVUS_TOKEN=your-zilliz-cloud-api-key
EOF

See the Example File for more details.

2. Simplified MCP Configuration

Claude Code:

claude mcp add claude-context -- npx @zilliz/claude-context-mcp@latest

Cursor/Windsurf/Others:

{
  "mcpServers": {
    "claude-context": {
      "command": "npx",
      "args": ["-y", "@zilliz/claude-context-mcp@latest"]
    }
  }
}

📚 Additional Information

For detailed information about file processing rules and how custom patterns work, see: