add detailed docs and environment variables guide

Signed-off-by: ChengZi <chen.zhang@zilliz.com>
This commit is contained in:
ChengZi
2025-07-29 20:38:31 +08:00
committed by Cheney Zhang
parent 581c5a16a7
commit 06492d009f
11 changed files with 705 additions and 9 deletions

View File

@@ -15,6 +15,7 @@ EMBEDDING_PROVIDER=OpenAI
EMBEDDING_MODEL=text-embedding-3-small
# Embedding batch size for processing (default: 100)
# You can customize it according to the throughput of your embedding model. Generally, larger batch size means less indexing time.
EMBEDDING_BATCH_SIZE=100
# =============================================================================
@@ -55,14 +56,13 @@ OPENAI_API_KEY=your-openai-api-key-here
# Vector Database Configuration (Milvus/Zilliz)
# =============================================================================
# Milvus server address
# Milvus server address. It's optional when you get Zilliz Personal API Key.
MILVUS_ADDRESS=your-zilliz-cloud-public-endpoint
# Milvus authentication token
# MILVUS_TOKEN=your-zilliz-cloud-api-key
# Milvus authentication token. You can refer to this guide to get Zilliz Personal API Key as your Milvus token.
# https://github.com/zilliztech/code-context/blob/master/assets/signup_and_get_apikey.png
MILVUS_TOKEN=your-zilliz-cloud-api-key
# Zilliz Cloud base URL (optional, default: https://api.cloud.zilliz.com)
# ZILLIZ_BASE_URL=https://api.cloud.zilliz.com
# =============================================================================
# Code Splitter Configuration

7
.gitignore vendored
View File

@@ -52,4 +52,9 @@ Thumbs.db
# Extension specific
*.vsix
*.crx
*.pem
*.pem
.claude/*
CLAUDE.md
.cursor/*

View File

@@ -9,6 +9,7 @@
[![License](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
[![Node.js](https://img.shields.io/badge/Node.js-20%2B-green.svg)](https://nodejs.org/)
[![Documentation](https://img.shields.io/badge/Documentation-📚-orange.svg)](docs/)
[![VS Code Marketplace](https://img.shields.io/visual-studio-marketplace/v/zilliz.semanticcodesearch?label=VS%20Code%20Extension&logo=visual-studio-code)](https://marketplace.visualstudio.com/items?itemName=zilliz.semanticcodesearch)
[![npm - core](https://img.shields.io/npm/v/@zilliz/code-context-core?label=%40zilliz%2Fcode-context-core&logo=npm)](https://www.npmjs.com/package/@zilliz/code-context-core)
[![npm - mcp](https://img.shields.io/npm/v/@zilliz/code-context-mcp?label=%40zilliz%2Fcode-context-mcp&logo=npm)](https://www.npmjs.com/package/@zilliz/code-context-mcp)
@@ -43,7 +44,7 @@ Model Context Protocol (MCP) allows you to integrate Code Context with your favo
<details>
<summary>Get a free vector database on Zilliz Cloud</summary>
Code Context needs a vector database. You can [sign up](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) on Zilliz Cloud to get a API key.
Code Context needs a vector database. You can [sign up](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) on Zilliz Cloud to get an API key.
![](assets/signup_and_get_apikey.png)
@@ -364,8 +365,9 @@ npx @zilliz/code-context-mcp@latest
</details>
</details>
For more detailed MCP environment variable configuration, see our [Environment Variables Guide](docs/getting-started/environment-variables.md).
📚 **Need more help?** Check out our [complete documentation](docs/) for detailed guides and troubleshooting tips.
---
@@ -509,6 +511,12 @@ Check the `/examples` directory for complete usage examples:
---
## ❓ FAQ
For frequently asked questions and troubleshooting tips, see our [FAQ Guide](docs/troubleshooting/faq.md).
---
## 🤝 Contributing
We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details on how to get started.

33
docs/README.md Normal file
View File

@@ -0,0 +1,33 @@
# Code Context Documentation
![](../assets/code_context_logo_dark.png)
Welcome to the Code Context documentation! Code Context is a powerful tool that adds semantic code search capabilities to AI coding assistants through MCP.
## 🚀 Quick Navigation
### Getting Started
- [📋 Project Overview](getting-started/overview.md) - What is Code Context and how it works
- [🛠️ Prerequisites](getting-started/prerequisites.md) - What you need before starting
- [⚡ Quick Start Guide](getting-started/quick-start.md) - Get up and running in 1 minutes
### Components
- [MCP Server](../packages/mcp/README.md) - The MCP server of Code Context
- [VSCode Extension](../packages/vscode-extension/README.md) - The VSCode extension of Code Context
- [Core Package](../packages/core/README.md) - The core package of Code Context
### Troubleshooting
- [❓ FAQ](troubleshooting/faq.md) - Frequently asked questions
## 🔗 External Resources
- [GitHub Repository](https://github.com/zilliztech/code-context)
- [VSCode Marketplace](https://marketplace.visualstudio.com/items?itemName=zilliz.semanticcodesearch)
- [npm - Core Package](https://www.npmjs.com/package/@zilliz/code-context-core)
- [npm - MCP Server](https://www.npmjs.com/package/@zilliz/code-context-mcp)
- [Zilliz Cloud](https://cloud.zilliz.com)
## 💬 Support
- **Issues**: [GitHub Issues](https://github.com/zilliztech/code-context/issues)
- **Discord**: [Join our Discord](https://discord.gg/mKc3R95yE5)

View File

@@ -0,0 +1,76 @@
# Environment Variables Configuration
## 🎯 Global Configuration
Code Context supports a global configuration file at `~/.codecontext/.env` to simplify MCP setup across different MCP clients.
**Benefits:**
- Configure once, use everywhere
- No need to specify environment variables in each MCP client
- Cleaner MCP configurations
## 📋 Environment Variable Priority
1. **Process Environment Variables** (highest)
2. **Global Configuration File** (`~/.codecontext/.env`)
3. **Default Values** (lowest)
## 🔧 Required Environment Variables
### Embedding Provider
| Variable | Description | Default |
|----------|-------------|---------|
| `EMBEDDING_PROVIDER` | Provider: `OpenAI`, `VoyageAI`, `Gemini`, `Ollama` | `OpenAI` |
| `OPENAI_API_KEY` | OpenAI API key | Required for OpenAI |
| `VOYAGEAI_API_KEY` | VoyageAI API key | Required for VoyageAI |
| `GEMINI_API_KEY` | Gemini API key | Required for Gemini |
### Vector Database
| Variable | Description | Default |
|----------|-------------|---------|
| `MILVUS_TOKEN` | Milvus authentication token. Get [Zilliz Personal API Key](https://github.com/zilliztech/code-context/blob/master/assets/signup_and_get_apikey.png) | Recommended |
| `MILVUS_ADDRESS` | Milvus server address. Optional when using Zilliz Personal API Key | Auto-resolved from token |
### Ollama (Local)
| Variable | Description | Default |
|----------|-------------|---------|
| `OLLAMA_HOST` | Ollama server URL | `http://127.0.0.1:11434` |
| `OLLAMA_MODEL` | Model name | `nomic-embed-text` |
### Advanced Configuration
| Variable | Description | Default |
|----------|-------------|---------|
| `EMBEDDING_BATCH_SIZE` | Batch size for processing. Larger batch size means less indexing time | `100` |
| `SPLITTER_TYPE` | Code splitter type: `ast`, `langchain` | `ast` |
## 🚀 Quick Setup
### 1. Create Global Config
```bash
mkdir -p ~/.codecontext
cat > ~/.codecontext/.env << 'EOF'
EMBEDDING_PROVIDER=OpenAI
OPENAI_API_KEY=sk-your-openai-api-key
MILVUS_TOKEN=your-zilliz-cloud-api-key
EOF
```
### 2. Simplified MCP Configuration
**Claude Code:**
```bash
claude mcp add code-context -- npx @zilliz/code-context-mcp@latest
```
**Cursor/Windsurf/Others:**
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"]
}
}
}
```

View File

@@ -0,0 +1,114 @@
# Project Overview
## What is Code Context?
Code Context is a powerful semantic code search tool that gives AI coding assistants deep understanding of your entire codebase. Instead of traditional keyword-based search, Code Context uses vector embeddings and AI to understand the meaning and context of your code.
## Key Features
### 🔍 Semantic Code Search
Ask natural language questions like "find functions that handle user authentication" and get relevant code snippets from across your entire codebase.
### 🧠 Context-Aware Understanding
Discover relationships between different parts of your code, even across millions of lines. The system understands code structure, patterns, and dependencies.
### ⚡ Incremental Indexing
Efficiently re-index only changed files using Merkle trees, making it fast to keep your search index up-to-date.
### 🧩 Intelligent Code Chunking
Uses Abstract Syntax Trees (AST) to intelligently split code into meaningful chunks that preserve context and structure.
### 🗄️ Scalable Architecture
Integrates with Zilliz Cloud for scalable vector search, handling codebases of any size.
### 🛠️ Highly Customizable
Configure file extensions, ignore patterns, embedding models, and search parameters to fit your specific needs.
## How It Works
### 1. Code Analysis
Code Context analyzes your codebase using AST parsers to understand code structure and semantics.
### 2. Intelligent Chunking
Code is split into meaningful chunks that preserve context, function boundaries, and logical groupings.
### 3. Vector Embeddings
Each code chunk is converted into high-dimensional vectors using state-of-the-art embedding models.
### 4. Vector Storage
Embeddings are stored in a vector database (Milvus/Zilliz Cloud) for efficient similarity search.
### 5. Semantic Search
Natural language queries are converted to vectors and matched against stored code embeddings.
## Architecture Components
### Core Engine (`@zilliz/code-context-core`)
The foundational indexing engine that handles:
- Code parsing and analysis
- Embedding generation
- Vector database operations
- Search algorithms
### MCP Server (`@zilliz/code-context-mcp`)
Model Context Protocol server that enables integration with AI assistants:
- Standardized tool interface
- Compatible with Claude Code, Cursor, Windsurf, and more
- Real-time indexing and search capabilities
### VSCode Extension
Native Visual Studio Code integration:
- Semantic search sidebar
- Context-aware code navigation
- Progressive indexing with visual feedback
### Chrome Extension
GitHub integration for web-based development:
- Semantic search on GitHub repositories
- Context-aware code browsing
- Cross-repository search capabilities
## Supported Technologies
### Programming Languages
- **Compiled Languages**: TypeScript, JavaScript, Java, C++, C#, Go, Rust
- **Scripting Languages**: Python, PHP, Ruby
- **Mobile**: Swift, Kotlin, Scala, Objective-C
- **Documentation**: Markdown
### Embedding Providers
- **OpenAI**: `text-embedding-3-small`, `text-embedding-3-large`
- **VoyageAI**: `voyage-code-3`, specialized for code understanding
- **Gemini**: Google's embedding models with Matryoshka representation
- **Ollama**: Local embedding models for privacy-focused development
### Vector Databases
- **Milvus**: Open-source vector database
- **Zilliz Cloud**: Fully managed vector database service
### AI Assistant Integration
- **Claude Code**: Native MCP integration
- **Cursor**: MCP configuration support
- **Windsurf**: JSON-based MCP setup
- **VSCode**: Direct extension + MCP support
- **And more**: Any MCP-compatible AI assistant
## Use Cases
### Large Codebase Navigation
Quickly find relevant code patterns, implementations, and examples across massive codebases.
### Code Review Assistance
Identify similar code patterns, potential duplications, and related functionality during reviews.
### Learning and Onboarding
Help new team members understand codebase structure and find relevant examples.
### Refactoring Support
Locate all instances of specific patterns or implementations that need updating.
### API Discovery
Find usage examples and implementations of specific APIs or libraries.
### Cross-Language Development
Search for similar functionality across different programming languages in polyglot codebases.

View File

@@ -0,0 +1,51 @@
# Prerequisites
Before setting up Code Context, ensure you have the following requirements met.
## Required Services
### Embedding Provider (Choose One)
#### Option 1: OpenAI (Recommended)
- **API Key**: Get from [OpenAI Platform](https://platform.openai.com/api-keys)
- **Billing**: Active billing account required
- **Models**: `text-embedding-3-small` or `text-embedding-3-large`
- **Rate Limits**: Check current limits on your OpenAI account
#### Option 2: VoyageAI
- **API Key**: Get from [VoyageAI Console](https://dash.voyageai.com/)
- **Models**: `voyage-code-3` (optimized for code)
- **Billing**: Pay-per-use pricing
#### Option 3: Gemini
- **API Key**: Get from [Google AI Studio](https://aistudio.google.com/)
- **Models**: `gemini-embedding-001`
- **Quota**: Check current quotas and limits
#### Option 4: Ollama (Local)
- **Installation**: Download from [ollama.ai](https://ollama.ai/)
- **Models**: Pull embedding models like `nomic-embed-text`
- **Hardware**: Sufficient RAM for model loading (varies by model)
### Vector Database
#### Zilliz Cloud (Recommended)
![](../../assets/signup_and_get_apikey.png)
- **Account**: [Sign up](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) on Zilliz Cloud to get an API key.
- **Convenience**: Fully managed Milvus vector database service without the need to install and manage it.
#### Local Milvus (Advanced)
- **Docker**: Install Milvus by following [this guide](https://milvus.io/docs/install_standalone-docker-compose.md)
- **Resources**: More complex configuration required
## Development Tools (Optional)
### For VSCode Extension
- **VSCode**: Version 1.74.0 or higher
- **Extensions**: Code Context extension from marketplace
### For Development Contributions
- **Git**: For version control
- **pnpm**: Package manager (preferred over npm)
- **TypeScript**: Understanding of TypeScript development

View File

@@ -0,0 +1,369 @@
# Quick Start Guide
Get Code Context running with AI assistants in under 5 minutes! This guide covers the most common setup using MCP (Model Context Protocol) with Claude Code.
## 🚀 1-Minute Setup for Claude Code
### Step 1: Get API Keys
You'll need two API keys:
1. **OpenAI API Key**: Get from [OpenAI Platform](https://platform.openai.com/api-keys)
2. **Zilliz Cloud API Key**: ![](../../assets/signup_and_get_apikey.png)
[Sign up](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) on Zilliz Cloud to get an API key.
### Step 2: Configure Claude Code
Run this single command to add Code Context to Claude Code:
```bash
claude mcp add code-context \
-e OPENAI_API_KEY=sk-your-openai-api-key \
-e MILVUS_TOKEN=your-zilliz-cloud-api-key \
-- npx @zilliz/code-context-mcp@latest
```
Replace the API keys with your actual keys.
### Step 3: Start Using Code Context
1. **Open Claude Code** in your project directory
2. **Index your codebase**:
```
Index this codebase
```
3. **Start searching**:
```
Find functions that handle user authentication
```
🎉 **That's it!** You now have semantic code search in Claude Code.
## Alternative Quick Setups
<details>
<summary><strong>Qwen Code</strong></summary>
Create or edit the `~/.qwen/settings.json` file and add the following configuration:
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["@zilliz/code-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
</details>
<details>
<summary><strong>Cursor</strong></summary>
Go to: `Settings` -> `Cursor Settings` -> `MCP` -> `Add new global MCP server`
Pasting the following configuration into your Cursor `~/.cursor/mcp.json` file is the recommended approach. You may also install in a specific project by creating `.cursor/mcp.json` in your project folder. See [Cursor MCP docs](https://docs.cursor.com/context/model-context-protocol) for more info.
**OpenAI Configuration (Default):**
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "OpenAI",
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
**VoyageAI Configuration:**
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "VoyageAI",
"VOYAGEAI_API_KEY": "your-voyageai-api-key",
"EMBEDDING_MODEL": "voyage-code-3",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
**Gemini Configuration:**
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Gemini",
"GEMINI_API_KEY": "your-gemini-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
**Ollama Configuration:**
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"],
"env": {
"EMBEDDING_PROVIDER": "Ollama",
"EMBEDDING_MODEL": "nomic-embed-text",
"OLLAMA_HOST": "http://127.0.0.1:11434",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
</details>
<details>
<summary><strong>Claude Desktop</strong></summary>
Add to your Claude Desktop configuration:
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["@zilliz/code-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
</details>
<details>
<summary><strong>Claude Code</strong></summary>
Use the command line interface to add the CodeContext MCP server:
```bash
# Add the CodeContext MCP server
claude mcp add code-context -e OPENAI_API_KEY=your-openai-api-key -e MILVUS_TOKEN=your-zilliz-cloud-api-key -- npx @zilliz/code-context-mcp@latest
```
See the [Claude Code MCP documentation](https://docs.anthropic.com/en/docs/claude-code/mcp) for more details about MCP server management.
</details>
<details>
<summary><strong>Windsurf</strong></summary>
Windsurf supports MCP configuration through a JSON file. Add the following configuration to your Windsurf MCP settings:
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
</details>
<details>
<summary><strong>VS Code</strong></summary>
The CodeContext MCP server can be used with VS Code through MCP-compatible extensions. Add the following configuration to your VS Code MCP settings:
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
</details>
<details>
<summary><strong>Cherry Studio</strong></summary>
Cherry Studio allows for visual MCP server configuration through its settings interface. While it doesn't directly support manual JSON configuration, you can add a new server via the GUI:
1. Navigate to **Settings → MCP Servers → Add Server**.
2. Fill in the server details:
- **Name**: `code-context`
- **Type**: `STDIO`
- **Command**: `npx`
- **Arguments**: `["@zilliz/code-context-mcp@latest"]`
- **Environment Variables**:
- `OPENAI_API_KEY`: `your-openai-api-key`
- `MILVUS_TOKEN`: `your-zilliz-cloud-api-key`
3. Save the configuration to activate the server.
</details>
<details>
<summary><strong>Cline</strong></summary>
Cline uses a JSON configuration file to manage MCP servers. To integrate the provided MCP server configuration:
1. Open Cline and click on the **MCP Servers** icon in the top navigation bar.
2. Select the **Installed** tab, then click **Advanced MCP Settings**.
3. In the `cline_mcp_settings.json` file, add the following configuration:
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["@zilliz/code-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
4. Save the file.
</details>
<details>
<summary><strong>Augment</strong></summary>
To configure Code Context MCP in Augment Code, you can use either the graphical interface or manual configuration.
#### **A. Using the Augment Code UI**
1. Click the hamburger menu.
2. Select **Settings**.
3. Navigate to the **Tools** section.
4. Click the **+ Add MCP** button.
5. Enter the following command:
```
npx @zilliz/code-context-mcp@latest
```
6. Name the MCP: **Code Context**.
7. Click the **Add** button.
------
#### **B. Manual Configuration**
1. Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
2. Select Edit Settings
3. Under Advanced, click Edit in settings.json
4. Add the server configuration to the `mcpServers` array in the `augment.advanced` object
```json
"augment.advanced": {
"mcpServers": [
{
"name": "code-context",
"command": "npx",
"args": ["-y", "@zilliz/code-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
]
}
```
</details>
<details>
<summary><strong>Roo Code</strong></summary>
Roo Code utilizes a JSON configuration file for MCP servers:
1. Open Roo Code and navigate to **Settings → MCP Servers → Edit Global Config**.
2. In the `mcp_settings.json` file, add the following configuration:
```json
{
"mcpServers": {
"code-context": {
"command": "npx",
"args": ["@zilliz/code-context-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key",
"MILVUS_ADDRESS": "your-zilliz-cloud-public-endpoint",
"MILVUS_TOKEN": "your-zilliz-cloud-api-key"
}
}
}
}
```
3. Save the file to activate the server.
</details>
<details>
<summary><strong>Other MCP Clients</strong></summary>
The server uses stdio transport and follows the standard MCP protocol. It can be integrated with any MCP-compatible client by running:
```bash
npx @zilliz/code-context-mcp@latest
```
</details>
> 💡 **Tip**: For easier configuration management, you can use [global environment variables](environment-variables.md) instead of specifying them in each MCP client configuration.

View File

@@ -0,0 +1,36 @@
# Frequently Asked Questions (FAQ)
## Q: What files does Code Context decide to embed?
**A:** Code Context embeds files based on the following rules:
**Files that are included:**
- Files with supported extensions (DEFAULT_SUPPORTED_EXTENSIONS)
**Files that are excluded:**
- Files matching DEFAULT_IGNORE_PATTERNS
- Files matching patterns in .gitignore
The final rule is: `DEFAULT_SUPPORTED_EXTENSIONS - DEFAULT_IGNORE_PATTERNS - .gitignore patterns`
Supported extensions include common programming languages (.ts, .js, .py, .java, .cpp, etc.) and documentation files (.md, .markdown). Default ignore patterns cover build outputs, dependencies (node_modules), IDE files, and temporary files.
**See the `DEFAULT_SUPPORTED_EXTENSIONS` and `DEFAULT_IGNORE_PATTERNS` definition:** [`packages/core/src/context.ts`](../../packages/core/src/context.ts)
## Q: Can I use a fully local deployment setup?
**A:** Yes, you can deploy Code Context entirely on your local infrastructure. While we recommend using the fully managed [Zilliz Cloud](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) service for ease of use, you can also set up your own private local deployment.
**For local deployment:**
1. **Vector Database (Milvus)**: Deploy Milvus locally using Docker Compose by following the [official Milvus installation guide](https://milvus.io/docs/install_standalone-docker-compose.md). Configure the following environment variables:
- `MILVUS_ADDRESS=127.0.0.1:19530` (or your Milvus server address)
- `MILVUS_TOKEN=your-optional-token` (if authentication is enabled)
2. **Embedding Service (Ollama)**: Install and run [Ollama](https://ollama.com/) locally for embedding generation. Configure:
- `EMBEDDING_PROVIDER=Ollama`
- `OLLAMA_HOST=http://127.0.0.1:11434` (or your Ollama server URL)
- `OLLAMA_MODEL=nomic-embed-text` (or your preferred embedding model)
This setup gives you complete control over your data while maintaining full functionality. See our [environment variables guide](../getting-started/environment-variables.md) for detailed configuration options.

View File

@@ -42,6 +42,8 @@ MILVUS_ADDRESS=your-zilliz-cloud-public-endpoint
MILVUS_TOKEN=your-zilliz-cloud-api-key
```
> 💡 **Tip**: For easier configuration management across different usage scenarios, consider using [global environment variables](../../docs/getting-started/environment-variables.md).
## Quick Start
```typescript

View File

@@ -31,6 +31,8 @@ Before using the MCP server, make sure you have:
Code Context MCP supports multiple embedding providers. Choose the one that best fits your needs:
> 💡 **Tip**: You can also use [global environment variables](../../docs/getting-started/environment-variables.md) for easier configuration management across different MCP clients.
```bash
# Supported providers: OpenAI, VoyageAI, Gemini, Ollama
EMBEDDING_PROVIDER=OpenAI
@@ -148,7 +150,7 @@ OLLAMA_HOST=http://127.0.0.1:11434
#### Get a free vector database on Zilliz Cloud
Code Context needs a vector database. You can [sign up](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) on Zilliz Cloud to get a API key.
Code Context needs a vector database. You can [sign up](https://cloud.zilliz.com/signup?utm_source=github&utm_medium=referral&utm_campaign=2507-codecontext-readme) on Zilliz Cloud to get an API key.
![](../../assets/signup_and_get_apikey.png)