mirror of
https://github.com/charmbracelet/crush.git
synced 2025-08-02 05:20:46 +03:00
docs(readme): simplify and cleanup readme
This commit is contained in:
81
README.md
81
README.md
@@ -65,11 +65,9 @@ Crush can use LSPs for additional context to help inform its decisions, just lik
|
||||
{
|
||||
"lsp": {
|
||||
"go": {
|
||||
"disabled": false,
|
||||
"command": "gopls"
|
||||
},
|
||||
"typescript": {
|
||||
"disabled": false,
|
||||
"command": "typescript-language-server",
|
||||
"args": ["--stdio"]
|
||||
},
|
||||
@@ -80,83 +78,30 @@ Crush can use LSPs for additional context to help inform its decisions, just lik
|
||||
}
|
||||
```
|
||||
|
||||
### Amazon Bedrock
|
||||
|
||||
To use AWS Bedrock with Claude models, configure your AWS credentials and region:
|
||||
|
||||
```json
|
||||
{
|
||||
"providers": [
|
||||
{
|
||||
"id": "bedrock",
|
||||
"provider_type": "bedrock",
|
||||
"extra_params": {
|
||||
"region": "us-east-1"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Bedrock uses your AWS credentials from environment variables (`AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`) or AWS credential profiles. The region can be specified in the configuration or via the `AWS_REGION` environment variable.
|
||||
|
||||
### Google Vertex AI
|
||||
|
||||
For Google Cloud Vertex AI with Gemini models, configure your project and location:
|
||||
|
||||
```json
|
||||
{
|
||||
"providers": [
|
||||
{
|
||||
"id": "vertexai",
|
||||
"provider_type": "vertexai",
|
||||
"extra_headers": {
|
||||
"project": "your-gcp-project-id",
|
||||
"location": "us-central1"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Vertex AI uses Google Cloud authentication. Ensure you have the `GOOGLE_APPLICATION_CREDENTIALS` environment variable set or are authenticated via `gcloud auth application-default login`.
|
||||
|
||||
### OpenAI-Compatible APIs
|
||||
|
||||
Crush supports all OpenAI-compatible APIs, including local models via Ollama:
|
||||
Crush supports all OpenAI-compatible APIs. Here's an example configuration for Deepseek, which uses an OpenAI-compatible API. Don't forget to set `DEEPSEEK_API_KEY` in your environment.
|
||||
|
||||
```json
|
||||
{
|
||||
"providers": [
|
||||
{
|
||||
"id": "ollama",
|
||||
"providers": {
|
||||
"deepseek": {
|
||||
"provider_type": "openai",
|
||||
"base_url": "http://localhost:11434/v1",
|
||||
"base_url": "https://api.deepseek.com/v1",
|
||||
"models": [
|
||||
{
|
||||
"id": "llama3.2:3b",
|
||||
"name": "Llama 3.2 3B",
|
||||
"context_window": 8192,
|
||||
"default_max_tokens": 4096
|
||||
"id": "deepseek-chat",
|
||||
"model": "Deepseek V3",
|
||||
"cost_per_1m_in": 0.27,
|
||||
"cost_per_1m_out": 1.1,
|
||||
"cost_per_1m_in_cached": 0.07,
|
||||
"cost_per_1m_out_cached": 1.1,
|
||||
"context_window": 64000,
|
||||
"default_max_tokens": 5000
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
For other OpenAI-compatible providers, adjust the `base_url` and provide an `api_key` if required:
|
||||
|
||||
```json
|
||||
{
|
||||
"providers": [
|
||||
{
|
||||
"id": "custom-openai",
|
||||
"provider_type": "openai",
|
||||
"base_url": "https://api.example.com/v1",
|
||||
"api_key": "your-api-key"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
Reference in New Issue
Block a user