mirror of
https://github.com/kardolus/chatgpt-cli.git
synced 2024-09-08 23:15:00 +03:00
Add thread based context management
This commit is contained in:
83
README.md
83
README.md
@@ -29,17 +29,19 @@ environment, demonstrating its practicality and effectiveness.
|
||||
* **Query mode**: Single input-output interactions with the GPT model.
|
||||
* **Interactive mode**: The interactive mode allows for a more conversational experience with the model. Exit
|
||||
interactive mode by simply typing 'exit'.
|
||||
* **Context management**: Seamless conversations with the GPT model by maintaining message history across CLI calls.
|
||||
* **Sliding window history**: Automatically trims conversation history while maintaining context to stay within token
|
||||
limits.
|
||||
* **Custom context from local files**: Provide a custom context for the GPT model to reference during the conversation
|
||||
by piping it in.
|
||||
* **Model listing**: Get a list of available models by using the `-l` or `--list-models` flag.
|
||||
* **Advanced configuration options**: The application supports a layered configuration system, allowing you to specify
|
||||
settings through default values, a `config.yaml` file, and environment variables. In addition, you can quickly modify
|
||||
specific configuration parameters (`model` and `max_tokens`) through `--set-model` and `--set-max-tokens` flags.
|
||||
The `--config` or `-c` flag lets you check your current settings with ease. New configuration options such
|
||||
as `omit_history` allow for further customization of your user experience.
|
||||
* **Thread-based context management**: Enjoy seamless conversations with the GPT model with individualized context for
|
||||
each thread, much like your experience on the OpenAI website. Each unique thread has its own history, ensuring
|
||||
relevant and coherent responses across different chat instances.
|
||||
* **Sliding window history**: To stay within token limits, the chat history automatically trims while still preserving
|
||||
the necessary context.
|
||||
* **Custom context from any source**: You can provide the GPT model with a custom context during conversation. This
|
||||
context can be piped in from any source, such as local files, standard input, or even another program. This
|
||||
flexibility allows the model to adapt to a wide range of conversational scenarios.
|
||||
* **Model listing**: Access a list of available models using the `-l` or `--list-models` flag.
|
||||
* **Advanced configuration options**: The CLI supports a layered configuration system where settings can be specified
|
||||
through default values, a `config.yaml` file, and environment variables. For quick adjustments, use the `--set-model`
|
||||
and `--set-max-tokens` flags. To verify your current settings, use the `--config` or `-c` flag. The newly
|
||||
added `omit_history` configuration option adds another layer of customization to your user experience.
|
||||
|
||||
## Installation
|
||||
|
||||
@@ -106,9 +108,14 @@ export OPENAI_API_KEY="your_api_key"
|
||||
mkdir -p ~/.chatgpt-cli
|
||||
```
|
||||
|
||||
With this directory in place, the CLI will automatically manage message history for seamless conversations with the GPT
|
||||
model. The history acts as a sliding window, maintaining a maximum of `4096` tokens to ensure optimal performance and
|
||||
interaction quality.
|
||||
Once this directory is in place, the CLI automatically manages the message history for each "thread" you converse
|
||||
with. The history operates like a sliding window, maintaining context up to a configurable token maximum. This
|
||||
ensures a balance between maintaining conversation context and achieving optimal performance.
|
||||
|
||||
By default, if a specific thread is not provided by the user, the CLI uses the default thread and stores the history
|
||||
at `~/.chatgpt-cli/history/default.json`. You can find more details about how to configure the `thread` parameter in
|
||||
the
|
||||
[Configuration](#configuration) section of this document.
|
||||
|
||||
3. Try it out:
|
||||
|
||||
@@ -141,25 +148,32 @@ cat context.txt | chatgpt "What kind of toy would Kya enjoy?"
|
||||
chatgpt --list-models
|
||||
```
|
||||
|
||||
7. For more options, see:
|
||||
|
||||
```shell
|
||||
chatgpt --help
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
The ChatGPT CLI adopts a three-tier configuration strategy, with different levels of precedence assigned to default
|
||||
values, the `config.yaml` file, and environment variables, in that respective order.
|
||||
|
||||
The default configuration:
|
||||
Configuration variables:
|
||||
|
||||
```yaml
|
||||
name: openai
|
||||
api_key:
|
||||
model: gpt-3.5-turbo
|
||||
max_tokens: 4096
|
||||
url: https://api.openai.com
|
||||
completions_path: /v1/chat/completions
|
||||
models_path: /v1/models
|
||||
omit_history: false
|
||||
```
|
||||
| Variable | Description | Default |
|
||||
|--------------------|-----------------------------------------------------------------------------------|--------------------------|
|
||||
| `name` | The prefix for environment variable overrides. | 'openai' |
|
||||
| `api_key` | Your OpenAI API key. | (none for security) |
|
||||
| `model` | The GPT model used by the application. | 'gpt-3.5-turbo' |
|
||||
| `max_tokens` | The maximum number of tokens that can be used in a single API call. | 4096 |
|
||||
| `thread` | The name of the current chat thread. Each unique thread name has its own context. | 'default' |
|
||||
| `omit_history` | If true, the chat history will not be used to provide context for the GPT model. | false |
|
||||
| `url` | The base URL for the OpenAI API. | 'https://api.openai.com' |
|
||||
| `completions_path` | The API endpoint for completions. | '/v1/chat/completions' |
|
||||
| `models_path` | The API endpoint for accessing model information. | '/v1/models' |
|
||||
|
||||
These defaults can be overridden by providing your own values in the user configuration file,
|
||||
The defaults can be overridden by providing your own values in the user configuration file,
|
||||
named `.chatgpt-cli/config.yaml`, located in your home directory.
|
||||
|
||||
The structure of the user configuration file mirrors that of the default configuration. For instance, to override
|
||||
@@ -249,23 +263,16 @@ To run all tests, use:
|
||||
./bin/chatgpt what type of dog is a Jack Russel?
|
||||
```
|
||||
|
||||
4. As mentioned before, to enable history tracking across CLI calls, create a ~/.chatgpt-cli directory using the
|
||||
command:
|
||||
4. As mentioned previously, the ChatGPT CLI supports tracking conversation history across CLI calls. This feature
|
||||
creates a seamless and conversational experience with the GPT model, as the history is utilized as context in
|
||||
subsequent interactions.
|
||||
|
||||
To enable this feature, you need to create a `~/.chatgpt-cli` directory using the command:
|
||||
|
||||
```shell
|
||||
mkdir -p ~/.chatgpt-cli
|
||||
```
|
||||
|
||||
With this directory in place, the CLI will automatically manage message history for seamless conversations with the GPT
|
||||
model. The history acts as a sliding window, maintaining a maximum of 4096 tokens to ensure optimal performance and
|
||||
interaction quality.
|
||||
|
||||
For more options, see:
|
||||
|
||||
```shell
|
||||
./bin/chatgpt --help
|
||||
```
|
||||
|
||||
## Reporting Issues and Contributing
|
||||
|
||||
If you encounter any issues or have suggestions for improvements,
|
||||
|
||||
@@ -39,6 +39,7 @@ func New(caller http.Caller, cs config.ConfigStore, hs history.HistoryStore) (*C
|
||||
}
|
||||
|
||||
caller.SetAPIKey(configuration.APIKey)
|
||||
hs.SetThread(configuration.Thread)
|
||||
|
||||
return &Client{
|
||||
Config: configuration,
|
||||
|
||||
@@ -23,11 +23,12 @@ import (
|
||||
|
||||
const (
|
||||
defaultMaxTokens = 4096
|
||||
defaultURL = "https://api.openai.com"
|
||||
defaultURL = "https://default.openai.com"
|
||||
defaultName = "default-name"
|
||||
defaultModel = "gpt-3.5-turbo"
|
||||
defaultCompletionsPath = "/v1/chat/completions"
|
||||
defaultModelsPath = "/v1/models"
|
||||
defaultCompletionsPath = "/default/completions"
|
||||
defaultModelsPath = "/default/models"
|
||||
defaultThread = "default-thread"
|
||||
envApiKey = "api-key"
|
||||
)
|
||||
|
||||
@@ -230,6 +231,7 @@ func testClient(t *testing.T, when spec.G, it spec.S) {
|
||||
})
|
||||
it("ignores history when configured to do so", func() {
|
||||
mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1)
|
||||
mockHistoryStore.EXPECT().SetThread(defaultThread).Times(1)
|
||||
mockConfigStore.EXPECT().Read().Return(types.Config{OmitHistory: true}, nil).Times(1)
|
||||
|
||||
subject, err := client.New(mockCaller, mockConfigStore, mockHistoryStore)
|
||||
@@ -484,6 +486,7 @@ func newClientFactory(mc *MockCaller, mcs *MockConfigStore, mhs *MockHistoryStor
|
||||
URL: defaultURL,
|
||||
CompletionsPath: defaultCompletionsPath,
|
||||
ModelsPath: defaultModelsPath,
|
||||
Thread: defaultThread,
|
||||
}).Times(1)
|
||||
|
||||
return &clientFactory{
|
||||
@@ -495,6 +498,7 @@ func newClientFactory(mc *MockCaller, mcs *MockConfigStore, mhs *MockHistoryStor
|
||||
|
||||
func (f *clientFactory) buildClientWithoutConfig() *client.Client {
|
||||
f.mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1)
|
||||
f.mockHistoryStore.EXPECT().SetThread(defaultThread).Times(1)
|
||||
f.mockConfigStore.EXPECT().Read().Return(types.Config{}, nil).Times(1)
|
||||
|
||||
c, err := client.New(f.mockCaller, f.mockConfigStore, f.mockHistoryStore)
|
||||
@@ -505,6 +509,7 @@ func (f *clientFactory) buildClientWithoutConfig() *client.Client {
|
||||
|
||||
func (f *clientFactory) buildClientWithConfig(config types.Config) *client.Client {
|
||||
f.mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1)
|
||||
f.mockHistoryStore.EXPECT().SetThread(defaultThread).Times(1)
|
||||
f.mockConfigStore.EXPECT().Read().Return(config, nil).Times(1)
|
||||
|
||||
c, err := client.New(f.mockCaller, f.mockConfigStore, f.mockHistoryStore)
|
||||
|
||||
@@ -63,6 +63,18 @@ func (mr *MockHistoryStoreMockRecorder) Read() *gomock.Call {
|
||||
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "Read", reflect.TypeOf((*MockHistoryStore)(nil).Read))
|
||||
}
|
||||
|
||||
// SetThread mocks base method.
|
||||
func (m *MockHistoryStore) SetThread(arg0 string) {
|
||||
m.ctrl.T.Helper()
|
||||
m.ctrl.Call(m, "SetThread", arg0)
|
||||
}
|
||||
|
||||
// SetThread indicates an expected call of SetThread.
|
||||
func (mr *MockHistoryStoreMockRecorder) SetThread(arg0 interface{}) *gomock.Call {
|
||||
mr.mock.ctrl.T.Helper()
|
||||
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "SetThread", reflect.TypeOf((*MockHistoryStore)(nil).SetThread), arg0)
|
||||
}
|
||||
|
||||
// Write mocks base method.
|
||||
func (m *MockHistoryStore) Write(arg0 []types.Message) error {
|
||||
m.ctrl.T.Helper()
|
||||
|
||||
@@ -42,7 +42,7 @@ func main() {
|
||||
|
||||
rootCmd.PersistentFlags().BoolVarP(&interactiveMode, "interactive", "i", false, "Use interactive mode")
|
||||
rootCmd.PersistentFlags().BoolVarP(&queryMode, "query", "q", false, "Use query mode instead of stream mode")
|
||||
rootCmd.PersistentFlags().BoolVar(&clearHistory, "clear-history", false, "Clear the history of ChatGPT CLI")
|
||||
rootCmd.PersistentFlags().BoolVar(&clearHistory, "clear-history", false, "Clear all prior conversation context for the current thread")
|
||||
rootCmd.PersistentFlags().BoolVarP(&showConfig, "config", "c", false, "Display the configuration")
|
||||
rootCmd.PersistentFlags().BoolVarP(&showVersion, "version", "v", false, "Display the version information")
|
||||
rootCmd.PersistentFlags().BoolVarP(&listModels, "list-models", "l", false, "List available models")
|
||||
@@ -85,11 +85,18 @@ func run(cmd *cobra.Command, args []string) error {
|
||||
}
|
||||
|
||||
if clearHistory {
|
||||
historyHandler := history.New()
|
||||
err := historyHandler.Delete()
|
||||
historyHandler, err := history.New()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
cm := configmanager.New(config.New())
|
||||
historyHandler.SetThread(cm.Config.Thread)
|
||||
|
||||
if err := historyHandler.Delete(); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
fmt.Println("History successfully cleared.")
|
||||
return nil
|
||||
}
|
||||
@@ -105,7 +112,8 @@ func run(cmd *cobra.Command, args []string) error {
|
||||
return nil
|
||||
}
|
||||
|
||||
client, err := client.New(http.New(), config.New(), history.New())
|
||||
hs, _ := history.New() // do not error out
|
||||
client, err := client.New(http.New(), config.New(), hs)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
@@ -15,6 +15,7 @@ const (
|
||||
openAIURL = "https://api.openai.com"
|
||||
openAICompletionsPath = "/v1/chat/completions"
|
||||
openAIModelsPath = "/v1/models"
|
||||
openAIThread = "default"
|
||||
)
|
||||
|
||||
type ConfigStore interface {
|
||||
@@ -54,6 +55,7 @@ func (f *FileIO) ReadDefaults() types.Config {
|
||||
URL: openAIURL,
|
||||
CompletionsPath: openAICompletionsPath,
|
||||
ModelsPath: openAIModelsPath,
|
||||
Thread: openAIThread,
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -28,6 +28,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
defaultURL = "default-url"
|
||||
defaultModel = "default-model"
|
||||
defaultApiKey = "default-api-key"
|
||||
defaultThread = "default-thread"
|
||||
defaultCompletionsPath = "default-completions-path"
|
||||
defaultModelsPath = "default-models-path"
|
||||
defaultOmitHistory = false
|
||||
@@ -54,6 +55,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
CompletionsPath: defaultCompletionsPath,
|
||||
ModelsPath: defaultModelsPath,
|
||||
OmitHistory: defaultOmitHistory,
|
||||
Thread: defaultThread,
|
||||
}
|
||||
|
||||
envPrefix = strings.ToUpper(defaultConfig.Name) + "_"
|
||||
@@ -85,6 +87,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.Name).To(Equal(defaultName))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the user provided model", func() {
|
||||
userModel := "the-model"
|
||||
@@ -102,6 +105,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.Name).To(Equal(defaultName))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the user provided name", func() {
|
||||
userName := "the-name"
|
||||
@@ -119,6 +123,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the user provided max-tokens", func() {
|
||||
userMaxTokens := 42
|
||||
@@ -136,6 +141,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
|
||||
})
|
||||
it("gives precedence to the user provided URL", func() {
|
||||
@@ -154,6 +160,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the user provided completions-path", func() {
|
||||
completionsPath := "the-completions-path"
|
||||
@@ -171,6 +178,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the user provided models-path", func() {
|
||||
modelsPath := "the-models-path"
|
||||
@@ -188,6 +196,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(modelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the user provided api-key", func() {
|
||||
apiKey := "new-api-key"
|
||||
@@ -205,6 +214,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(apiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the user provided omit-history", func() {
|
||||
omitHistory := true
|
||||
@@ -221,8 +231,27 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(omitHistory))
|
||||
})
|
||||
it("gives precedence to the user provided thread", func() {
|
||||
userThread := "user-thread"
|
||||
|
||||
mockConfigStore.EXPECT().ReadDefaults().Return(defaultConfig).Times(1)
|
||||
mockConfigStore.EXPECT().Read().Return(types.Config{Thread: userThread}, nil).Times(1)
|
||||
|
||||
subject := configmanager.New(mockConfigStore).WithEnvironment()
|
||||
|
||||
Expect(subject.Config.Name).To(Equal(defaultName))
|
||||
Expect(subject.Config.Model).To(Equal(defaultModel))
|
||||
Expect(subject.Config.MaxTokens).To(Equal(defaultMaxTokens))
|
||||
Expect(subject.Config.URL).To(Equal(defaultURL))
|
||||
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(userThread))
|
||||
})
|
||||
it("gives precedence to the OMIT_HISTORY environment variable", func() {
|
||||
var (
|
||||
environmentValue = true
|
||||
@@ -243,8 +272,32 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(environmentValue))
|
||||
})
|
||||
it("gives precedence to the THREAD environment variable", func() {
|
||||
var (
|
||||
environmentValue = "env-thread"
|
||||
configValue = "conf-thread"
|
||||
)
|
||||
|
||||
Expect(os.Setenv(envPrefix+"THREAD", environmentValue)).To(Succeed())
|
||||
|
||||
mockConfigStore.EXPECT().ReadDefaults().Return(defaultConfig).Times(1)
|
||||
mockConfigStore.EXPECT().Read().Return(types.Config{Thread: configValue}, nil).Times(1)
|
||||
|
||||
subject := configmanager.New(mockConfigStore).WithEnvironment()
|
||||
|
||||
Expect(subject.Config.Name).To(Equal(defaultName))
|
||||
Expect(subject.Config.Model).To(Equal(defaultModel))
|
||||
Expect(subject.Config.MaxTokens).To(Equal(defaultMaxTokens))
|
||||
Expect(subject.Config.URL).To(Equal(defaultURL))
|
||||
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(environmentValue))
|
||||
})
|
||||
it("gives precedence to the API_KEY environment variable", func() {
|
||||
var (
|
||||
environmentKey = "environment-api-key"
|
||||
@@ -266,6 +319,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(environmentKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the MODEL environment variable", func() {
|
||||
var (
|
||||
@@ -288,6 +342,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the MAX_TOKENS environment variable", func() {
|
||||
var (
|
||||
@@ -310,6 +365,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the URL environment variable", func() {
|
||||
var (
|
||||
@@ -332,6 +388,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the COMPLETIONS_PATH environment variable", func() {
|
||||
var (
|
||||
@@ -354,6 +411,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
it("gives precedence to the MODELS_PATH environment variable", func() {
|
||||
var (
|
||||
@@ -376,6 +434,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(subject.Config.ModelsPath).To(Equal(envModelsPath))
|
||||
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
})
|
||||
|
||||
@@ -396,6 +455,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(result).To(ContainSubstring(defaultModelsPath))
|
||||
Expect(result).To(ContainSubstring(fmt.Sprintf("%d", defaultMaxTokens)))
|
||||
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
|
||||
Expect(subject.Config.Thread).To(Equal(defaultThread))
|
||||
})
|
||||
})
|
||||
|
||||
@@ -438,4 +498,5 @@ func cleanEnv(envPrefix string) {
|
||||
Expect(os.Unsetenv(envPrefix + "COMPLETIONS_PATH")).To(Succeed())
|
||||
Expect(os.Unsetenv(envPrefix + "MODELS_PATH")).To(Succeed())
|
||||
Expect(os.Unsetenv(envPrefix + "OMIT_HISTORY")).To(Succeed())
|
||||
Expect(os.Unsetenv(envPrefix + "THREAD")).To(Succeed())
|
||||
}
|
||||
|
||||
@@ -2,46 +2,77 @@ package history
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"github.com/kardolus/chatgpt-cli/config"
|
||||
"github.com/kardolus/chatgpt-cli/types"
|
||||
"github.com/kardolus/chatgpt-cli/utils"
|
||||
"os"
|
||||
"path"
|
||||
"path/filepath"
|
||||
)
|
||||
|
||||
const (
|
||||
historyDirName = "history"
|
||||
jsonExtension = ".json"
|
||||
)
|
||||
|
||||
type HistoryStore interface {
|
||||
Delete() error
|
||||
Read() ([]types.Message, error)
|
||||
Write([]types.Message) error
|
||||
SetThread(thread string)
|
||||
}
|
||||
|
||||
// Ensure FileIO implements the HistoryStore interface
|
||||
var _ HistoryStore = &FileIO{}
|
||||
|
||||
type FileIO struct {
|
||||
historyFilePath string
|
||||
historyDir string
|
||||
thread string
|
||||
}
|
||||
|
||||
func New() (*FileIO, error) {
|
||||
_ = migrate()
|
||||
|
||||
dir, err := getHistoryDir()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
chatGPTDir, err := utils.GetChatGPTDirectory()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
fileInfo, err := os.Stat(chatGPTDir)
|
||||
if err == nil {
|
||||
if fileInfo.IsDir() {
|
||||
err = os.MkdirAll(dir, 0755)
|
||||
}
|
||||
}
|
||||
|
||||
func New() *FileIO {
|
||||
path, _ := getPath()
|
||||
return &FileIO{
|
||||
historyFilePath: path,
|
||||
}
|
||||
historyDir: dir,
|
||||
}, err
|
||||
}
|
||||
|
||||
func (f *FileIO) WithFilePath(historyFilePath string) *FileIO {
|
||||
f.historyFilePath = historyFilePath
|
||||
func (f *FileIO) SetThread(thread string) {
|
||||
f.thread = thread
|
||||
}
|
||||
|
||||
func (f *FileIO) WithDirectory(historyDir string) *FileIO {
|
||||
f.historyDir = historyDir
|
||||
return f
|
||||
}
|
||||
|
||||
func (f *FileIO) Delete() error {
|
||||
if _, err := os.Stat(f.historyFilePath); err == nil {
|
||||
return os.Remove(f.historyFilePath)
|
||||
if _, err := os.Stat(f.getPath()); err == nil {
|
||||
return os.Remove(f.getPath())
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (f *FileIO) Read() ([]types.Message, error) {
|
||||
return parseFile(f.historyFilePath)
|
||||
return parseFile(f.getPath())
|
||||
}
|
||||
|
||||
func (f *FileIO) Write(messages []types.Message) error {
|
||||
@@ -50,16 +81,56 @@ func (f *FileIO) Write(messages []types.Message) error {
|
||||
return err
|
||||
}
|
||||
|
||||
return os.WriteFile(f.historyFilePath, data, 0644)
|
||||
return os.WriteFile(f.getPath(), data, 0644)
|
||||
}
|
||||
|
||||
func getPath() (string, error) {
|
||||
func (f *FileIO) getPath() string {
|
||||
return filepath.Join(f.historyDir, f.thread+jsonExtension)
|
||||
}
|
||||
|
||||
func getHistoryDir() (string, error) {
|
||||
homeDir, err := utils.GetChatGPTDirectory()
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
return filepath.Join(homeDir, "history"), nil
|
||||
return filepath.Join(homeDir, historyDirName), nil
|
||||
}
|
||||
|
||||
// migrate moves the legacy "history" file in ~/.chatgpt-cli to "history/default.json"
|
||||
func migrate() error {
|
||||
hiddenDir, err := utils.GetChatGPTDirectory()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
historyFile := path.Join(hiddenDir, historyDirName)
|
||||
|
||||
fileInfo, err := os.Stat(historyFile)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if !fileInfo.IsDir() {
|
||||
defaults := config.New().ReadDefaults()
|
||||
|
||||
// move the legacy "history" file to "default.json"
|
||||
if err := os.Rename(historyFile, path.Join(hiddenDir, defaults.Thread+jsonExtension)); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// create the "history" directory
|
||||
if err := os.Mkdir(historyFile, 0755); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// move default.json to the "history" directory
|
||||
if err := os.Rename(path.Join(hiddenDir, defaults.Thread+jsonExtension), path.Join(historyFile, defaults.Thread+jsonExtension)); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func parseFile(fileName string) ([]types.Message, error) {
|
||||
|
||||
@@ -6,6 +6,7 @@ import (
|
||||
"github.com/kardolus/chatgpt-cli/configmanager"
|
||||
"github.com/kardolus/chatgpt-cli/history"
|
||||
"github.com/kardolus/chatgpt-cli/types"
|
||||
"github.com/kardolus/chatgpt-cli/utils"
|
||||
"github.com/onsi/gomega/gexec"
|
||||
"github.com/sclevine/spec"
|
||||
"github.com/sclevine/spec/report"
|
||||
@@ -38,9 +39,10 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
|
||||
})
|
||||
|
||||
when("Read, Write and Delete History", func() {
|
||||
const threadName = "default-thread"
|
||||
|
||||
var (
|
||||
tmpDir string
|
||||
tmpFile *os.File
|
||||
fileIO *history.FileIO
|
||||
messages []types.Message
|
||||
err error
|
||||
@@ -50,12 +52,9 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
|
||||
tmpDir, err = os.MkdirTemp("", "chatgpt-cli-test")
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
tmpFile, err = os.CreateTemp(tmpDir, "history.json")
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
Expect(tmpFile.Close()).To(Succeed())
|
||||
|
||||
fileIO = history.New().WithFilePath(tmpFile.Name())
|
||||
fileIO, _ = history.New()
|
||||
fileIO = fileIO.WithDirectory(tmpDir)
|
||||
fileIO.SetThread(threadName)
|
||||
|
||||
messages = []types.Message{
|
||||
{
|
||||
@@ -91,7 +90,7 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
|
||||
err = fileIO.Delete()
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
_, err = os.Stat(tmpFile.Name())
|
||||
_, err = os.Stat(threadName + ".json")
|
||||
Expect(os.IsNotExist(err)).To(BeTrue())
|
||||
})
|
||||
})
|
||||
@@ -170,7 +169,6 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
|
||||
SetDefaultEventuallyTimeout(5 * time.Second)
|
||||
|
||||
Expect(buildBinary()).To(Succeed())
|
||||
|
||||
Expect(runMockServer()).To(Succeed())
|
||||
|
||||
Eventually(func() (string, error) {
|
||||
@@ -214,14 +212,17 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
|
||||
Eventually(session).Should(gexec.Exit(exitSuccess))
|
||||
})
|
||||
|
||||
it("should not require an API key for the --clear-history flag", func() {
|
||||
it("should require a hidden folder for the --clear-history flag", func() {
|
||||
Expect(os.Unsetenv(apiKeyEnvVar)).To(Succeed())
|
||||
|
||||
command := exec.Command(binaryPath, "--clear-history")
|
||||
session, err := gexec.Start(command, io.Discard, io.Discard)
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
Eventually(session).Should(gexec.Exit(exitSuccess))
|
||||
Eventually(session).Should(gexec.Exit(exitFailure))
|
||||
|
||||
output := string(session.Out.Contents())
|
||||
Expect(output).To(ContainSubstring(".chatgpt-cli: no such file or directory"))
|
||||
})
|
||||
|
||||
it("should require an argument for the --set-model flag", func() {
|
||||
@@ -325,9 +326,63 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(os.RemoveAll(filePath)).To(Succeed())
|
||||
})
|
||||
|
||||
it("migrates the legacy history as expected", func() {
|
||||
// Legacy history file should not exist
|
||||
legacyFile := path.Join(filePath, "history")
|
||||
Expect(legacyFile).NotTo(BeAnExistingFile())
|
||||
|
||||
// History should not exist yet
|
||||
historyFile := path.Join(filePath, "history", "default.json")
|
||||
Expect(historyFile).NotTo(BeAnExistingFile())
|
||||
|
||||
bytes, err := utils.FileToBytes("history.json")
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
Expect(os.WriteFile(legacyFile, bytes, 0644)).To(Succeed())
|
||||
Expect(legacyFile).To(BeARegularFile())
|
||||
|
||||
// Perform a query
|
||||
command := exec.Command(binaryPath, "--query", "some-query")
|
||||
session, err := gexec.Start(command, io.Discard, io.Discard)
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
// The CLI response should be as expected
|
||||
Eventually(session).Should(gexec.Exit(exitSuccess))
|
||||
|
||||
output := string(session.Out.Contents())
|
||||
|
||||
response := `I don't have personal opinions about bars, but here are some popular bars in Red Hook, Brooklyn:`
|
||||
Expect(output).To(ContainSubstring(response))
|
||||
|
||||
// The history file should have the expected content
|
||||
Expect(path.Dir(historyFile)).To(BeADirectory())
|
||||
content, err := os.ReadFile(historyFile)
|
||||
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
Expect(content).NotTo(BeEmpty())
|
||||
Expect(string(content)).To(ContainSubstring(response))
|
||||
|
||||
// The legacy file should now be a directory
|
||||
Expect(legacyFile).To(BeADirectory())
|
||||
Expect(legacyFile).NotTo(BeARegularFile())
|
||||
|
||||
// The content was moved to the new file
|
||||
Expect(string(content)).To(ContainSubstring("Of course! Which city are you referring to?"))
|
||||
})
|
||||
|
||||
it("should not require an API key for the --clear-history flag", func() {
|
||||
Expect(os.Unsetenv(apiKeyEnvVar)).To(Succeed())
|
||||
|
||||
command := exec.Command(binaryPath, "--clear-history")
|
||||
session, err := gexec.Start(command, io.Discard, io.Discard)
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
Eventually(session).Should(gexec.Exit(exitSuccess))
|
||||
})
|
||||
|
||||
it("keeps track of history", func() {
|
||||
// History should not exist yet
|
||||
historyFile := path.Join(filePath, "history")
|
||||
historyFile := path.Join(filePath, "history", "default.json")
|
||||
Expect(historyFile).NotTo(BeAnExistingFile())
|
||||
|
||||
// Perform a query
|
||||
@@ -344,7 +399,7 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
|
||||
Expect(output).To(ContainSubstring(response))
|
||||
|
||||
// The history file should have the expected content
|
||||
Expect(historyFile).To(BeAnExistingFile())
|
||||
Expect(path.Dir(historyFile)).To(BeADirectory())
|
||||
content, err := os.ReadFile(historyFile)
|
||||
|
||||
Expect(err).NotTo(HaveOccurred())
|
||||
|
||||
14
resources/testdata/history.json
vendored
Normal file
14
resources/testdata/history.json
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
[
|
||||
{
|
||||
"role": "system",
|
||||
"content": "You are a helpful assistant."
|
||||
},
|
||||
{
|
||||
"role": "user",
|
||||
"content": "can you tell me more about this city?"
|
||||
},
|
||||
{
|
||||
"role": "assistant",
|
||||
"content": "Of course! Which city are you referring to?\n"
|
||||
}
|
||||
]
|
||||
@@ -5,8 +5,9 @@ type Config struct {
|
||||
APIKey string `yaml:"api_key"`
|
||||
Model string `yaml:"model"`
|
||||
MaxTokens int `yaml:"max_tokens"`
|
||||
Thread string `yaml:"thread"`
|
||||
OmitHistory bool `yaml:"omit_history"`
|
||||
URL string `yaml:"url"`
|
||||
CompletionsPath string `yaml:"completions_path"`
|
||||
ModelsPath string `yaml:"models_path"`
|
||||
OmitHistory bool `yaml:"omit_history"`
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user