Add thread based context management

This commit is contained in:
kardolus
2023-06-22 14:43:13 -04:00
parent 4845ef836c
commit 37c6e68029
11 changed files with 355 additions and 118 deletions

175
README.md
View File

@@ -29,17 +29,19 @@ environment, demonstrating its practicality and effectiveness.
* **Query mode**: Single input-output interactions with the GPT model. * **Query mode**: Single input-output interactions with the GPT model.
* **Interactive mode**: The interactive mode allows for a more conversational experience with the model. Exit * **Interactive mode**: The interactive mode allows for a more conversational experience with the model. Exit
interactive mode by simply typing 'exit'. interactive mode by simply typing 'exit'.
* **Context management**: Seamless conversations with the GPT model by maintaining message history across CLI calls. * **Thread-based context management**: Enjoy seamless conversations with the GPT model with individualized context for
* **Sliding window history**: Automatically trims conversation history while maintaining context to stay within token each thread, much like your experience on the OpenAI website. Each unique thread has its own history, ensuring
limits. relevant and coherent responses across different chat instances.
* **Custom context from local files**: Provide a custom context for the GPT model to reference during the conversation * **Sliding window history**: To stay within token limits, the chat history automatically trims while still preserving
by piping it in. the necessary context.
* **Model listing**: Get a list of available models by using the `-l` or `--list-models` flag. * **Custom context from any source**: You can provide the GPT model with a custom context during conversation. This
* **Advanced configuration options**: The application supports a layered configuration system, allowing you to specify context can be piped in from any source, such as local files, standard input, or even another program. This
settings through default values, a `config.yaml` file, and environment variables. In addition, you can quickly modify flexibility allows the model to adapt to a wide range of conversational scenarios.
specific configuration parameters (`model` and `max_tokens`) through `--set-model` and `--set-max-tokens` flags. * **Model listing**: Access a list of available models using the `-l` or `--list-models` flag.
The `--config` or `-c` flag lets you check your current settings with ease. New configuration options such * **Advanced configuration options**: The CLI supports a layered configuration system where settings can be specified
as `omit_history` allow for further customization of your user experience. through default values, a `config.yaml` file, and environment variables. For quick adjustments, use the `--set-model`
and `--set-max-tokens` flags. To verify your current settings, use the `--config` or `-c` flag. The newly
added `omit_history` configuration option adds another layer of customization to your user experience.
## Installation ## Installation
@@ -96,70 +98,82 @@ Choose the appropriate command for your system, which will download the binary,
the following line to your shell profile (e.g., ~/.bashrc, ~/.zshrc, or ~/.bash_profile), replacing your_api_key with the following line to your shell profile (e.g., ~/.bashrc, ~/.zshrc, or ~/.bash_profile), replacing your_api_key with
your actual key: your actual key:
```shell ```shell
export OPENAI_API_KEY="your_api_key" export OPENAI_API_KEY="your_api_key"
``` ```
2. To enable history tracking across CLI calls, create a ~/.chatgpt-cli directory using the command: 2. To enable history tracking across CLI calls, create a ~/.chatgpt-cli directory using the command:
```shell ```shell
mkdir -p ~/.chatgpt-cli mkdir -p ~/.chatgpt-cli
``` ```
With this directory in place, the CLI will automatically manage message history for seamless conversations with the GPT Once this directory is in place, the CLI automatically manages the message history for each "thread" you converse
model. The history acts as a sliding window, maintaining a maximum of `4096` tokens to ensure optimal performance and with. The history operates like a sliding window, maintaining context up to a configurable token maximum. This
interaction quality. ensures a balance between maintaining conversation context and achieving optimal performance.
By default, if a specific thread is not provided by the user, the CLI uses the default thread and stores the history
at `~/.chatgpt-cli/history/default.json`. You can find more details about how to configure the `thread` parameter in
the
[Configuration](#configuration) section of this document.
3. Try it out: 3. Try it out:
```shell ```shell
chatgpt what is the capital of the Netherlands chatgpt what is the capital of the Netherlands
``` ```
4. To start interactive mode, use the `-i` or `--interactive` flag: 4. To start interactive mode, use the `-i` or `--interactive` flag:
```shell ```shell
chatgpt --interactive chatgpt --interactive
``` ```
5. To use the pipe feature, create a text file containing some context. For example, create a file named context.txt 5. To use the pipe feature, create a text file containing some context. For example, create a file named context.txt
with the following content: with the following content:
```shell ```shell
Kya is a playful dog who loves swimming and playing fetch. Kya is a playful dog who loves swimming and playing fetch.
``` ```
Then, use the pipe feature to provide this context to ChatGPT: Then, use the pipe feature to provide this context to ChatGPT:
```shell ```shell
cat context.txt | chatgpt "What kind of toy would Kya enjoy?" cat context.txt | chatgpt "What kind of toy would Kya enjoy?"
``` ```
6. To list all available models, use the -l or --list-models flag: 6. To list all available models, use the -l or --list-models flag:
```shell ```shell
chatgpt --list-models chatgpt --list-models
``` ```
7. For more options, see:
```shell
chatgpt --help
```
## Configuration ## Configuration
The ChatGPT CLI adopts a three-tier configuration strategy, with different levels of precedence assigned to default The ChatGPT CLI adopts a three-tier configuration strategy, with different levels of precedence assigned to default
values, the `config.yaml` file, and environment variables, in that respective order. values, the `config.yaml` file, and environment variables, in that respective order.
The default configuration: Configuration variables:
```yaml | Variable | Description | Default |
name: openai |--------------------|-----------------------------------------------------------------------------------|--------------------------|
api_key: | `name` | The prefix for environment variable overrides. | 'openai' |
model: gpt-3.5-turbo | `api_key` | Your OpenAI API key. | (none for security) |
max_tokens: 4096 | `model` | The GPT model used by the application. | 'gpt-3.5-turbo' |
url: https://api.openai.com | `max_tokens` | The maximum number of tokens that can be used in a single API call. | 4096 |
completions_path: /v1/chat/completions | `thread` | The name of the current chat thread. Each unique thread name has its own context. | 'default' |
models_path: /v1/models | `omit_history` | If true, the chat history will not be used to provide context for the GPT model. | false |
omit_history: false | `url` | The base URL for the OpenAI API. | 'https://api.openai.com' |
``` | `completions_path` | The API endpoint for completions. | '/v1/chat/completions' |
| `models_path` | The API endpoint for accessing model information. | '/v1/models' |
These defaults can be overridden by providing your own values in the user configuration file, The defaults can be overridden by providing your own values in the user configuration file,
named `.chatgpt-cli/config.yaml`, located in your home directory. named `.chatgpt-cli/config.yaml`, located in your home directory.
The structure of the user configuration file mirrors that of the default configuration. For instance, to override The structure of the user configuration file mirrors that of the default configuration. For instance, to override
@@ -213,58 +227,51 @@ building the application:
1. Run the tests using the following scripts: 1. Run the tests using the following scripts:
For unit tests, run: For unit tests, run:
```shell ```shell
./scripts/unit.sh ./scripts/unit.sh
``` ```
For integration tests, run: For integration tests, run:
```shell ```shell
./scripts/integration.sh ./scripts/integration.sh
``` ```
For contract tests, run: For contract tests, run:
```shell ```shell
./scripts/contract.sh ./scripts/contract.sh
``` ```
To run all tests, use: To run all tests, use:
```shell ```shell
./scripts/all-tests.sh ./scripts/all-tests.sh
``` ```
2. Build the app using the installation script: 2. Build the app using the installation script:
```shell ```shell
./scripts/install.sh ./scripts/install.sh
``` ```
3. After a successful build, test the application with the following command: 3. After a successful build, test the application with the following command:
```shell ```shell
./bin/chatgpt what type of dog is a Jack Russel? ./bin/chatgpt what type of dog is a Jack Russel?
``` ```
4. As mentioned before, to enable history tracking across CLI calls, create a ~/.chatgpt-cli directory using the 4. As mentioned previously, the ChatGPT CLI supports tracking conversation history across CLI calls. This feature
command: creates a seamless and conversational experience with the GPT model, as the history is utilized as context in
subsequent interactions.
```shell To enable this feature, you need to create a `~/.chatgpt-cli` directory using the command:
mkdir -p ~/.chatgpt-cli
```
With this directory in place, the CLI will automatically manage message history for seamless conversations with the GPT ```shell
model. The history acts as a sliding window, maintaining a maximum of 4096 tokens to ensure optimal performance and mkdir -p ~/.chatgpt-cli
interaction quality. ```
For more options, see:
```shell
./bin/chatgpt --help
```
## Reporting Issues and Contributing ## Reporting Issues and Contributing

View File

@@ -39,6 +39,7 @@ func New(caller http.Caller, cs config.ConfigStore, hs history.HistoryStore) (*C
} }
caller.SetAPIKey(configuration.APIKey) caller.SetAPIKey(configuration.APIKey)
hs.SetThread(configuration.Thread)
return &Client{ return &Client{
Config: configuration, Config: configuration,

View File

@@ -23,11 +23,12 @@ import (
const ( const (
defaultMaxTokens = 4096 defaultMaxTokens = 4096
defaultURL = "https://api.openai.com" defaultURL = "https://default.openai.com"
defaultName = "default-name" defaultName = "default-name"
defaultModel = "gpt-3.5-turbo" defaultModel = "gpt-3.5-turbo"
defaultCompletionsPath = "/v1/chat/completions" defaultCompletionsPath = "/default/completions"
defaultModelsPath = "/v1/models" defaultModelsPath = "/default/models"
defaultThread = "default-thread"
envApiKey = "api-key" envApiKey = "api-key"
) )
@@ -230,6 +231,7 @@ func testClient(t *testing.T, when spec.G, it spec.S) {
}) })
it("ignores history when configured to do so", func() { it("ignores history when configured to do so", func() {
mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1) mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1)
mockHistoryStore.EXPECT().SetThread(defaultThread).Times(1)
mockConfigStore.EXPECT().Read().Return(types.Config{OmitHistory: true}, nil).Times(1) mockConfigStore.EXPECT().Read().Return(types.Config{OmitHistory: true}, nil).Times(1)
subject, err := client.New(mockCaller, mockConfigStore, mockHistoryStore) subject, err := client.New(mockCaller, mockConfigStore, mockHistoryStore)
@@ -484,6 +486,7 @@ func newClientFactory(mc *MockCaller, mcs *MockConfigStore, mhs *MockHistoryStor
URL: defaultURL, URL: defaultURL,
CompletionsPath: defaultCompletionsPath, CompletionsPath: defaultCompletionsPath,
ModelsPath: defaultModelsPath, ModelsPath: defaultModelsPath,
Thread: defaultThread,
}).Times(1) }).Times(1)
return &clientFactory{ return &clientFactory{
@@ -495,6 +498,7 @@ func newClientFactory(mc *MockCaller, mcs *MockConfigStore, mhs *MockHistoryStor
func (f *clientFactory) buildClientWithoutConfig() *client.Client { func (f *clientFactory) buildClientWithoutConfig() *client.Client {
f.mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1) f.mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1)
f.mockHistoryStore.EXPECT().SetThread(defaultThread).Times(1)
f.mockConfigStore.EXPECT().Read().Return(types.Config{}, nil).Times(1) f.mockConfigStore.EXPECT().Read().Return(types.Config{}, nil).Times(1)
c, err := client.New(f.mockCaller, f.mockConfigStore, f.mockHistoryStore) c, err := client.New(f.mockCaller, f.mockConfigStore, f.mockHistoryStore)
@@ -505,6 +509,7 @@ func (f *clientFactory) buildClientWithoutConfig() *client.Client {
func (f *clientFactory) buildClientWithConfig(config types.Config) *client.Client { func (f *clientFactory) buildClientWithConfig(config types.Config) *client.Client {
f.mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1) f.mockCaller.EXPECT().SetAPIKey(envApiKey).Times(1)
f.mockHistoryStore.EXPECT().SetThread(defaultThread).Times(1)
f.mockConfigStore.EXPECT().Read().Return(config, nil).Times(1) f.mockConfigStore.EXPECT().Read().Return(config, nil).Times(1)
c, err := client.New(f.mockCaller, f.mockConfigStore, f.mockHistoryStore) c, err := client.New(f.mockCaller, f.mockConfigStore, f.mockHistoryStore)

View File

@@ -63,6 +63,18 @@ func (mr *MockHistoryStoreMockRecorder) Read() *gomock.Call {
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "Read", reflect.TypeOf((*MockHistoryStore)(nil).Read)) return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "Read", reflect.TypeOf((*MockHistoryStore)(nil).Read))
} }
// SetThread mocks base method.
func (m *MockHistoryStore) SetThread(arg0 string) {
m.ctrl.T.Helper()
m.ctrl.Call(m, "SetThread", arg0)
}
// SetThread indicates an expected call of SetThread.
func (mr *MockHistoryStoreMockRecorder) SetThread(arg0 interface{}) *gomock.Call {
mr.mock.ctrl.T.Helper()
return mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "SetThread", reflect.TypeOf((*MockHistoryStore)(nil).SetThread), arg0)
}
// Write mocks base method. // Write mocks base method.
func (m *MockHistoryStore) Write(arg0 []types.Message) error { func (m *MockHistoryStore) Write(arg0 []types.Message) error {
m.ctrl.T.Helper() m.ctrl.T.Helper()

View File

@@ -42,7 +42,7 @@ func main() {
rootCmd.PersistentFlags().BoolVarP(&interactiveMode, "interactive", "i", false, "Use interactive mode") rootCmd.PersistentFlags().BoolVarP(&interactiveMode, "interactive", "i", false, "Use interactive mode")
rootCmd.PersistentFlags().BoolVarP(&queryMode, "query", "q", false, "Use query mode instead of stream mode") rootCmd.PersistentFlags().BoolVarP(&queryMode, "query", "q", false, "Use query mode instead of stream mode")
rootCmd.PersistentFlags().BoolVar(&clearHistory, "clear-history", false, "Clear the history of ChatGPT CLI") rootCmd.PersistentFlags().BoolVar(&clearHistory, "clear-history", false, "Clear all prior conversation context for the current thread")
rootCmd.PersistentFlags().BoolVarP(&showConfig, "config", "c", false, "Display the configuration") rootCmd.PersistentFlags().BoolVarP(&showConfig, "config", "c", false, "Display the configuration")
rootCmd.PersistentFlags().BoolVarP(&showVersion, "version", "v", false, "Display the version information") rootCmd.PersistentFlags().BoolVarP(&showVersion, "version", "v", false, "Display the version information")
rootCmd.PersistentFlags().BoolVarP(&listModels, "list-models", "l", false, "List available models") rootCmd.PersistentFlags().BoolVarP(&listModels, "list-models", "l", false, "List available models")
@@ -85,11 +85,18 @@ func run(cmd *cobra.Command, args []string) error {
} }
if clearHistory { if clearHistory {
historyHandler := history.New() historyHandler, err := history.New()
err := historyHandler.Delete()
if err != nil { if err != nil {
return err return err
} }
cm := configmanager.New(config.New())
historyHandler.SetThread(cm.Config.Thread)
if err := historyHandler.Delete(); err != nil {
return err
}
fmt.Println("History successfully cleared.") fmt.Println("History successfully cleared.")
return nil return nil
} }
@@ -105,7 +112,8 @@ func run(cmd *cobra.Command, args []string) error {
return nil return nil
} }
client, err := client.New(http.New(), config.New(), history.New()) hs, _ := history.New() // do not error out
client, err := client.New(http.New(), config.New(), hs)
if err != nil { if err != nil {
return err return err
} }

View File

@@ -15,6 +15,7 @@ const (
openAIURL = "https://api.openai.com" openAIURL = "https://api.openai.com"
openAICompletionsPath = "/v1/chat/completions" openAICompletionsPath = "/v1/chat/completions"
openAIModelsPath = "/v1/models" openAIModelsPath = "/v1/models"
openAIThread = "default"
) )
type ConfigStore interface { type ConfigStore interface {
@@ -54,6 +55,7 @@ func (f *FileIO) ReadDefaults() types.Config {
URL: openAIURL, URL: openAIURL,
CompletionsPath: openAICompletionsPath, CompletionsPath: openAICompletionsPath,
ModelsPath: openAIModelsPath, ModelsPath: openAIModelsPath,
Thread: openAIThread,
} }
} }

View File

@@ -28,6 +28,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
defaultURL = "default-url" defaultURL = "default-url"
defaultModel = "default-model" defaultModel = "default-model"
defaultApiKey = "default-api-key" defaultApiKey = "default-api-key"
defaultThread = "default-thread"
defaultCompletionsPath = "default-completions-path" defaultCompletionsPath = "default-completions-path"
defaultModelsPath = "default-models-path" defaultModelsPath = "default-models-path"
defaultOmitHistory = false defaultOmitHistory = false
@@ -54,6 +55,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
CompletionsPath: defaultCompletionsPath, CompletionsPath: defaultCompletionsPath,
ModelsPath: defaultModelsPath, ModelsPath: defaultModelsPath,
OmitHistory: defaultOmitHistory, OmitHistory: defaultOmitHistory,
Thread: defaultThread,
} }
envPrefix = strings.ToUpper(defaultConfig.Name) + "_" envPrefix = strings.ToUpper(defaultConfig.Name) + "_"
@@ -85,6 +87,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.Name).To(Equal(defaultName)) Expect(subject.Config.Name).To(Equal(defaultName))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided model", func() { it("gives precedence to the user provided model", func() {
userModel := "the-model" userModel := "the-model"
@@ -102,6 +105,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.Name).To(Equal(defaultName)) Expect(subject.Config.Name).To(Equal(defaultName))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided name", func() { it("gives precedence to the user provided name", func() {
userName := "the-name" userName := "the-name"
@@ -119,6 +123,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided max-tokens", func() { it("gives precedence to the user provided max-tokens", func() {
userMaxTokens := 42 userMaxTokens := 42
@@ -136,6 +141,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided URL", func() { it("gives precedence to the user provided URL", func() {
@@ -154,6 +160,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided completions-path", func() { it("gives precedence to the user provided completions-path", func() {
completionsPath := "the-completions-path" completionsPath := "the-completions-path"
@@ -171,6 +178,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided models-path", func() { it("gives precedence to the user provided models-path", func() {
modelsPath := "the-models-path" modelsPath := "the-models-path"
@@ -188,6 +196,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(modelsPath)) Expect(subject.Config.ModelsPath).To(Equal(modelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided api-key", func() { it("gives precedence to the user provided api-key", func() {
apiKey := "new-api-key" apiKey := "new-api-key"
@@ -205,6 +214,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(apiKey)) Expect(subject.Config.APIKey).To(Equal(apiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the user provided omit-history", func() { it("gives precedence to the user provided omit-history", func() {
omitHistory := true omitHistory := true
@@ -221,8 +231,27 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath)) Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.Thread).To(Equal(defaultThread))
Expect(subject.Config.OmitHistory).To(Equal(omitHistory)) Expect(subject.Config.OmitHistory).To(Equal(omitHistory))
}) })
it("gives precedence to the user provided thread", func() {
userThread := "user-thread"
mockConfigStore.EXPECT().ReadDefaults().Return(defaultConfig).Times(1)
mockConfigStore.EXPECT().Read().Return(types.Config{Thread: userThread}, nil).Times(1)
subject := configmanager.New(mockConfigStore).WithEnvironment()
Expect(subject.Config.Name).To(Equal(defaultName))
Expect(subject.Config.Model).To(Equal(defaultModel))
Expect(subject.Config.MaxTokens).To(Equal(defaultMaxTokens))
Expect(subject.Config.URL).To(Equal(defaultURL))
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(userThread))
})
it("gives precedence to the OMIT_HISTORY environment variable", func() { it("gives precedence to the OMIT_HISTORY environment variable", func() {
var ( var (
environmentValue = true environmentValue = true
@@ -243,8 +272,32 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath)) Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.Thread).To(Equal(defaultThread))
Expect(subject.Config.OmitHistory).To(Equal(environmentValue)) Expect(subject.Config.OmitHistory).To(Equal(environmentValue))
}) })
it("gives precedence to the THREAD environment variable", func() {
var (
environmentValue = "env-thread"
configValue = "conf-thread"
)
Expect(os.Setenv(envPrefix+"THREAD", environmentValue)).To(Succeed())
mockConfigStore.EXPECT().ReadDefaults().Return(defaultConfig).Times(1)
mockConfigStore.EXPECT().Read().Return(types.Config{Thread: configValue}, nil).Times(1)
subject := configmanager.New(mockConfigStore).WithEnvironment()
Expect(subject.Config.Name).To(Equal(defaultName))
Expect(subject.Config.Model).To(Equal(defaultModel))
Expect(subject.Config.MaxTokens).To(Equal(defaultMaxTokens))
Expect(subject.Config.URL).To(Equal(defaultURL))
Expect(subject.Config.CompletionsPath).To(Equal(defaultCompletionsPath))
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(environmentValue))
})
it("gives precedence to the API_KEY environment variable", func() { it("gives precedence to the API_KEY environment variable", func() {
var ( var (
environmentKey = "environment-api-key" environmentKey = "environment-api-key"
@@ -266,6 +319,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(environmentKey)) Expect(subject.Config.APIKey).To(Equal(environmentKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the MODEL environment variable", func() { it("gives precedence to the MODEL environment variable", func() {
var ( var (
@@ -288,6 +342,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the MAX_TOKENS environment variable", func() { it("gives precedence to the MAX_TOKENS environment variable", func() {
var ( var (
@@ -310,6 +365,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the URL environment variable", func() { it("gives precedence to the URL environment variable", func() {
var ( var (
@@ -332,6 +388,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the COMPLETIONS_PATH environment variable", func() { it("gives precedence to the COMPLETIONS_PATH environment variable", func() {
var ( var (
@@ -354,6 +411,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(defaultModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
it("gives precedence to the MODELS_PATH environment variable", func() { it("gives precedence to the MODELS_PATH environment variable", func() {
var ( var (
@@ -376,6 +434,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(subject.Config.ModelsPath).To(Equal(envModelsPath)) Expect(subject.Config.ModelsPath).To(Equal(envModelsPath))
Expect(subject.Config.APIKey).To(Equal(defaultApiKey)) Expect(subject.Config.APIKey).To(Equal(defaultApiKey))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
}) })
@@ -396,6 +455,7 @@ func testConfig(t *testing.T, when spec.G, it spec.S) {
Expect(result).To(ContainSubstring(defaultModelsPath)) Expect(result).To(ContainSubstring(defaultModelsPath))
Expect(result).To(ContainSubstring(fmt.Sprintf("%d", defaultMaxTokens))) Expect(result).To(ContainSubstring(fmt.Sprintf("%d", defaultMaxTokens)))
Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory)) Expect(subject.Config.OmitHistory).To(Equal(defaultOmitHistory))
Expect(subject.Config.Thread).To(Equal(defaultThread))
}) })
}) })
@@ -438,4 +498,5 @@ func cleanEnv(envPrefix string) {
Expect(os.Unsetenv(envPrefix + "COMPLETIONS_PATH")).To(Succeed()) Expect(os.Unsetenv(envPrefix + "COMPLETIONS_PATH")).To(Succeed())
Expect(os.Unsetenv(envPrefix + "MODELS_PATH")).To(Succeed()) Expect(os.Unsetenv(envPrefix + "MODELS_PATH")).To(Succeed())
Expect(os.Unsetenv(envPrefix + "OMIT_HISTORY")).To(Succeed()) Expect(os.Unsetenv(envPrefix + "OMIT_HISTORY")).To(Succeed())
Expect(os.Unsetenv(envPrefix + "THREAD")).To(Succeed())
} }

View File

@@ -2,46 +2,77 @@ package history
import ( import (
"encoding/json" "encoding/json"
"github.com/kardolus/chatgpt-cli/config"
"github.com/kardolus/chatgpt-cli/types" "github.com/kardolus/chatgpt-cli/types"
"github.com/kardolus/chatgpt-cli/utils" "github.com/kardolus/chatgpt-cli/utils"
"os" "os"
"path"
"path/filepath" "path/filepath"
) )
const (
historyDirName = "history"
jsonExtension = ".json"
)
type HistoryStore interface { type HistoryStore interface {
Delete() error Delete() error
Read() ([]types.Message, error) Read() ([]types.Message, error)
Write([]types.Message) error Write([]types.Message) error
SetThread(thread string)
} }
// Ensure FileIO implements the HistoryStore interface // Ensure FileIO implements the HistoryStore interface
var _ HistoryStore = &FileIO{} var _ HistoryStore = &FileIO{}
type FileIO struct { type FileIO struct {
historyFilePath string historyDir string
thread string
} }
func New() *FileIO { func New() (*FileIO, error) {
path, _ := getPath() _ = migrate()
return &FileIO{
historyFilePath: path, dir, err := getHistoryDir()
if err != nil {
return nil, err
} }
chatGPTDir, err := utils.GetChatGPTDirectory()
if err != nil {
return nil, err
}
fileInfo, err := os.Stat(chatGPTDir)
if err == nil {
if fileInfo.IsDir() {
err = os.MkdirAll(dir, 0755)
}
}
return &FileIO{
historyDir: dir,
}, err
} }
func (f *FileIO) WithFilePath(historyFilePath string) *FileIO { func (f *FileIO) SetThread(thread string) {
f.historyFilePath = historyFilePath f.thread = thread
}
func (f *FileIO) WithDirectory(historyDir string) *FileIO {
f.historyDir = historyDir
return f return f
} }
func (f *FileIO) Delete() error { func (f *FileIO) Delete() error {
if _, err := os.Stat(f.historyFilePath); err == nil { if _, err := os.Stat(f.getPath()); err == nil {
return os.Remove(f.historyFilePath) return os.Remove(f.getPath())
} }
return nil return nil
} }
func (f *FileIO) Read() ([]types.Message, error) { func (f *FileIO) Read() ([]types.Message, error) {
return parseFile(f.historyFilePath) return parseFile(f.getPath())
} }
func (f *FileIO) Write(messages []types.Message) error { func (f *FileIO) Write(messages []types.Message) error {
@@ -50,16 +81,56 @@ func (f *FileIO) Write(messages []types.Message) error {
return err return err
} }
return os.WriteFile(f.historyFilePath, data, 0644) return os.WriteFile(f.getPath(), data, 0644)
} }
func getPath() (string, error) { func (f *FileIO) getPath() string {
return filepath.Join(f.historyDir, f.thread+jsonExtension)
}
func getHistoryDir() (string, error) {
homeDir, err := utils.GetChatGPTDirectory() homeDir, err := utils.GetChatGPTDirectory()
if err != nil { if err != nil {
return "", err return "", err
} }
return filepath.Join(homeDir, "history"), nil return filepath.Join(homeDir, historyDirName), nil
}
// migrate moves the legacy "history" file in ~/.chatgpt-cli to "history/default.json"
func migrate() error {
hiddenDir, err := utils.GetChatGPTDirectory()
if err != nil {
return err
}
historyFile := path.Join(hiddenDir, historyDirName)
fileInfo, err := os.Stat(historyFile)
if err != nil {
return err
}
if !fileInfo.IsDir() {
defaults := config.New().ReadDefaults()
// move the legacy "history" file to "default.json"
if err := os.Rename(historyFile, path.Join(hiddenDir, defaults.Thread+jsonExtension)); err != nil {
return err
}
// create the "history" directory
if err := os.Mkdir(historyFile, 0755); err != nil {
return err
}
// move default.json to the "history" directory
if err := os.Rename(path.Join(hiddenDir, defaults.Thread+jsonExtension), path.Join(historyFile, defaults.Thread+jsonExtension)); err != nil {
return err
}
}
return nil
} }
func parseFile(fileName string) ([]types.Message, error) { func parseFile(fileName string) ([]types.Message, error) {

View File

@@ -6,6 +6,7 @@ import (
"github.com/kardolus/chatgpt-cli/configmanager" "github.com/kardolus/chatgpt-cli/configmanager"
"github.com/kardolus/chatgpt-cli/history" "github.com/kardolus/chatgpt-cli/history"
"github.com/kardolus/chatgpt-cli/types" "github.com/kardolus/chatgpt-cli/types"
"github.com/kardolus/chatgpt-cli/utils"
"github.com/onsi/gomega/gexec" "github.com/onsi/gomega/gexec"
"github.com/sclevine/spec" "github.com/sclevine/spec"
"github.com/sclevine/spec/report" "github.com/sclevine/spec/report"
@@ -38,9 +39,10 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
}) })
when("Read, Write and Delete History", func() { when("Read, Write and Delete History", func() {
const threadName = "default-thread"
var ( var (
tmpDir string tmpDir string
tmpFile *os.File
fileIO *history.FileIO fileIO *history.FileIO
messages []types.Message messages []types.Message
err error err error
@@ -50,12 +52,9 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
tmpDir, err = os.MkdirTemp("", "chatgpt-cli-test") tmpDir, err = os.MkdirTemp("", "chatgpt-cli-test")
Expect(err).NotTo(HaveOccurred()) Expect(err).NotTo(HaveOccurred())
tmpFile, err = os.CreateTemp(tmpDir, "history.json") fileIO, _ = history.New()
Expect(err).NotTo(HaveOccurred()) fileIO = fileIO.WithDirectory(tmpDir)
fileIO.SetThread(threadName)
Expect(tmpFile.Close()).To(Succeed())
fileIO = history.New().WithFilePath(tmpFile.Name())
messages = []types.Message{ messages = []types.Message{
{ {
@@ -91,7 +90,7 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
err = fileIO.Delete() err = fileIO.Delete()
Expect(err).NotTo(HaveOccurred()) Expect(err).NotTo(HaveOccurred())
_, err = os.Stat(tmpFile.Name()) _, err = os.Stat(threadName + ".json")
Expect(os.IsNotExist(err)).To(BeTrue()) Expect(os.IsNotExist(err)).To(BeTrue())
}) })
}) })
@@ -170,7 +169,6 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
SetDefaultEventuallyTimeout(5 * time.Second) SetDefaultEventuallyTimeout(5 * time.Second)
Expect(buildBinary()).To(Succeed()) Expect(buildBinary()).To(Succeed())
Expect(runMockServer()).To(Succeed()) Expect(runMockServer()).To(Succeed())
Eventually(func() (string, error) { Eventually(func() (string, error) {
@@ -214,14 +212,17 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
Eventually(session).Should(gexec.Exit(exitSuccess)) Eventually(session).Should(gexec.Exit(exitSuccess))
}) })
it("should not require an API key for the --clear-history flag", func() { it("should require a hidden folder for the --clear-history flag", func() {
Expect(os.Unsetenv(apiKeyEnvVar)).To(Succeed()) Expect(os.Unsetenv(apiKeyEnvVar)).To(Succeed())
command := exec.Command(binaryPath, "--clear-history") command := exec.Command(binaryPath, "--clear-history")
session, err := gexec.Start(command, io.Discard, io.Discard) session, err := gexec.Start(command, io.Discard, io.Discard)
Expect(err).NotTo(HaveOccurred()) Expect(err).NotTo(HaveOccurred())
Eventually(session).Should(gexec.Exit(exitSuccess)) Eventually(session).Should(gexec.Exit(exitFailure))
output := string(session.Out.Contents())
Expect(output).To(ContainSubstring(".chatgpt-cli: no such file or directory"))
}) })
it("should require an argument for the --set-model flag", func() { it("should require an argument for the --set-model flag", func() {
@@ -325,9 +326,63 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
Expect(os.RemoveAll(filePath)).To(Succeed()) Expect(os.RemoveAll(filePath)).To(Succeed())
}) })
it("migrates the legacy history as expected", func() {
// Legacy history file should not exist
legacyFile := path.Join(filePath, "history")
Expect(legacyFile).NotTo(BeAnExistingFile())
// History should not exist yet
historyFile := path.Join(filePath, "history", "default.json")
Expect(historyFile).NotTo(BeAnExistingFile())
bytes, err := utils.FileToBytes("history.json")
Expect(err).NotTo(HaveOccurred())
Expect(os.WriteFile(legacyFile, bytes, 0644)).To(Succeed())
Expect(legacyFile).To(BeARegularFile())
// Perform a query
command := exec.Command(binaryPath, "--query", "some-query")
session, err := gexec.Start(command, io.Discard, io.Discard)
Expect(err).NotTo(HaveOccurred())
// The CLI response should be as expected
Eventually(session).Should(gexec.Exit(exitSuccess))
output := string(session.Out.Contents())
response := `I don't have personal opinions about bars, but here are some popular bars in Red Hook, Brooklyn:`
Expect(output).To(ContainSubstring(response))
// The history file should have the expected content
Expect(path.Dir(historyFile)).To(BeADirectory())
content, err := os.ReadFile(historyFile)
Expect(err).NotTo(HaveOccurred())
Expect(content).NotTo(BeEmpty())
Expect(string(content)).To(ContainSubstring(response))
// The legacy file should now be a directory
Expect(legacyFile).To(BeADirectory())
Expect(legacyFile).NotTo(BeARegularFile())
// The content was moved to the new file
Expect(string(content)).To(ContainSubstring("Of course! Which city are you referring to?"))
})
it("should not require an API key for the --clear-history flag", func() {
Expect(os.Unsetenv(apiKeyEnvVar)).To(Succeed())
command := exec.Command(binaryPath, "--clear-history")
session, err := gexec.Start(command, io.Discard, io.Discard)
Expect(err).NotTo(HaveOccurred())
Eventually(session).Should(gexec.Exit(exitSuccess))
})
it("keeps track of history", func() { it("keeps track of history", func() {
// History should not exist yet // History should not exist yet
historyFile := path.Join(filePath, "history") historyFile := path.Join(filePath, "history", "default.json")
Expect(historyFile).NotTo(BeAnExistingFile()) Expect(historyFile).NotTo(BeAnExistingFile())
// Perform a query // Perform a query
@@ -344,7 +399,7 @@ func testIntegration(t *testing.T, when spec.G, it spec.S) {
Expect(output).To(ContainSubstring(response)) Expect(output).To(ContainSubstring(response))
// The history file should have the expected content // The history file should have the expected content
Expect(historyFile).To(BeAnExistingFile()) Expect(path.Dir(historyFile)).To(BeADirectory())
content, err := os.ReadFile(historyFile) content, err := os.ReadFile(historyFile)
Expect(err).NotTo(HaveOccurred()) Expect(err).NotTo(HaveOccurred())

14
resources/testdata/history.json vendored Normal file
View File

@@ -0,0 +1,14 @@
[
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "can you tell me more about this city?"
},
{
"role": "assistant",
"content": "Of course! Which city are you referring to?\n"
}
]

View File

@@ -5,8 +5,9 @@ type Config struct {
APIKey string `yaml:"api_key"` APIKey string `yaml:"api_key"`
Model string `yaml:"model"` Model string `yaml:"model"`
MaxTokens int `yaml:"max_tokens"` MaxTokens int `yaml:"max_tokens"`
Thread string `yaml:"thread"`
OmitHistory bool `yaml:"omit_history"`
URL string `yaml:"url"` URL string `yaml:"url"`
CompletionsPath string `yaml:"completions_path"` CompletionsPath string `yaml:"completions_path"`
ModelsPath string `yaml:"models_path"` ModelsPath string `yaml:"models_path"`
OmitHistory bool `yaml:"omit_history"`
} }