Adds a `modelProvider` field to `promptVariants`, currently just set to "openai/ChatCompletion" for all variants for now.
Adds a `modelProviders/` directory where we can define and store pluggable model providers. Currently just OpenAI. Not everything is pluggable yet -- notably the code to actually generate completions hasn't been migrated to this setup yet.
Does a lot of work to get the types working. Prompts are now defined with a function `definePrompt(modelProvider, config)` instead of `prompt = config`. Added a script to migrate old prompt definitions.
This is still partial work, but the diff is large enough that I want to get it in. I don't think anything is broken but I haven't tested thoroughly.
* Add dropdown header for model switching
* Allow variant duplication
* Fix prettier
* Use env variable to restrict prisma logs
* Fix env.mjs
* Remove unnecessary scroll bar from function call output
* Properly record when 404 error occurs in queryLLM task
* Add SelectedModelInfo in SelectModelModal
* Add react-select
* Calculate new prompt after switching model
* Send newly selected model with creation request
* Get new prompt construction function back from GPT-4
* Fix prettier
* Fix prettier