Kyle Corbitt
0c37506975
Bugfix: show prompts for view-only experiments
...
Accidentally put the Monaco loading code in the wrong place! This fixes.
2023-07-24 14:57:33 -07:00
arcticfly
2b2e0ab8ee
Define refinement actions in the model providers ( #87 )
...
* Add descriptions of fields in llama 2 input schema
* Let GPT-4 know when the provider stays the same
* Allow refetching in the event of any errors
* Define refinement actions in model providers
* Fix prettier
2023-07-23 17:37:08 -07:00
arcticfly
3dbb06ec00
Give negative margin to account for border ( #86 )
2023-07-23 16:49:12 -07:00
David Corbitt
85d42a014b
Fix padding on AddVariant button
2023-07-22 16:27:59 -07:00
arcticfly
7d1ded3b18
Improve menu styling ( #85 )
2023-07-22 16:22:00 -07:00
Kyle Corbitt
b00f6dd04b
Merge pull request #84 from OpenPipe/paginated-scenarios
...
Paginate scenarios
2023-07-22 16:12:02 -07:00
Kyle Corbitt
2e395e4d39
Paginate scenarios
...
Show 10 scenarios at a time and let the user paginate through them to keep the interface responsive with potentially 1000s of scenarios.
2023-07-22 16:10:16 -07:00
Kyle Corbitt
4b06d05908
Merge pull request #82 from OpenPipe/space-out-scenarios
...
Separate scenarios from prompts in outputs table
2023-07-22 14:44:51 -07:00
Kyle Corbitt
aabf355b81
Merge pull request #81 from OpenPipe/fullscreen-editor
...
Fullscreen editor
2023-07-22 14:44:42 -07:00
Kyle Corbitt
61e5f0775d
separate scenarios from prompts in outputs table
2023-07-22 07:38:19 -07:00
Kyle Corbitt
cc1d1178da
Fullscreen editor
2023-07-21 22:19:38 -07:00
David Corbitt
7466db63df
Make REPLICATE_API_TOKEN optional
2023-07-21 20:23:38 -07:00
David Corbitt
79a0b03bf8
Add another function call example
2023-07-21 20:16:36 -07:00
arcticfly
6fb7a82d72
Add support for switching to Llama models ( #80 )
...
* Add support for switching to Llama models
* Fix prettier
2023-07-21 20:10:59 -07:00
Kyle Corbitt
4ea30a3ba3
Merge pull request #79 from OpenPipe/copy-evals
...
Copy over evals when new cell created
2023-07-21 18:43:44 -07:00
Kyle Corbitt
52d1d5c7ee
Copy over evals when new cell created
...
Fixes a bug where new cells generated as clones of existing cells didn't get the eval results cloned as well.
2023-07-21 18:40:40 -07:00
Kyle Corbitt
46036a44d2
small README update
2023-07-21 14:32:07 -07:00
Kyle Corbitt
3753fe5c16
Merge pull request #78 from OpenPipe/bugfix-max-tokens
...
Fix typescript hints for max_tokens
2023-07-21 12:10:00 -07:00
Kyle Corbitt
213a00a8e6
Fix typescript hints for max_tokens
2023-07-21 12:04:58 -07:00
Kyle Corbitt
af9943eefc
Merge pull request #77 from OpenPipe/provider-types
...
Slightly better typings for ModelProviders
2023-07-21 11:51:25 -07:00
Kyle Corbitt
741128e0f4
Better division of labor between frontend and backend model providers
...
A bit better thinking on which types go where.
2023-07-21 11:49:35 -07:00
David Corbitt
aff14539d8
Add comment to .env.example
2023-07-21 11:29:21 -07:00
David Corbitt
1af81a50a9
Add REPLICATE_API_TOKEN to .env.example
2023-07-21 11:28:14 -07:00
Kyle Corbitt
7e1fbb3767
Slightly better typings for ModelProviders
...
Still not great because the `any`s loosen some call sites up more than I'd like, but better than the broken types before.
2023-07-21 06:50:05 -07:00
David Corbitt
a5d972005e
Add user's current prompt to prompt derivation
2023-07-21 00:43:39 -07:00
arcticfly
a180b5bef2
Show prompt diff when changing models ( #76 )
...
* Make CompareFunctions more configurable
* Change RefinePromptModal styles
* Accept newModel in getModifiedPromptFn
* Show prompt comparison in SelectModelModal
* Pass variant to SelectModelModal
* Update instructions
* Properly use isDisabled
2023-07-20 23:26:49 -07:00
Kyle Corbitt
55c697223e
Merge pull request #74 from OpenPipe/model-providers
...
replicate/llama2 provider
2023-07-20 23:21:42 -07:00
arcticfly
9978075867
Fix auth flicker ( #75 )
...
* Remove experiments flicker for unauthenticated users
* Decrease size of NewScenarioButton spinner
2023-07-20 20:46:31 -07:00
Kyle Corbitt
847753c32b
replicate/llama2 provider
...
Still need to fix the types but it runs
2023-07-20 19:55:03 -07:00
Kyle Corbitt
372c2512c9
Merge pull request #73 from OpenPipe/model-providers
...
More work on modelProviders
2023-07-20 18:56:14 -07:00
Kyle Corbitt
332a2101c0
More work on modelProviders
...
I think everything that's OpenAI-specific is inside modelProviders at this point, so we can get started adding more providers.
2023-07-20 18:54:26 -07:00
arcticfly
1822fe198e
Initially render AutoResizeTextArea without overflow ( #72 )
...
* Rerender resized text area with scroll
* Remove default hidden overflow
2023-07-20 15:00:09 -07:00
Kyle Corbitt
f06e1db3db
Merge pull request #71 from OpenPipe/model-providers
...
Prep for more model providers
2023-07-20 14:55:31 -07:00
Kyle Corbitt
ded6678e97
Prep for more model providers
...
Adds a `modelProvider` field to `promptVariants`, currently just set to "openai/ChatCompletion" for all variants for now.
Adds a `modelProviders/` directory where we can define and store pluggable model providers. Currently just OpenAI. Not everything is pluggable yet -- notably the code to actually generate completions hasn't been migrated to this setup yet.
Does a lot of work to get the types working. Prompts are now defined with a function `definePrompt(modelProvider, config)` instead of `prompt = config`. Added a script to migrate old prompt definitions.
This is still partial work, but the diff is large enough that I want to get it in. I don't think anything is broken but I haven't tested thoroughly.
2023-07-20 14:49:22 -07:00
arcticfly
9314a86857
Use translation in initial scenarios ( #70 )
2023-07-20 14:28:48 -07:00
David Corbitt
54dcb4a567
Prevent text input labels from overlaying scenarios header
2023-07-20 14:28:36 -07:00
David Corbitt
2c8c8d07cf
Merge branch 'main' of github.com:corbt/prompt-lab
2023-07-20 13:38:58 -07:00
David Corbitt
e885bdd365
Fix ScenarioEditor padding
2023-07-20 13:38:46 -07:00
arcticfly
86dc36a656
Improve refinement ( #69 )
...
* Format construction function on return
* Add more refinement examples
* Treat 503 like 429
* Define prompt as object
* Fix prettier
2023-07-20 13:05:27 -07:00
arcticfly
55c077d604
Create FloatingLabelInput for scenario variables ( #68 )
...
* Create FloatingLabelInput
* Fix prettier
* Simplify changes
2023-07-20 12:20:12 -07:00
arcticfly
e598e454d0
Add new predefined refinement options ( #67 )
...
* Add new predefined refinement options
* Fix prettier
* Add icon to SelectModelModal title
2023-07-19 20:10:08 -07:00
David Corbitt
6e3f90cd2f
Add more info to refinement
2023-07-19 18:10:23 -07:00
David Corbitt
eec894e101
Allow multiline instructions
2023-07-19 18:10:04 -07:00
David Corbitt
f797fc3fa4
Eliminate spinner flicker in OutputCell
2023-07-19 18:09:47 -07:00
David Corbitt
335dc0357f
Fix CompareFunctions for mobile
2023-07-19 17:24:19 -07:00
arcticfly
e6e2c706c2
Change up refinement UI ( #66 )
...
* Remove unused ScenarioVariantCell fields
* Refine deriveNewConstructFn
* Fix prettier
* Remove migration script
* Add refine modal
* Fix prettier
* Fix diff checker overflow
* Decrease diff height
* Add more context to prompt refining
* Auto-expand prompt when refining
2023-07-19 17:19:45 -07:00
Kyle Corbitt
7d2166b305
Merge pull request #65 from OpenPipe/no-model
...
Cache cost on ModelOutput
2023-07-19 16:22:35 -07:00
Kyle Corbitt
60765e51ac
Remove model from promptVariant and add cost
...
Storing the model on promptVariant is problematic because it isn't always in sync with the actual prompt definition. I'm removing it for now to see if we can get away with that -- might have to add it back in later if this causes trouble.
Added `cost` to modelOutput as well so we can cache that, which is important given that the cost calculations won't be the same between different API providers.
2023-07-19 16:20:53 -07:00
arcticfly
2c4ba6eb9b
Update README.md ( #64 )
2023-07-19 15:39:21 -07:00
arcticfly
4c97b9f147
Refine prompt ( #63 )
...
* Remove unused ScenarioVariantCell fields
* Refine deriveNewConstructFn
* Fix prettier
* Remove migration script
* Add refine modal
* Fix prettier
* Fix diff checker overflow
* Decrease diff height
2023-07-19 15:31:40 -07:00