Compare commits

...

104 Commits

Author SHA1 Message Date
arcticfly
092b48552d Update README.md 2023-08-24 20:58:45 -07:00
arcticfly
33ca98b267 Enlarge fine-tune gif 2023-08-24 14:14:38 -07:00
arcticfly
39c943f2ec Change layout of README.md 2023-08-24 14:13:01 -07:00
arcticfly
2aa4ac1594 Update opening gif in README.md 2023-08-24 12:46:25 -07:00
arcticfly
42ade01f22 Update README.md 2023-08-24 11:14:25 -07:00
David Corbitt
59b79049c1 Move license to top level 2023-08-24 10:41:23 -07:00
arcticfly
0d7433cb7e Update README.md
Include more models
2023-08-24 00:13:24 -07:00
arcticfly
ec59252010 Wrap in Portal (#191) 2023-08-23 23:48:53 -07:00
arcticfly
87e2339df2 Remove openpipe object from config (#190)
* Remove openpipe object from config

* Remove comment
2023-08-23 23:31:39 -07:00
arcticfly
75ad6619a5 Add InfoCircles (#189) 2023-08-23 22:06:37 -07:00
Kyle Corbitt
4b8941d53a Merge pull request #188 from OpenPipe/export-fixes
Export fixes
2023-08-23 21:30:36 -07:00
David Corbitt
0d691d17cc Rename input to instruction in alpaca format 2023-08-23 21:26:28 -07:00
David Corbitt
815d4faad2 Fix mobile export styles 2023-08-23 21:21:09 -07:00
arcticfly
9632ccbc71 Export formatted logged calls (#187)
* Export formatted data

* Properly update inputMessageHashMap

* Hide remove duplicates checkbox in advanced options

* Remove unused import
2023-08-23 20:45:27 -07:00
Kyle Corbitt
a4131e4a10 Merge pull request #186 from OpenPipe/python-client
Python client
2023-08-23 19:37:55 -07:00
Kyle Corbitt
db1c8f171d Python client published 2023-08-23 19:37:05 -07:00
David Corbitt
678392ef17 Wait until flags are loaded to show beta modal 2023-08-23 18:27:39 -07:00
arcticfly
af722128e8 Use feature flags to control beta features (#185)
* Use feature flags to control beta features

* Remove references to beta env variable
2023-08-23 18:18:56 -07:00
Kyle Corbitt
50a79b6e3a python compat fixes 2023-08-23 17:14:19 -07:00
arcticfly
f59150ff5b Add flow for fine-tuning (#183)
* Remove unnecessary dataset code

* Fix jump on row selection

* Add FineTuneButton

* Add model slug to modal

* Add fine tunes to schema

* Remove dataset routers

* Remove more dataset-specific code

* Remove more data code

* Fix horizontal scroll bar jumping

* Add fine tunes page

* Actually create the fine tune entry

* Add beta modal

* Require beta for fine tunes and request logs

* Send user to waitlist link

* control beta features in .env variable

* Combine migration files

* Show beta features in app shell

* Clear selected log ids last when closing fine tune modal

* Remove ModalCloseButton from BetaModal

* Remove unused import

* Change timestamps to camelCase
2023-08-23 16:13:21 -07:00
David Corbitt
b58e0a8d54 Merge branch 'main' of github.com:corbt/prompt-lab 2023-08-23 03:29:23 -07:00
David Corbitt
dc82a3fa82 Add variant editor shadow 2023-08-23 03:29:07 -07:00
arcticfly
fedbf5784e Fix padding for mobile sign in (#184) 2023-08-23 01:07:55 -07:00
arcticfly
888c04af50 Allow user to toggle visible columns (#182)
* Maintain tag casing

* Persist column visibility in zustand

* Persist only visibleColumns key

* merge persisted state

* Only show ColumnVisibilityDropdown after rehydration

* Record storage rehydrated

* Add useIsClientRehydrated hook

* Hide ActionButton text on mobile

* Condense Paginator on mobile

---------

Co-authored-by: Kyle Corbitt <kyle@corbt.com>
2023-08-21 23:13:29 -07:00
arcticfly
1b36453051 Update README.md
Comment out most gifs
2023-08-21 13:54:23 -07:00
Kyle Corbitt
2f37b3ed87 Merge pull request #181 from OpenPipe/catch-rejections
Catch unhandled rejections in background worker
2023-08-18 22:58:31 -07:00
Kyle Corbitt
8fa7b691db make max pool size configurable 2023-08-18 22:56:24 -07:00
David Corbitt
17866a5249 Fix typo in newConstructionFn 2023-08-18 21:45:43 -07:00
Kyle Corbitt
947eba3216 Catch unhandled rejections in background worker
Previously, an unhandled promise rejection in the background worker would crash the process. This way we log it and don't crash.
2023-08-18 19:03:54 -07:00
arcticfly
ef1f9458f4 Add prompt ids (#177)
* Add prompt ids

* Add prompt ids
2023-08-18 16:56:17 -07:00
Kyle Corbitt
c6c7e746ee Merge pull request #180 from OpenPipe/priorities
Prioritize job execution
2023-08-18 13:46:31 -07:00
Kyle Corbitt
3be0a90960 Prioritize job execution
Makes it so our most critical jobs go through first. Priority order:

1. Force-refetched cells
2. Cells visible on the current page
3. All other cells
4. Retries
5. Evaluations
2023-08-18 13:44:33 -07:00
Kyle Corbitt
9b1f2ac30a new script to run workers 2023-08-18 13:01:01 -07:00
Kyle Corbitt
1b394cc72b more resources 2023-08-18 12:14:28 -07:00
Kyle Corbitt
26b9731bab worker env 2023-08-18 11:45:54 -07:00
Kyle Corbitt
7c8ec8f6a7 Merge pull request #179 from OpenPipe/job-dedupe
Run workers in a separate Docker container
2023-08-18 11:26:32 -07:00
Kyle Corbitt
10dd53e7f6 Run workers in a separate Docker container
We've outgrown the run-everything-on-one-machine setup. This change moves background jobs to a different Docker image in production. It also adds a `jobKey` to certain jobs so if we try to process the same cell multiple times it'll only actually run the job once.
2023-08-18 11:16:00 -07:00
Kyle Corbitt
b1802fc04b Merge pull request #176 from OpenPipe/more-js
Streaming + logging works in Typescript SDK
2023-08-18 08:56:56 -07:00
Kyle Corbitt
f2135ddc72 Streaming + logging works in Typescript SDK
Also added some high-level tests to minimize the chances that we're breaking anything.

The typescript SDK is mostly functional at this point, with the exception that we don't have a build process or way to import it when deployed as an NPM package.
2023-08-18 08:53:08 -07:00
arcticfly
ca89eafb0b Create new uiId for forked variants and scenarios (#175)
* Create new uiIds for forked variants and scenarios

* Add replaceVariant.mutateAsync to onSave dependencies
2023-08-18 08:09:07 -07:00
arcticfly
b50d47beaf Square header border when scrolled down (#174)
* Square header border when scrolled down

* Remove unused import
2023-08-18 01:41:47 -07:00
arcticfly
733d53625b Add Gryphe/MythoMax-L2-13b (#173) 2023-08-18 00:37:16 -07:00
arcticfly
a5e59e4235 Allow user to delete scenario without variables (#172)
* Allow user to delete scenario without variables

* Hide expand button for empty scenario editor

* Add header to scenario modal
2023-08-18 00:08:32 -07:00
Kyle Corbitt
d0102e3202 Merge pull request #171 from OpenPipe/experiment-slug
Use shorter experiment IDs
2023-08-17 23:33:30 -07:00
Kyle Corbitt
bd571c4c4e Merge pull request #170 from OpenPipe/jobs-log
Enqueue tasks more efficiently
2023-08-17 23:33:20 -07:00
Kyle Corbitt
296eb23d97 Use shorter experiment IDs
Because https://app.openpipe.ai/experiments/B1EtN6oHeXMele2 is a cooler URL than https://app.openpipe.ai/experiments/3692942c-6f1b-4bef-83b1-c11f00a3fbdd
2023-08-17 23:28:56 -07:00
Kyle Corbitt
4e2ae7a441 Enqueue tasks more efficiently
Previously we were opening a new database connection for each task we added. Not a problem at small scale but kinda overwhelming for Postgres now that we have more usage.
2023-08-17 22:42:46 -07:00
Kyle Corbitt
072dcee376 Merge pull request #168 from OpenPipe/jobs-log
Admin dashboard for jobs
2023-08-17 22:26:10 -07:00
Kyle Corbitt
94464c0617 Admin dashboard for jobs
Extremely simple jobs dashboard to sanity-check what we've got going on in the job queue.
2023-08-17 22:20:39 -07:00
arcticfly
980644f13c Support vicuna system message (#167)
* Support vicuna system message

* Change tags to USER and ASSISTANT
2023-08-17 21:02:27 -07:00
arcticfly
6a56250001 Add platypus 13b, vicuna 13b, and nous hermes 7b (#166)
* Add platypus

* Add vicuna 13b and nous hermes 7b
2023-08-17 20:01:10 -07:00
Kyle Corbitt
b1c7bbbd4a Merge pull request #165 from OpenPipe/better-output
Don't define CellWrapper inline
2023-08-17 19:07:32 -07:00
Kyle Corbitt
3e20fa31ca Don't define CellWrapper inline
This way we don't re-render the entire cell every time a variable changes. Better performance and handles modals correctly.

OutputCell is still a pretty messy component, which we'll have to address at some point, but the complexity is still manageable for now.
2023-08-17 17:52:45 -07:00
Kyle Corbitt
48a8e64be1 Merge pull request #164 from OpenPipe/more-models
Add Nous-Hermes and Airoboros models
2023-08-17 17:51:28 -07:00
David Corbitt
f3a5f11195 Temporarilyt remove platypus and stableBeluga models 2023-08-17 16:58:52 -07:00
David Corbitt
da5cbaf4dc Remove console.log 2023-08-17 16:16:22 -07:00
David Corbitt
acf74909c9 Ensure ending newline is displayed 2023-08-17 03:37:32 -07:00
David Corbitt
edac8da4a8 Convert system to user prompt for airoboros 2023-08-17 03:10:55 -07:00
David Corbitt
687f3dd85f Rename prompt modal 2023-08-17 02:34:26 -07:00
David Corbitt
0cef3ab5bd Only enable getTemplatedPromptMessage when modal open 2023-08-17 02:32:02 -07:00
David Corbitt
756b3185de Rename CellOptions 2023-08-17 01:44:18 -07:00
David Corbitt
3776ffc4c3 Change ScenarioRow background color 2023-08-17 01:44:06 -07:00
David Corbitt
82549122e1 Add 4 more models 2023-08-17 01:40:05 -07:00
David Corbitt
56a96a7db6 Use different color for row highlight style 2023-08-16 22:46:22 -07:00
David Corbitt
1596b15727 Fix warning from useLayoutEffect 2023-08-16 22:44:18 -07:00
David Corbitt
70d4a5bd9a Fix project settings padding on desktop 2023-08-16 22:40:27 -07:00
arcticfly
c6ec901374 Ad openpipe/Chat provider with Open-Orca/OpenOrcaxOpenChat-Preview2-13B model (#163)
* Display 4 decimal points in ModelStatsCard

* Add openpipe-chat provider
2023-08-16 22:37:37 -07:00
David Corbitt
ad7665664a Update 7b-chat version 2023-08-16 19:23:01 -07:00
David Corbitt
108e3d1e85 Revert email to table-cell display on md screens 2023-08-16 18:49:14 -07:00
David Corbitt
76f600722a Sort project members by role 2023-08-16 18:30:27 -07:00
David Corbitt
d9a0e4581f Add bgColor behind selected project in menu 2023-08-16 18:16:44 -07:00
arcticfly
b9251ad93c Fix members table mobile styles (#162) 2023-08-16 17:52:25 -07:00
arcticfly
809ef04dc1 Invite members (#161)
* Allow user invitations

* Restyle inviting members

* Remove annoying comment

* Add page for accepting an invitation

* Send invitation email with Brevo

* Prevent admins from removing personal project users

* Mark access ceontrol for cancelProjectInvitation

* Make RadioGroup controlled

* Shorten form helper text

* Use nodemailer to send emails

* Update .env.example
2023-08-16 17:25:31 -07:00
arcticfly
0fba2c9ee7 Add NOT_CONTAINS, fix bugs (#160)
* Fix null case for tag comparisons

* Change debounce time to 500ms

* Add NOT_CONTAINS

* Avoid sql injection

* Store filters by id

* Fix chained NOT_CONTAINS
2023-08-15 16:43:59 -07:00
Kyle Corbitt
ac2ca0f617 Merge pull request #158 from OpenPipe/log-filters
Filter logged calls
2023-08-15 10:16:59 -07:00
David Corbitt
73b9e40ced Give LoggedCallsTable scrollbar 2023-08-15 03:12:59 -07:00
David Corbitt
3447e863cc Prevent model name from wrapping 2023-08-15 02:53:24 -07:00
David Corbitt
897e77b054 Prevent logged calls table flashes 2023-08-15 02:49:46 -07:00
David Corbitt
b22a4cd93b Combine migrations 2023-08-15 02:34:27 -07:00
David Corbitt
3547c85c86 Display tag values 2023-08-15 02:32:05 -07:00
David Corbitt
9636fa033e Add second tag to seed 2023-08-15 02:31:24 -07:00
David Corbitt
890a738568 Filter by tags 2023-08-15 01:50:48 -07:00
David Corbitt
7003595e76 Install lodash-es in client-libs for omit function 2023-08-15 00:59:23 -07:00
David Corbitt
00df4453d3 Remove old prettier files 2023-08-15 00:55:05 -07:00
David Corbitt
4c325fc1cc Move prettier files to top directory 2023-08-15 00:54:52 -07:00
David Corbitt
dfee8a0ed7 Merge branch 'main' into log-filters 2023-08-15 00:41:28 -07:00
David Corbitt
0b4e116783 Undo changes in client-libs 2023-08-15 00:30:35 -07:00
David Corbitt
2bcb1d16a3 Autoresize InputDropdown 2023-08-15 00:27:12 -07:00
David Corbitt
6e7efee21e Seed with tags 2023-08-15 00:26:11 -07:00
David Corbitt
bb9c3a9e61 Condense table 2023-08-15 00:26:05 -07:00
David Corbitt
11bfb5d5e4 Start server with timezone 2023-08-14 23:37:23 -07:00
Kyle Corbitt
b00ab933b3 Merge pull request #157 from OpenPipe/more-js-api
TypeScript SDK mostly working
2023-08-14 23:25:33 -07:00
Kyle Corbitt
8f4e7f7e2e TypeScript SDK mostly working
Ok so this is still pretty rough, and notably there's no reporting for streaming. But for non-streaming requests I've verified that this does in fact report requests locally.
2023-08-14 23:22:27 -07:00
David Corbitt
634739c045 Add InputDropdown 2023-08-14 23:02:08 -07:00
David Corbitt
9a9cbe8fd4 Hide paginators for empty lists 2023-08-14 21:17:03 -07:00
David Corbitt
649dc3376b Debounce filter value updates 2023-08-14 21:00:42 -07:00
David Corbitt
05e774d021 Style filters title 2023-08-14 20:47:18 -07:00
David Corbitt
0e328b13dc Style add filter button 2023-08-14 20:42:51 -07:00
David Corbitt
0a18ca9cd6 Allow filtering by response, model, and status code 2023-08-14 20:16:44 -07:00
David Corbitt
a5fe35912e Allow filter by request contains 2023-08-14 20:01:17 -07:00
David Corbitt
3d3ddbe7a9 Show number of rows in table header 2023-08-14 19:56:15 -07:00
David Corbitt
d8a5617dee Increase button radius 2023-08-14 19:51:06 -07:00
Kyle Corbitt
5da62fdc29 Merge pull request #156 from OpenPipe/move-api
Python package improvements
2023-08-14 19:45:14 -07:00
Kyle Corbitt
2863dc2f89 Merge pull request #155 from OpenPipe/move-api
Move the external API into its own router
2023-08-14 17:02:34 -07:00
176 changed files with 6973 additions and 2722 deletions

1
.gitignore vendored
View File

@@ -3,3 +3,4 @@
*.pyc
node_modules/
*.tsbuildinfo
dist/

2
.prettierignore Normal file
View File

@@ -0,0 +1,2 @@
*.schema.json
app/pnpm-lock.yaml

106
README.md
View File

@@ -1,16 +1,52 @@
<!-- <img src="https://github.com/openpipe/openpipe/assets/41524992/ca59596e-eb80-40f9-921f-6d67f6e6d8fa" width="72px" /> -->
<p align="center">
<a href="https://openpipe.ai">
<img height="70" src="https://github.com/openpipe/openpipe/assets/41524992/70af25fb-1f90-42d9-8a20-3606e3b5aaba" alt="logo">
</a>
</p>
<h1 align="center">
OpenPipe
</h1>
# OpenPipe
<p align="center">
<i>Turn expensive prompts into cheap fine-tuned models.</i>
</p>
OpenPipe is a flexible playground for comparing and optimizing LLM prompts. It lets you quickly generate, test and compare candidate prompts, and can automatically [translate](#-translate-between-model-apis) those prompts between models.
<p align="center">
<a href="/LICENSE"><img alt="License Apache-2.0" src="https://img.shields.io/github/license/openpipe/openpipe?style=flat-square"></a>
<a href='http://makeapullrequest.com'><img alt='PRs Welcome' src='https://img.shields.io/badge/PRs-welcome-brightgreen.svg?style=flat-square'/></a>
<a href="https://github.com/openpipe/openpipe/graphs/commit-activity"><img alt="GitHub commit activity" src="https://img.shields.io/github/commit-activity/m/openpipe/openpipe?style=flat-square"/></a>
<a href="https://github.com/openpipe/openpipe/issues"><img alt="GitHub closed issues" src="https://img.shields.io/github/issues-closed/openpipe/openpipe?style=flat-square"/></a>
</p>
<img src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="demo">
<p align="center">
<a href="https://app.openpipe.ai/">Hosted App</a> - <a href="#running-locally">Running Locally</a> - <a href="#sample-experiments">Experiments</a>
</p>
<br>
Use powerful but expensive LLMs to fine-tune smaller and cheaper models suited to your exact needs. Evaluate model and prompt combinations in the playground. Query your past requests and export optimized training data.
<br>
## 🪛 Features
* <b>Fine-Tune</b>
* Easy integration with OpenPipe's SDK in both Python and JS.
* Swiftly query logs using intuitive built-in filters.
* Export data in multiple training formats, including Alpaca and ChatGPT, with deduplication.
* <b>Experiment</b>
* Bulk-test wide-reaching scenarios using code templating.
* Seamlessly translate prompts across different model APIs.
* Tap into autogenerated scenarios for fresh test perspectives.
<img src="https://github.com/openpipe/openpipe/assets/41524992/eaa8b92d-4536-4f63-bbef-4b0b1a60f6b5" alt="fine-tune demo">
<!-- <img height="400px" src="https://github.com/openpipe/openpipe/assets/41524992/66bb1843-cb72-4130-a369-eec2df3b8201" alt="playground demo"> -->
You can use our hosted version of OpenPipe at https://openpipe.ai. You can also clone this repository and [run it locally](#running-locally).
## Sample Experiments
These are simple experiments users have created that show how OpenPipe works. Feel free to fork them and start experimenting yourself.
These are sample experiments users have created that show how OpenPipe works. Feel free to fork them and start experimenting yourself.
- [Twitter Sentiment Analysis](https://app.openpipe.ai/experiments/62c20a73-2012-4a64-973c-4b665ad46a57)
- [Reddit User Needs](https://app.openpipe.ai/experiments/22222222-2222-2222-2222-222222222222)
@@ -19,43 +55,25 @@ These are simple experiments users have created that show how OpenPipe works. Fe
## Supported Models
- All models available through the OpenAI [chat completion API](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
- Llama2 [7b chat](https://replicate.com/a16z-infra/llama7b-v2-chat), [13b chat](https://replicate.com/a16z-infra/llama13b-v2-chat), [70b chat](https://replicate.com/replicate/llama70b-v2-chat).
- Anthropic's [Claude 1 Instant](https://www.anthropic.com/index/introducing-claude) and [Claude 2](https://www.anthropic.com/index/claude-2)
## Features
### 🔍 Visualize Responses
Inspect prompt completions side-by-side.
### 🧪 Bulk-Test
OpenPipe lets you _template_ a prompt. Use the templating feature to run the prompts you're testing against many potential inputs for broad coverage of your problem space.
### 📟 Translate between Model APIs
Write your prompt in one format and automatically convert it to work with any other model.
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/1e19ccf2-96b6-4e93-a3a5-1449710d1b5b" alt="translate between models">
<br><br>
### 🛠️ Refine Your Prompts Automatically
Use a growing database of best-practice refinements to improve your prompts automatically.
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/87a27fe7-daef-445c-a5e2-1c82b23f9f99" alt="add function call">
<br><br>
### 🪄 Auto-generate Test Scenarios
OpenPipe includes a tool to generate new test scenarios based on your existing prompts and scenarios. Just click "Autogenerate Scenario" to try it out!
<img width="600" src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="auto-generate">
<br><br>
#### OpenAI
- [GPT 3.5 Turbo](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
- [GPT 3.5 Turbo 16k](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
- [GPT 4](https://openai.com/gpt-4)
#### Llama2
- [7b chat](https://replicate.com/a16z-infra/llama7b-v2-chat)
- [13b chat](https://replicate.com/a16z-infra/llama13b-v2-chat)
- [70b chat](https://replicate.com/replicate/llama70b-v2-chat)
#### Llama2 Fine-Tunes
- [Open-Orca/OpenOrcaxOpenChat-Preview2-13B](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B)
- [Open-Orca/OpenOrca-Platypus2-13B](https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B)
- [NousResearch/Nous-Hermes-Llama2-13b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b)
- [jondurbin/airoboros-l2-13b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0)
- [lmsys/vicuna-13b-v1.5](https://huggingface.co/lmsys/vicuna-13b-v1.5)
- [Gryphe/MythoMax-L2-13b](https://huggingface.co/Gryphe/MythoMax-L2-13b)
- [NousResearch/Nous-Hermes-llama-2-7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b)
#### Anthropic
- [Claude 1 Instant](https://www.anthropic.com/index/introducing-claude)
- [Claude 2](https://www.anthropic.com/index/claude-2)
## Running Locally
@@ -75,4 +93,4 @@ OpenPipe includes a tool to generate new test scenarios based on your existing p
1. Copy your `.env` file to `.env.test`.
2. Update the `DATABASE_URL` to have a different database name than your development one
3. Run `DATABASE_URL=[your new datatase url] pnpm prisma migrate dev --skip-seed --skip-generate`
4. Run `pnpm test`
4. Run `pnpm test`

View File

@@ -34,3 +34,9 @@ GITHUB_CLIENT_SECRET="your_secret"
OPENPIPE_BASE_URL="http://localhost:3000/api/v1"
OPENPIPE_API_KEY="your_key"
SENDER_EMAIL="placeholder"
SMTP_HOST="placeholder"
SMTP_PORT="placeholder"
SMTP_LOGIN="placeholder"
SMTP_PASSWORD="placeholder"

View File

@@ -1,2 +0,0 @@
*.schema.json
pnpm-lock.yaml

View File

@@ -12,17 +12,18 @@ declare module "nextjs-routes" {
export type Route =
| StaticRoute<"/account/signin">
| StaticRoute<"/admin/jobs">
| DynamicRoute<"/api/auth/[...nextauth]", { "nextauth": string[] }>
| StaticRoute<"/api/experiments/og-image">
| DynamicRoute<"/api/trpc/[trpc]", { "trpc": string }>
| DynamicRoute<"/api/v1/[...trpc]", { "trpc": string[] }>
| StaticRoute<"/api/v1/openapi">
| StaticRoute<"/dashboard">
| DynamicRoute<"/data/[id]", { "id": string }>
| StaticRoute<"/data">
| DynamicRoute<"/experiments/[id]", { "id": string }>
| DynamicRoute<"/experiments/[experimentSlug]", { "experimentSlug": string }>
| StaticRoute<"/experiments">
| StaticRoute<"/fine-tunes">
| StaticRoute<"/">
| DynamicRoute<"/invitations/[invitationToken]", { "invitationToken": string }>
| StaticRoute<"/project/settings">
| StaticRoute<"/request-logs">
| StaticRoute<"/sentry-example-page">

View File

@@ -23,7 +23,6 @@ ARG NEXT_PUBLIC_SOCKET_URL
ARG NEXT_PUBLIC_HOST
ARG NEXT_PUBLIC_SENTRY_DSN
ARG SENTRY_AUTH_TOKEN
ARG NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS
WORKDIR /code
COPY --from=deps /code/node_modules ./node_modules
@@ -45,4 +44,4 @@ EXPOSE 3000
ENV PORT 3000
# Run the "run-prod.sh" script
CMD /code/app/run-prod.sh
CMD /code/app/scripts/run-prod.sh

View File

@@ -10,14 +10,15 @@
},
"scripts": {
"build": "next build",
"dev:next": "next dev",
"dev:next": "TZ=UTC next dev",
"dev:wss": "pnpm tsx --watch src/wss-server.ts",
"dev:worker": "NODE_ENV='development' pnpm tsx --watch src/server/tasks/worker.ts",
"dev": "concurrently --kill-others 'pnpm dev:next' 'pnpm dev:wss' 'pnpm dev:worker'",
"worker": "NODE_ENV='development' pnpm tsx --watch src/server/tasks/worker.ts",
"dev": "concurrently --kill-others 'pnpm dev:next' 'pnpm dev:wss' 'pnpm worker --watch'",
"postinstall": "prisma generate",
"lint": "next lint",
"start": "next start",
"start": "TZ=UTC next start",
"codegen:clients": "tsx src/server/scripts/client-codegen.ts",
"codegen:db": "prisma generate && kysely-codegen --dialect postgres --out-file src/server/db.types.ts",
"seed": "tsx prisma/seed.ts",
"check": "concurrently 'pnpm lint' 'pnpm tsc' 'pnpm prettier . --check'",
"test": "pnpm vitest"
@@ -37,6 +38,7 @@
"@monaco-editor/loader": "^1.3.3",
"@next-auth/prisma-adapter": "^1.0.5",
"@prisma/client": "^4.14.0",
"@sendinblue/client": "^3.3.1",
"@sentry/nextjs": "^7.61.0",
"@t3-oss/env-nextjs": "^0.3.1",
"@tabler/icons-react": "^2.22.0",
@@ -46,6 +48,7 @@
"@trpc/react-query": "^10.26.0",
"@trpc/server": "^10.26.0",
"@vercel/og": "^0.5.9",
"archiver": "^6.0.0",
"ast-types": "^0.14.2",
"chroma-js": "^2.4.2",
"concurrently": "^8.2.0",
@@ -58,20 +61,25 @@
"framer-motion": "^10.12.17",
"gpt-tokens": "^1.0.10",
"graphile-worker": "^0.13.0",
"human-id": "^4.0.0",
"immer": "^10.0.2",
"isolated-vm": "^4.5.0",
"json-schema-to-typescript": "^13.0.2",
"json-stringify-pretty-compact": "^4.0.0",
"jsonschema": "^1.4.1",
"kysely": "^0.26.1",
"kysely-codegen": "^0.10.1",
"lodash-es": "^4.17.21",
"lucide-react": "^0.265.0",
"marked": "^7.0.3",
"next": "^13.4.2",
"next-auth": "^4.22.1",
"next-query-params": "^4.2.3",
"nextjs-cors": "^2.1.2",
"nextjs-routes": "^2.0.1",
"nodemailer": "^6.9.4",
"openai": "4.0.0-beta.7",
"openpipe": "workspace:*",
"pg": "^8.11.2",
"pluralize": "^8.0.0",
"posthog-js": "^1.75.3",
@@ -92,6 +100,7 @@
"replicate": "^0.12.3",
"socket.io": "^4.7.1",
"socket.io-client": "^4.7.1",
"stream-buffers": "^3.0.2",
"superjson": "1.12.2",
"trpc-openapi": "^1.2.0",
"tsx": "^3.12.7",
@@ -100,11 +109,11 @@
"uuid": "^9.0.0",
"vite-tsconfig-paths": "^4.2.0",
"zod": "^3.21.4",
"zustand": "^4.3.9",
"openpipe": "workspace:*"
"zustand": "^4.3.9"
},
"devDependencies": {
"@openapi-contrib/openapi-schema-to-json-schema": "^4.0.5",
"@types/archiver": "^5.3.2",
"@types/babel__core": "^7.20.1",
"@types/babel__standalone": "^7.1.4",
"@types/chroma-js": "^2.4.0",
@@ -114,12 +123,14 @@
"@types/json-schema": "^7.0.12",
"@types/lodash-es": "^4.17.8",
"@types/node": "^18.16.0",
"@types/nodemailer": "^6.4.9",
"@types/pg": "^8.10.2",
"@types/pluralize": "^0.0.30",
"@types/prismjs": "^1.26.0",
"@types/react": "^18.2.6",
"@types/react-dom": "^18.2.4",
"@types/react-syntax-highlighter": "^15.5.7",
"@types/stream-buffers": "^3.0.4",
"@types/uuid": "^9.0.2",
"@typescript-eslint/eslint-plugin": "^5.59.6",
"@typescript-eslint/parser": "^5.59.6",
@@ -129,6 +140,7 @@
"eslint-plugin-unused-imports": "^2.0.0",
"monaco-editor": "^0.40.0",
"openapi-typescript": "^6.3.4",
"openapi-typescript-codegen": "^0.25.0",
"prisma": "^4.14.0",
"raw-loader": "^4.0.2",
"typescript": "^5.0.4",

View File

@@ -0,0 +1,22 @@
-- DropIndex
DROP INDEX "LoggedCallTag_name_idx";
DROP INDEX "LoggedCallTag_name_value_idx";
-- AlterTable: Add projectId column without NOT NULL constraint for now
ALTER TABLE "LoggedCallTag" ADD COLUMN "projectId" UUID;
-- Set the default value
UPDATE "LoggedCallTag" lct
SET "projectId" = lc."projectId"
FROM "LoggedCall" lc
WHERE lct."loggedCallId" = lc.id;
-- Now set the NOT NULL constraint
ALTER TABLE "LoggedCallTag" ALTER COLUMN "projectId" SET NOT NULL;
-- CreateIndex
CREATE INDEX "LoggedCallTag_projectId_name_idx" ON "LoggedCallTag"("projectId", "name");
CREATE INDEX "LoggedCallTag_projectId_name_value_idx" ON "LoggedCallTag"("projectId", "name", "value");
-- CreateIndex
CREATE UNIQUE INDEX "LoggedCallTag_loggedCallId_name_key" ON "LoggedCallTag"("loggedCallId", "name");

View File

@@ -0,0 +1,25 @@
-- CreateTable
CREATE TABLE "UserInvitation" (
"id" UUID NOT NULL,
"projectId" UUID NOT NULL,
"email" TEXT NOT NULL,
"role" "ProjectUserRole" NOT NULL,
"invitationToken" TEXT NOT NULL,
"senderId" UUID NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "UserInvitation_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "UserInvitation_invitationToken_key" ON "UserInvitation"("invitationToken");
-- CreateIndex
CREATE UNIQUE INDEX "UserInvitation_projectId_email_key" ON "UserInvitation"("projectId", "email");
-- AddForeignKey
ALTER TABLE "UserInvitation" ADD CONSTRAINT "UserInvitation_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "UserInvitation" ADD CONSTRAINT "UserInvitation_senderId_fkey" FOREIGN KEY ("senderId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,88 @@
/*
* Copyright 2023 Viascom Ltd liab. Co
*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
CREATE EXTENSION IF NOT EXISTS pgcrypto;
CREATE OR REPLACE FUNCTION nanoid(
size int DEFAULT 21,
alphabet text DEFAULT '_-0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'
)
RETURNS text
LANGUAGE plpgsql
volatile
AS
$$
DECLARE
idBuilder text := '';
counter int := 0;
bytes bytea;
alphabetIndex int;
alphabetArray text[];
alphabetLength int;
mask int;
step int;
BEGIN
alphabetArray := regexp_split_to_array(alphabet, '');
alphabetLength := array_length(alphabetArray, 1);
mask := (2 << cast(floor(log(alphabetLength - 1) / log(2)) as int)) - 1;
step := cast(ceil(1.6 * mask * size / alphabetLength) AS int);
while true
loop
bytes := gen_random_bytes(step);
while counter < step
loop
alphabetIndex := (get_byte(bytes, counter) & mask) + 1;
if alphabetIndex <= alphabetLength then
idBuilder := idBuilder || alphabetArray[alphabetIndex];
if length(idBuilder) = size then
return idBuilder;
end if;
end if;
counter := counter + 1;
end loop;
counter := 0;
end loop;
END
$$;
-- Make a short_nanoid function that uses the default alphabet and length of 15
CREATE OR REPLACE FUNCTION short_nanoid()
RETURNS text
LANGUAGE plpgsql
volatile
AS
$$
BEGIN
RETURN nanoid(15, '0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ');
END
$$;
-- AlterTable
ALTER TABLE "Experiment" ADD COLUMN "slug" TEXT NOT NULL DEFAULT short_nanoid();
-- For existing experiments, keep the existing id as the slug for backwards compatibility
UPDATE "Experiment" SET "slug" = "id";
-- CreateIndex
CREATE UNIQUE INDEX "Experiment_slug_key" ON "Experiment"("slug");

View File

@@ -0,0 +1,48 @@
/*
Warnings:
- You are about to drop the column `input` on the `DatasetEntry` table. All the data in the column will be lost.
- You are about to drop the column `output` on the `DatasetEntry` table. All the data in the column will be lost.
- Added the required column `loggedCallId` to the `DatasetEntry` table without a default value. This is not possible if the table is not empty.
*/
-- AlterTable
ALTER TABLE "DatasetEntry" DROP COLUMN "input",
DROP COLUMN "output",
ADD COLUMN "loggedCallId" UUID NOT NULL;
-- AddForeignKey
ALTER TABLE "DatasetEntry" ADD CONSTRAINT "DatasetEntry_loggedCallId_fkey" FOREIGN KEY ("loggedCallId") REFERENCES "LoggedCall"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AlterTable
ALTER TABLE "LoggedCallModelResponse" ALTER COLUMN "cost" SET DATA TYPE DOUBLE PRECISION;
-- CreateEnum
CREATE TYPE "FineTuneStatus" AS ENUM ('PENDING', 'TRAINING', 'AWAITING_DEPLOYMENT', 'DEPLOYING', 'DEPLOYED', 'ERROR');
-- CreateTable
CREATE TABLE "FineTune" (
"id" UUID NOT NULL,
"slug" TEXT NOT NULL,
"baseModel" TEXT NOT NULL,
"status" "FineTuneStatus" NOT NULL DEFAULT 'PENDING',
"trainingStartedAt" TIMESTAMP(3),
"trainingFinishedAt" TIMESTAMP(3),
"deploymentStartedAt" TIMESTAMP(3),
"deploymentFinishedAt" TIMESTAMP(3),
"datasetId" UUID NOT NULL,
"projectId" UUID NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "FineTune_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "FineTune_slug_key" ON "FineTune"("slug");
-- AddForeignKey
ALTER TABLE "FineTune" ADD CONSTRAINT "FineTune_datasetId_fkey" FOREIGN KEY ("datasetId") REFERENCES "Dataset"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "FineTune" ADD CONSTRAINT "FineTune_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -11,7 +11,9 @@ datasource db {
}
model Experiment {
id String @id @default(uuid()) @db.Uuid
id String @id @default(uuid()) @db.Uuid
slug String @unique @default(dbgenerated("short_nanoid()"))
label String
sortIndex Int @default(0)
@@ -179,6 +181,7 @@ model Dataset {
name String
datasetEntries DatasetEntry[]
fineTunes FineTune[]
projectId String @db.Uuid
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
@@ -190,8 +193,8 @@ model Dataset {
model DatasetEntry {
id String @id @default(uuid()) @db.Uuid
input String
output String?
loggedCallId String @db.Uuid
loggedCall LoggedCall @relation(fields: [loggedCallId], references: [id], onDelete: Cascade)
datasetId String @db.Uuid
dataset Dataset? @relation(fields: [datasetId], references: [id], onDelete: Cascade)
@@ -207,13 +210,15 @@ model Project {
personalProjectUserId String? @unique @db.Uuid
personalProjectUser User? @relation(fields: [personalProjectUserId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
projectUsers ProjectUser[]
experiments Experiment[]
datasets Dataset[]
loggedCalls LoggedCall[]
apiKeys ApiKey[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
projectUsers ProjectUser[]
projectUserInvitations UserInvitation[]
experiments Experiment[]
datasets Dataset[]
loggedCalls LoggedCall[]
fineTunes FineTune[]
apiKeys ApiKey[]
}
enum ProjectUserRole {
@@ -273,8 +278,9 @@ model LoggedCall {
projectId String @db.Uuid
project Project? @relation(fields: [projectId], references: [id], onDelete: Cascade)
model String?
tags LoggedCallTag[]
model String?
tags LoggedCallTag[]
datasetEntries DatasetEntry[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@ -309,7 +315,7 @@ model LoggedCallModelResponse {
outputTokens Int?
finishReason String?
completionId String?
cost Decimal? @db.Decimal(18, 12)
cost Float?
// The LoggedCall that created this LoggedCallModelResponse
originalLoggedCallId String @unique @db.Uuid
@@ -323,15 +329,17 @@ model LoggedCallModelResponse {
}
model LoggedCallTag {
id String @id @default(uuid()) @db.Uuid
name String
value String?
id String @id @default(uuid()) @db.Uuid
name String
value String?
projectId String @db.Uuid
loggedCallId String @db.Uuid
loggedCall LoggedCall @relation(fields: [loggedCallId], references: [id], onDelete: Cascade)
@@index([name])
@@index([name, value])
@@unique([loggedCallId, name])
@@index([projectId, name])
@@index([projectId, name, value])
}
model ApiKey {
@@ -388,16 +396,33 @@ model User {
role UserRole @default(USER)
accounts Account[]
sessions Session[]
projectUsers ProjectUser[]
projects Project[]
worldChampEntrant WorldChampEntrant?
accounts Account[]
sessions Session[]
projectUsers ProjectUser[]
projects Project[]
worldChampEntrant WorldChampEntrant?
sentUserInvitations UserInvitation[]
createdAt DateTime @default(now())
updatedAt DateTime @default(now()) @updatedAt
}
model UserInvitation {
id String @id @default(uuid()) @db.Uuid
projectId String @db.Uuid
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
email String
role ProjectUserRole
invitationToken String @unique
senderId String @db.Uuid
sender User @relation(fields: [senderId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@unique([projectId, email])
}
model VerificationToken {
identifier String
token String @unique
@@ -405,3 +430,33 @@ model VerificationToken {
@@unique([identifier, token])
}
enum FineTuneStatus {
PENDING
TRAINING
AWAITING_DEPLOYMENT
DEPLOYING
DEPLOYED
ERROR
}
model FineTune {
id String @id @default(uuid()) @db.Uuid
slug String @unique
baseModel String
status FineTuneStatus @default(PENDING)
trainingStartedAt DateTime?
trainingFinishedAt DateTime?
deploymentStartedAt DateTime?
deploymentFinishedAt DateTime?
datasetId String @db.Uuid
dataset Dataset @relation(fields: [datasetId], references: [id], onDelete: Cascade)
projectId String @db.Uuid
project Project @relation(fields: [projectId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}

View File

@@ -10,6 +10,14 @@ await prisma.project.deleteMany({
where: { id: defaultId },
});
// Mark all users as admins
await prisma.user.updateMany({
where: {},
data: {
role: "ADMIN",
},
});
// If there's an existing project, just seed into it
const project =
(await prisma.project.findFirst({})) ??
@@ -18,12 +26,16 @@ const project =
}));
if (env.OPENPIPE_API_KEY) {
await prisma.apiKey.create({
data: {
await prisma.apiKey.upsert({
where: {
apiKey: env.OPENPIPE_API_KEY,
},
create: {
projectId: project.id,
name: "Default API Key",
apiKey: env.OPENPIPE_API_KEY,
},
update: {},
});
}

View File

@@ -13,6 +13,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: number;
outputTokens: number;
finishReason: string;
tags: { name: string; value: string }[];
}[] = [
{
reqPayload: {
@@ -107,6 +108,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 236,
outputTokens: 5,
finishReason: "stop",
tags: [],
},
{
reqPayload: {
@@ -193,6 +195,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 222,
outputTokens: 5,
finishReason: "stop",
tags: [],
},
{
reqPayload: {
@@ -231,6 +234,7 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 14,
outputTokens: 7,
finishReason: "stop",
tags: [{ name: "prompt_id", value: "id2" }],
},
{
reqPayload: {
@@ -306,6 +310,10 @@ const MODEL_RESPONSE_TEMPLATES: {
inputTokens: 2802,
outputTokens: 108,
finishReason: "stop",
tags: [
{ name: "prompt_id", value: "chatcmpl-7lQS3MktOT8BTgNEytl9dkyssCQqL" },
{ name: "some_other_tag", value: "some_other_value" },
],
},
];
@@ -349,6 +357,7 @@ for (let i = 0; i < 1437; i++) {
cacheHit: false,
requestedAt,
projectId: project.id,
model: template.reqPayload.model,
createdAt: requestedAt,
});
@@ -388,11 +397,14 @@ for (let i = 0; i < 1437; i++) {
modelResponseId: loggedCallModelResponseId,
},
});
loggedCallTagsToCreate.push({
loggedCallId,
name: "$model",
value: template.reqPayload.model,
});
for (const tag of template.tags) {
loggedCallTagsToCreate.push({
projectId: project.id,
loggedCallId,
name: tag.name,
value: tag.value,
});
}
}
await prisma.$transaction([

View File

@@ -0,0 +1,6 @@
#! /bin/bash
set -e
cd "$(dirname "$0")/.."
apt-get update
apt-get install -y htop psql

View File

@@ -10,6 +10,4 @@ pnpm tsx src/promptConstructor/migrate.ts
echo "Starting the server"
pnpm concurrently --kill-others \
"pnpm start" \
"pnpm tsx src/server/tasks/worker.ts"
pnpm start

10
app/scripts/run-workers-prod.sh Executable file
View File

@@ -0,0 +1,10 @@
#! /bin/bash
set -e
echo "Migrating the database"
pnpm prisma migrate deploy
echo "Starting 4 workers"
pnpm concurrently "pnpm worker" "pnpm worker" "pnpm worker" "pnpm worker"

13
app/scripts/test-docker.sh Executable file
View File

@@ -0,0 +1,13 @@
#! /bin/bash
set -e
cd "$(dirname "$0")/../.."
echo "Env is"
echo $ENVIRONMENT
docker build . --file app/Dockerfile --tag "openpipe-prod"
# Run the image
docker run --env-file app/.env -it --entrypoint "/bin/bash" "openpipe-prod"

View File

@@ -3,6 +3,7 @@
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
import * as Sentry from "@sentry/nextjs";
import { isError } from "lodash-es";
import { env } from "~/env.mjs";
if (env.NEXT_PUBLIC_SENTRY_DSN) {
@@ -15,4 +16,10 @@ if (env.NEXT_PUBLIC_SENTRY_DSN) {
// Setting this option to true will print useful information to the console while you're setting up Sentry.
debug: false,
});
} else {
// Install local debug exception handler for rejected promises
process.on("unhandledRejection", (reason) => {
const reasonDetails = isError(reason) ? reason?.stack : reason;
console.log("Unhandled Rejection at:", reasonDetails);
});
}

View File

@@ -1,13 +1,13 @@
import { Textarea, type TextareaProps } from "@chakra-ui/react";
import ResizeTextarea from "react-textarea-autosize";
import React, { useLayoutEffect, useState } from "react";
import React, { useEffect, useState } from "react";
export const AutoResizeTextarea: React.ForwardRefRenderFunction<
HTMLTextAreaElement,
TextareaProps & { minRows?: number }
> = ({ minRows = 1, overflowY = "hidden", ...props }, ref) => {
const [isRerendered, setIsRerendered] = useState(false);
useLayoutEffect(() => setIsRerendered(true), []);
useEffect(() => setIsRerendered(true), []);
return (
<Textarea

View File

@@ -87,7 +87,7 @@ export const ModelStatsCard = ({
label="Price"
info={
<Text>
${model.pricePerSecond.toFixed(3)}
${model.pricePerSecond.toFixed(4)}
<Text color="gray.500"> / second</Text>
</Text>
}

View File

@@ -1,15 +1,16 @@
import { HStack, Icon, IconButton, Tooltip, Text } from "@chakra-ui/react";
import { HStack, Icon, IconButton, Tooltip, Text, type StackProps } from "@chakra-ui/react";
import { useState } from "react";
import { MdContentCopy } from "react-icons/md";
import { useHandledAsyncCallback } from "~/utils/hooks";
const CopiableCode = ({ code }: { code: string }) => {
const CopiableCode = ({ code, ...rest }: { code: string } & StackProps) => {
const [copied, setCopied] = useState(false);
const [copyToClipboard] = useHandledAsyncCallback(async () => {
await navigator.clipboard.writeText(code);
setCopied(true);
}, [code]);
return (
<HStack
backgroundColor="blackAlpha.800"
@@ -18,9 +19,19 @@ const CopiableCode = ({ code }: { code: string }) => {
padding={3}
w="full"
justifyContent="space-between"
alignItems="flex-start"
{...rest}
>
<Text fontFamily="inconsolata" fontWeight="bold" letterSpacing={0.5} overflowX="auto">
<Text
fontFamily="inconsolata"
fontWeight="bold"
letterSpacing={0.5}
overflowX="auto"
whiteSpace="pre-wrap"
>
{code}
{/* Necessary for trailing newline to actually be displayed */}
{code.endsWith("\n") ? "\n" : ""}
</Text>
<Tooltip closeOnClick={false} label={copied ? "Copied!" : "Copy to clipboard"}>
<IconButton

View File

@@ -0,0 +1,14 @@
import { Tooltip, Icon, VStack } from "@chakra-ui/react";
import { RiInformationFill } from "react-icons/ri";
const InfoCircle = ({ tooltipText }: { tooltipText: string }) => {
return (
<Tooltip label={tooltipText} fontSize="sm" shouldWrapChildren maxW={80}>
<VStack>
<Icon as={RiInformationFill} boxSize={5} color="gray.500" />
</VStack>
</Tooltip>
);
};
export default InfoCircle;

View File

@@ -0,0 +1,91 @@
import {
Input,
InputGroup,
InputRightElement,
Icon,
Popover,
PopoverTrigger,
PopoverContent,
VStack,
HStack,
Button,
Text,
useDisclosure,
type InputGroupProps,
} from "@chakra-ui/react";
import { FiChevronDown } from "react-icons/fi";
import { BiCheck } from "react-icons/bi";
type InputDropdownProps<T> = {
options: ReadonlyArray<T>;
selectedOption: T;
onSelect: (option: T) => void;
inputGroupProps?: InputGroupProps;
};
const InputDropdown = <T,>({
options,
selectedOption,
onSelect,
inputGroupProps,
}: InputDropdownProps<T>) => {
const popover = useDisclosure();
return (
<Popover placement="bottom-start" {...popover}>
<PopoverTrigger>
<InputGroup
cursor="pointer"
w={(selectedOption as string).length * 14 + 180}
{...inputGroupProps}
>
<Input
value={selectedOption as string}
// eslint-disable-next-line @typescript-eslint/no-empty-function -- controlled input requires onChange
onChange={() => {}}
cursor="pointer"
borderColor={popover.isOpen ? "blue.500" : undefined}
_hover={popover.isOpen ? { borderColor: "blue.500" } : undefined}
contentEditable={false}
// disable focus
onFocus={(e) => {
e.target.blur();
}}
/>
<InputRightElement>
<Icon as={FiChevronDown} />
</InputRightElement>
</InputGroup>
</PopoverTrigger>
<PopoverContent boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);" minW={0} w="auto">
<VStack spacing={0}>
{options?.map((option, index) => (
<HStack
key={index}
as={Button}
onClick={() => {
onSelect(option);
popover.onClose();
}}
w="full"
variant="ghost"
justifyContent="space-between"
fontWeight="semibold"
borderRadius={0}
colorScheme="blue"
color="black"
fontSize="sm"
borderBottomWidth={1}
>
<Text mr={16}>{option as string}</Text>
{option === selectedOption && <Icon as={BiCheck} color="blue.500" boxSize={5} />}
</HStack>
))}
</VStack>
</PopoverContent>
</Popover>
);
};
export default InputDropdown;

View File

@@ -8,7 +8,7 @@ import {
useHandledAsyncCallback,
useVisibleScenarioIds,
} from "~/utils/hooks";
import { cellPadding } from "../constants";
import { cellPadding } from "./constants";
import { ActionButton } from "./ScenariosHeader";
export default function AddVariantButton() {

View File

@@ -1,7 +1,7 @@
import { HStack, Icon, IconButton, Spinner, Tooltip, useDisclosure } from "@chakra-ui/react";
import { BsArrowClockwise, BsInfoCircle } from "react-icons/bs";
import { useExperimentAccess } from "~/utils/hooks";
import ExpandedModal from "./PromptModal";
import PromptModal from "./PromptModal";
import { type RouterOutputs } from "~/utils/api";
export const CellOptions = ({
@@ -32,7 +32,7 @@ export const CellOptions = ({
variant="ghost"
/>
</Tooltip>
<ExpandedModal cell={cell} disclosure={modalDisclosure} />
<PromptModal cell={cell} disclosure={modalDisclosure} />
</>
)}
{canModify && (

View File

@@ -0,0 +1,29 @@
import { type StackProps, VStack } from "@chakra-ui/react";
import { type RouterOutputs } from "~/utils/api";
import { type Scenario } from "../types";
import { CellOptions } from "./CellOptions";
import { OutputStats } from "./OutputStats";
const CellWrapper: React.FC<
StackProps & {
cell: RouterOutputs["scenarioVariantCells"]["get"] | undefined;
hardRefetching: boolean;
hardRefetch: () => void;
mostRecentResponse:
| NonNullable<RouterOutputs["scenarioVariantCells"]["get"]>["modelResponses"][0]
| undefined;
scenario: Scenario;
}
> = ({ children, cell, hardRefetching, hardRefetch, mostRecentResponse, scenario, ...props }) => (
<VStack w="full" alignItems="flex-start" {...props} px={2} py={2} h="100%">
{cell && (
<CellOptions refetchingOutput={hardRefetching} refetchOutput={hardRefetch} cell={cell} />
)}
<VStack w="full" alignItems="flex-start" maxH={500} overflowY="auto" flex={1}>
{children}
</VStack>
{mostRecentResponse && <OutputStats modelResponse={mostRecentResponse} scenario={scenario} />}
</VStack>
);
export default CellWrapper;

View File

@@ -1,17 +1,16 @@
import { api } from "~/utils/api";
import { type PromptVariant, type Scenario } from "../types";
import { type StackProps, Text, VStack } from "@chakra-ui/react";
import { useScenarioVars, useHandledAsyncCallback } from "~/utils/hooks";
import { Text } from "@chakra-ui/react";
import stringify from "json-stringify-pretty-compact";
import { Fragment, useEffect, useState, type ReactElement } from "react";
import SyntaxHighlighter from "react-syntax-highlighter";
import { docco } from "react-syntax-highlighter/dist/cjs/styles/hljs";
import stringify from "json-stringify-pretty-compact";
import { type ReactElement, useState, useEffect, Fragment, useCallback } from "react";
import useSocket from "~/utils/useSocket";
import { OutputStats } from "./OutputStats";
import { RetryCountdown } from "./RetryCountdown";
import frontendModelProviders from "~/modelProviders/frontendModelProviders";
import { api } from "~/utils/api";
import { useHandledAsyncCallback, useScenarioVars } from "~/utils/hooks";
import useSocket from "~/utils/useSocket";
import { type PromptVariant, type Scenario } from "../types";
import CellWrapper from "./CellWrapper";
import { ResponseLog } from "./ResponseLog";
import { CellOptions } from "./TopActions";
import { RetryCountdown } from "./RetryCountdown";
const WAITING_MESSAGE_INTERVAL = 20000;
@@ -44,7 +43,7 @@ export default function OutputCell({
type OutputSchema = Parameters<typeof provider.normalizeOutput>[0];
const { mutateAsync: hardRefetchMutate } = api.scenarioVariantCells.forceRefetch.useMutation();
const { mutateAsync: hardRefetchMutate } = api.scenarioVariantCells.hardRefetch.useMutation();
const [hardRefetch, hardRefetching] = useHandledAsyncCallback(async () => {
await hardRefetchMutate({ scenarioId: scenario.id, variantId: variant.id });
await utils.scenarioVariantCells.get.invalidate({
@@ -72,35 +71,26 @@ export default function OutputCell({
const mostRecentResponse = cell?.modelResponses[cell.modelResponses.length - 1];
const CellWrapper = useCallback(
({ children, ...props }: StackProps) => (
<VStack w="full" alignItems="flex-start" {...props} px={2} py={2} h="100%">
{cell && (
<CellOptions refetchingOutput={hardRefetching} refetchOutput={hardRefetch} cell={cell} />
)}
<VStack w="full" alignItems="flex-start" maxH={500} overflowY="auto" flex={1}>
{children}
</VStack>
{mostRecentResponse && (
<OutputStats modelResponse={mostRecentResponse} scenario={scenario} />
)}
</VStack>
),
[hardRefetching, hardRefetch, mostRecentResponse, scenario, cell],
);
const wrapperProps: Parameters<typeof CellWrapper>[0] = {
cell,
hardRefetching,
hardRefetch,
mostRecentResponse,
scenario,
};
if (!vars) return null;
if (!cell && !fetchingOutput)
return (
<CellWrapper>
<CellWrapper {...wrapperProps}>
<Text color="gray.500">Error retrieving output</Text>
</CellWrapper>
);
if (cell && cell.errorMessage) {
return (
<CellWrapper>
<CellWrapper {...wrapperProps}>
<Text color="red.500">{cell.errorMessage}</Text>
</CellWrapper>
);
@@ -112,7 +102,12 @@ export default function OutputCell({
if (showLogs)
return (
<CellWrapper alignItems="flex-start" fontFamily="inconsolata, monospace" spacing={0}>
<CellWrapper
{...wrapperProps}
alignItems="flex-start"
fontFamily="inconsolata, monospace"
spacing={0}
>
{cell?.jobQueuedAt && <ResponseLog time={cell.jobQueuedAt} title="Job queued" />}
{cell?.jobStartedAt && <ResponseLog time={cell.jobStartedAt} title="Job started" />}
{cell?.modelResponses?.map((response) => {
@@ -174,7 +169,7 @@ export default function OutputCell({
if (mostRecentResponse?.respPayload && normalizedOutput?.type === "json") {
return (
<CellWrapper>
<CellWrapper {...wrapperProps}>
<SyntaxHighlighter
customStyle={{ overflowX: "unset", width: "100%", flex: 1 }}
language="json"
@@ -193,7 +188,7 @@ export default function OutputCell({
const contentToDisplay = (normalizedOutput?.type === "text" && normalizedOutput.value) || "";
return (
<CellWrapper>
<CellWrapper {...wrapperProps}>
<Text whiteSpace="pre-wrap">{contentToDisplay}</Text>
</CellWrapper>
);

View File

@@ -5,30 +5,103 @@ import {
ModalContent,
ModalHeader,
ModalOverlay,
VStack,
Text,
Box,
type UseDisclosureReturn,
Link,
} from "@chakra-ui/react";
import { type RouterOutputs } from "~/utils/api";
import { api, type RouterOutputs } from "~/utils/api";
import { JSONTree } from "react-json-tree";
import CopiableCode from "~/components/CopiableCode";
export default function ExpandedModal(props: {
const theme = {
scheme: "chalk",
author: "chris kempson (http://chriskempson.com)",
base00: "transparent",
base01: "#202020",
base02: "#303030",
base03: "#505050",
base04: "#b0b0b0",
base05: "#d0d0d0",
base06: "#e0e0e0",
base07: "#f5f5f5",
base08: "#fb9fb1",
base09: "#eda987",
base0A: "#ddb26f",
base0B: "#acc267",
base0C: "#12cfc0",
base0D: "#6fc2ef",
base0E: "#e1a3ee",
base0F: "#deaf8f",
};
export default function PromptModal(props: {
cell: NonNullable<RouterOutputs["scenarioVariantCells"]["get"]>;
disclosure: UseDisclosureReturn;
}) {
const { data } = api.scenarioVariantCells.getTemplatedPromptMessage.useQuery(
{
cellId: props.cell.id,
},
{
enabled: props.disclosure.isOpen,
},
);
return (
<Modal isOpen={props.disclosure.isOpen} onClose={props.disclosure.onClose} size="2xl">
<Modal isOpen={props.disclosure.isOpen} onClose={props.disclosure.onClose} size="xl">
<ModalOverlay />
<ModalContent>
<ModalHeader>Prompt</ModalHeader>
<ModalHeader>Prompt Details</ModalHeader>
<ModalCloseButton />
<ModalBody>
<JSONTree
data={props.cell.prompt}
invertTheme={true}
theme="chalk"
shouldExpandNodeInitially={() => true}
getItemString={() => ""}
hideRoot
/>
<VStack py={4} w="">
<VStack w="full" alignItems="flex-start">
<Text fontWeight="bold">Full Prompt</Text>
<Box
w="full"
p={4}
alignItems="flex-start"
backgroundColor="blackAlpha.800"
borderRadius={4}
>
<JSONTree
data={props.cell.prompt}
theme={theme}
shouldExpandNodeInitially={() => true}
getItemString={() => ""}
hideRoot
/>
</Box>
</VStack>
{data?.templatedPrompt && (
<VStack w="full" mt={4} alignItems="flex-start">
<Text fontWeight="bold">Templated prompt message:</Text>
<CopiableCode
w="full"
// bgColor="gray.100"
p={4}
borderWidth={1}
whiteSpace="pre-wrap"
code={data.templatedPrompt}
/>
</VStack>
)}
{data?.learnMoreUrl && (
<Link
href={data.learnMoreUrl}
isExternal
color="blue.500"
fontWeight="bold"
fontSize="sm"
mt={4}
alignSelf="flex-end"
>
Learn More
</Link>
)}
</VStack>
</ModalBody>
</ModalContent>
</Modal>

View File

@@ -16,7 +16,7 @@ import {
VStack,
} from "@chakra-ui/react";
import { BsArrowsAngleExpand, BsX } from "react-icons/bs";
import { cellPadding } from "../constants";
import { cellPadding } from "./constants";
import { FloatingLabelInput } from "./FloatingLabelInput";
import { ScenarioEditorModal } from "./ScenarioEditorModal";
@@ -111,25 +111,23 @@ export default function ScenarioEditor({
onDrop={onReorder}
backgroundColor={isDragTarget ? "gray.100" : "transparent"}
>
{variableLabels.length === 0 ? (
<Box color="gray.500">
{vars.data ? "No scenario variables configured" : "Loading..."}
</Box>
) : (
{
<VStack spacing={4} flex={1} py={2}>
<HStack justifyContent="space-between" w="100%" align="center" spacing={0}>
<Text flex={1}>Scenario</Text>
<Tooltip label="Expand" hasArrow>
<IconButton
aria-label="Expand"
icon={<Icon as={BsArrowsAngleExpand} boxSize={3} />}
onClick={() => setScenarioEditorModalOpen(true)}
size="xs"
colorScheme="gray"
color="gray.500"
variant="ghost"
/>
</Tooltip>
{variableLabels.length && (
<Tooltip label="Expand" hasArrow>
<IconButton
aria-label="Expand"
icon={<Icon as={BsArrowsAngleExpand} boxSize={3} />}
onClick={() => setScenarioEditorModalOpen(true)}
size="xs"
colorScheme="gray"
color="gray.500"
variant="ghost"
/>
</Tooltip>
)}
{canModify && props.canHide && (
<Tooltip label="Delete" hasArrow>
<IconButton
@@ -150,31 +148,38 @@ export default function ScenarioEditor({
</Tooltip>
)}
</HStack>
{variableLabels.map((key) => {
const value = values[key] ?? "";
return (
<FloatingLabelInput
key={key}
label={key}
isDisabled={!canModify}
style={{ width: "100%" }}
maxHeight={32}
value={value}
onChange={(e) => {
setValues((prev) => ({ ...prev, [key]: e.target.value }));
}}
onKeyDown={(e) => {
if (e.key === "Enter" && (e.metaKey || e.ctrlKey)) {
e.preventDefault();
e.currentTarget.blur();
onSave();
}
}}
onMouseEnter={() => setVariableInputHovered(true)}
onMouseLeave={() => setVariableInputHovered(false)}
/>
);
})}
{variableLabels.length === 0 ? (
<Box color="gray.500">
{vars.data ? "No scenario variables configured" : "Loading..."}
</Box>
) : (
variableLabels.map((key) => {
const value = values[key] ?? "";
return (
<FloatingLabelInput
key={key}
label={key}
isDisabled={!canModify}
style={{ width: "100%" }}
maxHeight={32}
value={value}
onChange={(e) => {
setValues((prev) => ({ ...prev, [key]: e.target.value }));
}}
onKeyDown={(e) => {
if (e.key === "Enter" && (e.metaKey || e.ctrlKey)) {
e.preventDefault();
e.currentTarget.blur();
onSave();
}
}}
onMouseEnter={() => setVariableInputHovered(true)}
onMouseLeave={() => setVariableInputHovered(false)}
/>
);
})
)}
{hasChanged && (
<HStack justify="right">
<Button
@@ -192,7 +197,7 @@ export default function ScenarioEditor({
</HStack>
)}
</VStack>
)}
}
</HStack>
{scenarioEditorModalOpen && (
<ScenarioEditorModal

View File

@@ -65,11 +65,11 @@ export const ScenarioEditorModal = ({
<Modal
isOpen
onClose={onClose}
size={{ base: "xl", sm: "2xl", md: "3xl", lg: "5xl", xl: "7xl" }}
size={{ base: "xl", sm: "2xl", md: "3xl", lg: "4xl", xl: "5xl" }}
>
<ModalOverlay />
<ModalContent w={1200}>
<ModalHeader />
<ModalHeader>Edit Scenario</ModalHeader>
<ModalCloseButton />
<ModalBody maxW="unset">
<VStack spacing={8}>

View File

@@ -11,7 +11,7 @@ import {
IconButton,
Spinner,
} from "@chakra-ui/react";
import { cellPadding } from "../constants";
import { cellPadding } from "./constants";
import {
useExperiment,
useExperimentAccess,

View File

@@ -110,7 +110,7 @@ export default function VariantEditor(props: { variant: PromptVariant }) {
setIsChanged(false);
await utils.promptVariants.list.invalidate();
}, [checkForChanges]);
}, [checkForChanges, replaceVariant.mutateAsync]);
useEffect(() => {
if (monaco) {

View File

@@ -1,11 +1,11 @@
import { useState, type DragEvent } from "react";
import { type PromptVariant } from "../OutputsTable/types";
import { type PromptVariant } from "../types";
import { api } from "~/utils/api";
import { RiDraggable } from "react-icons/ri";
import { useExperimentAccess, useHandledAsyncCallback } from "~/utils/hooks";
import { HStack, Icon, Text, GridItem, type GridItemProps } from "@chakra-ui/react"; // Changed here
import { cellPadding, headerMinHeight } from "../constants";
import AutoResizeTextArea from "../AutoResizeTextArea";
import AutoResizeTextArea from "../../AutoResizeTextArea";
import VariantHeaderMenuButton from "./VariantHeaderMenuButton";
export default function VariantHeader(
@@ -75,7 +75,7 @@ export default function VariantHeader(
padding={0}
sx={{
position: "sticky",
top: "-2",
top: "0",
// Ensure that the menu always appears above the sticky header of other variants
zIndex: menuOpen ? "dropdown" : 10,
}}

View File

@@ -1,6 +1,4 @@
import { type PromptVariant } from "../OutputsTable/types";
import { api } from "~/utils/api";
import { useHandledAsyncCallback, useVisibleScenarioIds } from "~/utils/hooks";
import { useState } from "react";
import {
Icon,
Menu,
@@ -14,10 +12,13 @@ import {
} from "@chakra-ui/react";
import { BsFillTrashFill, BsGear, BsStars } from "react-icons/bs";
import { FaRegClone } from "react-icons/fa";
import { useState } from "react";
import { RefinePromptModal } from "../RefinePromptModal/RefinePromptModal";
import { RiExchangeFundsFill } from "react-icons/ri";
import { ChangeModelModal } from "../ChangeModelModal/ChangeModelModal";
import { api } from "~/utils/api";
import { useHandledAsyncCallback, useVisibleScenarioIds } from "~/utils/hooks";
import { type PromptVariant } from "../types";
import { RefinePromptModal } from "../../RefinePromptModal/RefinePromptModal";
import { ChangeModelModal } from "../../ChangeModelModal/ChangeModelModal";
export default function VariantHeaderMenuButton({
variant,

View File

@@ -1,6 +1,6 @@
import { HStack, Icon, Text, useToken } from "@chakra-ui/react";
import { type PromptVariant } from "./types";
import { cellPadding } from "../constants";
import { cellPadding } from "./constants";
import { api } from "~/utils/api";
import chroma from "chroma-js";
import { BsCurrencyDollar } from "react-icons/bs";

View File

@@ -3,13 +3,14 @@ import { api } from "~/utils/api";
import AddVariantButton from "./AddVariantButton";
import ScenarioRow from "./ScenarioRow";
import VariantEditor from "./VariantEditor";
import VariantHeader from "../VariantHeader/VariantHeader";
import VariantHeader from "./VariantHeader/VariantHeader";
import VariantStats from "./VariantStats";
import { ScenariosHeader } from "./ScenariosHeader";
import { borders } from "./styles";
import { useScenarios } from "~/utils/hooks";
import ScenarioPaginator from "./ScenarioPaginator";
import { Fragment } from "react";
import useScrolledPast from "./useHasScrolledPast";
export default function OutputsTable({ experimentId }: { experimentId: string | undefined }) {
const variants = api.promptVariants.list.useQuery(
@@ -18,6 +19,7 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
);
const scenarios = useScenarios();
const shouldFlattenHeader = useScrolledPast(50);
if (!variants.data || !scenarios.data) return null;
@@ -63,8 +65,8 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
variant={variant}
canHide={variants.data.length > 1}
rowStart={1}
borderTopLeftRadius={isFirst ? 8 : 0}
borderTopRightRadius={isLast ? 8 : 0}
borderTopLeftRadius={isFirst && !shouldFlattenHeader ? 8 : 0}
borderTopRightRadius={isLast && !shouldFlattenHeader ? 8 : 0}
{...sharedProps}
/>
<GridItem rowStart={2} {...sharedProps}>
@@ -75,6 +77,7 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
{...sharedProps}
borderBottomLeftRadius={isFirst ? 8 : 0}
borderBottomRightRadius={isLast ? 8 : 0}
boxShadow="5px 5px 15px 1px rgba(0, 0, 0, 0.1);"
>
<VariantStats variant={variant} />
</GridItem>

View File

@@ -0,0 +1,34 @@
import { useState, useEffect } from "react";
const useScrolledPast = (scrollThreshold: number) => {
const [hasScrolledPast, setHasScrolledPast] = useState(true);
useEffect(() => {
const container = document.getElementById("output-container");
if (!container) {
console.warn('Element with id "outputs-container" not found.');
return;
}
const checkScroll = () => {
const { scrollTop } = container;
// Check if scrollTop is greater than or equal to scrollThreshold
setHasScrolledPast(scrollTop > scrollThreshold);
};
checkScroll();
container.addEventListener("scroll", checkScroll);
// Cleanup
return () => {
container.removeEventListener("scroll", checkScroll);
};
}, []);
return hasScrolledPast;
};
export default useScrolledPast;

View File

@@ -1,15 +1,19 @@
import { HStack, IconButton, Text, Select, type StackProps, Icon } from "@chakra-ui/react";
import {
HStack,
IconButton,
Text,
Select,
type StackProps,
Icon,
useBreakpointValue,
} from "@chakra-ui/react";
import React, { useCallback } from "react";
import { FiChevronsLeft, FiChevronsRight, FiChevronLeft, FiChevronRight } from "react-icons/fi";
import { usePageParams } from "~/utils/hooks";
const pageSizeOptions = [10, 25, 50, 100];
const Paginator = ({
count,
condense,
...props
}: { count: number; condense?: boolean } & StackProps) => {
const Paginator = ({ count, ...props }: { count: number; condense?: boolean } & StackProps) => {
const { page, pageSize, setPageParams } = usePageParams();
const lastPage = Math.ceil(count / pageSize);
@@ -37,6 +41,11 @@ const Paginator = ({
const goToLastPage = () => setPageParams({ page: lastPage }, "replace");
const goToFirstPage = () => setPageParams({ page: 1 }, "replace");
const isMobile = useBreakpointValue({ base: true, md: false });
const condense = isMobile || props.condense;
if (count === 0) return null;
return (
<HStack
pt={4}

View File

@@ -1,112 +0,0 @@
import {
HStack,
Icon,
VStack,
Text,
Divider,
Spinner,
AspectRatio,
SkeletonText,
} from "@chakra-ui/react";
import { RiDatabase2Line } from "react-icons/ri";
import { formatTimePast } from "~/utils/dayjs";
import Link from "next/link";
import { useRouter } from "next/router";
import { BsPlusSquare } from "react-icons/bs";
import { api } from "~/utils/api";
import { useHandledAsyncCallback } from "~/utils/hooks";
import { useAppStore } from "~/state/store";
type DatasetData = {
name: string;
numEntries: number;
id: string;
createdAt: Date;
updatedAt: Date;
};
export const DatasetCard = ({ dataset }: { dataset: DatasetData }) => {
return (
<AspectRatio ratio={1.2} w="full">
<VStack
as={Link}
href={{ pathname: "/data/[id]", query: { id: dataset.id } }}
bg="gray.50"
_hover={{ bg: "gray.100" }}
transition="background 0.2s"
cursor="pointer"
borderColor="gray.200"
borderWidth={1}
p={4}
justify="space-between"
>
<HStack w="full" color="gray.700" justify="center">
<Icon as={RiDatabase2Line} boxSize={4} />
<Text fontWeight="bold">{dataset.name}</Text>
</HStack>
<HStack h="full" spacing={4} flex={1} align="center">
<CountLabel label="Rows" count={dataset.numEntries} />
</HStack>
<HStack w="full" color="gray.500" fontSize="xs" textAlign="center">
<Text flex={1}>Created {formatTimePast(dataset.createdAt)}</Text>
<Divider h={4} orientation="vertical" />
<Text flex={1}>Updated {formatTimePast(dataset.updatedAt)}</Text>
</HStack>
</VStack>
</AspectRatio>
);
};
const CountLabel = ({ label, count }: { label: string; count: number }) => {
return (
<VStack alignItems="center" flex={1}>
<Text color="gray.500" fontWeight="bold">
{label}
</Text>
<Text fontSize="sm" color="gray.500">
{count}
</Text>
</VStack>
);
};
export const NewDatasetCard = () => {
const router = useRouter();
const selectedProjectId = useAppStore((s) => s.selectedProjectId);
const createMutation = api.datasets.create.useMutation();
const [createDataset, isLoading] = useHandledAsyncCallback(async () => {
const newDataset = await createMutation.mutateAsync({ projectId: selectedProjectId ?? "" });
await router.push({ pathname: "/data/[id]", query: { id: newDataset.id } });
}, [createMutation, router, selectedProjectId]);
return (
<AspectRatio ratio={1.2} w="full">
<VStack
align="center"
justify="center"
_hover={{ cursor: "pointer", bg: "gray.50" }}
transition="background 0.2s"
cursor="pointer"
borderColor="gray.200"
borderWidth={1}
p={4}
onClick={createDataset}
>
<Icon as={isLoading ? Spinner : BsPlusSquare} boxSize={8} />
<Text display={{ base: "none", md: "block" }} ml={2}>
New Dataset
</Text>
</VStack>
</AspectRatio>
);
};
export const DatasetCardSkeleton = () => (
<AspectRatio ratio={1.2} w="full">
<VStack align="center" borderColor="gray.200" borderWidth={1} p={4} bg="gray.50">
<SkeletonText noOfLines={1} w="80%" />
<SkeletonText noOfLines={2} w="60%" />
<SkeletonText noOfLines={1} w="80%" />
</VStack>
</AspectRatio>
);

View File

@@ -1,16 +0,0 @@
import { type StackProps } from "@chakra-ui/react";
import { useDatasetEntries } from "~/utils/hooks";
import Paginator from "../Paginator";
const DatasetEntriesPaginator = (props: StackProps) => {
const { data } = useDatasetEntries();
if (!data) return null;
const { count } = data;
return <Paginator count={count} {...props} />;
};
export default DatasetEntriesPaginator;

View File

@@ -1,31 +0,0 @@
import { type StackProps, VStack, Table, Th, Tr, Thead, Tbody, Text } from "@chakra-ui/react";
import { useDatasetEntries } from "~/utils/hooks";
import TableRow from "./TableRow";
import DatasetEntriesPaginator from "./DatasetEntriesPaginator";
const DatasetEntriesTable = (props: StackProps) => {
const { data } = useDatasetEntries();
return (
<VStack justifyContent="space-between" {...props}>
<Table variant="simple" sx={{ "table-layout": "fixed", width: "full" }}>
<Thead>
<Tr>
<Th>Input</Th>
<Th>Output</Th>
</Tr>
</Thead>
<Tbody>{data?.entries.map((entry) => <TableRow key={entry.id} entry={entry} />)}</Tbody>
</Table>
{(!data || data.entries.length) === 0 ? (
<Text alignSelf="flex-start" pl={6} color="gray.500">
No entries found
</Text>
) : (
<DatasetEntriesPaginator />
)}
</VStack>
);
};
export default DatasetEntriesTable;

View File

@@ -1,26 +0,0 @@
import { Button, HStack, useDisclosure } from "@chakra-ui/react";
import { BiImport } from "react-icons/bi";
import { BsStars } from "react-icons/bs";
import { GenerateDataModal } from "./GenerateDataModal";
export const DatasetHeaderButtons = () => {
const generateModalDisclosure = useDisclosure();
return (
<>
<HStack>
<Button leftIcon={<BiImport />} colorScheme="blue" variant="ghost">
Import Data
</Button>
<Button leftIcon={<BsStars />} colorScheme="blue" onClick={generateModalDisclosure.onOpen}>
Generate Data
</Button>
</HStack>
<GenerateDataModal
isOpen={generateModalDisclosure.isOpen}
onClose={generateModalDisclosure.onClose}
/>
</>
);
};

View File

@@ -1,128 +0,0 @@
import {
Modal,
ModalBody,
ModalCloseButton,
ModalContent,
ModalHeader,
ModalOverlay,
ModalFooter,
Text,
HStack,
VStack,
Icon,
NumberInput,
NumberInputField,
NumberInputStepper,
NumberIncrementStepper,
NumberDecrementStepper,
Button,
} from "@chakra-ui/react";
import { BsStars } from "react-icons/bs";
import { useState } from "react";
import { useDataset, useHandledAsyncCallback } from "~/utils/hooks";
import { api } from "~/utils/api";
import AutoResizeTextArea from "~/components/AutoResizeTextArea";
export const GenerateDataModal = ({
isOpen,
onClose,
}: {
isOpen: boolean;
onClose: () => void;
}) => {
const utils = api.useContext();
const datasetId = useDataset().data?.id;
const [numToGenerate, setNumToGenerate] = useState<number>(20);
const [inputDescription, setInputDescription] = useState<string>(
"Each input should contain an email body. Half of the emails should contain event details, and the other half should not.",
);
const [outputDescription, setOutputDescription] = useState<string>(
`Each output should contain "true" or "false", where "true" indicates that the email contains event details.`,
);
const generateEntriesMutation = api.datasetEntries.autogenerateEntries.useMutation();
const [generateEntries, generateEntriesInProgress] = useHandledAsyncCallback(async () => {
if (!inputDescription || !outputDescription || !numToGenerate || !datasetId) return;
await generateEntriesMutation.mutateAsync({
datasetId,
inputDescription,
outputDescription,
numToGenerate,
});
await utils.datasetEntries.list.invalidate();
onClose();
}, [
generateEntriesMutation,
onClose,
inputDescription,
outputDescription,
numToGenerate,
datasetId,
]);
return (
<Modal isOpen={isOpen} onClose={onClose} size={{ base: "xl", sm: "2xl", md: "3xl" }}>
<ModalOverlay />
<ModalContent w={1200}>
<ModalHeader>
<HStack>
<Icon as={BsStars} />
<Text>Generate Data</Text>
</HStack>
</ModalHeader>
<ModalCloseButton />
<ModalBody maxW="unset">
<VStack w="full" spacing={8} padding={8} alignItems="flex-start">
<VStack alignItems="flex-start" spacing={2}>
<Text fontWeight="bold">Number of Rows:</Text>
<NumberInput
step={5}
defaultValue={15}
min={0}
max={100}
onChange={(valueString) => setNumToGenerate(parseInt(valueString) || 0)}
value={numToGenerate}
w="24"
>
<NumberInputField />
<NumberInputStepper>
<NumberIncrementStepper />
<NumberDecrementStepper />
</NumberInputStepper>
</NumberInput>
</VStack>
<VStack alignItems="flex-start" w="full" spacing={2}>
<Text fontWeight="bold">Input Description:</Text>
<AutoResizeTextArea
value={inputDescription}
onChange={(e) => setInputDescription(e.target.value)}
placeholder="Each input should contain..."
/>
</VStack>
<VStack alignItems="flex-start" w="full" spacing={2}>
<Text fontWeight="bold">Output Description (optional):</Text>
<AutoResizeTextArea
value={outputDescription}
onChange={(e) => setOutputDescription(e.target.value)}
placeholder="The output should contain..."
/>
</VStack>
</VStack>
</ModalBody>
<ModalFooter>
<Button
colorScheme="blue"
isLoading={generateEntriesInProgress}
isDisabled={!numToGenerate || !inputDescription || !outputDescription}
onClick={generateEntries}
>
Generate
</Button>
</ModalFooter>
</ModalContent>
</Modal>
);
};

View File

@@ -1,13 +0,0 @@
import { Td, Tr } from "@chakra-ui/react";
import { type DatasetEntry } from "@prisma/client";
const TableRow = ({ entry }: { entry: DatasetEntry }) => {
return (
<Tr key={entry.id}>
<Td>{entry.input}</Td>
<Td>{entry.output}</Td>
</Tr>
);
};
export default TableRow;

View File

@@ -14,21 +14,11 @@ import { formatTimePast } from "~/utils/dayjs";
import Link from "next/link";
import { useRouter } from "next/router";
import { BsPlusSquare } from "react-icons/bs";
import { api } from "~/utils/api";
import { RouterOutputs, api } from "~/utils/api";
import { useHandledAsyncCallback } from "~/utils/hooks";
import { useAppStore } from "~/state/store";
type ExperimentData = {
testScenarioCount: number;
promptVariantCount: number;
id: string;
label: string;
sortIndex: number;
createdAt: Date;
updatedAt: Date;
};
export const ExperimentCard = ({ exp }: { exp: ExperimentData }) => {
export const ExperimentCard = ({ exp }: { exp: RouterOutputs["experiments"]["list"][0] }) => {
return (
<Card
w="full"
@@ -45,7 +35,7 @@ export const ExperimentCard = ({ exp }: { exp: ExperimentData }) => {
as={Link}
w="full"
h="full"
href={{ pathname: "/experiments/[id]", query: { id: exp.id } }}
href={{ pathname: "/experiments/[experimentSlug]", query: { experimentSlug: exp.slug } }}
justify="space-between"
>
<HStack w="full" color="gray.700" justify="center">
@@ -89,8 +79,8 @@ export const NewExperimentCard = () => {
projectId: selectedProjectId ?? "",
});
await router.push({
pathname: "/experiments/[id]",
query: { id: newExperiment.id },
pathname: "/experiments/[experimentSlug]",
query: { experimentSlug: newExperiment.slug },
});
}, [createMutation, router, selectedProjectId]);

View File

@@ -16,11 +16,14 @@ export const useOnForkButtonPressed = () => {
const [onFork, isForking] = useHandledAsyncCallback(async () => {
if (!experiment.data?.id || !selectedProjectId) return;
const forkedExperimentId = await forkMutation.mutateAsync({
const newExperiment = await forkMutation.mutateAsync({
id: experiment.data.id,
projectId: selectedProjectId,
});
await router.push({ pathname: "/experiments/[id]", query: { id: forkedExperimentId } });
await router.push({
pathname: "/experiments/[experimentSlug]",
query: { experimentSlug: newExperiment.slug },
});
}, [forkMutation, experiment.data?.id, router]);
const onForkButtonPressed = useCallback(() => {

View File

@@ -0,0 +1,65 @@
import { Card, Table, Thead, Tr, Th, Tbody, Td, VStack, Icon, Text } from "@chakra-ui/react";
import { FaTable } from "react-icons/fa";
import { type FineTuneStatus } from "@prisma/client";
import dayjs from "~/utils/dayjs";
import { useFineTunes } from "~/utils/hooks";
const FineTunesTable = ({}) => {
const { data } = useFineTunes();
const fineTunes = data?.fineTunes || [];
return (
<Card width="100%" overflowX="auto">
{fineTunes.length ? (
<Table>
<Thead>
<Tr>
<Th>ID</Th>
<Th>Created At</Th>
<Th>Base Model</Th>
<Th>Dataset Size</Th>
<Th>Status</Th>
</Tr>
</Thead>
<Tbody>
{fineTunes.map((fineTune) => {
return (
<Tr key={fineTune.id}>
<Td>{fineTune.slug}</Td>
<Td>{dayjs(fineTune.createdAt).format("MMMM D h:mm A")}</Td>
<Td>{fineTune.baseModel}</Td>
<Td>{fineTune.dataset._count.datasetEntries}</Td>
<Td fontSize="sm" fontWeight="bold">
<Text color={getStatusColor(fineTune.status)}>{fineTune.status}</Text>
</Td>
</Tr>
);
})}
</Tbody>
</Table>
) : (
<VStack py={8}>
<Icon as={FaTable} boxSize={16} color="gray.300" />
<Text color="gray.400" fontSize="lg" fontWeight="bold">
No Fine Tunes Found
</Text>
</VStack>
)}
</Card>
);
};
export default FineTunesTable;
const getStatusColor = (status: FineTuneStatus) => {
switch (status) {
case "DEPLOYED":
return "green.500";
case "ERROR":
return "red.500";
default:
return "yellow.500";
}
};

View File

@@ -15,12 +15,14 @@ import Head from "next/head";
import Link from "next/link";
import { BsGearFill, BsGithub, BsPersonCircle } from "react-icons/bs";
import { IoStatsChartOutline } from "react-icons/io5";
import { RiHome3Line, RiDatabase2Line, RiFlaskLine } from "react-icons/ri";
import { RiHome3Line, RiFlaskLine } from "react-icons/ri";
import { FaRobot } from "react-icons/fa";
import { signIn, useSession } from "next-auth/react";
import { env } from "~/env.mjs";
import ProjectMenu from "./ProjectMenu";
import NavSidebarOption from "./NavSidebarOption";
import IconLink from "./IconLink";
import { BetaModal } from "./BetaModal";
import { useAppStore } from "~/state/store";
const Divider = () => <Box h="1px" bgColor="gray.300" w="full" />;
@@ -71,21 +73,10 @@ const NavSidebar = () => {
<ProjectMenu />
<Divider />
{env.NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS && (
<>
<IconLink icon={RiHome3Line} label="Dashboard" href="/dashboard" beta />
<IconLink
icon={IoStatsChartOutline}
label="Request Logs"
href="/request-logs"
beta
/>
</>
)}
<IconLink icon={RiHome3Line} label="Dashboard" href="/dashboard" beta />
<IconLink icon={IoStatsChartOutline} label="Request Logs" href="/request-logs" beta />
<IconLink icon={FaRobot} label="Fine Tunes" href="/fine-tunes" beta />
<IconLink icon={RiFlaskLine} label="Experiments" href="/experiments" />
{env.NEXT_PUBLIC_SHOW_DATA && (
<IconLink icon={RiDatabase2Line} label="Data" href="/data" />
)}
<VStack w="full" alignItems="flex-start" spacing={0} pt={8}>
<Text
pl={2}
@@ -105,7 +96,7 @@ const NavSidebar = () => {
<NavSidebarOption>
<HStack
w="full"
p={4}
p={{ base: 2, md: 4 }}
as={ChakraLink}
justifyContent="start"
onClick={() => {
@@ -141,10 +132,12 @@ export default function AppShell({
children,
title,
requireAuth,
requireBeta,
}: {
children: React.ReactNode;
title?: string;
requireAuth?: boolean;
requireBeta?: boolean;
}) {
const [vh, setVh] = useState("100vh"); // Default height to prevent flicker on initial render
@@ -174,15 +167,21 @@ export default function AppShell({
}
}, [requireAuth, user, authLoading]);
const flags = useAppStore((s) => s.featureFlags.featureFlags);
const flagsLoaded = useAppStore((s) => s.featureFlags.flagsLoaded);
return (
<Flex h={vh} w="100vw">
<Head>
<title>{title ? `${title} | OpenPipe` : "OpenPipe"}</title>
</Head>
<NavSidebar />
<Box h="100%" flex={1} overflowY="auto" bgColor="gray.50">
{children}
</Box>
</Flex>
<>
<Flex h={vh} w="100vw">
<Head>
<title>{title ? `${title} | OpenPipe` : "OpenPipe"}</title>
</Head>
<NavSidebar />
<Box h="100%" flex={1} overflowY="auto" bgColor="gray.50">
{children}
</Box>
</Flex>
{requireBeta && flagsLoaded && !flags.betaAccess && <BetaModal />}
</>
);
}

View File

@@ -0,0 +1,67 @@
import {
Button,
Modal,
ModalBody,
ModalContent,
ModalFooter,
ModalHeader,
ModalOverlay,
VStack,
Text,
HStack,
Icon,
Link,
} from "@chakra-ui/react";
import { BsStars } from "react-icons/bs";
import { useRouter } from "next/router";
import { useSession } from "next-auth/react";
export const BetaModal = () => {
const router = useRouter();
const session = useSession();
const email = session.data?.user.email ?? "";
return (
<Modal
isOpen
onClose={router.back}
closeOnOverlayClick={false}
size={{ base: "xl", md: "2xl" }}
>
<ModalOverlay />
<ModalContent w={1200}>
<ModalHeader>
<HStack>
<Icon as={BsStars} />
<Text>Beta-Only Feature</Text>
</HStack>
</ModalHeader>
<ModalBody maxW="unset">
<VStack spacing={8} py={4} alignItems="flex-start">
<Text fontSize="md">
This feature is currently in beta. To receive early access to beta-only features, join
the waitlist. You'll receive an email at <b>{email}</b> when you're approved.
</Text>
</VStack>
</ModalBody>
<ModalFooter>
<HStack spacing={4}>
<Button
as={Link}
textDecoration="none !important"
colorScheme="orange"
target="_blank"
href={`https://ax3nafkw0jp.typeform.com/to/ZNpYqvAc#email=${email}`}
>
Join Waitlist
</Button>
<Button colorScheme="blue" onClick={router.back}>
Done
</Button>
</HStack>
</ModalFooter>
</ModalContent>
</Modal>
);
};

View File

@@ -14,8 +14,9 @@ import {
Link as ChakraLink,
Image,
Box,
Portal,
} from "@chakra-ui/react";
import React, { useEffect, useState } from "react";
import { useEffect } from "react";
import Link from "next/link";
import { BsPlus, BsPersonCircle } from "react-icons/bs";
import { type Project } from "@prisma/client";
@@ -67,7 +68,13 @@ export default function ProjectMenu() {
);
return (
<VStack w="full" alignItems="flex-start" spacing={0} py={1}>
<VStack
w="full"
alignItems="flex-start"
spacing={0}
py={1}
zIndex={popover.isOpen ? "dropdown" : undefined}
>
<Popover
placement="bottom"
isOpen={popover.isOpen}
@@ -103,64 +110,66 @@ export default function ProjectMenu() {
</HStack>
</NavSidebarOption>
</PopoverTrigger>
<PopoverContent
_focusVisible={{ outline: "unset" }}
ml={-1}
w={224}
boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);"
fontSize="sm"
>
<VStack alignItems="flex-start" spacing={1} py={1}>
<Text px={3} py={2}>
{user?.user.email}
</Text>
<Divider />
<Text alignSelf="flex-start" fontWeight="bold" px={3} pt={2}>
Your Projects
</Text>
<VStack spacing={0} w="full" px={1}>
{projects?.map((proj) => (
<ProjectOption
key={proj.id}
proj={proj}
isActive={proj.id === selectedProjectId}
onClose={popover.onClose}
/>
))}
<HStack
as={Button}
variant="ghost"
colorScheme="blue"
color="blue.400"
fontSize="sm"
justifyContent="flex-start"
onClick={createProject}
w="full"
borderRadius={4}
spacing={0}
>
<Text>Add project</Text>
<Icon as={isLoading ? Spinner : BsPlus} boxSize={4} strokeWidth={0.5} />
</HStack>
</VStack>
<Portal>
<PopoverContent
_focusVisible={{ outline: "unset" }}
w={220}
ml={{ base: 2, md: 0 }}
boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);"
fontSize="sm"
>
<VStack alignItems="flex-start" spacing={1} py={1}>
<Text px={3} py={2}>
{user?.user.email}
</Text>
<Divider />
<Text alignSelf="flex-start" fontWeight="bold" px={3} pt={2}>
Your Projects
</Text>
<VStack spacing={0} w="full" px={1}>
{projects?.map((proj) => (
<ProjectOption
key={proj.id}
proj={proj}
isActive={proj.id === selectedProjectId}
onClose={popover.onClose}
/>
))}
<HStack
as={Button}
variant="ghost"
colorScheme="blue"
color="blue.400"
fontSize="sm"
justifyContent="flex-start"
onClick={createProject}
w="full"
borderRadius={4}
spacing={0}
>
<Text>Add project</Text>
<Icon as={isLoading ? Spinner : BsPlus} boxSize={4} strokeWidth={0.5} />
</HStack>
</VStack>
<Divider />
<VStack w="full" px={1}>
<ChakraLink
onClick={() => {
signOut().catch(console.error);
}}
_hover={{ bgColor: "gray.200", textDecoration: "none" }}
w="full"
py={2}
px={2}
borderRadius={4}
>
<Text>Sign out</Text>
</ChakraLink>
<Divider />
<VStack w="full" px={1}>
<ChakraLink
onClick={() => {
signOut().catch(console.error);
}}
_hover={{ bgColor: "gray.200", textDecoration: "none" }}
w="full"
py={2}
px={2}
borderRadius={4}
>
<Text>Sign out</Text>
</ChakraLink>
</VStack>
</VStack>
</VStack>
</PopoverContent>
</PopoverContent>
</Portal>
</Popover>
</VStack>
);
@@ -176,7 +185,6 @@ const ProjectOption = ({
onClose: () => void;
}) => {
const setSelectedProjectId = useAppStore((s) => s.setSelectedProjectId);
const [gearHovered, setGearHovered] = useState(false);
return (
<HStack
@@ -188,8 +196,8 @@ const ProjectOption = ({
}}
w="full"
justifyContent="space-between"
_hover={gearHovered ? undefined : { bgColor: "gray.200", textDecoration: "none" }}
color={isActive ? "blue.400" : undefined}
_hover={{ bgColor: "gray.200", textDecoration: "none" }}
bgColor={isActive ? "gray.100" : undefined}
py={2}
px={4}
borderRadius={4}

View File

@@ -23,50 +23,48 @@ export default function UserMenu({ user, ...rest }: { user: Session } & StackPro
);
return (
<>
<Popover placement="right">
<PopoverTrigger>
<NavSidebarOption>
<HStack
// Weird values to make mobile look right; can clean up when we make the sidebar disappear on mobile
py={2}
px={1}
spacing={3}
{...rest}
>
{profileImage}
<VStack spacing={0} align="start" flex={1} flexShrink={1}>
<Text fontWeight="bold" fontSize="sm">
{user.user.name}
</Text>
<Text color="gray.500" fontSize="xs">
{/* {user.user.email} */}
</Text>
</VStack>
<Icon as={BsChevronRight} boxSize={4} color="gray.500" />
</HStack>
</NavSidebarOption>
</PopoverTrigger>
<PopoverContent _focusVisible={{ outline: "unset" }} ml={-1} minW={48} w="full">
<VStack align="stretch" spacing={0}>
{/* sign out */}
<HStack
as={Link}
onClick={() => {
signOut().catch(console.error);
}}
px={4}
py={2}
spacing={4}
color="gray.500"
fontSize="sm"
>
<Icon as={BsBoxArrowRight} boxSize={6} />
<Text>Sign out</Text>
</HStack>
</VStack>
</PopoverContent>
</Popover>
</>
<Popover placement="right">
<PopoverTrigger>
<NavSidebarOption>
<HStack
// Weird values to make mobile look right; can clean up when we make the sidebar disappear on mobile
py={2}
px={1}
spacing={3}
{...rest}
>
{profileImage}
<VStack spacing={0} align="start" flex={1} flexShrink={1}>
<Text fontWeight="bold" fontSize="sm">
{user.user.name}
</Text>
<Text color="gray.500" fontSize="xs">
{/* {user.user.email} */}
</Text>
</VStack>
<Icon as={BsChevronRight} boxSize={4} color="gray.500" />
</HStack>
</NavSidebarOption>
</PopoverTrigger>
<PopoverContent _focusVisible={{ outline: "unset" }} ml={-1} minW={48} w="full">
<VStack align="stretch" spacing={0}>
{/* sign out */}
<HStack
as={Link}
onClick={() => {
signOut().catch(console.error);
}}
px={4}
py={2}
spacing={4}
color="gray.500"
fontSize="sm"
>
<Icon as={BsBoxArrowRight} boxSize={6} />
<Text>Sign out</Text>
</HStack>
</VStack>
</PopoverContent>
</Popover>
);
}

View File

@@ -0,0 +1,128 @@
import {
Button,
FormControl,
FormLabel,
Input,
FormHelperText,
HStack,
Modal,
ModalBody,
ModalCloseButton,
ModalContent,
ModalFooter,
ModalHeader,
ModalOverlay,
Spinner,
Text,
VStack,
RadioGroup,
Radio,
} from "@chakra-ui/react";
import { useState, useEffect } from "react";
import { api } from "~/utils/api";
import { useHandledAsyncCallback, useSelectedProject } from "~/utils/hooks";
import { maybeReportError } from "~/utils/errorHandling/maybeReportError";
import { type ProjectUserRole } from "@prisma/client";
export const InviteMemberModal = ({
isOpen,
onClose,
}: {
isOpen: boolean;
onClose: () => void;
}) => {
const selectedProject = useSelectedProject().data;
const utils = api.useContext();
const [email, setEmail] = useState("");
const [role, setRole] = useState<ProjectUserRole>("MEMBER");
useEffect(() => {
setEmail("");
setRole("MEMBER");
}, [isOpen]);
const emailIsValid = !email || !email.match(/.+@.+\..+/);
const inviteMemberMutation = api.users.inviteToProject.useMutation();
const [inviteMember, isInviting] = useHandledAsyncCallback(async () => {
if (!selectedProject?.id || !role) return;
const resp = await inviteMemberMutation.mutateAsync({
projectId: selectedProject.id,
email,
role,
});
if (maybeReportError(resp)) return;
await utils.projects.get.invalidate();
onClose();
}, [inviteMemberMutation, email, role, selectedProject?.id, onClose]);
return (
<Modal isOpen={isOpen} onClose={onClose}>
<ModalOverlay />
<ModalContent w={1200}>
<ModalHeader>
<HStack>
<Text>Invite Member</Text>
</HStack>
</ModalHeader>
<ModalCloseButton />
<ModalBody>
<VStack spacing={8} alignItems="flex-start">
<Text>
Invite a new member to <b>{selectedProject?.name}</b>.
</Text>
<RadioGroup
value={role}
onChange={(e) => setRole(e as ProjectUserRole)}
colorScheme="orange"
>
<VStack w="full" alignItems="flex-start">
<Radio value="MEMBER">
<Text fontSize="sm">MEMBER</Text>
</Radio>
<Radio value="ADMIN">
<Text fontSize="sm">ADMIN</Text>
</Radio>
</VStack>
</RadioGroup>
<FormControl>
<FormLabel>Email</FormLabel>
<Input
type="email"
value={email}
onChange={(e) => setEmail(e.target.value)}
onKeyDown={(e) => {
if (e.key === "Enter" && (e.metaKey || e.ctrlKey || e.shiftKey)) {
e.preventDefault();
e.currentTarget.blur();
inviteMember();
}
}}
/>
<FormHelperText>Enter the email of the person you want to invite.</FormHelperText>
</FormControl>
</VStack>
</ModalBody>
<ModalFooter mt={4}>
<HStack>
<Button colorScheme="gray" onClick={onClose} minW={24}>
<Text>Cancel</Text>
</Button>
<Button
colorScheme="orange"
onClick={inviteMember}
minW={24}
isDisabled={emailIsValid || isInviting}
>
{isInviting ? <Spinner boxSize={4} /> : <Text>Send Invitation</Text>}
</Button>
</HStack>
</ModalFooter>
</ModalContent>
</Modal>
);
};

View File

@@ -0,0 +1,145 @@
import { useMemo, useState } from "react";
import {
Table,
Thead,
Tr,
Th,
Tbody,
Td,
IconButton,
useDisclosure,
Text,
Button,
} from "@chakra-ui/react";
import { useSession } from "next-auth/react";
import { BsTrash } from "react-icons/bs";
import { type User } from "@prisma/client";
import { useHandledAsyncCallback, useSelectedProject } from "~/utils/hooks";
import { InviteMemberModal } from "./InviteMemberModal";
import { RemoveMemberDialog } from "./RemoveMemberDialog";
import { api } from "~/utils/api";
import { maybeReportError } from "~/utils/errorHandling/maybeReportError";
const MemberTable = () => {
const selectedProject = useSelectedProject().data;
const session = useSession().data;
const utils = api.useContext();
const [memberToRemove, setMemberToRemove] = useState<User | null>(null);
const inviteMemberModal = useDisclosure();
const cancelInvitationMutation = api.users.cancelProjectInvitation.useMutation();
const [cancelInvitation, isCancelling] = useHandledAsyncCallback(
async (invitationToken: string) => {
if (!selectedProject?.id) return;
const resp = await cancelInvitationMutation.mutateAsync({
invitationToken,
});
if (maybeReportError(resp)) return;
await utils.projects.get.invalidate();
},
[selectedProject?.id, cancelInvitationMutation],
);
const sortedMembers = useMemo(() => {
if (!selectedProject?.projectUsers) return [];
return selectedProject.projectUsers.sort((a, b) => {
if (a.role === b.role) return a.createdAt < b.createdAt ? -1 : 1;
// Take advantage of fact that ADMIN is alphabetically before MEMBER
return a.role < b.role ? -1 : 1;
});
}, [selectedProject?.projectUsers]);
return (
<>
<Table fontSize={{ base: "sm", md: "md" }}>
<Thead
sx={{
th: {
base: { px: 0 },
md: { px: 6 },
},
}}
>
<Tr>
<Th>Name</Th>
<Th display={{ base: "none", md: "table-cell" }}>Email</Th>
<Th>Role</Th>
{selectedProject?.role === "ADMIN" && <Th />}
</Tr>
</Thead>
<Tbody
sx={{
td: {
base: { px: 0 },
md: { px: 6 },
},
}}
>
{selectedProject &&
sortedMembers.map((member) => {
return (
<Tr key={member.id}>
<Td>
<Text fontWeight="bold">{member.user.name}</Text>
</Td>
<Td display={{ base: "none", md: "table-cell" }} h="full">
{member.user.email}
</Td>
<Td fontSize={{ base: "xs", md: "sm" }}>{member.role}</Td>
{selectedProject.role === "ADMIN" && (
<Td textAlign="end">
{member.user.id !== session?.user?.id &&
member.user.id !== selectedProject.personalProjectUserId && (
<IconButton
aria-label="Remove member"
colorScheme="red"
icon={<BsTrash />}
onClick={() => setMemberToRemove(member.user)}
/>
)}
</Td>
)}
</Tr>
);
})}
{selectedProject?.projectUserInvitations?.map((invitation) => {
return (
<Tr key={invitation.id}>
<Td>
<Text as="i">Invitation pending</Text>
</Td>
<Td>{invitation.email}</Td>
<Td fontSize="sm">{invitation.role}</Td>
{selectedProject.role === "ADMIN" && (
<Td textAlign="end">
<Button
size="sm"
colorScheme="red"
variant="ghost"
onClick={() => cancelInvitation(invitation.invitationToken)}
isLoading={isCancelling}
>
Cancel
</Button>
</Td>
)}
</Tr>
);
})}
</Tbody>
</Table>
<InviteMemberModal isOpen={inviteMemberModal.isOpen} onClose={inviteMemberModal.onClose} />
<RemoveMemberDialog
member={memberToRemove}
isOpen={!!memberToRemove}
onClose={() => setMemberToRemove(null)}
/>
</>
);
};
export default MemberTable;

View File

@@ -0,0 +1,71 @@
import {
Button,
AlertDialog,
AlertDialogBody,
AlertDialogFooter,
AlertDialogHeader,
AlertDialogContent,
AlertDialogOverlay,
Text,
VStack,
Spinner,
} from "@chakra-ui/react";
import { type User } from "@prisma/client";
import { useRouter } from "next/router";
import { useRef } from "react";
import { api } from "~/utils/api";
import { useHandledAsyncCallback, useSelectedProject } from "~/utils/hooks";
export const RemoveMemberDialog = ({
isOpen,
onClose,
member,
}: {
isOpen: boolean;
onClose: () => void;
member: User | null;
}) => {
const selectedProject = useSelectedProject();
const removeUserMutation = api.users.removeUserFromProject.useMutation();
const utils = api.useContext();
const router = useRouter();
const cancelRef = useRef<HTMLButtonElement>(null);
const [onRemoveConfirm, isRemoving] = useHandledAsyncCallback(async () => {
if (!selectedProject.data?.id || !member?.id) return;
await removeUserMutation.mutateAsync({ projectId: selectedProject.data.id, userId: member.id });
await utils.projects.get.invalidate();
onClose();
}, [removeUserMutation, selectedProject, router]);
return (
<AlertDialog isOpen={isOpen} leastDestructiveRef={cancelRef} onClose={onClose}>
<AlertDialogOverlay>
<AlertDialogContent>
<AlertDialogHeader fontSize="lg" fontWeight="bold">
Remove Member
</AlertDialogHeader>
<AlertDialogBody>
<VStack spacing={4} alignItems="flex-start">
<Text>
Are you sure you want to remove <b>{member?.name}</b> from the project?
</Text>
</VStack>
</AlertDialogBody>
<AlertDialogFooter>
<Button ref={cancelRef} onClick={onClose}>
Cancel
</Button>
<Button colorScheme="red" onClick={onRemoveConfirm} ml={3} w={20}>
{isRemoving ? <Spinner /> : "Remove"}
</Button>
</AlertDialogFooter>
</AlertDialogContent>
</AlertDialogOverlay>
</AlertDialog>
);
};

View File

@@ -21,7 +21,7 @@ const ActionButton = ({
>
<HStack spacing={1}>
{icon && <Icon as={icon} />}
<Text>{label}</Text>
<Text display={{ base: "none", md: "flex" }}>{label}</Text>
</HStack>
</Button>
);

View File

@@ -0,0 +1,117 @@
import {
Icon,
Popover,
PopoverTrigger,
PopoverContent,
VStack,
HStack,
Button,
Text,
useDisclosure,
Box,
} from "@chakra-ui/react";
import { BiCheck } from "react-icons/bi";
import { BsToggles } from "react-icons/bs";
import { useMemo } from "react";
import { useIsClientRehydrated, useTagNames } from "~/utils/hooks";
import { useAppStore } from "~/state/store";
import { StaticColumnKeys } from "~/state/columnVisiblitySlice";
import ActionButton from "./ActionButton";
const ColumnVisiblityDropdown = () => {
const tagNames = useTagNames().data;
const visibleColumns = useAppStore((s) => s.columnVisibility.visibleColumns);
const toggleColumnVisibility = useAppStore((s) => s.columnVisibility.toggleColumnVisibility);
const totalColumns = Object.keys(StaticColumnKeys).length + (tagNames?.length ?? 0);
const popover = useDisclosure();
const columnVisiblityOptions = useMemo(() => {
const options: { label: string; key: string }[] = [
{
label: "Sent At",
key: StaticColumnKeys.SENT_AT,
},
{
label: "Model",
key: StaticColumnKeys.MODEL,
},
{
label: "Duration",
key: StaticColumnKeys.DURATION,
},
{
label: "Input Tokens",
key: StaticColumnKeys.INPUT_TOKENS,
},
{
label: "Output Tokens",
key: StaticColumnKeys.OUTPUT_TOKENS,
},
{
label: "Status Code",
key: StaticColumnKeys.STATUS_CODE,
},
];
for (const tagName of tagNames ?? []) {
options.push({
label: tagName,
key: tagName,
});
}
return options;
}, [tagNames]);
const isClientRehydrated = useIsClientRehydrated();
if (!isClientRehydrated) return null;
return (
<Popover
placement="bottom-start"
isOpen={popover.isOpen}
onOpen={popover.onOpen}
onClose={popover.onClose}
>
<PopoverTrigger>
<Box>
<ActionButton
label={`Columns (${visibleColumns.size}/${totalColumns})`}
icon={BsToggles}
/>
</Box>
</PopoverTrigger>
<PopoverContent boxShadow="0 0 40px 4px rgba(0, 0, 0, 0.1);" minW={0} w="auto">
<VStack spacing={0} maxH={400} overflowY="auto">
{columnVisiblityOptions?.map((option, index) => (
<HStack
key={index}
as={Button}
onClick={() => toggleColumnVisibility(option.key)}
w="full"
minH={10}
variant="ghost"
justifyContent="space-between"
fontWeight="semibold"
borderRadius={0}
colorScheme="blue"
color="black"
fontSize="sm"
borderBottomWidth={1}
>
<Text mr={16}>{option.label}</Text>
<Box w={5}>
{visibleColumns.has(option.key) && (
<Icon as={BiCheck} color="blue.500" boxSize={5} />
)}
</Box>
</HStack>
))}
</VStack>
</PopoverContent>
</Popover>
);
};
export default ColumnVisiblityDropdown;

View File

@@ -0,0 +1,210 @@
import { useState, useEffect } from "react";
import {
Modal,
ModalOverlay,
ModalContent,
ModalHeader,
ModalCloseButton,
ModalBody,
ModalFooter,
HStack,
VStack,
Icon,
Text,
Button,
Checkbox,
NumberInput,
NumberInputField,
NumberInputStepper,
NumberIncrementStepper,
NumberDecrementStepper,
Collapse,
Flex,
useDisclosure,
type UseDisclosureReturn,
} from "@chakra-ui/react";
import { BiExport } from "react-icons/bi";
import { useHandledAsyncCallback } from "~/utils/hooks";
import { api } from "~/utils/api";
import { useAppStore } from "~/state/store";
import ActionButton from "./ActionButton";
import InputDropdown from "../InputDropdown";
import { FiChevronUp, FiChevronDown } from "react-icons/fi";
import InfoCircle from "../InfoCircle";
const SUPPORTED_EXPORT_FORMATS = ["alpaca-finetune", "openai-fine-tune", "unformatted"];
const ExportButton = () => {
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const disclosure = useDisclosure();
return (
<>
<ActionButton
onClick={disclosure.onOpen}
label="Export"
icon={BiExport}
isDisabled={selectedLogIds.size === 0}
/>
<ExportLogsModal disclosure={disclosure} />
</>
);
};
export default ExportButton;
const ExportLogsModal = ({ disclosure }: { disclosure: UseDisclosureReturn }) => {
const selectedProjectId = useAppStore((s) => s.selectedProjectId);
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const clearSelectedLogIds = useAppStore((s) => s.selectedLogs.clearSelectedLogIds);
const [selectedExportFormat, setSelectedExportFormat] = useState(SUPPORTED_EXPORT_FORMATS[0]);
const [testingSplit, setTestingSplit] = useState(10);
const [removeDuplicates, setRemoveDuplicates] = useState(true);
const [showAdvancedOptions, setShowAdvancedOptions] = useState(false);
useEffect(() => {
if (disclosure.isOpen) {
setSelectedExportFormat(SUPPORTED_EXPORT_FORMATS[0]);
setTestingSplit(10);
setRemoveDuplicates(true);
}
}, [disclosure.isOpen]);
const exportLogsMutation = api.loggedCalls.export.useMutation();
const [exportLogs, exportInProgress] = useHandledAsyncCallback(async () => {
if (!selectedProjectId || !selectedLogIds.size || !testingSplit || !selectedExportFormat)
return;
const response = await exportLogsMutation.mutateAsync({
projectId: selectedProjectId,
selectedLogIds: Array.from(selectedLogIds),
testingSplit,
selectedExportFormat,
removeDuplicates,
});
const dataUrl = `data:application/pdf;base64,${response}`;
const blob = await fetch(dataUrl).then((res) => res.blob());
const url = URL.createObjectURL(blob);
const a = document.createElement("a");
a.href = url;
a.download = `data.zip`;
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
disclosure.onClose();
clearSelectedLogIds();
}, [
exportLogsMutation,
selectedProjectId,
selectedLogIds,
testingSplit,
selectedExportFormat,
removeDuplicates,
]);
return (
<Modal size={{ base: "xl", md: "2xl" }} {...disclosure}>
<ModalOverlay />
<ModalContent w={1200}>
<ModalHeader>
<HStack>
<Icon as={BiExport} />
<Text>Export Logs</Text>
</HStack>
</ModalHeader>
<ModalCloseButton />
<ModalBody maxW="unset">
<VStack w="full" spacing={8} pt={4} alignItems="flex-start">
<Text>
We'll export the <b>{selectedLogIds.size}</b> logs you have selected in the format of
your choice.
</Text>
<VStack alignItems="flex-start" spacing={4}>
<Flex
flexDir={{ base: "column", md: "row" }}
alignItems={{ base: "flex-start", md: "center" }}
>
<HStack w={48} alignItems="center" spacing={1}>
<Text fontWeight="bold">Format:</Text>
<InfoCircle tooltipText="Format logs for for fine tuning or export them without formatting." />
</HStack>
<InputDropdown
options={SUPPORTED_EXPORT_FORMATS}
selectedOption={selectedExportFormat}
onSelect={(option) => setSelectedExportFormat(option)}
inputGroupProps={{ w: 48 }}
/>
</Flex>
<Flex
flexDir={{ base: "column", md: "row" }}
alignItems={{ base: "flex-start", md: "center" }}
>
<HStack w={48} alignItems="center" spacing={1}>
<Text fontWeight="bold">Testing Split:</Text>
<InfoCircle tooltipText="The percent of your logs that will be reserved for testing and saved in another file. Logs are split randomly." />
</HStack>
<HStack>
<NumberInput
defaultValue={10}
onChange={(_, num) => setTestingSplit(num)}
min={1}
max={100}
w={48}
>
<NumberInputField />
<NumberInputStepper>
<NumberIncrementStepper />
<NumberDecrementStepper />
</NumberInputStepper>
</NumberInput>
</HStack>
</Flex>
</VStack>
<VStack alignItems="flex-start" spacing={0}>
<Button
variant="unstyled"
color="blue.600"
onClick={() => setShowAdvancedOptions(!showAdvancedOptions)}
>
<HStack>
<Text>Advanced Options</Text>
<Icon as={showAdvancedOptions ? FiChevronUp : FiChevronDown} />
</HStack>
</Button>
<Collapse in={showAdvancedOptions} unmountOnExit={true}>
<VStack align="stretch" pt={4}>
<HStack>
<Checkbox
colorScheme="blue"
isChecked={removeDuplicates}
onChange={(e) => setRemoveDuplicates(e.target.checked)}
>
<Text>Remove duplicates</Text>
</Checkbox>
<InfoCircle tooltipText="To avoid overfitting and speed up training, automatically deduplicate logs with matching input and output." />
</HStack>
</VStack>
</Collapse>
</VStack>
</VStack>
</ModalBody>
<ModalFooter>
<HStack>
<Button colorScheme="gray" onClick={disclosure.onClose} minW={24}>
Cancel
</Button>
<Button colorScheme="blue" onClick={exportLogs} isLoading={exportInProgress} minW={24}>
Export
</Button>
</HStack>
</ModalFooter>
</ModalContent>
</Modal>
);
};

View File

@@ -0,0 +1,161 @@
import { useState, useEffect } from "react";
import {
Modal,
ModalOverlay,
ModalContent,
ModalHeader,
ModalCloseButton,
ModalBody,
ModalFooter,
HStack,
VStack,
Icon,
Text,
Button,
useDisclosure,
type UseDisclosureReturn,
Input,
} from "@chakra-ui/react";
import { FaRobot } from "react-icons/fa";
import humanId from "human-id";
import { useRouter } from "next/router";
import { useHandledAsyncCallback } from "~/utils/hooks";
import { api } from "~/utils/api";
import { useAppStore } from "~/state/store";
import ActionButton from "./ActionButton";
import InputDropdown from "../InputDropdown";
import { FiChevronDown } from "react-icons/fi";
const SUPPORTED_BASE_MODELS = ["llama2-7b", "llama2-13b", "llama2-70b", "gpt-3.5-turbo"];
const FineTuneButton = () => {
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const disclosure = useDisclosure();
return (
<>
<ActionButton
onClick={disclosure.onOpen}
label="Fine Tune"
icon={FaRobot}
isDisabled={selectedLogIds.size === 0}
/>
<FineTuneModal disclosure={disclosure} />
</>
);
};
export default FineTuneButton;
const FineTuneModal = ({ disclosure }: { disclosure: UseDisclosureReturn }) => {
const selectedProjectId = useAppStore((s) => s.selectedProjectId);
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const clearSelectedLogIds = useAppStore((s) => s.selectedLogs.clearSelectedLogIds);
const [selectedBaseModel, setSelectedBaseModel] = useState(SUPPORTED_BASE_MODELS[0]);
const [modelSlug, setModelSlug] = useState(humanId({ separator: "-", capitalize: false }));
useEffect(() => {
if (disclosure.isOpen) {
setSelectedBaseModel(SUPPORTED_BASE_MODELS[0]);
setModelSlug(humanId({ separator: "-", capitalize: false }));
}
}, [disclosure.isOpen]);
const utils = api.useContext();
const router = useRouter();
const createFineTuneMutation = api.fineTunes.create.useMutation();
const [createFineTune, creationInProgress] = useHandledAsyncCallback(async () => {
if (!selectedProjectId || !modelSlug || !selectedBaseModel || !selectedLogIds.size) return;
await createFineTuneMutation.mutateAsync({
projectId: selectedProjectId,
slug: modelSlug,
baseModel: selectedBaseModel,
selectedLogIds: Array.from(selectedLogIds),
});
await utils.fineTunes.list.invalidate();
await router.push({ pathname: "/fine-tunes" });
clearSelectedLogIds();
disclosure.onClose();
}, [createFineTuneMutation, selectedProjectId, selectedLogIds, modelSlug, selectedBaseModel]);
return (
<Modal size={{ base: "xl", md: "2xl" }} {...disclosure}>
<ModalOverlay />
<ModalContent w={1200}>
<ModalHeader>
<HStack>
<Icon as={FaRobot} />
<Text>Fine Tune</Text>
</HStack>
</ModalHeader>
<ModalCloseButton />
<ModalBody maxW="unset">
<VStack w="full" spacing={8} pt={4} alignItems="flex-start">
<Text>
We'll train on the <b>{selectedLogIds.size}</b> logs you've selected.
</Text>
<VStack>
<HStack spacing={2} w="full">
<Text fontWeight="bold" w={36}>
Model ID:
</Text>
<Input
value={modelSlug}
onChange={(e) => setModelSlug(e.target.value)}
w={48}
placeholder="unique-id"
onKeyDown={(e) => {
// If the user types anything other than a-z, A-Z, or 0-9, replace it with -
if (!/[a-zA-Z0-9]/.test(e.key)) {
e.preventDefault();
setModelSlug((s) => s && `${s}-`);
}
}}
/>
</HStack>
<HStack spacing={2}>
<Text fontWeight="bold" w={36}>
Base model:
</Text>
<InputDropdown
options={SUPPORTED_BASE_MODELS}
selectedOption={selectedBaseModel}
onSelect={(option) => setSelectedBaseModel(option)}
inputGroupProps={{ w: 48 }}
/>
</HStack>
</VStack>
<Button variant="unstyled" color="blue.600">
<HStack>
<Text>Advanced Options</Text>
<Icon as={FiChevronDown} />
</HStack>
</Button>
</VStack>
</ModalBody>
<ModalFooter>
<HStack>
<Button colorScheme="gray" onClick={disclosure.onClose} minW={24}>
Cancel
</Button>
<Button
colorScheme="blue"
onClick={createFineTune}
isLoading={creationInProgress}
minW={24}
isDisabled={!modelSlug}
>
Start Training
</Button>
</HStack>
</ModalFooter>
</ModalContent>
</Modal>
);
};

View File

@@ -0,0 +1,30 @@
import { Button, HStack, Icon, Text } from "@chakra-ui/react";
import { BsPlus } from "react-icons/bs";
import { comparators, defaultFilterableFields } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
const AddFilterButton = () => {
const addFilter = useAppStore((s) => s.logFilters.addFilter);
return (
<HStack
as={Button}
variant="ghost"
onClick={() =>
addFilter({
id: Date.now().toString(),
field: defaultFilterableFields[0],
comparator: comparators[0],
value: "",
})
}
spacing={0}
fontSize="sm"
>
<Icon as={BsPlus} boxSize={5} />
<Text>Add Filter</Text>
</HStack>
);
};
export default AddFilterButton;

View File

@@ -0,0 +1,44 @@
import { useCallback, useState } from "react";
import { HStack, IconButton, Input } from "@chakra-ui/react";
import { BsTrash } from "react-icons/bs";
import { type LogFilter } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
import { debounce } from "lodash-es";
import SelectFieldDropdown from "./SelectFieldDropdown";
import SelectComparatorDropdown from "./SelectComparatorDropdown";
const LogFilter = ({ filter }: { filter: LogFilter }) => {
const updateFilter = useAppStore((s) => s.logFilters.updateFilter);
const deleteFilter = useAppStore((s) => s.logFilters.deleteFilter);
const [editedValue, setEditedValue] = useState(filter.value);
const debouncedUpdateFilter = useCallback(
debounce((filter: LogFilter) => updateFilter(filter), 500, {
leading: true,
}),
[updateFilter],
);
return (
<HStack>
<SelectFieldDropdown filter={filter} />
<SelectComparatorDropdown filter={filter} />
<Input
value={editedValue}
onChange={(e) => {
setEditedValue(e.target.value);
debouncedUpdateFilter({ ...filter, value: e.target.value });
}}
/>
<IconButton
aria-label="Delete Filter"
icon={<BsTrash />}
onClick={() => deleteFilter(filter.id)}
/>
</HStack>
);
};
export default LogFilter;

View File

@@ -0,0 +1,30 @@
import { VStack, Text } from "@chakra-ui/react";
import AddFilterButton from "./AddFilterButton";
import { useAppStore } from "~/state/store";
import LogFilter from "./LogFilter";
const LogFilters = () => {
const filters = useAppStore((s) => s.logFilters.filters);
return (
<VStack
bgColor="white"
borderRadius={8}
borderWidth={1}
w="full"
alignItems="flex-start"
p={4}
spacing={4}
>
<Text fontWeight="bold" color="gray.500">
Filters
</Text>
{filters.map((filter) => (
<LogFilter key={filter.id} filter={filter} />
))}
<AddFilterButton />
</VStack>
);
};
export default LogFilters;

View File

@@ -0,0 +1,19 @@
import { comparators, type LogFilter } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
import InputDropdown from "~/components/InputDropdown";
const SelectComparatorDropdown = ({ filter }: { filter: LogFilter }) => {
const updateFilter = useAppStore((s) => s.logFilters.updateFilter);
const { comparator } = filter;
return (
<InputDropdown
options={comparators}
selectedOption={comparator}
onSelect={(option) => updateFilter({ ...filter, comparator: option })}
/>
);
};
export default SelectComparatorDropdown;

View File

@@ -0,0 +1,22 @@
import { defaultFilterableFields, type LogFilter } from "~/state/logFiltersSlice";
import { useAppStore } from "~/state/store";
import { useTagNames } from "~/utils/hooks";
import InputDropdown from "~/components/InputDropdown";
const SelectFieldDropdown = ({ filter }: { filter: LogFilter }) => {
const tagNames = useTagNames().data;
const updateFilter = useAppStore((s) => s.logFilters.updateFilter);
const { field } = filter;
return (
<InputDropdown
options={[...defaultFilterableFields, ...(tagNames || [])]}
selectedOption={field}
onSelect={(option) => updateFilter({ ...filter, field: option })}
/>
);
};
export default SelectFieldDropdown;

View File

@@ -5,14 +5,14 @@ import { TableHeader, TableRow } from "./TableRow";
export default function LoggedCallsTable() {
const [expandedRow, setExpandedRow] = useState<string | null>(null);
const { data: loggedCalls } = useLoggedCalls();
const loggedCalls = useLoggedCalls().data;
return (
<Card width="100%" overflow="hidden">
<Card width="100%" overflowX="auto">
<Table>
<TableHeader showCheckbox />
<TableHeader showOptions />
<Tbody>
{loggedCalls?.calls.map((loggedCall) => {
{loggedCalls?.calls?.map((loggedCall) => {
return (
<TableRow
key={loggedCall.id}
@@ -25,7 +25,7 @@ export default function LoggedCallsTable() {
setExpandedRow(loggedCall.id);
}
}}
showCheckbox
showOptions
/>
);
})}

View File

@@ -14,51 +14,64 @@ import {
Text,
Checkbox,
} from "@chakra-ui/react";
import dayjs from "dayjs";
import relativeTime from "dayjs/plugin/relativeTime";
import Link from "next/link";
import dayjs from "~/utils/dayjs";
import { type RouterOutputs } from "~/utils/api";
import { FormattedJson } from "./FormattedJson";
import { useAppStore } from "~/state/store";
import { useLoggedCalls } from "~/utils/hooks";
import { useIsClientRehydrated, useLoggedCalls, useTagNames } from "~/utils/hooks";
import { useMemo } from "react";
dayjs.extend(relativeTime);
import { StaticColumnKeys } from "~/state/columnVisiblitySlice";
type LoggedCall = RouterOutputs["loggedCalls"]["list"]["calls"][0];
export const TableHeader = ({ showCheckbox }: { showCheckbox?: boolean }) => {
export const TableHeader = ({ showOptions }: { showOptions?: boolean }) => {
const matchingLogIds = useLoggedCalls().data?.matchingLogIds;
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const addAll = useAppStore((s) => s.selectedLogs.addSelectedLogIds);
const clearAll = useAppStore((s) => s.selectedLogs.clearSelectedLogIds);
const allSelected = useMemo(() => {
if (!matchingLogIds) return false;
if (!matchingLogIds || !matchingLogIds.length) return false;
return matchingLogIds.every((id) => selectedLogIds.has(id));
}, [selectedLogIds, matchingLogIds]);
const tagNames = useTagNames().data;
const visibleColumns = useAppStore((s) => s.columnVisibility.visibleColumns);
const isClientRehydrated = useIsClientRehydrated();
if (!isClientRehydrated) return null;
return (
<Thead>
<Tr>
{showCheckbox && (
<Th>
<HStack w={8}>
{showOptions && (
<Th pr={0}>
<HStack minW={16}>
<Checkbox
isChecked={allSelected}
onChange={() => {
allSelected ? clearAll() : addAll(matchingLogIds || []);
}}
/>
<Text>({selectedLogIds.size})</Text>
<Text>
({selectedLogIds.size ? `${selectedLogIds.size}/` : ""}
{matchingLogIds?.length || 0})
</Text>
</HStack>
</Th>
)}
<Th>Time</Th>
<Th>Model</Th>
<Th isNumeric>Duration</Th>
<Th isNumeric>Input tokens</Th>
<Th isNumeric>Output tokens</Th>
<Th isNumeric>Status</Th>
{visibleColumns.has(StaticColumnKeys.SENT_AT) && <Th>Sent At</Th>}
{visibleColumns.has(StaticColumnKeys.MODEL) && <Th>Model</Th>}
{tagNames
?.filter((tagName) => visibleColumns.has(tagName))
.map((tagName) => (
<Th key={tagName} textTransform={"none"}>
{tagName}
</Th>
))}
{visibleColumns.has(StaticColumnKeys.DURATION) && <Th isNumeric>Duration</Th>}
{visibleColumns.has(StaticColumnKeys.INPUT_TOKENS) && <Th isNumeric>Input tokens</Th>}
{visibleColumns.has(StaticColumnKeys.OUTPUT_TOKENS) && <Th isNumeric>Output tokens</Th>}
{visibleColumns.has(StaticColumnKeys.STATUS_CODE) && <Th isNumeric>Status</Th>}
</Tr>
</Thead>
);
@@ -68,30 +81,30 @@ export const TableRow = ({
loggedCall,
isExpanded,
onToggle,
showCheckbox,
showOptions,
}: {
loggedCall: LoggedCall;
isExpanded: boolean;
onToggle: () => void;
showCheckbox?: boolean;
showOptions?: boolean;
}) => {
const isError = loggedCall.modelResponse?.statusCode !== 200;
const timeAgo = dayjs(loggedCall.requestedAt).fromNow();
const requestedAt = dayjs(loggedCall.requestedAt).format("MMMM D h:mm A");
const fullTime = dayjs(loggedCall.requestedAt).toString();
const durationCell = (
<Td isNumeric>
{loggedCall.cacheHit ? (
<Text color="gray.500">Cached</Text>
) : (
((loggedCall.modelResponse?.durationMs ?? 0) / 1000).toFixed(2) + "s"
)}
</Td>
);
const isChecked = useAppStore((s) => s.selectedLogs.selectedLogIds.has(loggedCall.id));
const toggleChecked = useAppStore((s) => s.selectedLogs.toggleSelectedLogId);
const tagNames = useTagNames().data;
const visibleColumns = useAppStore((s) => s.columnVisibility.visibleColumns);
const visibleTagNames = useMemo(() => {
return tagNames?.filter((tagName) => visibleColumns.has(tagName)) ?? [];
}, [tagNames, visibleColumns]);
const isClientRehydrated = useIsClientRehydrated();
if (!isClientRehydrated) return null;
return (
<>
<Tr
@@ -101,43 +114,66 @@ export const TableRow = ({
sx={{
"> td": { borderBottom: "none" },
}}
fontSize="sm"
>
{showCheckbox && (
{showOptions && (
<Td>
<Checkbox isChecked={isChecked} onChange={() => toggleChecked(loggedCall.id)} />
</Td>
)}
<Td>
<Tooltip label={fullTime} placement="top">
<Box whiteSpace="nowrap" minW="120px">
{timeAgo}
</Box>
</Tooltip>
</Td>
<Td width="100%">
<HStack justifyContent="flex-start">
<Text
colorScheme="purple"
color="purple.500"
borderColor="purple.500"
px={1}
borderRadius={4}
borderWidth={1}
fontSize="xs"
>
{loggedCall.model}
</Text>
</HStack>
</Td>
{durationCell}
<Td isNumeric>{loggedCall.modelResponse?.inputTokens}</Td>
<Td isNumeric>{loggedCall.modelResponse?.outputTokens}</Td>
<Td sx={{ color: isError ? "red.500" : "green.500", fontWeight: "semibold" }} isNumeric>
{loggedCall.modelResponse?.statusCode ?? "No response"}
</Td>
{visibleColumns.has(StaticColumnKeys.SENT_AT) && (
<Td>
<Tooltip label={fullTime} placement="top">
<Box whiteSpace="nowrap" minW="120px">
{requestedAt}
</Box>
</Tooltip>
</Td>
)}
{visibleColumns.has(StaticColumnKeys.MODEL) && (
<Td>
<HStack justifyContent="flex-start">
<Text
colorScheme="purple"
color="purple.500"
borderColor="purple.500"
px={1}
borderRadius={4}
borderWidth={1}
fontSize="xs"
whiteSpace="nowrap"
>
{loggedCall.model}
</Text>
</HStack>
</Td>
)}
{visibleTagNames.map((tagName) => (
<Td key={tagName}>{loggedCall.tags[tagName]}</Td>
))}
{visibleColumns.has(StaticColumnKeys.DURATION) && (
<Td isNumeric>
{loggedCall.cacheHit ? (
<Text color="gray.500">Cached</Text>
) : (
((loggedCall.modelResponse?.durationMs ?? 0) / 1000).toFixed(2) + "s"
)}
</Td>
)}
{visibleColumns.has(StaticColumnKeys.INPUT_TOKENS) && (
<Td isNumeric>{loggedCall.modelResponse?.inputTokens}</Td>
)}
{visibleColumns.has(StaticColumnKeys.OUTPUT_TOKENS) && (
<Td isNumeric>{loggedCall.modelResponse?.outputTokens}</Td>
)}
{visibleColumns.has(StaticColumnKeys.STATUS_CODE) && (
<Td sx={{ color: isError ? "red.500" : "green.500", fontWeight: "semibold" }} isNumeric>
{loggedCall.modelResponse?.statusCode ?? "No response"}
</Td>
)}
</Tr>
<Tr>
<Td colSpan={8} p={0}>
<Td colSpan={visibleColumns.size + 1} w="full" p={0}>
<Collapse in={isExpanded} unmountOnExit={true}>
<VStack p={4} align="stretch">
<HStack align="stretch">

View File

@@ -21,6 +21,19 @@ export const env = createEnv({
ANTHROPIC_API_KEY: z.string().default("placeholder"),
SENTRY_AUTH_TOKEN: z.string().optional(),
OPENPIPE_API_KEY: z.string().optional(),
SENDER_EMAIL: z.string().default("placeholder"),
SMTP_HOST: z.string().default("placeholder"),
SMTP_PORT: z.string().default("placeholder"),
SMTP_LOGIN: z.string().default("placeholder"),
SMTP_PASSWORD: z.string().default("placeholder"),
WORKER_CONCURRENCY: z
.string()
.default("10")
.transform((val) => parseInt(val)),
WORKER_MAX_POOL_SIZE: z
.string()
.default("10")
.transform((val) => parseInt(val)),
},
/**
@@ -33,8 +46,6 @@ export const env = createEnv({
NEXT_PUBLIC_SOCKET_URL: z.string().url().default("http://localhost:3318"),
NEXT_PUBLIC_HOST: z.string().url().default("http://localhost:3000"),
NEXT_PUBLIC_SENTRY_DSN: z.string().optional(),
NEXT_PUBLIC_SHOW_DATA: z.string().optional(),
NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS: z.string().optional(),
},
/**
@@ -49,7 +60,6 @@ export const env = createEnv({
NEXT_PUBLIC_POSTHOG_KEY: process.env.NEXT_PUBLIC_POSTHOG_KEY,
NEXT_PUBLIC_SOCKET_URL: process.env.NEXT_PUBLIC_SOCKET_URL,
NEXT_PUBLIC_HOST: process.env.NEXT_PUBLIC_HOST,
NEXT_PUBLIC_SHOW_DATA: process.env.NEXT_PUBLIC_SHOW_DATA,
GITHUB_CLIENT_ID: process.env.GITHUB_CLIENT_ID,
GITHUB_CLIENT_SECRET: process.env.GITHUB_CLIENT_SECRET,
REPLICATE_API_TOKEN: process.env.REPLICATE_API_TOKEN,
@@ -57,7 +67,13 @@ export const env = createEnv({
NEXT_PUBLIC_SENTRY_DSN: process.env.NEXT_PUBLIC_SENTRY_DSN,
SENTRY_AUTH_TOKEN: process.env.SENTRY_AUTH_TOKEN,
OPENPIPE_API_KEY: process.env.OPENPIPE_API_KEY,
NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS: process.env.NEXT_PUBLIC_FF_SHOW_LOGGED_CALLS,
SENDER_EMAIL: process.env.SENDER_EMAIL,
SMTP_HOST: process.env.SMTP_HOST,
SMTP_PORT: process.env.SMTP_PORT,
SMTP_LOGIN: process.env.SMTP_LOGIN,
SMTP_PASSWORD: process.env.SMTP_PASSWORD,
WORKER_CONCURRENCY: process.env.WORKER_CONCURRENCY,
WORKER_MAX_POOL_SIZE: process.env.WORKER_MAX_POOL_SIZE,
},
/**
* Run `build` or `dev` with `SKIP_ENV_VALIDATION` to skip env validation.

View File

@@ -1,6 +1,7 @@
import openaiChatCompletionFrontend from "./openai-ChatCompletion/frontend";
import replicateLlama2Frontend from "./replicate-llama2/frontend";
import anthropicFrontend from "./anthropic-completion/frontend";
import openpipeFrontend from "./openpipe-chat/frontend";
import { type SupportedProvider, type FrontendModelProvider } from "./types";
// Keep attributes here that need to be accessible from the frontend. We can't
@@ -10,6 +11,7 @@ const frontendModelProviders: Record<SupportedProvider, FrontendModelProvider<an
"openai/ChatCompletion": openaiChatCompletionFrontend,
"replicate/llama2": replicateLlama2Frontend,
"anthropic/completion": anthropicFrontend,
"openpipe/Chat": openpipeFrontend,
};
export default frontendModelProviders;

View File

@@ -1,12 +1,14 @@
import openaiChatCompletion from "./openai-ChatCompletion";
import replicateLlama2 from "./replicate-llama2";
import anthropicCompletion from "./anthropic-completion";
import openpipeChatCompletion from "./openpipe-chat";
import { type SupportedProvider, type ModelProvider } from "./types";
const modelProviders: Record<SupportedProvider, ModelProvider<any, any, any>> = {
"openai/ChatCompletion": openaiChatCompletion,
"replicate/llama2": replicateLlama2,
"anthropic/completion": anthropicCompletion,
"openpipe/Chat": openpipeChatCompletion,
};
export default modelProviders;

View File

@@ -1,54 +1,10 @@
/* eslint-disable @typescript-eslint/no-unsafe-call */
import {
type ChatCompletionChunk,
type ChatCompletion,
type CompletionCreateParams,
} from "openai/resources/chat";
import { type CompletionResponse } from "../types";
import { isArray, isString, omit } from "lodash-es";
import { openai } from "~/server/utils/openai";
import { isArray, isString } from "lodash-es";
import { APIError } from "openai";
const mergeStreamedChunks = (
base: ChatCompletion | null,
chunk: ChatCompletionChunk,
): ChatCompletion => {
if (base === null) {
return mergeStreamedChunks({ ...chunk, choices: [] }, chunk);
}
const choices = [...base.choices];
for (const choice of chunk.choices) {
const baseChoice = choices.find((c) => c.index === choice.index);
if (baseChoice) {
baseChoice.finish_reason = choice.finish_reason ?? baseChoice.finish_reason;
baseChoice.message = baseChoice.message ?? { role: "assistant" };
if (choice.delta?.content)
baseChoice.message.content =
((baseChoice.message.content as string) ?? "") + (choice.delta.content ?? "");
if (choice.delta?.function_call) {
const fnCall = baseChoice.message.function_call ?? {};
fnCall.name =
((fnCall.name as string) ?? "") + ((choice.delta.function_call.name as string) ?? "");
fnCall.arguments =
((fnCall.arguments as string) ?? "") +
((choice.delta.function_call.arguments as string) ?? "");
}
} else {
// @ts-expect-error the types are correctly telling us that finish_reason
// could be null, but don't want to fix it right now.
choices.push({ ...omit(choice, "delta"), message: { role: "assistant", ...choice.delta } });
}
}
const merged: ChatCompletion = {
...base,
choices,
};
return merged;
};
import { type ChatCompletion, type CompletionCreateParams } from "openai/resources/chat";
import mergeChunks from "openpipe/src/openai/mergeChunks";
import { openai } from "~/server/utils/openai";
import { type CompletionResponse } from "../types";
export async function getCompletion(
input: CompletionCreateParams,
@@ -59,19 +15,25 @@ export async function getCompletion(
try {
if (onStream) {
console.log("got started");
const resp = await openai.chat.completions.create(
{ ...input, stream: true },
{
...input,
stream: true,
openpipe: {
tags: {
prompt_id: "getCompletion",
stream: "true",
},
},
},
{
maxRetries: 0,
},
);
for await (const part of resp) {
console.log("got part", part);
finalCompletion = mergeStreamedChunks(finalCompletion, part);
finalCompletion = mergeChunks(finalCompletion, part);
onStream(finalCompletion);
}
console.log("got final", finalCompletion);
if (!finalCompletion) {
return {
type: "error",
@@ -81,7 +43,16 @@ export async function getCompletion(
}
} else {
const resp = await openai.chat.completions.create(
{ ...input, stream: false },
{
...input,
stream: false,
openpipe: {
tags: {
prompt_id: "getCompletion",
stream: "false",
},
},
},
{
maxRetries: 0,
},

View File

@@ -12,7 +12,6 @@ export const refinementActions: Record<string, RefinementAction> = {
definePrompt("openai/ChatCompletion", {
model: "gpt-4",
stream: true,
messages: [
{
role: "system",
@@ -29,7 +28,6 @@ export const refinementActions: Record<string, RefinementAction> = {
definePrompt("openai/ChatCompletion", {
model: "gpt-4",
stream: true,
messages: [
{
role: "system",
@@ -126,7 +124,6 @@ export const refinementActions: Record<string, RefinementAction> = {
definePrompt("openai/ChatCompletion", {
model: "gpt-4",
stream: true,
messages: [
{
role: "system",
@@ -143,7 +140,6 @@ export const refinementActions: Record<string, RefinementAction> = {
definePrompt("openai/ChatCompletion", {
model: "gpt-4",
stream: true,
messages: [
{
role: "system",
@@ -237,7 +233,6 @@ export const refinementActions: Record<string, RefinementAction> = {
definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo",
stream: true,
messages: [
{
role: "system",

View File

@@ -0,0 +1,98 @@
import { type OpenpipeChatOutput, type SupportedModel } from ".";
import { type FrontendModelProvider } from "../types";
import { refinementActions } from "./refinementActions";
import {
templateOpenOrcaPrompt,
templateAlpacaInstructPrompt,
// templateSystemUserAssistantPrompt,
templateInstructionInputResponsePrompt,
templateAiroborosPrompt,
templateGryphePrompt,
templateVicunaPrompt,
} from "./templatePrompt";
const frontendModelProvider: FrontendModelProvider<SupportedModel, OpenpipeChatOutput> = {
name: "OpenAI ChatCompletion",
models: {
"Open-Orca/OpenOrcaxOpenChat-Preview2-13B": {
name: "OpenOrcaxOpenChat-Preview2-13B",
contextWindow: 4096,
pricePerSecond: 0.0003,
speed: "medium",
provider: "openpipe/Chat",
learnMoreUrl: "https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B",
templatePrompt: templateOpenOrcaPrompt,
},
"Open-Orca/OpenOrca-Platypus2-13B": {
name: "OpenOrca-Platypus2-13B",
contextWindow: 4096,
pricePerSecond: 0.0003,
speed: "medium",
provider: "openpipe/Chat",
learnMoreUrl: "https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B",
templatePrompt: templateAlpacaInstructPrompt,
defaultStopTokens: ["</s>"],
},
// "stabilityai/StableBeluga-13B": {
// name: "StableBeluga-13B",
// contextWindow: 4096,
// pricePerSecond: 0.0003,
// speed: "medium",
// provider: "openpipe/Chat",
// learnMoreUrl: "https://huggingface.co/stabilityai/StableBeluga-13B",
// templatePrompt: templateSystemUserAssistantPrompt,
// },
"NousResearch/Nous-Hermes-Llama2-13b": {
name: "Nous-Hermes-Llama2-13b",
contextWindow: 4096,
pricePerSecond: 0.0003,
speed: "medium",
provider: "openpipe/Chat",
learnMoreUrl: "https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b",
templatePrompt: templateInstructionInputResponsePrompt,
},
"jondurbin/airoboros-l2-13b-gpt4-2.0": {
name: "airoboros-l2-13b-gpt4-2.0",
contextWindow: 4096,
pricePerSecond: 0.0003,
speed: "medium",
provider: "openpipe/Chat",
learnMoreUrl: "https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-2.0",
templatePrompt: templateAiroborosPrompt,
},
"lmsys/vicuna-13b-v1.5": {
name: "vicuna-13b-v1.5",
contextWindow: 4096,
pricePerSecond: 0.0003,
speed: "medium",
provider: "openpipe/Chat",
learnMoreUrl: "https://huggingface.co/lmsys/vicuna-13b-v1.5",
templatePrompt: templateVicunaPrompt,
},
"Gryphe/MythoMax-L2-13b": {
name: "MythoMax-L2-13b",
contextWindow: 4096,
pricePerSecond: 0.0003,
speed: "medium",
provider: "openpipe/Chat",
learnMoreUrl: "https://huggingface.co/Gryphe/MythoMax-L2-13b",
templatePrompt: templateGryphePrompt,
},
"NousResearch/Nous-Hermes-llama-2-7b": {
name: "Nous-Hermes-llama-2-7b",
contextWindow: 4096,
pricePerSecond: 0.0003,
speed: "medium",
provider: "openpipe/Chat",
learnMoreUrl: "https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b",
templatePrompt: templateInstructionInputResponsePrompt,
},
},
refinementActions,
normalizeOutput: (output) => ({ type: "text", value: output }),
};
export default frontendModelProvider;

View File

@@ -0,0 +1,121 @@
/* eslint-disable @typescript-eslint/no-unsafe-call */
import { isArray, isString } from "lodash-es";
import OpenAI, { APIError } from "openai";
import { type CompletionResponse } from "../types";
import { type OpenpipeChatInput, type OpenpipeChatOutput } from ".";
import frontendModelProvider from "./frontend";
const modelEndpoints: Record<OpenpipeChatInput["model"], string> = {
"Open-Orca/OpenOrcaxOpenChat-Preview2-13B": "https://5ef82gjxk8kdys-8000.proxy.runpod.net/v1",
"Open-Orca/OpenOrca-Platypus2-13B": "https://lt5qlel6qcji8t-8000.proxy.runpod.net/v1",
// "stabilityai/StableBeluga-13B": "https://vcorl8mxni2ou1-8000.proxy.runpod.net/v1",
"NousResearch/Nous-Hermes-Llama2-13b": "https://ncv8pw3u0vb8j2-8000.proxy.runpod.net/v1",
"jondurbin/airoboros-l2-13b-gpt4-2.0": "https://9nrbx7oph4btou-8000.proxy.runpod.net/v1",
"lmsys/vicuna-13b-v1.5": "https://h88hkt3ux73rb7-8000.proxy.runpod.net/v1",
"Gryphe/MythoMax-L2-13b": "https://3l5jvhnxdgky3v-8000.proxy.runpod.net/v1",
"NousResearch/Nous-Hermes-llama-2-7b": "https://ua1bpc6kv3dgge-8000.proxy.runpod.net/v1",
};
export async function getCompletion(
input: OpenpipeChatInput,
onStream: ((partialOutput: OpenpipeChatOutput) => void) | null,
): Promise<CompletionResponse<OpenpipeChatOutput>> {
const { model, messages, ...rest } = input;
const templatedPrompt = frontendModelProvider.models[model].templatePrompt?.(messages);
if (!templatedPrompt) {
return {
type: "error",
message: "Failed to generate prompt",
autoRetry: false,
};
}
const openai = new OpenAI({
baseURL: modelEndpoints[model],
});
const start = Date.now();
let finalCompletion: OpenpipeChatOutput = "";
const completionParams = {
model,
prompt: templatedPrompt,
...rest,
};
if (!completionParams.stop && frontendModelProvider.models[model].defaultStopTokens) {
completionParams.stop = frontendModelProvider.models[model].defaultStopTokens;
}
try {
if (onStream) {
const resp = await openai.completions.create(
{ ...completionParams, stream: true },
{
maxRetries: 0,
},
);
for await (const part of resp) {
finalCompletion += part.choices[0]?.text;
onStream(finalCompletion);
}
if (!finalCompletion) {
return {
type: "error",
message: "Streaming failed to return a completion",
autoRetry: false,
};
}
} else {
const resp = await openai.completions.create(
{ ...completionParams, stream: false },
{
maxRetries: 0,
},
);
finalCompletion = resp.choices[0]?.text || "";
if (!finalCompletion) {
return {
type: "error",
message: "Failed to return a completion",
autoRetry: false,
};
}
}
const timeToComplete = Date.now() - start;
return {
type: "success",
statusCode: 200,
value: finalCompletion,
timeToComplete,
};
} catch (error: unknown) {
if (error instanceof APIError) {
// The types from the sdk are wrong
const rawMessage = error.message as string | string[];
// If the message is not a string, stringify it
const message = isString(rawMessage)
? rawMessage
: isArray(rawMessage)
? rawMessage.map((m) => m.toString()).join("\n")
: (rawMessage as any).toString();
return {
type: "error",
message,
autoRetry: error.status === 429 || error.status === 503,
statusCode: error.status,
};
} else {
console.error(error);
return {
type: "error",
message: (error as Error).message,
autoRetry: true,
};
}
}
}

View File

@@ -0,0 +1,54 @@
import { type JSONSchema4 } from "json-schema";
import { type ModelProvider } from "../types";
import inputSchema from "./input.schema.json";
import { getCompletion } from "./getCompletion";
import frontendModelProvider from "./frontend";
const supportedModels = [
"Open-Orca/OpenOrcaxOpenChat-Preview2-13B",
"Open-Orca/OpenOrca-Platypus2-13B",
// "stabilityai/StableBeluga-13B",
"NousResearch/Nous-Hermes-Llama2-13b",
"jondurbin/airoboros-l2-13b-gpt4-2.0",
"lmsys/vicuna-13b-v1.5",
"Gryphe/MythoMax-L2-13b",
"NousResearch/Nous-Hermes-llama-2-7b",
] as const;
export type SupportedModel = (typeof supportedModels)[number];
export type OpenpipeChatInput = {
model: SupportedModel;
messages: {
role: "system" | "user" | "assistant";
content: string;
}[];
temperature?: number;
top_p?: number;
stop?: string[] | string;
max_tokens?: number;
presence_penalty?: number;
frequency_penalty?: number;
};
export type OpenpipeChatOutput = string;
export type OpenpipeChatModelProvider = ModelProvider<
SupportedModel,
OpenpipeChatInput,
OpenpipeChatOutput
>;
const modelProvider: OpenpipeChatModelProvider = {
getModel: (input) => input.model,
inputSchema: inputSchema as JSONSchema4,
canStream: true,
getCompletion,
getUsage: (input, output) => {
// TODO: Implement this
return null;
},
...frontendModelProvider,
};
export default modelProvider;

View File

@@ -0,0 +1,96 @@
{
"type": "object",
"properties": {
"model": {
"description": "ID of the model to use.",
"example": "Open-Orca/OpenOrcaxOpenChat-Preview2-13B",
"type": "string",
"enum": [
"Open-Orca/OpenOrcaxOpenChat-Preview2-13B",
"Open-Orca/OpenOrca-Platypus2-13B",
"NousResearch/Nous-Hermes-Llama2-13b",
"jondurbin/airoboros-l2-13b-gpt4-2.0",
"lmsys/vicuna-13b-v1.5",
"Gryphe/MythoMax-L2-13b",
"NousResearch/Nous-Hermes-llama-2-7b"
]
},
"messages": {
"description": "A list of messages comprising the conversation so far.",
"type": "array",
"minItems": 1,
"items": {
"type": "object",
"properties": {
"role": {
"type": "string",
"enum": ["system", "user", "assistant"],
"description": "The role of the messages author. One of `system`, `user`, or `assistant`."
},
"content": {
"type": "string",
"description": "The contents of the message. `content` is required for all messages."
}
},
"required": ["role", "content"]
}
},
"temperature": {
"type": "number",
"minimum": 0,
"maximum": 2,
"default": 1,
"example": 1,
"nullable": true,
"description": "What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.\n\nWe generally recommend altering this or `top_p` but not both.\n"
},
"top_p": {
"type": "number",
"minimum": 0,
"maximum": 1,
"default": 1,
"example": 1,
"nullable": true,
"description": "An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered.\n\nWe generally recommend altering this or `temperature` but not both.\n"
},
"stop": {
"description": "Up to 4 sequences where the API will stop generating further tokens.\n",
"default": null,
"oneOf": [
{
"type": "string",
"nullable": true
},
{
"type": "array",
"minItems": 1,
"maxItems": 4,
"items": {
"type": "string"
}
}
]
},
"max_tokens": {
"description": "The maximum number of [tokens](/tokenizer) to generate in the chat completion.\n\nThe total length of input tokens and generated tokens is limited by the model's context length. [Example Python code](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb) for counting tokens.\n",
"type": "integer"
},
"presence_penalty": {
"type": "number",
"default": 0,
"minimum": -2,
"maximum": 2,
"nullable": true,
"description": "Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.\n\n[See more information about frequency and presence penalties.](/docs/api-reference/parameter-details)\n"
},
"frequency_penalty": {
"type": "number",
"default": 0,
"minimum": -2,
"maximum": 2,
"nullable": true,
"description": "Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.\n\n[See more information about frequency and presence penalties.](/docs/api-reference/parameter-details)\n"
}
},
"required": ["model", "messages"]
}

View File

@@ -0,0 +1,3 @@
import { type RefinementAction } from "../types";
export const refinementActions: Record<string, RefinementAction> = {};

View File

@@ -0,0 +1,274 @@
import { type OpenpipeChatInput } from ".";
// User: Hello<|end_of_turn|>Assistant: Hi<|end_of_turn|>User: How are you today?<|end_of_turn|>Assistant:
export const templateOpenOrcaPrompt = (messages: OpenpipeChatInput["messages"]) => {
const splitter = "<|end_of_turn|>";
const formattedMessages = messages.map((message) => {
if (message.role === "system" || message.role === "user") {
return "User: " + message.content;
} else {
return "Assistant: " + message.content;
}
});
let prompt = formattedMessages.join(splitter);
// Ensure that the prompt ends with an assistant message
const lastUserIndex = prompt.lastIndexOf("User:");
const lastAssistantIndex = prompt.lastIndexOf("Assistant:");
if (lastUserIndex > lastAssistantIndex) {
prompt += splitter + "Assistant:";
}
return prompt;
};
// ### Instruction:
// <prompt> (without the <>)
// ### Response: (leave two newlines for model to respond)
export const templateAlpacaInstructPrompt = (messages: OpenpipeChatInput["messages"]) => {
const splitter = "\n\n";
const userTag = "### Instruction:\n\n";
const assistantTag = "### Response:\n\n";
const formattedMessages = messages.map((message) => {
if (message.role === "system" || message.role === "user") {
return userTag + message.content;
} else {
return assistantTag + message.content;
}
});
let prompt = formattedMessages.join(splitter);
// Ensure that the prompt ends with an assistant message
const lastUserIndex = prompt.lastIndexOf(userTag);
const lastAssistantIndex = prompt.lastIndexOf(assistantTag);
if (lastUserIndex > lastAssistantIndex) {
prompt += splitter + assistantTag;
}
return prompt;
};
// ### System:
// This is a system prompt, please behave and help the user.
// ### User:
// Your prompt here
// ### Assistant
// The output of Stable Beluga 13B
export const templateSystemUserAssistantPrompt = (messages: OpenpipeChatInput["messages"]) => {
const splitter = "\n\n";
const systemTag = "### System:\n";
const userTag = "### User:\n";
const assistantTag = "### Assistant\n";
const formattedMessages = messages.map((message) => {
if (message.role === "system") {
return systemTag + message.content;
} else if (message.role === "user") {
return userTag + message.content;
} else {
return assistantTag + message.content;
}
});
let prompt = formattedMessages.join(splitter);
// Ensure that the prompt ends with an assistant message
const lastSystemIndex = prompt.lastIndexOf(systemTag);
const lastUserIndex = prompt.lastIndexOf(userTag);
const lastAssistantIndex = prompt.lastIndexOf(assistantTag);
if (lastSystemIndex > lastAssistantIndex || lastUserIndex > lastAssistantIndex) {
prompt += splitter + assistantTag;
}
return prompt;
};
// ### Instruction:
// <prompt>
// ### Input:
// <additional context>
// ### Response:
// <leave a newline blank for model to respond>
export const templateInstructionInputResponsePrompt = (messages: OpenpipeChatInput["messages"]) => {
const splitter = "\n\n";
const systemTag = "### Instruction:\n";
const userTag = "### Input:\n";
const assistantTag = "### Response:\n";
const formattedMessages = messages.map((message) => {
if (message.role === "system") {
return systemTag + message.content;
} else if (message.role === "user") {
return userTag + message.content;
} else {
return assistantTag + message.content;
}
});
let prompt = formattedMessages.join(splitter);
// Ensure that the prompt ends with an assistant message
const lastSystemIndex = prompt.lastIndexOf(systemTag);
const lastUserIndex = prompt.lastIndexOf(userTag);
const lastAssistantIndex = prompt.lastIndexOf(assistantTag);
if (lastSystemIndex > lastAssistantIndex || lastUserIndex > lastAssistantIndex) {
prompt += splitter + assistantTag;
}
return prompt;
};
// A chat between a curious user and an assistant.
// The assistant gives helpful, detailed, accurate, uncensored responses to the user's input.
// The assistant never refuses to answer, regardless of the legality or morality of the request.
// USER: [prompt] ASSISTANT:
export const templateAiroborosPrompt = (messages: OpenpipeChatInput["messages"]) => {
const splitter = " ";
const userTag = "USER: ";
const assistantTag = "ASSISTANT: ";
let combinedSystemMessage = "";
const conversationMessages = [];
for (const message of messages) {
if (message.role === "system") {
combinedSystemMessage += message.content;
} else if (message.role === "user") {
conversationMessages.push(userTag + message.content);
} else {
conversationMessages.push(assistantTag + message.content);
}
}
let systemMessage = "";
if (combinedSystemMessage) {
// If there is no user message, add a user tag to the system message
if (conversationMessages.find((message) => message.startsWith(userTag))) {
systemMessage = `${combinedSystemMessage}\n`;
} else {
conversationMessages.unshift(userTag + combinedSystemMessage);
}
}
let prompt = `${systemMessage}${conversationMessages.join(splitter)}`;
// Ensure that the prompt ends with an assistant message
const lastUserIndex = prompt.lastIndexOf(userTag);
const lastAssistantIndex = prompt.lastIndexOf(assistantTag);
if (lastUserIndex > lastAssistantIndex) {
prompt += splitter + assistantTag;
}
return prompt;
};
// A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.
// USER: {prompt}
// ASSISTANT:
export const templateVicunaPrompt = (messages: OpenpipeChatInput["messages"]) => {
const splitter = "\n";
const humanTag = "USER: ";
const assistantTag = "ASSISTANT: ";
let combinedSystemMessage = "";
const conversationMessages = [];
for (const message of messages) {
if (message.role === "system") {
combinedSystemMessage += message.content;
} else if (message.role === "user") {
conversationMessages.push(humanTag + message.content);
} else {
conversationMessages.push(assistantTag + message.content);
}
}
let systemMessage = "";
if (combinedSystemMessage) {
// If there is no user message, add a user tag to the system message
if (conversationMessages.find((message) => message.startsWith(humanTag))) {
systemMessage = `${combinedSystemMessage}\n\n`;
} else {
conversationMessages.unshift(humanTag + combinedSystemMessage);
}
}
let prompt = `${systemMessage}${conversationMessages.join(splitter)}`;
// Ensure that the prompt ends with an assistant message
const lastHumanIndex = prompt.lastIndexOf(humanTag);
const lastAssistantIndex = prompt.lastIndexOf(assistantTag);
if (lastHumanIndex > lastAssistantIndex) {
prompt += splitter + assistantTag;
}
return prompt.trim();
};
// <System prompt/Character Card>
// ### Instruction:
// Your instruction or question here.
// For roleplay purposes, I suggest the following - Write <CHAR NAME>'s next reply in a chat between <YOUR NAME> and <CHAR NAME>. Write a single reply only.
// ### Response:
export const templateGryphePrompt = (messages: OpenpipeChatInput["messages"]) => {
const splitter = "\n\n";
const instructionTag = "### Instruction:\n";
const responseTag = "### Response:\n";
let combinedSystemMessage = "";
const conversationMessages = [];
for (const message of messages) {
if (message.role === "system") {
combinedSystemMessage += message.content;
} else if (message.role === "user") {
conversationMessages.push(instructionTag + message.content);
} else {
conversationMessages.push(responseTag + message.content);
}
}
let systemMessage = "";
if (combinedSystemMessage) {
// If there is no user message, add a user tag to the system message
if (conversationMessages.find((message) => message.startsWith(instructionTag))) {
systemMessage = `${combinedSystemMessage}\n\n`;
} else {
conversationMessages.unshift(instructionTag + combinedSystemMessage);
}
}
let prompt = `${systemMessage}${conversationMessages.join(splitter)}`;
// Ensure that the prompt ends with an assistant message
const lastInstructionIndex = prompt.lastIndexOf(instructionTag);
const lastAssistantIndex = prompt.lastIndexOf(responseTag);
if (lastInstructionIndex > lastAssistantIndex) {
prompt += splitter + responseTag;
}
return prompt;
};

View File

@@ -8,7 +8,7 @@ const replicate = new Replicate({
});
const modelIds: Record<ReplicateLlama2Input["model"], string> = {
"7b-chat": "4f0b260b6a13eb53a6b1891f089d57c08f41003ae79458be5011303d81a394dc",
"7b-chat": "7b0bfc9aff140d5b75bacbed23e91fd3c34b01a1e958d32132de6e0a19796e2c",
"13b-chat": "2a7f981751ec7fdf87b5b91ad4db53683a98082e9ff7bfd12c8cd5ea85980a52",
"70b-chat": "2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1",
};

View File

@@ -2,11 +2,13 @@ import { type JSONSchema4 } from "json-schema";
import { type IconType } from "react-icons";
import { type JsonValue } from "type-fest";
import { z } from "zod";
import { type OpenpipeChatInput } from "./openpipe-chat";
export const ZodSupportedProvider = z.union([
z.literal("openai/ChatCompletion"),
z.literal("replicate/llama2"),
z.literal("anthropic/completion"),
z.literal("openpipe/Chat"),
]);
export type SupportedProvider = z.infer<typeof ZodSupportedProvider>;
@@ -22,6 +24,8 @@ export type Model = {
description?: string;
learnMoreUrl?: string;
apiDocsUrl?: string;
templatePrompt?: (initialPrompt: OpenpipeChatInput["messages"]) => string;
defaultStopTokens?: string[];
};
export type ProviderModel = { provider: z.infer<typeof ZodSupportedProvider>; model: string };

View File

@@ -0,0 +1,54 @@
import { Card, Table, Tbody, Td, Th, Thead, Tr } from "@chakra-ui/react";
import dayjs from "dayjs";
import { isDate, isObject, isString } from "lodash-es";
import AppShell from "~/components/nav/AppShell";
import { type RouterOutputs, api } from "~/utils/api";
const fieldsToShow: (keyof RouterOutputs["adminJobs"]["list"][0])[] = [
"id",
"queue_name",
"payload",
"priority",
"attempts",
"last_error",
"created_at",
"key",
"locked_at",
"run_at",
];
export default function Jobs() {
const jobs = api.adminJobs.list.useQuery({});
return (
<AppShell title="Admin Jobs">
<Card m={4} overflowX="auto">
<Table>
<Thead>
<Tr>
{fieldsToShow.map((field) => (
<Th key={field}>{field}</Th>
))}
</Tr>
</Thead>
<Tbody>
{jobs.data?.map((job) => (
<Tr key={job.id}>
{fieldsToShow.map((field) => {
// Check if object
let value = job[field];
if (isDate(value)) {
value = dayjs(value).format("YYYY-MM-DD HH:mm:ss");
} else if (isObject(value) && !isString(value)) {
value = JSON.stringify(value);
} // check if date
return <Td key={field}>{value}</Td>;
})}
</Tr>
))}
</Tbody>
</Table>
</Card>
</AppShell>
);
}

View File

@@ -33,7 +33,7 @@ export default function Dashboard() {
);
return (
<AppShell title="Dashboard" requireAuth>
<AppShell title="Dashboard" requireAuth requireBeta>
<VStack px={8} py={8} alignItems="flex-start" spacing={4}>
<Text fontSize="2xl" fontWeight="bold">
Dashboard

View File

@@ -1,97 +0,0 @@
import {
Box,
Breadcrumb,
BreadcrumbItem,
Center,
Flex,
Icon,
Input,
VStack,
} from "@chakra-ui/react";
import Link from "next/link";
import { useRouter } from "next/router";
import { useState, useEffect } from "react";
import { RiDatabase2Line } from "react-icons/ri";
import AppShell from "~/components/nav/AppShell";
import { api } from "~/utils/api";
import { useDataset, useHandledAsyncCallback } from "~/utils/hooks";
import DatasetEntriesTable from "~/components/datasets/DatasetEntriesTable";
import { DatasetHeaderButtons } from "~/components/datasets/DatasetHeaderButtons/DatasetHeaderButtons";
import PageHeaderContainer from "~/components/nav/PageHeaderContainer";
import ProjectBreadcrumbContents from "~/components/nav/ProjectBreadcrumbContents";
export default function Dataset() {
const router = useRouter();
const utils = api.useContext();
const dataset = useDataset();
const datasetId = router.query.id as string;
const [name, setName] = useState(dataset.data?.name || "");
useEffect(() => {
setName(dataset.data?.name || "");
}, [dataset.data?.name]);
const updateMutation = api.datasets.update.useMutation();
const [onSaveName] = useHandledAsyncCallback(async () => {
if (name && name !== dataset.data?.name && dataset.data?.id) {
await updateMutation.mutateAsync({
id: dataset.data.id,
updates: { name: name },
});
await Promise.all([utils.datasets.list.invalidate(), utils.datasets.get.invalidate()]);
}
}, [updateMutation, dataset.data?.id, dataset.data?.name, name]);
if (!dataset.isLoading && !dataset.data) {
return (
<AppShell title="Dataset not found">
<Center h="100%">
<div>Dataset not found 😕</div>
</Center>
</AppShell>
);
}
return (
<AppShell title={dataset.data?.name}>
<VStack h="full">
<PageHeaderContainer>
<Breadcrumb>
<BreadcrumbItem>
<ProjectBreadcrumbContents projectName={dataset.data?.project?.name} />
</BreadcrumbItem>
<BreadcrumbItem>
<Link href="/data">
<Flex alignItems="center" _hover={{ textDecoration: "underline" }}>
<Icon as={RiDatabase2Line} boxSize={4} mr={2} /> Datasets
</Flex>
</Link>
</BreadcrumbItem>
<BreadcrumbItem isCurrentPage>
<Input
size="sm"
value={name}
onChange={(e) => setName(e.target.value)}
onBlur={onSaveName}
borderWidth={1}
borderColor="transparent"
fontSize={16}
px={0}
minW={{ base: 100, lg: 300 }}
flex={1}
_hover={{ borderColor: "gray.300" }}
_focus={{ borderColor: "blue.500", outline: "none" }}
/>
</BreadcrumbItem>
</Breadcrumb>
<DatasetHeaderButtons />
</PageHeaderContainer>
<Box w="full" overflowX="auto" flex={1} px={8} pt={8} pb={16}>
{datasetId && <DatasetEntriesTable />}
</Box>
</VStack>
</AppShell>
);
}

View File

@@ -1,49 +0,0 @@
import { SimpleGrid, Icon, Breadcrumb, BreadcrumbItem, Flex } from "@chakra-ui/react";
import AppShell from "~/components/nav/AppShell";
import { RiDatabase2Line } from "react-icons/ri";
import {
DatasetCard,
DatasetCardSkeleton,
NewDatasetCard,
} from "~/components/datasets/DatasetCard";
import PageHeaderContainer from "~/components/nav/PageHeaderContainer";
import ProjectBreadcrumbContents from "~/components/nav/ProjectBreadcrumbContents";
import { useDatasets } from "~/utils/hooks";
export default function DatasetsPage() {
const datasets = useDatasets();
return (
<AppShell title="Data" requireAuth>
<PageHeaderContainer>
<Breadcrumb>
<BreadcrumbItem>
<ProjectBreadcrumbContents />
</BreadcrumbItem>
<BreadcrumbItem minH={8}>
<Flex alignItems="center">
<Icon as={RiDatabase2Line} boxSize={4} mr={2} /> Datasets
</Flex>
</BreadcrumbItem>
</Breadcrumb>
</PageHeaderContainer>
<SimpleGrid w="full" columns={{ base: 1, md: 2, lg: 3, xl: 4 }} spacing={8} py={4} px={8}>
<NewDatasetCard />
{datasets.data && !datasets.isLoading ? (
datasets?.data?.map((dataset) => (
<DatasetCard
key={dataset.id}
dataset={{ ...dataset, numEntries: dataset._count.datasetEntries }}
/>
))
) : (
<>
<DatasetCardSkeleton />
<DatasetCardSkeleton />
<DatasetCardSkeleton />
</>
)}
</SimpleGrid>
</AppShell>
);
}

View File

@@ -26,26 +26,6 @@ import Head from "next/head";
import PageHeaderContainer from "~/components/nav/PageHeaderContainer";
import ProjectBreadcrumbContents from "~/components/nav/ProjectBreadcrumbContents";
// TODO: import less to fix deployment with server side props
// export const getServerSideProps = async (context: GetServerSidePropsContext<{ id: string }>) => {
// const experimentId = context.params?.id as string;
// const helpers = createServerSideHelpers({
// router: appRouter,
// ctx: createInnerTRPCContext({ session: null }),
// transformer: superjson, // optional - adds superjson serialization
// });
// // prefetch query
// await helpers.experiments.stats.prefetch({ id: experimentId });
// return {
// props: {
// trpcState: helpers.dehydrate(),
// },
// };
// };
export default function Experiment() {
const router = useRouter();
const utils = api.useContext();
@@ -53,9 +33,9 @@ export default function Experiment() {
const experiment = useExperiment();
const experimentStats = api.experiments.stats.useQuery(
{ id: router.query.id as string },
{ id: experiment.data?.id as string },
{
enabled: !!router.query.id,
enabled: !!experiment.data?.id,
},
);
const stats = experimentStats.data;
@@ -144,8 +124,8 @@ export default function Experiment() {
<ExperimentHeaderButtons />
</PageHeaderContainer>
<ExperimentSettingsDrawer />
<Box w="100%" overflowX="auto" flex={1}>
<OutputsTable experimentId={router.query.id as string | undefined} />
<Box w="100%" overflowX="auto" flex={1} id="output-container">
<OutputsTable experimentId={experiment.data?.id} />
</Box>
</VStack>
</AppShell>

View File

@@ -0,0 +1,18 @@
import { Text, VStack, Divider } from "@chakra-ui/react";
import FineTunesTable from "~/components/fineTunes/FineTunesTable";
import AppShell from "~/components/nav/AppShell";
export default function FineTunes() {
return (
<AppShell title="Fine Tunes" requireAuth requireBeta>
<VStack px={8} py={8} alignItems="flex-start" spacing={4} w="full">
<Text fontSize="2xl" fontWeight="bold">
Fine Tunes
</Text>
<Divider />
<FineTunesTable />
</VStack>
</AppShell>
);
}

View File

@@ -0,0 +1,110 @@
import { Center, Text, VStack, HStack, Button, Card } from "@chakra-ui/react";
import { useRouter } from "next/router";
import AppShell from "~/components/nav/AppShell";
import { api } from "~/utils/api";
import { useHandledAsyncCallback } from "~/utils/hooks";
import { useAppStore } from "~/state/store";
import { useSyncVariantEditor } from "~/state/sync";
import { maybeReportError } from "~/utils/errorHandling/maybeReportError";
export default function Invitation() {
const router = useRouter();
const utils = api.useContext();
useSyncVariantEditor();
const setSelectedProjectId = useAppStore((state) => state.setSelectedProjectId);
const invitationToken = router.query.invitationToken as string | undefined;
const invitation = api.users.getProjectInvitation.useQuery(
{ invitationToken: invitationToken as string },
{ enabled: !!invitationToken },
);
const cancelMutation = api.users.cancelProjectInvitation.useMutation();
const [declineInvitation, isDeclining] = useHandledAsyncCallback(async () => {
if (invitationToken) {
await cancelMutation.mutateAsync({
invitationToken,
});
await router.replace("/");
}
}, [cancelMutation, invitationToken]);
const acceptMutation = api.users.acceptProjectInvitation.useMutation();
const [acceptInvitation, isAccepting] = useHandledAsyncCallback(async () => {
if (invitationToken) {
const resp = await acceptMutation.mutateAsync({
invitationToken,
});
if (!maybeReportError(resp) && resp) {
await utils.projects.list.invalidate();
setSelectedProjectId(resp.payload);
}
await router.replace("/");
}
}, [acceptMutation, invitationToken]);
if (invitation.isLoading) {
return (
<AppShell requireAuth title="Loading...">
<Center h="full">
<Text>Loading...</Text>
</Center>
</AppShell>
);
}
if (!invitationToken || !invitation.data) {
return (
<AppShell requireAuth title="Invalid invitation token">
<Center h="full">
<Text>
The invitation you've received is invalid or expired. Please ask your project admin for
a new token.
</Text>
</Center>
</AppShell>
);
}
return (
<>
<AppShell requireAuth title="Invitation">
<Center h="full">
<Card>
<VStack
spacing={8}
w="full"
maxW="2xl"
p={16}
borderWidth={1}
borderRadius={8}
bgColor="white"
>
<Text fontSize="lg" fontWeight="bold">
You're invited! 🎉
</Text>
<Text textAlign="center">
You've been invited to join <b>{invitation.data.project.name}</b> by{" "}
<b>
{invitation.data.sender.name} ({invitation.data.sender.email})
</b>
.
</Text>
<HStack spacing={4}>
<Button colorScheme="gray" isLoading={isDeclining} onClick={declineInvitation}>
Decline
</Button>
<Button colorScheme="orange" isLoading={isAccepting} onClick={acceptInvitation}>
Accept
</Button>
</HStack>
</VStack>
</Card>
</Center>
</AppShell>
</>
);
}

View File

@@ -9,9 +9,11 @@ import {
Divider,
Icon,
useDisclosure,
Box,
Tooltip,
} from "@chakra-ui/react";
import { useEffect, useState } from "react";
import { BsTrash } from "react-icons/bs";
import { BsPlus, BsTrash } from "react-icons/bs";
import AppShell from "~/components/nav/AppShell";
import PageHeaderContainer from "~/components/nav/PageHeaderContainer";
@@ -21,6 +23,8 @@ import ProjectBreadcrumbContents from "~/components/nav/ProjectBreadcrumbContent
import CopiableCode from "~/components/CopiableCode";
import { DeleteProjectDialog } from "~/components/projectSettings/DeleteProjectDialog";
import AutoResizeTextArea from "~/components/AutoResizeTextArea";
import MemberTable from "~/components/projectSettings/MemberTable";
import { InviteMemberModal } from "~/components/projectSettings/InviteMemberModal";
export default function Settings() {
const utils = api.useContext();
@@ -50,12 +54,13 @@ export default function Settings() {
setName(selectedProject?.name);
}, [selectedProject?.name]);
const deleteProjectOpen = useDisclosure();
const inviteMemberModal = useDisclosure();
const deleteProjectDialog = useDisclosure();
return (
<>
<AppShell>
<PageHeaderContainer>
<AppShell requireAuth>
<PageHeaderContainer px={{ base: 4, md: 8 }}>
<Breadcrumb>
<BreadcrumbItem>
<ProjectBreadcrumbContents />
@@ -65,7 +70,7 @@ export default function Settings() {
</BreadcrumbItem>
</Breadcrumb>
</PageHeaderContainer>
<VStack px={8} py={4} alignItems="flex-start" spacing={4}>
<VStack px={{ base: 4, md: 8 }} py={4} alignItems="flex-start" spacing={4}>
<VStack spacing={0} alignItems="flex-start">
<Text fontSize="2xl" fontWeight="bold">
Project Settings
@@ -109,6 +114,37 @@ export default function Settings() {
</Button>
</VStack>
<Divider backgroundColor="gray.300" />
<VStack w="full" alignItems="flex-start">
<Subtitle>Project Members</Subtitle>
<Text fontSize="sm">
Add members to your project to allow them to view and edit your project's data.
</Text>
<Box mt={4} w="full">
<MemberTable />
</Box>
<Tooltip
isDisabled={selectedProject?.role === "ADMIN"}
label="Only admins can invite new members"
hasArrow
>
<Button
variant="outline"
colorScheme="orange"
borderRadius={4}
onClick={inviteMemberModal.onOpen}
mt={2}
_disabled={{
opacity: 0.6,
}}
isDisabled={selectedProject?.role !== "ADMIN"}
>
<Icon as={BsPlus} boxSize={5} />
<Text>Invite New Member</Text>
</Button>
</Tooltip>
</VStack>
<Divider backgroundColor="gray.300" />
<VStack alignItems="flex-start">
<Subtitle>Project API Key</Subtitle>
<Text fontSize="sm">
@@ -141,7 +177,7 @@ export default function Settings() {
borderRadius={4}
mt={2}
height="auto"
onClick={deleteProjectOpen.onOpen}
onClick={deleteProjectDialog.onOpen}
>
<Icon as={BsTrash} />
<Text overflowWrap="break-word" whiteSpace="normal" py={2}>
@@ -153,7 +189,11 @@ export default function Settings() {
</VStack>
</VStack>
</AppShell>
<DeleteProjectDialog isOpen={deleteProjectOpen.isOpen} onClose={deleteProjectOpen.onClose} />
<InviteMemberModal isOpen={inviteMemberModal.isOpen} onClose={inviteMemberModal.onClose} />
<DeleteProjectDialog
isOpen={deleteProjectDialog.isOpen}
onClose={deleteProjectDialog.onClose}
/>
</>
);
}

View File

@@ -1,4 +1,5 @@
import { Text, VStack, Divider, HStack } from "@chakra-ui/react";
import { useState } from "react";
import { Text, VStack, Divider, HStack, Box } from "@chakra-ui/react";
import AppShell from "~/components/nav/AppShell";
import LoggedCallTable from "~/components/requestLogs/LoggedCallsTable";
@@ -6,29 +7,50 @@ import LoggedCallsPaginator from "~/components/requestLogs/LoggedCallsPaginator"
import ActionButton from "~/components/requestLogs/ActionButton";
import { useAppStore } from "~/state/store";
import { RiFlaskLine } from "react-icons/ri";
import { FiFilter } from "react-icons/fi";
import LogFilters from "~/components/requestLogs/LogFilters/LogFilters";
import ColumnVisiblityDropdown from "~/components/requestLogs/ColumnVisiblityDropdown";
import FineTuneButton from "~/components/requestLogs/FineTuneButton";
import ExportButton from "~/components/requestLogs/ExportButton";
export default function LoggedCalls() {
const selectedLogIds = useAppStore((s) => s.selectedLogs.selectedLogIds);
const [filtersShown, setFiltersShown] = useState(true);
return (
<AppShell title="Request Logs" requireAuth>
<VStack px={8} py={8} alignItems="flex-start" spacing={4} w="full">
<Text fontSize="2xl" fontWeight="bold">
Request Logs
</Text>
<Divider />
<HStack w="full" justifyContent="flex-end">
<ActionButton
onClick={() => {
console.log("experimenting with these ids", selectedLogIds);
}}
label="Experiment"
icon={RiFlaskLine}
isDisabled={selectedLogIds.size === 0}
/>
</HStack>
<LoggedCallTable />
<LoggedCallsPaginator />
</VStack>
<AppShell title="Request Logs" requireAuth requireBeta>
<Box h="100vh" overflowY="scroll">
<VStack px={8} py={8} alignItems="flex-start" spacing={4} w="full">
<Text fontSize="2xl" fontWeight="bold">
Request Logs
</Text>
<Divider />
<HStack w="full" justifyContent="flex-end">
<FineTuneButton />
<ActionButton
onClick={() => {
console.log("experimenting with these ids", selectedLogIds);
}}
label="Experiment"
icon={RiFlaskLine}
isDisabled={selectedLogIds.size === 0}
/>
<ExportButton />
<ColumnVisiblityDropdown />
<ActionButton
onClick={() => {
setFiltersShown(!filtersShown);
}}
label={filtersShown ? "Hide Filters" : "Show Filters"}
icon={FiFilter}
/>
</HStack>
{filtersShown && <LogFilters />}
<LoggedCallTable />
<LoggedCallsPaginator />
</VStack>
</Box>
</AppShell>
);
}

View File

@@ -1,108 +0,0 @@
import { type ChatCompletion } from "openai/resources/chat";
import { openai } from "../../utils/openai";
import { isAxiosError } from "./utils";
import { type APIResponse } from "openai/core";
import { sleep } from "~/server/utils/sleep";
const MAX_AUTO_RETRIES = 50;
const MIN_DELAY = 500; // milliseconds
const MAX_DELAY = 15000; // milliseconds
function calculateDelay(numPreviousTries: number): number {
const baseDelay = Math.min(MAX_DELAY, MIN_DELAY * Math.pow(2, numPreviousTries));
const jitter = Math.random() * baseDelay;
return baseDelay + jitter;
}
const getCompletionWithBackoff = async (
getCompletion: () => Promise<APIResponse<ChatCompletion>>,
) => {
let completion;
let tries = 0;
while (tries < MAX_AUTO_RETRIES) {
try {
completion = await getCompletion();
break;
} catch (e) {
if (isAxiosError(e)) {
console.error(e?.response?.data?.error?.message);
} else {
await sleep(calculateDelay(tries));
console.error(e);
}
}
tries++;
}
return completion;
};
// TODO: Add seeds to ensure batches don't contain duplicate data
const MAX_BATCH_SIZE = 5;
export const autogenerateDatasetEntries = async (
numToGenerate: number,
inputDescription: string,
outputDescription: string,
): Promise<{ input: string; output: string }[]> => {
const batchSizes = Array.from({ length: Math.ceil(numToGenerate / MAX_BATCH_SIZE) }, (_, i) =>
i === Math.ceil(numToGenerate / MAX_BATCH_SIZE) - 1 && numToGenerate % MAX_BATCH_SIZE
? numToGenerate % MAX_BATCH_SIZE
: MAX_BATCH_SIZE,
);
const getCompletion = (batchSize: number) =>
openai.chat.completions.create({
model: "gpt-4",
messages: [
{
role: "system",
content: `The user needs ${batchSize} rows of data, each with an input and an output.\n---\n The input should follow these requirements: ${inputDescription}\n---\n The output should follow these requirements: ${outputDescription}`,
},
],
functions: [
{
name: "add_list_of_data",
description: "Add a list of data to the database",
parameters: {
type: "object",
properties: {
rows: {
type: "array",
description: "The rows of data that match the description",
items: {
type: "object",
properties: {
input: {
type: "string",
description: "The input for this row",
},
output: {
type: "string",
description: "The output for this row",
},
},
},
},
},
},
},
],
function_call: { name: "add_list_of_data" },
temperature: 0.5,
});
const completionCallbacks = batchSizes.map((batchSize) =>
getCompletionWithBackoff(() => getCompletion(batchSize)),
);
const completions = await Promise.all(completionCallbacks);
const rows = completions.flatMap((completion) => {
const parsed = JSON.parse(
completion?.choices[0]?.message?.function_call?.arguments ?? "{rows: []}",
) as { rows: { input: string; output: string }[] };
return parsed.rows;
});
return rows;
};

View File

@@ -98,6 +98,11 @@ export const autogenerateScenarioValues = async (
function_call: { name: "add_scenario" },
temperature: 0.5,
openpipe: {
tags: {
prompt_id: "autogenerateScenarioValues",
},
},
});
const parsed = JSON.parse(

View File

@@ -66,7 +66,7 @@ export const v1ApiRouter = createOpenApiRouter({
if (!existingResponse) return { respPayload: null };
await prisma.loggedCall.create({
const newCall = await prisma.loggedCall.create({
data: {
projectId: ctx.key.projectId,
requestedAt: new Date(input.requestedAt),
@@ -75,7 +75,7 @@ export const v1ApiRouter = createOpenApiRouter({
},
});
await createTags(existingResponse.originalLoggedCallId, input.tags);
await createTags(newCall.projectId, newCall.id, input.tags);
return {
respPayload: existingResponse.respPayload,
};
@@ -107,7 +107,7 @@ export const v1ApiRouter = createOpenApiRouter({
.default({}),
}),
)
.output(z.void())
.output(z.object({ status: z.union([z.literal("ok"), z.literal("error")]) }))
.mutation(async ({ input, ctx }) => {
const reqPayload = await reqValidator.spa(input.reqPayload);
const respPayload = await respValidator.spa(input.respPayload);
@@ -165,7 +165,8 @@ export const v1ApiRouter = createOpenApiRouter({
}),
]);
await createTags(newLoggedCallId, input.tags);
await createTags(ctx.key.projectId, newLoggedCallId, input.tags);
return { status: "ok" };
}),
localTestingOnlyGetLatestLoggedCall: openApiProtectedProc
.meta({
@@ -207,6 +208,7 @@ export const v1ApiRouter = createOpenApiRouter({
createdAt: true,
cacheHit: true,
tags: true,
id: true,
modelResponse: {
select: {
id: true,
@@ -228,10 +230,11 @@ export const v1ApiRouter = createOpenApiRouter({
}),
});
async function createTags(loggedCallId: string, tags: Record<string, string>) {
async function createTags(projectId: string, loggedCallId: string, tags: Record<string, string>) {
const tagsToCreate = Object.entries(tags).map(([name, value]) => ({
projectId,
loggedCallId,
name: name.replaceAll(/[^a-zA-Z0-9_$]/g, "_"),
name: name.replaceAll(/[^a-zA-Z0-9_$.]/g, "_"),
value,
}));
await prisma.loggedCallTag.createMany({

View File

@@ -6,11 +6,12 @@ import { scenarioVariantCellsRouter } from "./routers/scenarioVariantCells.route
import { scenarioVarsRouter } from "./routers/scenarioVariables.router";
import { evaluationsRouter } from "./routers/evaluations.router";
import { worldChampsRouter } from "./routers/worldChamps.router";
import { datasetsRouter } from "./routers/datasets.router";
import { datasetEntries } from "./routers/datasetEntries.router";
import { projectsRouter } from "./routers/projects.router";
import { dashboardRouter } from "./routers/dashboard.router";
import { loggedCallsRouter } from "./routers/loggedCalls.router";
import { fineTunesRouter } from "./routers/fineTunes.router";
import { usersRouter } from "./routers/users.router";
import { adminJobsRouter } from "./routers/adminJobs.router";
/**
* This is the primary router for your server.
@@ -25,11 +26,12 @@ export const appRouter = createTRPCRouter({
scenarioVars: scenarioVarsRouter,
evaluations: evaluationsRouter,
worldChamps: worldChampsRouter,
datasets: datasetsRouter,
datasetEntries: datasetEntries,
projects: projectsRouter,
dashboard: dashboardRouter,
loggedCalls: loggedCallsRouter,
fineTunes: fineTunesRouter,
users: usersRouter,
adminJobs: adminJobsRouter,
});
// export type definition of API

View File

@@ -0,0 +1,18 @@
import { z } from "zod";
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
import { kysely } from "~/server/db";
import { requireIsAdmin } from "~/utils/accessControl";
export const adminJobsRouter = createTRPCRouter({
list: protectedProcedure.input(z.object({})).query(async ({ ctx }) => {
await requireIsAdmin(ctx);
return await kysely
.selectFrom("graphile_worker.jobs")
.limit(100)
.selectAll()
.orderBy("created_at", "desc")
.execute();
}),
});

View File

@@ -1,145 +0,0 @@
import { z } from "zod";
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
import { prisma } from "~/server/db";
import { requireCanModifyDataset, requireCanViewDataset } from "~/utils/accessControl";
import { autogenerateDatasetEntries } from "../autogenerate/autogenerateDatasetEntries";
export const datasetEntries = createTRPCRouter({
list: protectedProcedure
.input(z.object({ datasetId: z.string(), page: z.number(), pageSize: z.number() }))
.query(async ({ input, ctx }) => {
await requireCanViewDataset(input.datasetId, ctx);
const { datasetId, page, pageSize } = input;
const entries = await prisma.datasetEntry.findMany({
where: {
datasetId,
},
orderBy: { createdAt: "desc" },
skip: (page - 1) * pageSize,
take: pageSize,
});
const count = await prisma.datasetEntry.count({
where: {
datasetId,
},
});
return {
entries,
count,
};
}),
createOne: protectedProcedure
.input(
z.object({
datasetId: z.string(),
input: z.string(),
output: z.string().optional(),
}),
)
.mutation(async ({ input, ctx }) => {
await requireCanModifyDataset(input.datasetId, ctx);
return await prisma.datasetEntry.create({
data: {
datasetId: input.datasetId,
input: input.input,
output: input.output,
},
});
}),
autogenerateEntries: protectedProcedure
.input(
z.object({
datasetId: z.string(),
numToGenerate: z.number(),
inputDescription: z.string(),
outputDescription: z.string(),
}),
)
.mutation(async ({ input, ctx }) => {
await requireCanModifyDataset(input.datasetId, ctx);
const dataset = await prisma.dataset.findUnique({
where: {
id: input.datasetId,
},
});
if (!dataset) {
throw new Error(`Dataset with id ${input.datasetId} does not exist`);
}
const entries = await autogenerateDatasetEntries(
input.numToGenerate,
input.inputDescription,
input.outputDescription,
);
const createdEntries = await prisma.datasetEntry.createMany({
data: entries.map((entry) => ({
datasetId: input.datasetId,
input: entry.input,
output: entry.output,
})),
});
return createdEntries;
}),
delete: protectedProcedure
.input(z.object({ id: z.string() }))
.mutation(async ({ input, ctx }) => {
const datasetId = (
await prisma.datasetEntry.findUniqueOrThrow({
where: { id: input.id },
})
).datasetId;
await requireCanModifyDataset(datasetId, ctx);
return await prisma.datasetEntry.delete({
where: {
id: input.id,
},
});
}),
update: protectedProcedure
.input(
z.object({
id: z.string(),
updates: z.object({
input: z.string(),
output: z.string().optional(),
}),
}),
)
.mutation(async ({ input, ctx }) => {
const existing = await prisma.datasetEntry.findUnique({
where: {
id: input.id,
},
});
if (!existing) {
throw new Error(`dataEntry with id ${input.id} does not exist`);
}
await requireCanModifyDataset(existing.datasetId, ctx);
return await prisma.datasetEntry.update({
where: {
id: input.id,
},
data: {
input: input.updates.input,
output: input.updates.output,
},
});
}),
});

View File

@@ -1,88 +0,0 @@
import { z } from "zod";
import { createTRPCRouter, protectedProcedure, publicProcedure } from "~/server/api/trpc";
import { prisma } from "~/server/db";
import {
requireCanModifyDataset,
requireCanModifyProject,
requireCanViewDataset,
requireCanViewProject,
} from "~/utils/accessControl";
export const datasetsRouter = createTRPCRouter({
list: protectedProcedure
.input(z.object({ projectId: z.string() }))
.query(async ({ input, ctx }) => {
await requireCanViewProject(input.projectId, ctx);
const datasets = await prisma.dataset.findMany({
where: {
projectId: input.projectId,
},
orderBy: {
createdAt: "desc",
},
include: {
_count: {
select: { datasetEntries: true },
},
},
});
return datasets;
}),
get: publicProcedure.input(z.object({ id: z.string() })).query(async ({ input, ctx }) => {
await requireCanViewDataset(input.id, ctx);
return await prisma.dataset.findFirstOrThrow({
where: { id: input.id },
include: {
project: true,
},
});
}),
create: protectedProcedure
.input(z.object({ projectId: z.string() }))
.mutation(async ({ input, ctx }) => {
await requireCanModifyProject(input.projectId, ctx);
const numDatasets = await prisma.dataset.count({
where: {
projectId: input.projectId,
},
});
return await prisma.dataset.create({
data: {
name: `Dataset ${numDatasets + 1}`,
projectId: input.projectId,
},
});
}),
update: protectedProcedure
.input(z.object({ id: z.string(), updates: z.object({ name: z.string() }) }))
.mutation(async ({ input, ctx }) => {
await requireCanModifyDataset(input.id, ctx);
return await prisma.dataset.update({
where: {
id: input.id,
},
data: {
name: input.updates.name,
},
});
}),
delete: protectedProcedure
.input(z.object({ id: z.string() }))
.mutation(async ({ input, ctx }) => {
await requireCanModifyDataset(input.id, ctx);
await prisma.dataset.delete({
where: {
id: input.id,
},
});
}),
});

View File

@@ -85,15 +85,16 @@ export const experimentsRouter = createTRPCRouter({
return experimentsWithCounts;
}),
get: publicProcedure.input(z.object({ id: z.string() })).query(async ({ input, ctx }) => {
await requireCanViewExperiment(input.id, ctx);
get: publicProcedure.input(z.object({ slug: z.string() })).query(async ({ input, ctx }) => {
const experiment = await prisma.experiment.findFirstOrThrow({
where: { id: input.id },
where: { slug: input.slug },
include: {
project: true,
},
});
await requireCanViewExperiment(experiment.id, ctx);
const canModify = ctx.session?.user.id
? await canModifyExperiment(experiment.id, ctx.session?.user.id)
: false;
@@ -177,6 +178,7 @@ export const experimentsRouter = createTRPCRouter({
existingToNewVariantIds.set(variant.id, newVariantId);
variantsToCreate.push({
...variant,
uiId: uuidv4(),
id: newVariantId,
experimentId: newExperimentId,
});
@@ -190,6 +192,7 @@ export const experimentsRouter = createTRPCRouter({
scenariosToCreate.push({
...scenario,
id: newScenarioId,
uiId: uuidv4(),
experimentId: newExperimentId,
variableValues: scenario.variableValues as Prisma.InputJsonValue,
});
@@ -290,7 +293,10 @@ export const experimentsRouter = createTRPCRouter({
}),
]);
return newExperimentId;
const newExperiment = await prisma.experiment.findUniqueOrThrow({
where: { id: newExperimentId },
});
return newExperiment;
}),
create: protectedProcedure
@@ -335,7 +341,6 @@ export const experimentsRouter = createTRPCRouter({
definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo-0613",
stream: true,
messages: [
{
role: "system",

Some files were not shown because too many files have changed in this diff Show More