Compare commits

..

54 Commits

Author SHA1 Message Date
Kyle Corbitt
9e859c199e two LoggedCall tables 2023-08-05 17:56:51 -07:00
Kyle Corbitt
7637b94ea7 schema changes 2023-08-05 13:49:03 -07:00
Kyle Corbitt
721f1726eb Merge pull request #122 from OpenPipe/app-dir
Move app to app/ subdir
2023-08-05 10:08:31 -07:00
Kyle Corbitt
cfeb4dfa92 run render and CI in app subdir 2023-08-05 10:06:06 -07:00
Kyle Corbitt
21ef67ed4c move app to app/ subdir 2023-08-05 10:00:10 -07:00
Kyle Corbitt
7707d451e0 Merge pull request #121 from OpenPipe/prompt-constructor
Rename constructFn to promptConstructor
2023-08-05 09:32:35 -07:00
Kyle Corbitt
d82782adb4 Number experiments based only on current org
Previously we were naming each new experiment based on the highest existing sort index globally, which doesn't make sense. Better to just use the local one.
2023-08-05 09:26:55 -07:00
Kyle Corbitt
e10589abff Rename constructFn to promptConstructor
It's a clearer name. Also reorganize the filesystem so all the promptConstructor related files are colocated.
2023-08-04 23:09:39 -07:00
Kyle Corbitt
01dcbfc896 Rename 'anthropic' to 'anthropic/completion' (#120)
More consistency in the way we name our model providers.
2023-08-04 22:07:23 -07:00
Kyle Corbitt
50e0b34d30 newer replicate models 2023-08-04 21:18:52 -07:00
arcticfly
44bb9fc58d Add outputs to entry generation (#119) 2023-08-04 16:14:49 -07:00
David Corbitt
c0d3784f0c Merge branch 'main' of github.com:corbt/prompt-lab 2023-08-04 16:06:45 -07:00
David Corbitt
e522026b71 Embold star 2023-08-04 16:06:34 -07:00
arcticfly
46b13d85b7 Update README.md 2023-08-04 12:00:38 -07:00
arcticfly
c12aa82a3e Update README.md 2023-08-04 11:58:47 -07:00
arcticfly
b98bce8944 Add Datasets (#118)
* Add dataset (without entries)

* Fix dataset hook

* Add dataset rows

* Add buttons to import/generate data

* Add GenerateDataModal

* Autogenerate and save data

* Fix prettier

* Fix types

* Add dataset pagination

* Fix prettier

* Use useDisclosure

* Allow generate data modal fadeaway

* hide/show data in env var

* Fix prettier
2023-08-04 11:52:03 -07:00
arcticfly
f045c80dfd Update README.md 2023-08-03 18:31:24 -07:00
arcticfly
3b460dff2a Update README.md 2023-08-03 18:16:54 -07:00
David Corbitt
5fa5732804 Move demo up 2023-08-03 12:02:10 -07:00
arcticfly
28e6e2b9df Wrap evals (#117)
* Wrap eval outputs

* Fix prettier

* Decrease variant minWidth
2023-08-03 11:58:39 -07:00
Kyle Corbitt
54d1df4442 upload sourcemaps 2023-08-03 11:53:13 -07:00
David Corbitt
f69c2b5f23 Fix prettier 2023-08-03 11:48:05 -07:00
David Corbitt
51f0666f6a Add table of contents to README 2023-08-03 11:40:29 -07:00
Kyle Corbitt
b67d974f4c Merge pull request #116 from OpenPipe/sentry
Add Sentry
2023-08-03 10:23:22 -07:00
Kyle Corbitt
33fb2db981 Add Sentry
Visibility into errors in prod
2023-08-03 10:18:17 -07:00
Kyle Corbitt
e391379c3e Merge pull request #115 from OpenPipe/admin
Add admin role
2023-08-03 09:39:00 -07:00
Kyle Corbitt
8d1609dd52 Add admin role
Allow privileged users to administer the system.
2023-08-03 09:35:13 -07:00
David Corbitt
f3380f302d Simplify world champs screen 2023-08-02 23:57:44 -07:00
David Corbitt
3dba9c7ee1 Update posthog version 2023-08-02 23:30:15 -07:00
David Corbitt
e0e4f7a9d6 Fix mobile table padding 2023-08-02 23:08:49 -07:00
arcticfly
48293dc579 Add link to demo experiment (#114) 2023-08-02 22:50:09 -07:00
arcticfly
38ac6243a0 Add server posthog events (#113) 2023-08-02 14:21:07 -07:00
arcticfly
bd2f58e2a5 Improve posthog (#112)
* Add SessionIdentifier

* Identify by id

* Rewrite posthog events

* Add NEXT_PUBLIC_HOST to dockerfile

* Fix default url

* Move SessionIdentifier into analytics file
2023-08-02 13:30:25 -07:00
Kyle Corbitt
808e47c6b9 Merge pull request #111 from OpenPipe/gh-btn
Update TopNavbar component to include a GitHub button
2023-08-02 10:15:26 -07:00
Kyle Corbitt
5945f0ed6b Update TopNavbar component to include a GitHub button 2023-08-02 10:11:41 -07:00
arcticfly
6bc7d76d15 Update README.md 2023-08-02 00:59:05 -07:00
arcticfly
e9ed173e34 Update README.md 2023-08-02 00:57:24 -07:00
arcticfly
75d58d7021 Update README.md 2023-08-02 00:56:19 -07:00
arcticfly
896c8c5c57 Update README.md 2023-08-02 00:51:57 -07:00
arcticfly
ec5547d0b0 Update README.md with new features and gifs (#110) 2023-08-02 00:46:48 -07:00
Kyle Corbitt
77e4e3b8c3 mobile styles 2023-08-01 23:08:35 -07:00
Kyle Corbitt
a1b03ddad1 Merge pull request #109 from OpenPipe/debug-prompts
Add debug modal for output cells
2023-08-01 22:51:39 -07:00
Kyle Corbitt
6be32bea4c Add debug modal for output cells
See the actual input that a model got for a specific cell. The formatting isn't great right now; should probably iterate on that.
2023-08-01 22:49:38 -07:00
arcticfly
72c70e2a55 Improve conversion to/from Claude (#108)
* Increase min width of prompt variant

* Increase width of custom instructions input

* Start recording API docs

* Provide better instructions for converting to/from Claude

* Fix prettier
2023-08-01 21:03:23 -07:00
arcticfly
026532f2c2 Model selection styling changes (#107)
* Model selection styling changes

* Fix prettier
2023-08-01 18:45:15 -07:00
Kyle Corbitt
f88538336f fix types 2023-08-01 18:31:34 -07:00
Kyle Corbitt
3c7178115e Merge pull request #105 from OpenPipe/bump-models
Bump Replicate models
2023-08-01 18:26:16 -07:00
Kyle Corbitt
292aaf090a Merge pull request #106 from OpenPipe/dark-mode
Update global background color in ChakraThemeProvider
2023-08-01 18:25:57 -07:00
Kyle Corbitt
d9915dc41b Update global background color in ChakraThemeProvider 2023-08-01 18:25:29 -07:00
David Corbitt
3560bcff14 Correct time stamps on waiting message 2023-08-01 18:09:23 -07:00
Kyle Corbitt
6982339a1a Bump Replicate models 2023-08-01 18:08:02 -07:00
arcticfly
d348b130d5 Add navbar to world-champs (#104)
* Add navbar to world-champs

* Move TopNavbar to signup.tsx
2023-08-01 16:59:46 -07:00
Kyle Corbitt
bf67580991 Merge pull request #103 from OpenPipe/world-champs
add created_at and updated_at to users
2023-08-01 16:49:11 -07:00
Kyle Corbitt
6184498810 Merge pull request #102 from OpenPipe/world-champs
world champs signup
2023-08-01 13:09:06 -07:00
213 changed files with 2778 additions and 726 deletions

View File

@@ -6,6 +6,10 @@ on:
push: push:
branches: [main] branches: [main]
defaults:
run:
working-directory: app
jobs: jobs:
run-checks: run-checks:
runs-on: ubuntu-latest runs-on: ubuntu-latest

View File

@@ -1,52 +1,62 @@
<img src="https://github.com/openpipe/openpipe/assets/41524992/ca59596e-eb80-40f9-921f-6d67f6e6d8fa" width="72px" /> <!-- <img src="https://github.com/openpipe/openpipe/assets/41524992/ca59596e-eb80-40f9-921f-6d67f6e6d8fa" width="72px" /> -->
# OpenPipe # OpenPipe
OpenPipe is a flexible playground for comparing and optimizing LLM prompts. It lets you quickly generate, test and compare candidate prompts with realistic sample data. OpenPipe is a flexible playground for comparing and optimizing LLM prompts. It lets you quickly generate, test and compare candidate prompts, and can automatically [translate](#-translate-between-model-apis) those prompts between models.
<img src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="demo">
You can use our hosted version of OpenPipe at https://openpipe.ai. You can also clone this repository and [run it locally](#running-locally).
## Sample Experiments ## Sample Experiments
These are simple experiments users have created that show how OpenPipe works. These are simple experiments users have created that show how OpenPipe works. Feel free to fork them and start experimenting yourself.
- [Country Capitals](https://app.openpipe.ai/experiments/11111111-1111-1111-1111-111111111111) - [Twitter Sentiment Analysis](https://app.openpipe.ai/experiments/62c20a73-2012-4a64-973c-4b665ad46a57)
- [Reddit User Needs](https://app.openpipe.ai/experiments/22222222-2222-2222-2222-222222222222) - [Reddit User Needs](https://app.openpipe.ai/experiments/22222222-2222-2222-2222-222222222222)
- [OpenAI Function Calls](https://app.openpipe.ai/experiments/2ebbdcb3-ed51-456e-87dc-91f72eaf3e2b) - [OpenAI Function Calls](https://app.openpipe.ai/experiments/2ebbdcb3-ed51-456e-87dc-91f72eaf3e2b)
- [Activity Classification](https://app.openpipe.ai/experiments/3950940f-ab6b-4b74-841d-7e9dbc4e4ff8) - [Activity Classification](https://app.openpipe.ai/experiments/3950940f-ab6b-4b74-841d-7e9dbc4e4ff8)
<img src="https://github.com/openpipe/openpipe/assets/176426/fc7624c6-5b65-4d4d-82b7-4a816f3e5678" alt="demo" height="400px">
You can use our hosted version of OpenPipe at [https://openpipe.ai]. You can also clone this repository and [run it locally](#running-locally).
## High-Level Features
**Configure Multiple Prompts**
Set up multiple prompt configurations and compare their output side-by-side. Each configuration can be configured independently.
**Visualize Responses**
Inspect prompt completions side-by-side.
**Test Many Inputs**
OpenPipe lets you _template_ a prompt. Use the templating feature to run the prompts you're testing against many potential inputs for broader coverage of your problem space than you'd get with manual testing.
**🪄 Auto-generate Test Scenarios**
OpenPipe includes a tool to generate new test scenarios based on your existing prompts and scenarios. Just click "Autogenerate Scenario" to try it out!
**Prompt Validation and Typeahead**
We use OpenAI's OpenAPI spec to automatically provide typeahead and validate prompts.
<img alt="typeahead" src="https://github.com/openpipe/openpipe/assets/176426/acc638f8-d851-4742-8d01-fe6f98890840" height="300px">
**Function Call Support**
Natively supports [OpenAI function calls](https://openai.com/blog/function-calling-and-other-api-updates) on supported models.
<img height="300px" alt="function calls" src="https://github.com/openpipe/openpipe/assets/176426/48ad13fe-af2f-4294-bf32-62015597fd9b">
## Supported Models ## Supported Models
- All models available through the OpenAI [chat completion API](https://platform.openai.com/docs/guides/gpt/chat-completions-api) - All models available through the OpenAI [chat completion API](https://platform.openai.com/docs/guides/gpt/chat-completions-api)
- Llama2 [7b chat](https://replicate.com/a16z-infra/llama7b-v2-chat), [13b chat](https://replicate.com/a16z-infra/llama13b-v2-chat), [70b chat](https://replicate.com/replicate/llama70b-v2-chat). - Llama2 [7b chat](https://replicate.com/a16z-infra/llama7b-v2-chat), [13b chat](https://replicate.com/a16z-infra/llama13b-v2-chat), [70b chat](https://replicate.com/replicate/llama70b-v2-chat).
- Anthropic's [Claude 1 Instant](https://www.anthropic.com/index/introducing-claude) and [Claude 2](https://www.anthropic.com/index/claude-2) - Anthropic's [Claude 1 Instant](https://www.anthropic.com/index/introducing-claude) and [Claude 2](https://www.anthropic.com/index/claude-2)
## Features
### 🔍 Visualize Responses
Inspect prompt completions side-by-side.
### 🧪 Bulk-Test
OpenPipe lets you _template_ a prompt. Use the templating feature to run the prompts you're testing against many potential inputs for broad coverage of your problem space.
### 📟 Translate between Model APIs
Write your prompt in one format and automatically convert it to work with any other model.
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/1e19ccf2-96b6-4e93-a3a5-1449710d1b5b" alt="translate between models">
<br><br>
### 🛠️ Refine Your Prompts Automatically
Use a growing database of best-practice refinements to improve your prompts automatically.
<img width="480" alt="Screenshot 2023-08-01 at 11 55 38 PM" src="https://github.com/OpenPipe/OpenPipe/assets/41524992/87a27fe7-daef-445c-a5e2-1c82b23f9f99" alt="add function call">
<br><br>
### 🪄 Auto-generate Test Scenarios
OpenPipe includes a tool to generate new test scenarios based on your existing prompts and scenarios. Just click "Autogenerate Scenario" to try it out!
<img width="600" src="https://github.com/openpipe/openpipe/assets/41524992/219a844e-3f4e-4f6b-8066-41348b42977b" alt="auto-generate">
<br><br>
## Running Locally ## Running Locally
1. Install [Postgresql](https://www.postgresql.org/download/). 1. Install [Postgresql](https://www.postgresql.org/download/).

View File

@@ -26,6 +26,8 @@ NEXT_PUBLIC_SOCKET_URL="http://localhost:3318"
NEXTAUTH_SECRET="your_secret" NEXTAUTH_SECRET="your_secret"
NEXTAUTH_URL="http://localhost:3000" NEXTAUTH_URL="http://localhost:3000"
NEXT_PUBLIC_HOST="http://localhost:3000"
# Next Auth Github Provider # Next Auth Github Provider
GITHUB_CLIENT_ID="your_client_id" GITHUB_CLIENT_ID="your_client_id"
GITHUB_CLIENT_SECRET="your_secret" GITHUB_CLIENT_SECRET="your_secret"

View File

@@ -40,3 +40,6 @@ yarn-error.log*
# typescript # typescript
*.tsbuildinfo *.tsbuildinfo
# Sentry Auth Token
.sentryclirc

View File

@@ -14,10 +14,14 @@ declare module "nextjs-routes" {
| StaticRoute<"/account/signin"> | StaticRoute<"/account/signin">
| DynamicRoute<"/api/auth/[...nextauth]", { "nextauth": string[] }> | DynamicRoute<"/api/auth/[...nextauth]", { "nextauth": string[] }>
| StaticRoute<"/api/experiments/og-image"> | StaticRoute<"/api/experiments/og-image">
| StaticRoute<"/api/sentry-example-api">
| DynamicRoute<"/api/trpc/[trpc]", { "trpc": string }> | DynamicRoute<"/api/trpc/[trpc]", { "trpc": string }>
| DynamicRoute<"/data/[id]", { "id": string }>
| StaticRoute<"/data">
| DynamicRoute<"/experiments/[id]", { "id": string }> | DynamicRoute<"/experiments/[id]", { "id": string }>
| StaticRoute<"/experiments"> | StaticRoute<"/experiments">
| StaticRoute<"/"> | StaticRoute<"/">
| StaticRoute<"/sentry-example-page">
| StaticRoute<"/world-champs"> | StaticRoute<"/world-champs">
| StaticRoute<"/world-champs/signup">; | StaticRoute<"/world-champs/signup">;

View File

@@ -20,6 +20,9 @@ FROM base as builder
# Include all NEXT_PUBLIC_* env vars here # Include all NEXT_PUBLIC_* env vars here
ARG NEXT_PUBLIC_POSTHOG_KEY ARG NEXT_PUBLIC_POSTHOG_KEY
ARG NEXT_PUBLIC_SOCKET_URL ARG NEXT_PUBLIC_SOCKET_URL
ARG NEXT_PUBLIC_HOST
ARG NEXT_PUBLIC_SENTRY_DSN
ARG SENTRY_AUTH_TOKEN
WORKDIR /app WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules COPY --from=deps /app/node_modules ./node_modules

61
app/next.config.mjs Normal file
View File

@@ -0,0 +1,61 @@
import nextRoutes from "nextjs-routes/config";
import { withSentryConfig } from "@sentry/nextjs";
/**
* Run `build` or `dev` with `SKIP_ENV_VALIDATION` to skip env validation. This is especially useful
* for Docker builds.
*/
const { env } = await import("./src/env.mjs");
/** @type {import("next").NextConfig} */
let config = {
reactStrictMode: true,
/**
* If you have `experimental: { appDir: true }` set, then you must comment the below `i18n` config
* out.
*
* @see https://github.com/vercel/next.js/issues/41980
*/
i18n: {
locales: ["en"],
defaultLocale: "en",
},
rewrites: async () => [
{
source: "/ingest/:path*",
destination: "https://app.posthog.com/:path*",
},
],
webpack: (config) => {
config.module.rules.push({
test: /\.txt$/,
use: "raw-loader",
});
return config;
},
};
config = nextRoutes()(config);
if (env.NEXT_PUBLIC_SENTRY_DSN && env.SENTRY_AUTH_TOKEN) {
// @ts-expect-error - `withSentryConfig` is not typed correctly
config = withSentryConfig(
config,
{
authToken: env.SENTRY_AUTH_TOKEN,
silent: true,
org: "openpipe",
project: "openpipe",
},
{
widenClientFileUpload: true,
tunnelRoute: "/monitoring",
disableLogger: true,
},
);
}
export default config;

View File

@@ -18,15 +18,18 @@
"start": "next start", "start": "next start",
"codegen": "tsx src/codegen/export-openai-types.ts", "codegen": "tsx src/codegen/export-openai-types.ts",
"seed": "tsx prisma/seed.ts", "seed": "tsx prisma/seed.ts",
"check": "concurrently 'pnpm lint' 'pnpm tsc' 'pnpm prettier . --check'" "check": "concurrently 'pnpm lint' 'pnpm tsc' 'pnpm prettier . --check'",
"test": "pnpm vitest --no-threads"
}, },
"dependencies": { "dependencies": {
"@anthropic-ai/sdk": "^0.5.8", "@anthropic-ai/sdk": "^0.5.8",
"@apidevtools/json-schema-ref-parser": "^10.1.0", "@apidevtools/json-schema-ref-parser": "^10.1.0",
"@babel/preset-typescript": "^7.22.5", "@babel/preset-typescript": "^7.22.5",
"@babel/standalone": "^7.22.9", "@babel/standalone": "^7.22.9",
"@chakra-ui/anatomy": "^2.2.0",
"@chakra-ui/next-js": "^2.1.4", "@chakra-ui/next-js": "^2.1.4",
"@chakra-ui/react": "^2.7.1", "@chakra-ui/react": "^2.7.1",
"@chakra-ui/styled-system": "^2.9.1",
"@emotion/react": "^11.11.1", "@emotion/react": "^11.11.1",
"@emotion/server": "^11.11.0", "@emotion/server": "^11.11.0",
"@emotion/styled": "^11.11.0", "@emotion/styled": "^11.11.0",
@@ -34,6 +37,7 @@
"@monaco-editor/loader": "^1.3.3", "@monaco-editor/loader": "^1.3.3",
"@next-auth/prisma-adapter": "^1.0.5", "@next-auth/prisma-adapter": "^1.0.5",
"@prisma/client": "^4.14.0", "@prisma/client": "^4.14.0",
"@sentry/nextjs": "^7.61.0",
"@t3-oss/env-nextjs": "^0.3.1", "@t3-oss/env-nextjs": "^0.3.1",
"@tabler/icons-react": "^2.22.0", "@tabler/icons-react": "^2.22.0",
"@tanstack/react-query": "^4.29.7", "@tanstack/react-query": "^4.29.7",
@@ -63,15 +67,18 @@
"next-auth": "^4.22.1", "next-auth": "^4.22.1",
"next-query-params": "^4.2.3", "next-query-params": "^4.2.3",
"nextjs-routes": "^2.0.1", "nextjs-routes": "^2.0.1",
"openai": "4.0.0-beta.2", "openai": "4.0.0-beta.7",
"pluralize": "^8.0.0", "pluralize": "^8.0.0",
"posthog-js": "^1.68.4", "posthog-js": "^1.75.3",
"posthog-node": "^3.1.1",
"prettier": "^3.0.0", "prettier": "^3.0.0",
"prismjs": "^1.29.0", "prismjs": "^1.29.0",
"react": "18.2.0", "react": "18.2.0",
"react-diff-viewer": "^3.1.1", "react-diff-viewer": "^3.1.1",
"react-dom": "18.2.0", "react-dom": "18.2.0",
"react-github-btn": "^1.4.0",
"react-icons": "^4.10.1", "react-icons": "^4.10.1",
"react-json-tree": "^0.18.0",
"react-select": "^5.7.4", "react-select": "^5.7.4",
"react-syntax-highlighter": "^15.5.0", "react-syntax-highlighter": "^15.5.0",
"react-textarea-autosize": "^8.5.0", "react-textarea-autosize": "^8.5.0",

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,5 @@
-- CreateEnum
CREATE TYPE "UserRole" AS ENUM ('ADMIN', 'USER');
-- AlterTable
ALTER TABLE "User" ADD COLUMN "role" "UserRole" NOT NULL DEFAULT 'USER';

View File

@@ -0,0 +1,28 @@
-- CreateTable
CREATE TABLE "Dataset" (
"id" UUID NOT NULL,
"name" TEXT NOT NULL,
"organizationId" UUID NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "Dataset_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "DatasetEntry" (
"id" UUID NOT NULL,
"input" TEXT NOT NULL,
"output" TEXT,
"datasetId" UUID NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
CONSTRAINT "DatasetEntry_pkey" PRIMARY KEY ("id")
);
-- AddForeignKey
ALTER TABLE "Dataset" ADD CONSTRAINT "Dataset_organizationId_fkey" FOREIGN KEY ("organizationId") REFERENCES "Organization"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "DatasetEntry" ADD CONSTRAINT "DatasetEntry_datasetId_fkey" FOREIGN KEY ("datasetId") REFERENCES "Dataset"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,13 @@
/*
Warnings:
- You are about to drop the column `constructFn` on the `PromptVariant` table. All the data in the column will be lost.
- You are about to drop the column `constructFnVersion` on the `PromptVariant` table. All the data in the column will be lost.
- Added the required column `promptConstructor` to the `PromptVariant` table without a default value. This is not possible if the table is not empty.
- Added the required column `promptConstructorVersion` to the `PromptVariant` table without a default value. This is not possible if the table is not empty.
*/
-- AlterTable
ALTER TABLE "PromptVariant" RENAME COLUMN "constructFn" TO "promptConstructor";
ALTER TABLE "PromptVariant" RENAME COLUMN "constructFnVersion" TO "promptConstructorVersion";

View File

@@ -31,11 +31,11 @@ model Experiment {
model PromptVariant { model PromptVariant {
id String @id @default(uuid()) @db.Uuid id String @id @default(uuid()) @db.Uuid
label String label String
constructFn String promptConstructor String
constructFnVersion Int promptConstructorVersion Int
model String model String
modelProvider String modelProvider String
uiId String @default(uuid()) @db.Uuid uiId String @default(uuid()) @db.Uuid
visible Boolean @default(true) visible Boolean @default(true)
@@ -174,6 +174,32 @@ model OutputEvaluation {
@@unique([modelResponseId, evaluationId]) @@unique([modelResponseId, evaluationId])
} }
model Dataset {
id String @id @default(uuid()) @db.Uuid
name String
datasetEntries DatasetEntry[]
organizationId String @db.Uuid
organization Organization @relation(fields: [organizationId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
model DatasetEntry {
id String @id @default(uuid()) @db.Uuid
input String
output String?
datasetId String @db.Uuid
dataset Dataset? @relation(fields: [datasetId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
model Organization { model Organization {
id String @id @default(uuid()) @db.Uuid id String @id @default(uuid()) @db.Uuid
personalOrgUserId String? @unique @db.Uuid personalOrgUserId String? @unique @db.Uuid
@@ -183,6 +209,9 @@ model Organization {
updatedAt DateTime @updatedAt updatedAt DateTime @updatedAt
organizationUsers OrganizationUser[] organizationUsers OrganizationUser[]
experiments Experiment[] experiments Experiment[]
datasets Dataset[]
loggedCalls LoggedCall[]
apiKeys ApiKey[]
} }
enum OrganizationUserRole { enum OrganizationUserRole {
@@ -222,6 +251,99 @@ model WorldChampEntrant {
@@unique([userId]) @@unique([userId])
} }
model LoggedCall {
id String @id @default(uuid()) @db.Uuid
startTime DateTime
// True if this call was served from the cache, false otherwise
cacheHit Boolean
// A LoggedCall is always associated with a LoggedCallModelResponse. If this
// is a cache miss, it's a new LoggedCallModelResponse we created for this.
// If it's a cache hit, it's the existing LoggedCallModelResponse we served.
modelResponseId String @db.Uuid
modelResponse LoggedCallModelResponse @relation(fields: [modelResponseId], references: [id], onDelete: Cascade)
// The response created by this LoggedCall. Will be null if this LoggedCall is a cache hit.
createdResponse LoggedCallModelResponse[] @relation(name: "ModelResponseCreatedBy")
organizationId String @db.Uuid
organization Organization? @relation(fields: [organizationId], references: [id], onDelete: Cascade)
tags LoggedCallTag[]
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@index([startTime])
}
model LoggedCallModelResponse {
id String @id @default(uuid()) @db.Uuid
reqPayload Json
// The HTTP status returned by the model provider
respStatus Int?
respPayload Json?
// Should be null if the request was successful, and some string if the request failed.
error String?
startTime DateTime
endTime DateTime
// Note: the function to calculate the cacheKey should include the project
// ID so we don't share cached responses between projects, which could be an
// attack vector. Also, we should only set the cacheKey on the model if the
// request was successful.
cacheKey String?
// Derived fields
durationMs Int?
inputTokens Int?
outputTokens Int?
finishReason String?
completionId String?
totalCost Decimal? @db.Decimal(18, 12)
// The LoggedCall that created this LoggedCallModelResponse
createdById String @unique @db.Uuid
createdBy LoggedCall @relation(name: "ModelResponseCreatedBy", fields: [createdById], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
loggedCalls LoggedCall[]
@@index([cacheKey])
}
model LoggedCallTag {
id String @id @default(cuid())
name String
value String?
loggedCallId String
loggedCall LoggedCall @relation(fields: [loggedCallId], references: [id], onDelete: Cascade)
@@index([name])
@@index([name, value])
}
model ApiKey {
id String @id @default(uuid()) @db.Uuid
name String
apiKey String @unique
organizationId String @db.Uuid
organization Organization? @relation(fields: [organizationId], references: [id], onDelete: Cascade)
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
model Account { model Account {
id String @id @default(uuid()) @db.Uuid id String @id @default(uuid()) @db.Uuid
userId String @db.Uuid userId String @db.Uuid
@@ -249,12 +371,20 @@ model Session {
user User @relation(fields: [userId], references: [id], onDelete: Cascade) user User @relation(fields: [userId], references: [id], onDelete: Cascade)
} }
enum UserRole {
ADMIN
USER
}
model User { model User {
id String @id @default(uuid()) @db.Uuid id String @id @default(uuid()) @db.Uuid
name String? name String?
email String? @unique email String? @unique
emailVerified DateTime? emailVerified DateTime?
image String? image String?
role UserRole @default(USER)
accounts Account[] accounts Account[]
sessions Session[] sessions Session[]
organizationUsers OrganizationUser[] organizationUsers OrganizationUser[]

View File

@@ -1,6 +1,7 @@
import { prisma } from "~/server/db"; import { prisma } from "~/server/db";
import dedent from "dedent"; import dedent from "dedent";
import { generateNewCell } from "~/server/utils/generateNewCell"; import { generateNewCell } from "~/server/utils/generateNewCell";
import { promptConstructorVersion } from "~/promptConstructor/version";
const defaultId = "11111111-1111-1111-1111-111111111111"; const defaultId = "11111111-1111-1111-1111-111111111111";
@@ -51,8 +52,8 @@ await prisma.promptVariant.createMany({
sortIndex: 0, sortIndex: 0,
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
modelProvider: "openai/ChatCompletion", modelProvider: "openai/ChatCompletion",
constructFnVersion: 1, promptConstructorVersion,
constructFn: dedent` promptConstructor: dedent`
definePrompt("openai/ChatCompletion", { definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
messages: [ messages: [
@@ -70,8 +71,8 @@ await prisma.promptVariant.createMany({
sortIndex: 1, sortIndex: 1,
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
modelProvider: "openai/ChatCompletion", modelProvider: "openai/ChatCompletion",
constructFnVersion: 1, promptConstructorVersion,
constructFn: dedent` promptConstructor: dedent`
definePrompt("openai/ChatCompletion", { definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
messages: [ messages: [

View File

@@ -3,6 +3,7 @@ import { generateNewCell } from "~/server/utils/generateNewCell";
import dedent from "dedent"; import dedent from "dedent";
import { execSync } from "child_process"; import { execSync } from "child_process";
import fs from "fs"; import fs from "fs";
import { promptConstructorVersion } from "~/promptConstructor/version";
const defaultId = "11111111-1111-1111-1111-111111111112"; const defaultId = "11111111-1111-1111-1111-111111111112";
@@ -98,8 +99,8 @@ for (const dataset of datasets) {
sortIndex: 0, sortIndex: 0,
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
modelProvider: "openai/ChatCompletion", modelProvider: "openai/ChatCompletion",
constructFnVersion: 1, promptConstructorVersion,
constructFn: dedent` promptConstructor: dedent`
definePrompt("openai/ChatCompletion", { definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
messages: [ messages: [

View File

@@ -2,6 +2,7 @@ import { prisma } from "~/server/db";
import dedent from "dedent"; import dedent from "dedent";
import fs from "fs"; import fs from "fs";
import { parse } from "csv-parse/sync"; import { parse } from "csv-parse/sync";
import { promptConstructorVersion } from "~/promptConstructor/version";
const defaultId = "11111111-1111-1111-1111-111111111112"; const defaultId = "11111111-1111-1111-1111-111111111112";
@@ -85,8 +86,8 @@ await prisma.promptVariant.createMany({
sortIndex: 0, sortIndex: 0,
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
modelProvider: "openai/ChatCompletion", modelProvider: "openai/ChatCompletion",
constructFnVersion: 1, promptConstructorVersion,
constructFn: dedent` promptConstructor: dedent`
definePrompt("openai/ChatCompletion", { definePrompt("openai/ChatCompletion", {
model: "gpt-3.5-turbo-0613", model: "gpt-3.5-turbo-0613",
messages: [ messages: [

View File

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

Before

Width:  |  Height:  |  Size: 6.8 KiB

After

Width:  |  Height:  |  Size: 6.8 KiB

View File

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 22 KiB

View File

Before

Width:  |  Height:  |  Size: 6.1 KiB

After

Width:  |  Height:  |  Size: 6.1 KiB

View File

Before

Width:  |  Height:  |  Size: 704 B

After

Width:  |  Height:  |  Size: 704 B

View File

Before

Width:  |  Height:  |  Size: 1.1 KiB

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

Before

Width:  |  Height:  |  Size: 3.0 KiB

After

Width:  |  Height:  |  Size: 3.0 KiB

View File

Before

Width:  |  Height:  |  Size: 858 B

After

Width:  |  Height:  |  Size: 858 B

View File

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

Before

Width:  |  Height:  |  Size: 62 KiB

After

Width:  |  Height:  |  Size: 62 KiB

View File

@@ -5,6 +5,9 @@ set -e
echo "Migrating the database" echo "Migrating the database"
pnpm prisma migrate deploy pnpm prisma migrate deploy
echo "Migrating promptConstructors"
pnpm tsx src/promptConstructor/migrate.ts
echo "Starting the server" echo "Starting the server"
pnpm concurrently --kill-others \ pnpm concurrently --kill-others \

View File

@@ -0,0 +1,33 @@
// This file configures the initialization of Sentry on the client.
// The config you add here will be used whenever a users loads a page in their browser.
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
import * as Sentry from "@sentry/nextjs";
import { env } from "~/env.mjs";
if (env.NEXT_PUBLIC_SENTRY_DSN) {
Sentry.init({
dsn: env.NEXT_PUBLIC_SENTRY_DSN,
// Adjust this value in production, or use tracesSampler for greater control
tracesSampleRate: 1,
// Setting this option to true will print useful information to the console while you're setting up Sentry.
debug: false,
replaysOnErrorSampleRate: 1.0,
// This sets the sample rate to be 10%. You may want this to be 100% while
// in development and sample at a lower rate in production
replaysSessionSampleRate: 0.1,
// You can remove this option if you're not planning to use the Sentry Session Replay feature:
integrations: [
new Sentry.Replay({
// Additional Replay configuration goes in here, for example:
maskAllText: true,
blockAllMedia: true,
}),
],
});
}

19
app/sentry.edge.config.ts Normal file
View File

@@ -0,0 +1,19 @@
// This file configures the initialization of Sentry for edge features (middleware, edge routes, and so on).
// The config you add here will be used whenever one of the edge features is loaded.
// Note that this config is unrelated to the Vercel Edge Runtime and is also required when running locally.
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
import * as Sentry from "@sentry/nextjs";
import { env } from "~/env.mjs";
if (env.NEXT_PUBLIC_SENTRY_DSN) {
Sentry.init({
dsn: env.NEXT_PUBLIC_SENTRY_DSN,
// Adjust this value in production, or use tracesSampler for greater control
tracesSampleRate: 1,
// Setting this option to true will print useful information to the console while you're setting up Sentry.
debug: false,
});
}

View File

@@ -0,0 +1,18 @@
// This file configures the initialization of Sentry on the server.
// The config you add here will be used whenever the server handles a request.
// https://docs.sentry.io/platforms/javascript/guides/nextjs/
import * as Sentry from "@sentry/nextjs";
import { env } from "~/env.mjs";
if (env.NEXT_PUBLIC_SENTRY_DSN) {
Sentry.init({
dsn: env.NEXT_PUBLIC_SENTRY_DSN,
// Adjust this value in production, or use tracesSampler for greater control
tracesSampleRate: 1,
// Setting this option to true will print useful information to the console while you're setting up Sentry.
debug: false,
});
}

View File

@@ -68,7 +68,7 @@ export const ChangeModelModal = ({
return; return;
await replaceVariantMutation.mutateAsync({ await replaceVariantMutation.mutateAsync({
id: variant.id, id: variant.id,
constructFn: modifiedPromptFn, promptConstructor: modifiedPromptFn,
streamScenarios: visibleScenarios, streamScenarios: visibleScenarios,
}); });
await utils.promptVariants.list.invalidate(); await utils.promptVariants.list.invalidate();
@@ -107,7 +107,7 @@ export const ChangeModelModal = ({
<ModelSearch selectedModel={selectedModel} setSelectedModel={setSelectedModel} /> <ModelSearch selectedModel={selectedModel} setSelectedModel={setSelectedModel} />
{isString(modifiedPromptFn) && ( {isString(modifiedPromptFn) && (
<CompareFunctions <CompareFunctions
originalFunction={variant.constructFn} originalFunction={variant.promptConstructor}
newFunction={modifiedPromptFn} newFunction={modifiedPromptFn}
leftTitle={originalLabel} leftTitle={originalLabel}
rightTitle={convertedLabel} rightTitle={convertedLabel}

View File

@@ -22,8 +22,8 @@ export const ModelSearch = (props: {
const [containerRef, containerDimensions] = useElementDimensions(); const [containerRef, containerDimensions] = useElementDimensions();
return ( return (
<VStack ref={containerRef as LegacyRef<HTMLDivElement>} w="full"> <VStack ref={containerRef as LegacyRef<HTMLDivElement>} w="full" fontFamily="inconsolata">
<Text>Browse Models</Text> <Text fontWeight="bold">Browse Models</Text>
<Select<ProviderModel> <Select<ProviderModel>
styles={{ control: (provided) => ({ ...provided, width: containerDimensions?.width }) }} styles={{ control: (provided) => ({ ...provided, width: containerDimensions?.width }) }}
getOptionLabel={(data) => modelLabel(data.provider, data.model)} getOptionLabel={(data) => modelLabel(data.provider, data.model)}

View File

@@ -23,16 +23,24 @@ export const ModelStatsCard = ({
{label} {label}
</Text> </Text>
<VStack w="full" spacing={6} bgColor="gray.100" p={4} borderRadius={4}> <VStack
w="full"
spacing={6}
borderWidth={1}
borderColor="gray.300"
p={4}
borderRadius={8}
fontFamily="inconsolata"
>
<HStack w="full" align="flex-start"> <HStack w="full" align="flex-start">
<Text flex={1} fontSize="lg"> <VStack flex={1} fontSize="lg" alignItems="flex-start">
<Text as="span" color="gray.600">
{model.provider} /{" "}
</Text>
<Text as="span" fontWeight="bold" color="gray.900"> <Text as="span" fontWeight="bold" color="gray.900">
{model.name} {model.name}
</Text> </Text>
</Text> <Text as="span" color="gray.600" fontSize="sm">
Provider: {model.provider}
</Text>
</VStack>
<Link <Link
href={model.learnMoreUrl} href={model.learnMoreUrl}
isExternal isExternal

View File

@@ -1,18 +1,29 @@
import { Button, Spinner, InputGroup, InputRightElement, Icon, HStack } from "@chakra-ui/react"; import {
Button,
Spinner,
InputGroup,
InputRightElement,
Icon,
HStack,
type InputGroupProps,
} from "@chakra-ui/react";
import { IoMdSend } from "react-icons/io"; import { IoMdSend } from "react-icons/io";
import AutoResizeTextArea from "../AutoResizeTextArea"; import AutoResizeTextArea from "./AutoResizeTextArea";
export const CustomInstructionsInput = ({ export const CustomInstructionsInput = ({
instructions, instructions,
setInstructions, setInstructions,
loading, loading,
onSubmit, onSubmit,
placeholder = "Send custom instructions",
...props
}: { }: {
instructions: string; instructions: string;
setInstructions: (instructions: string) => void; setInstructions: (instructions: string) => void;
loading: boolean; loading: boolean;
onSubmit: () => void; onSubmit: () => void;
}) => { placeholder?: string;
} & InputGroupProps) => {
return ( return (
<InputGroup <InputGroup
size="md" size="md"
@@ -22,6 +33,7 @@ export const CustomInstructionsInput = ({
borderRadius={8} borderRadius={8}
alignItems="center" alignItems="center"
colorScheme="orange" colorScheme="orange"
{...props}
> >
<AutoResizeTextArea <AutoResizeTextArea
value={instructions} value={instructions}
@@ -33,7 +45,7 @@ export const CustomInstructionsInput = ({
onSubmit(); onSubmit();
} }
}} }}
placeholder="Send custom instructions" placeholder={placeholder}
py={4} py={4}
pl={4} pl={4}
pr={12} pr={12}

View File

@@ -1,17 +1,17 @@
import { api } from "~/utils/api"; import { api } from "~/utils/api";
import { type PromptVariant, type Scenario } from "../types"; import { type PromptVariant, type Scenario } from "../types";
import { Text, VStack } from "@chakra-ui/react"; import { type StackProps, Text, VStack } from "@chakra-ui/react";
import { useExperiment, useHandledAsyncCallback } from "~/utils/hooks"; import { useExperiment, useHandledAsyncCallback } from "~/utils/hooks";
import SyntaxHighlighter from "react-syntax-highlighter"; import SyntaxHighlighter from "react-syntax-highlighter";
import { docco } from "react-syntax-highlighter/dist/cjs/styles/hljs"; import { docco } from "react-syntax-highlighter/dist/cjs/styles/hljs";
import stringify from "json-stringify-pretty-compact"; import stringify from "json-stringify-pretty-compact";
import { type ReactElement, useState, useEffect, Fragment } from "react"; import { type ReactElement, useState, useEffect, Fragment, useCallback } from "react";
import useSocket from "~/utils/useSocket"; import useSocket from "~/utils/useSocket";
import { OutputStats } from "./OutputStats"; import { OutputStats } from "./OutputStats";
import { RetryCountdown } from "./RetryCountdown"; import { RetryCountdown } from "./RetryCountdown";
import frontendModelProviders from "~/modelProviders/frontendModelProviders"; import frontendModelProviders from "~/modelProviders/frontendModelProviders";
import { ResponseLog } from "./ResponseLog"; import { ResponseLog } from "./ResponseLog";
import { CellContent } from "./CellContent"; import { CellOptions } from "./TopActions";
const WAITING_MESSAGE_INTERVAL = 20000; const WAITING_MESSAGE_INTERVAL = 20000;
@@ -72,37 +72,49 @@ export default function OutputCell({
// TODO: disconnect from socket if we're not streaming anymore // TODO: disconnect from socket if we're not streaming anymore
const streamedMessage = useSocket<OutputSchema>(cell?.id); const streamedMessage = useSocket<OutputSchema>(cell?.id);
const mostRecentResponse = cell?.modelResponses[cell.modelResponses.length - 1];
const CellWrapper = useCallback(
({ children, ...props }: StackProps) => (
<VStack w="full" alignItems="flex-start" {...props} px={2} py={2} h="100%">
{cell && (
<CellOptions refetchingOutput={hardRefetching} refetchOutput={hardRefetch} cell={cell} />
)}
<VStack w="full" alignItems="flex-start" maxH={500} overflowY="auto" flex={1}>
{children}
</VStack>
{mostRecentResponse && (
<OutputStats modelResponse={mostRecentResponse} scenario={scenario} />
)}
</VStack>
),
[hardRefetching, hardRefetch, mostRecentResponse, scenario, cell],
);
if (!vars) return null; if (!vars) return null;
if (!cell && !fetchingOutput) if (!cell && !fetchingOutput)
return ( return (
<CellContent hardRefetching={hardRefetching} hardRefetch={hardRefetch}> <CellWrapper>
<Text color="gray.500">Error retrieving output</Text> <Text color="gray.500">Error retrieving output</Text>
</CellContent> </CellWrapper>
); );
if (cell && cell.errorMessage) { if (cell && cell.errorMessage) {
return ( return (
<CellContent hardRefetching={hardRefetching} hardRefetch={hardRefetch}> <CellWrapper>
<Text color="red.500">{cell.errorMessage}</Text> <Text color="red.500">{cell.errorMessage}</Text>
</CellContent> </CellWrapper>
); );
} }
if (disabledReason) return <Text color="gray.500">{disabledReason}</Text>; if (disabledReason) return <Text color="gray.500">{disabledReason}</Text>;
const mostRecentResponse = cell?.modelResponses[cell.modelResponses.length - 1];
const showLogs = !streamedMessage && !mostRecentResponse?.output; const showLogs = !streamedMessage && !mostRecentResponse?.output;
if (showLogs) if (showLogs)
return ( return (
<CellContent <CellWrapper alignItems="flex-start" fontFamily="inconsolata, monospace" spacing={0}>
hardRefetching={hardRefetching}
hardRefetch={hardRefetch}
alignItems="flex-start"
fontFamily="inconsolata, monospace"
spacing={0}
>
{cell?.jobQueuedAt && <ResponseLog time={cell.jobQueuedAt} title="Job queued" />} {cell?.jobQueuedAt && <ResponseLog time={cell.jobQueuedAt} title="Job queued" />}
{cell?.jobStartedAt && <ResponseLog time={cell.jobStartedAt} title="Job started" />} {cell?.jobStartedAt && <ResponseLog time={cell.jobStartedAt} title="Job started" />}
{cell?.modelResponses?.map((response) => { {cell?.modelResponses?.map((response) => {
@@ -124,9 +136,13 @@ export default function OutputCell({
Array.from({ length: numWaitingMessages }, (_, i) => ( Array.from({ length: numWaitingMessages }, (_, i) => (
<ResponseLog <ResponseLog
key={`waiting-${i}`} key={`waiting-${i}`}
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion time={
time={new Date(response.requestedAt!.getTime() + i * WAITING_MESSAGE_INTERVAL)} new Date(
title="Waiting for response" (response.requestedAt?.getTime?.() ?? 0) +
(i + 1) * WAITING_MESSAGE_INTERVAL,
)
}
title="Waiting for response..."
/> />
))} ))}
{response.receivedAt && ( {response.receivedAt && (
@@ -144,7 +160,7 @@ export default function OutputCell({
{mostRecentResponse?.retryTime && ( {mostRecentResponse?.retryTime && (
<RetryCountdown retryTime={mostRecentResponse.retryTime} /> <RetryCountdown retryTime={mostRecentResponse.retryTime} />
)} )}
</CellContent> </CellWrapper>
); );
const normalizedOutput = mostRecentResponse?.output const normalizedOutput = mostRecentResponse?.output
@@ -155,50 +171,27 @@ export default function OutputCell({
if (mostRecentResponse?.output && normalizedOutput?.type === "json") { if (mostRecentResponse?.output && normalizedOutput?.type === "json") {
return ( return (
<VStack <CellWrapper>
w="100%" <SyntaxHighlighter
h="100%" customStyle={{ overflowX: "unset", width: "100%", flex: 1 }}
fontSize="xs" language="json"
flexWrap="wrap" style={docco}
overflowX="hidden" lineProps={{
justifyContent="space-between" style: { wordBreak: "break-all", whiteSpace: "pre-wrap" },
> }}
<CellContent wrapLines
hardRefetching={hardRefetching}
hardRefetch={hardRefetch}
w="full"
flex={1}
spacing={0}
> >
<SyntaxHighlighter {stringify(normalizedOutput.value, { maxLength: 40 })}
customStyle={{ overflowX: "unset", width: "100%", flex: 1 }} </SyntaxHighlighter>
language="json" </CellWrapper>
style={docco}
lineProps={{
style: { wordBreak: "break-all", whiteSpace: "pre-wrap" },
}}
wrapLines
>
{stringify(normalizedOutput.value, { maxLength: 40 })}
</SyntaxHighlighter>
</CellContent>
<OutputStats modelResponse={mostRecentResponse} scenario={scenario} />
</VStack>
); );
} }
const contentToDisplay = (normalizedOutput?.type === "text" && normalizedOutput.value) || ""; const contentToDisplay = (normalizedOutput?.type === "text" && normalizedOutput.value) || "";
return ( return (
<VStack w="100%" h="100%" justifyContent="space-between" whiteSpace="pre-wrap"> <CellWrapper>
<VStack w="full" alignItems="flex-start" spacing={0}> <Text>{contentToDisplay}</Text>
<CellContent hardRefetching={hardRefetching} hardRefetch={hardRefetch}> </CellWrapper>
<Text>{contentToDisplay}</Text>
</CellContent>
</VStack>
{mostRecentResponse?.output && (
<OutputStats modelResponse={mostRecentResponse} scenario={scenario} />
)}
</VStack>
); );
} }

View File

@@ -23,8 +23,15 @@ export const OutputStats = ({
const completionTokens = modelResponse.completionTokens; const completionTokens = modelResponse.completionTokens;
return ( return (
<HStack w="full" align="center" color="gray.500" fontSize="2xs" mt={{ base: 0, md: 1 }}> <HStack
<HStack flex={1}> w="full"
align="center"
color="gray.500"
fontSize="2xs"
mt={{ base: 0, md: 1 }}
alignItems="flex-end"
>
<HStack flex={1} flexWrap="wrap">
{modelResponse.outputEvaluations.map((evaluation) => { {modelResponse.outputEvaluations.map((evaluation) => {
const passed = evaluation.result > 0.5; const passed = evaluation.result > 0.5;
return ( return (

View File

@@ -0,0 +1,36 @@
import {
Modal,
ModalBody,
ModalCloseButton,
ModalContent,
ModalHeader,
ModalOverlay,
type UseDisclosureReturn,
} from "@chakra-ui/react";
import { type RouterOutputs } from "~/utils/api";
import { JSONTree } from "react-json-tree";
export default function ExpandedModal(props: {
cell: NonNullable<RouterOutputs["scenarioVariantCells"]["get"]>;
disclosure: UseDisclosureReturn;
}) {
return (
<Modal isOpen={props.disclosure.isOpen} onClose={props.disclosure.onClose} size="2xl">
<ModalOverlay />
<ModalContent>
<ModalHeader>Prompt</ModalHeader>
<ModalCloseButton />
<ModalBody>
<JSONTree
data={props.cell.prompt}
invertTheme={true}
theme="chalk"
shouldExpandNodeInitially={() => true}
getItemString={() => ""}
hideRoot
/>
</ModalBody>
</ModalContent>
</Modal>
);
}

View File

@@ -0,0 +1,53 @@
import { HStack, Icon, IconButton, Spinner, Tooltip, useDisclosure } from "@chakra-ui/react";
import { BsArrowClockwise, BsInfoCircle } from "react-icons/bs";
import { useExperimentAccess } from "~/utils/hooks";
import ExpandedModal from "./PromptModal";
import { type RouterOutputs } from "~/utils/api";
export const CellOptions = ({
cell,
refetchingOutput,
refetchOutput,
}: {
cell: RouterOutputs["scenarioVariantCells"]["get"];
refetchingOutput: boolean;
refetchOutput: () => void;
}) => {
const { canModify } = useExperimentAccess();
const modalDisclosure = useDisclosure();
return (
<HStack justifyContent="flex-end" w="full">
{cell && (
<>
<Tooltip label="See Prompt">
<IconButton
aria-label="See Prompt"
icon={<Icon as={BsInfoCircle} boxSize={4} />}
onClick={modalDisclosure.onOpen}
size="xs"
colorScheme="gray"
color="gray.500"
variant="ghost"
/>
</Tooltip>
<ExpandedModal cell={cell} disclosure={modalDisclosure} />
</>
)}
{canModify && (
<Tooltip label="Refetch output">
<IconButton
size="xs"
color="gray.500"
variant="ghost"
cursor="pointer"
onClick={refetchOutput}
aria-label="refetch output"
icon={<Icon as={refetchingOutput ? Spinner : BsArrowClockwise} boxSize={4} />}
/>
</Tooltip>
)}
</HStack>
);
};

View File

@@ -1,9 +1,8 @@
import { useEffect, type DragEvent } from "react";
import { api } from "~/utils/api";
import { isEqual } from "lodash-es"; import { isEqual } from "lodash-es";
import { type Scenario } from "./types"; import { useEffect, useState, type DragEvent } from "react";
import { api } from "~/utils/api";
import { useExperiment, useExperimentAccess, useHandledAsyncCallback } from "~/utils/hooks"; import { useExperiment, useExperimentAccess, useHandledAsyncCallback } from "~/utils/hooks";
import { useState } from "react"; import { type Scenario } from "./types";
import { import {
Box, Box,
@@ -12,14 +11,12 @@ import {
Icon, Icon,
IconButton, IconButton,
Spinner, Spinner,
Stack, Text,
Tooltip, Tooltip,
VStack, VStack,
Text,
} from "@chakra-ui/react"; } from "@chakra-ui/react";
import { cellPadding } from "../constants";
import { BsArrowsAngleExpand, BsX } from "react-icons/bs"; import { BsArrowsAngleExpand, BsX } from "react-icons/bs";
import { RiDraggable } from "react-icons/ri"; import { cellPadding } from "../constants";
import { FloatingLabelInput } from "./FloatingLabelInput"; import { FloatingLabelInput } from "./FloatingLabelInput";
import { ScenarioEditorModal } from "./ScenarioEditorModal"; import { ScenarioEditorModal } from "./ScenarioEditorModal";
@@ -115,60 +112,44 @@ export default function ScenarioEditor({
onDrop={onReorder} onDrop={onReorder}
backgroundColor={isDragTarget ? "gray.100" : "transparent"} backgroundColor={isDragTarget ? "gray.100" : "transparent"}
> >
{canModify && props.canHide && (
<Stack
alignSelf="flex-start"
opacity={props.hovered ? 1 : 0}
spacing={0}
ml={-cellPadding.x}
>
<Tooltip label="Hide scenario" hasArrow>
{/* for some reason the tooltip can't position itself properly relative to the icon without the wrapping box */}
<Button
variant="unstyled"
color="gray.400"
height="unset"
width="unset"
minW="unset"
onClick={onHide}
_hover={{
color: "gray.800",
cursor: "pointer",
}}
>
<Icon as={hidingInProgress ? Spinner : BsX} boxSize={hidingInProgress ? 4 : 6} />
</Button>
</Tooltip>
<Icon
as={RiDraggable}
boxSize={6}
color="gray.400"
_hover={{ color: "gray.800", cursor: "pointer" }}
/>
</Stack>
)}
{variableLabels.length === 0 ? ( {variableLabels.length === 0 ? (
<Box color="gray.500"> <Box color="gray.500">
{vars.data ? "No scenario variables configured" : "Loading..."} {vars.data ? "No scenario variables configured" : "Loading..."}
</Box> </Box>
) : ( ) : (
<VStack spacing={4} flex={1} py={2}> <VStack spacing={4} flex={1} py={2}>
<HStack justifyContent="space-between" w="100%"> <HStack justifyContent="space-between" w="100%" align="center" spacing={0}>
<Text color="gray.500">Scenario</Text> <Text flex={1}>Scenario</Text>
<IconButton <Tooltip label="Expand" hasArrow>
className="fullscreen-toggle" <IconButton
aria-label="Maximize" aria-label="Expand"
icon={<BsArrowsAngleExpand />} icon={<Icon as={BsArrowsAngleExpand} boxSize={3} />}
onClick={() => setScenarioEditorModalOpen(true)} onClick={() => setScenarioEditorModalOpen(true)}
boxSize={6} size="xs"
borderRadius={4} colorScheme="gray"
p={1.5} color="gray.500"
minW={0} variant="ghost"
colorScheme="gray" />
color="gray.500" </Tooltip>
variant="ghost" {canModify && props.canHide && (
/> <Tooltip label="Delete" hasArrow>
<IconButton
aria-label="Delete"
icon={
<Icon
as={hidingInProgress ? Spinner : BsX}
boxSize={hidingInProgress ? 4 : 6}
/>
}
onClick={onHide}
size="xs"
display="flex"
colorScheme="gray"
color="gray.500"
variant="ghost"
/>
</Tooltip>
)}
</HStack> </HStack>
{variableLabels.map((key) => { {variableLabels.map((key) => {
const value = values[key] ?? ""; const value = values[key] ?? "";

View File

@@ -1,7 +1,6 @@
import { import {
Button, Button,
HStack, HStack,
Icon,
Modal, Modal,
ModalBody, ModalBody,
ModalCloseButton, ModalCloseButton,
@@ -14,7 +13,6 @@ import {
VStack, VStack,
} from "@chakra-ui/react"; } from "@chakra-ui/react";
import { useEffect, useState } from "react"; import { useEffect, useState } from "react";
import { BsFileTextFill } from "react-icons/bs";
import { isEqual } from "lodash-es"; import { isEqual } from "lodash-es";
import { api } from "~/utils/api"; import { api } from "~/utils/api";
@@ -60,8 +58,6 @@ export const ScenarioEditorModal = ({
await utils.scenarios.list.invalidate(); await utils.scenarios.list.invalidate();
}, [mutation, values]); }, [mutation, values]);
console.log("scenario", scenario);
const vars = api.templateVars.list.useQuery({ experimentId: experiment.data?.id ?? "" }); const vars = api.templateVars.list.useQuery({ experimentId: experiment.data?.id ?? "" });
const variableLabels = vars.data?.map((v) => v.label) ?? []; const variableLabels = vars.data?.map((v) => v.label) ?? [];
@@ -73,12 +69,7 @@ export const ScenarioEditorModal = ({
> >
<ModalOverlay /> <ModalOverlay />
<ModalContent w={1200}> <ModalContent w={1200}>
<ModalHeader> <ModalHeader />
<HStack>
<Icon as={BsFileTextFill} />
<Text>Scenario</Text>
</HStack>
</ModalHeader>
<ModalCloseButton /> <ModalCloseButton />
<ModalBody maxW="unset"> <ModalBody maxW="unset">
<VStack spacing={8}> <VStack spacing={8}>

View File

@@ -0,0 +1,21 @@
import { useScenarios } from "~/utils/hooks";
import Paginator from "../Paginator";
const ScenarioPaginator = () => {
const { data } = useScenarios();
if (!data) return null;
const { scenarios, startIndex, lastPage, count } = data;
return (
<Paginator
numItemsLoaded={scenarios.length}
startIndex={startIndex}
lastPage={lastPage}
count={count}
/>
);
};
export default ScenarioPaginator;

View File

@@ -1,6 +1,5 @@
import { Box, GridItem } from "@chakra-ui/react"; import { GridItem } from "@chakra-ui/react";
import React, { useState } from "react"; import React, { useState } from "react";
import { cellPadding } from "../constants";
import OutputCell from "./OutputCell/OutputCell"; import OutputCell from "./OutputCell/OutputCell";
import ScenarioEditor from "./ScenarioEditor"; import ScenarioEditor from "./ScenarioEditor";
import type { PromptVariant, Scenario } from "./types"; import type { PromptVariant, Scenario } from "./types";
@@ -39,9 +38,7 @@ const ScenarioRow = (props: {
colStart={i + 2} colStart={i + 2}
{...borders} {...borders}
> >
<Box h="100%" w="100%" px={cellPadding.x} py={cellPadding.y}> <OutputCell key={variant.id} scenario={props.scenario} variant={variant} />
<OutputCell key={variant.id} scenario={props.scenario} variant={variant} />
</Box>
</GridItem> </GridItem>
))} ))}
</> </>

View File

@@ -47,7 +47,7 @@ export default function VariantEditor(props: { variant: PromptVariant }) {
return () => window.removeEventListener("keydown", handleEsc); return () => window.removeEventListener("keydown", handleEsc);
}, [isFullscreen, toggleFullscreen]); }, [isFullscreen, toggleFullscreen]);
const lastSavedFn = props.variant.constructFn; const lastSavedFn = props.variant.promptConstructor;
const modifierKey = useModifierKeyLabel(); const modifierKey = useModifierKeyLabel();
@@ -96,7 +96,7 @@ export default function VariantEditor(props: { variant: PromptVariant }) {
const resp = await replaceVariant.mutateAsync({ const resp = await replaceVariant.mutateAsync({
id: props.variant.id, id: props.variant.id,
constructFn: currentFn, promptConstructor: currentFn,
streamScenarios: visibleScenarios, streamScenarios: visibleScenarios,
}); });
if (resp.status === "error") { if (resp.status === "error") {

View File

@@ -43,12 +43,12 @@ export default function VariantStats(props: { variant: PromptVariant }) {
return ( return (
<HStack <HStack
justifyContent="space-between" justifyContent="space-between"
alignItems="center" alignItems="flex-end"
mx="2" mx="2"
fontSize="xs" fontSize="xs"
py={cellPadding.y} py={cellPadding.y}
> >
<HStack px={cellPadding.x}> <HStack px={cellPadding.x} flexWrap="wrap">
{showNumFinished && ( {showNumFinished && (
<Text> <Text>
{data.outputCount} / {data.scenarioCount} {data.outputCount} / {data.scenarioCount}

View File

@@ -35,7 +35,7 @@ export default function OutputsTable({ experimentId }: { experimentId: string |
pb={24} pb={24}
pl={8} pl={8}
display="grid" display="grid"
gridTemplateColumns={`250px repeat(${variants.data.length}, minmax(300px, 1fr)) auto`} gridTemplateColumns={`250px repeat(${variants.data.length}, minmax(320px, 1fr)) auto`}
sx={{ sx={{
"> *": { "> *": {
borderColor: "gray.300", borderColor: "gray.300",

View File

@@ -5,15 +5,20 @@ import {
BsChevronLeft, BsChevronLeft,
BsChevronRight, BsChevronRight,
} from "react-icons/bs"; } from "react-icons/bs";
import { usePage, useScenarios } from "~/utils/hooks"; import { usePage } from "~/utils/hooks";
const ScenarioPaginator = () => { const Paginator = ({
numItemsLoaded,
startIndex,
lastPage,
count,
}: {
numItemsLoaded: number;
startIndex: number;
lastPage: number;
count: number;
}) => {
const [page, setPage] = usePage(); const [page, setPage] = usePage();
const { data } = useScenarios();
if (!data) return null;
const { scenarios, startIndex, lastPage, count } = data;
const nextPage = () => { const nextPage = () => {
if (page < lastPage) { if (page < lastPage) {
@@ -49,7 +54,7 @@ const ScenarioPaginator = () => {
icon={<BsChevronLeft />} icon={<BsChevronLeft />}
/> />
<Box> <Box>
{startIndex}-{startIndex + scenarios.length - 1} / {count} {startIndex}-{startIndex + numItemsLoaded - 1} / {count}
</Box> </Box>
<IconButton <IconButton
variant="ghost" variant="ghost"
@@ -71,4 +76,4 @@ const ScenarioPaginator = () => {
); );
}; };
export default ScenarioPaginator; export default Paginator;

View File

@@ -20,7 +20,7 @@ import { useHandledAsyncCallback, useVisibleScenarioIds } from "~/utils/hooks";
import { type PromptVariant } from "@prisma/client"; import { type PromptVariant } from "@prisma/client";
import { useState } from "react"; import { useState } from "react";
import CompareFunctions from "./CompareFunctions"; import CompareFunctions from "./CompareFunctions";
import { CustomInstructionsInput } from "./CustomInstructionsInput"; import { CustomInstructionsInput } from "../CustomInstructionsInput";
import { RefineAction } from "./RefineAction"; import { RefineAction } from "./RefineAction";
import { isObject, isString } from "lodash-es"; import { isObject, isString } from "lodash-es";
import { type RefinementAction, type SupportedProvider } from "~/modelProviders/types"; import { type RefinementAction, type SupportedProvider } from "~/modelProviders/types";
@@ -73,7 +73,7 @@ export const RefinePromptModal = ({
return; return;
await replaceVariantMutation.mutateAsync({ await replaceVariantMutation.mutateAsync({
id: variant.id, id: variant.id,
constructFn: refinedPromptFn, promptConstructor: refinedPromptFn,
streamScenarios: visibleScenarios, streamScenarios: visibleScenarios,
}); });
await utils.promptVariants.list.invalidate(); await utils.promptVariants.list.invalidate();
@@ -97,7 +97,7 @@ export const RefinePromptModal = ({
<ModalCloseButton /> <ModalCloseButton />
<ModalBody maxW="unset"> <ModalBody maxW="unset">
<VStack spacing={8}> <VStack spacing={8}>
<VStack spacing={4}> <VStack spacing={4} w="full">
{Object.keys(refinementActions).length && ( {Object.keys(refinementActions).length && (
<> <>
<SimpleGrid columns={{ base: 1, md: 2 }} spacing={8}> <SimpleGrid columns={{ base: 1, md: 2 }} spacing={8}>
@@ -122,11 +122,11 @@ export const RefinePromptModal = ({
instructions={instructions} instructions={instructions}
setInstructions={setInstructions} setInstructions={setInstructions}
loading={modificationInProgress} loading={modificationInProgress}
onSubmit={getModifiedPromptFn} onSubmit={() => getModifiedPromptFn()}
/> />
</VStack> </VStack>
<CompareFunctions <CompareFunctions
originalFunction={variant.constructFn} originalFunction={variant.promptConstructor}
newFunction={isString(refinedPromptFn) ? refinedPromptFn : undefined} newFunction={isString(refinedPromptFn) ? refinedPromptFn : undefined}
maxH="40vh" maxH="40vh"
/> />

View File

@@ -0,0 +1,110 @@
import {
HStack,
Icon,
VStack,
Text,
Divider,
Spinner,
AspectRatio,
SkeletonText,
} from "@chakra-ui/react";
import { RiDatabase2Line } from "react-icons/ri";
import { formatTimePast } from "~/utils/dayjs";
import Link from "next/link";
import { useRouter } from "next/router";
import { BsPlusSquare } from "react-icons/bs";
import { api } from "~/utils/api";
import { useHandledAsyncCallback } from "~/utils/hooks";
type DatasetData = {
name: string;
numEntries: number;
id: string;
createdAt: Date;
updatedAt: Date;
};
export const DatasetCard = ({ dataset }: { dataset: DatasetData }) => {
return (
<AspectRatio ratio={1.2} w="full">
<VStack
as={Link}
href={{ pathname: "/data/[id]", query: { id: dataset.id } }}
bg="gray.50"
_hover={{ bg: "gray.100" }}
transition="background 0.2s"
cursor="pointer"
borderColor="gray.200"
borderWidth={1}
p={4}
justify="space-between"
>
<HStack w="full" color="gray.700" justify="center">
<Icon as={RiDatabase2Line} boxSize={4} />
<Text fontWeight="bold">{dataset.name}</Text>
</HStack>
<HStack h="full" spacing={4} flex={1} align="center">
<CountLabel label="Rows" count={dataset.numEntries} />
</HStack>
<HStack w="full" color="gray.500" fontSize="xs" textAlign="center">
<Text flex={1}>Created {formatTimePast(dataset.createdAt)}</Text>
<Divider h={4} orientation="vertical" />
<Text flex={1}>Updated {formatTimePast(dataset.updatedAt)}</Text>
</HStack>
</VStack>
</AspectRatio>
);
};
const CountLabel = ({ label, count }: { label: string; count: number }) => {
return (
<VStack alignItems="center" flex={1}>
<Text color="gray.500" fontWeight="bold">
{label}
</Text>
<Text fontSize="sm" color="gray.500">
{count}
</Text>
</VStack>
);
};
export const NewDatasetCard = () => {
const router = useRouter();
const createMutation = api.datasets.create.useMutation();
const [createDataset, isLoading] = useHandledAsyncCallback(async () => {
const newDataset = await createMutation.mutateAsync({ label: "New Dataset" });
await router.push({ pathname: "/data/[id]", query: { id: newDataset.id } });
}, [createMutation, router]);
return (
<AspectRatio ratio={1.2} w="full">
<VStack
align="center"
justify="center"
_hover={{ cursor: "pointer", bg: "gray.50" }}
transition="background 0.2s"
cursor="pointer"
borderColor="gray.200"
borderWidth={1}
p={4}
onClick={createDataset}
>
<Icon as={isLoading ? Spinner : BsPlusSquare} boxSize={8} />
<Text display={{ base: "none", md: "block" }} ml={2}>
New Dataset
</Text>
</VStack>
</AspectRatio>
);
};
export const DatasetCardSkeleton = () => (
<AspectRatio ratio={1.2} w="full">
<VStack align="center" borderColor="gray.200" borderWidth={1} p={4} bg="gray.50">
<SkeletonText noOfLines={1} w="80%" />
<SkeletonText noOfLines={2} w="60%" />
<SkeletonText noOfLines={1} w="80%" />
</VStack>
</AspectRatio>
);

View File

@@ -0,0 +1,21 @@
import { useDatasetEntries } from "~/utils/hooks";
import Paginator from "../Paginator";
const DatasetEntriesPaginator = () => {
const { data } = useDatasetEntries();
if (!data) return null;
const { entries, startIndex, lastPage, count } = data;
return (
<Paginator
numItemsLoaded={entries.length}
startIndex={startIndex}
lastPage={lastPage}
count={count}
/>
);
};
export default DatasetEntriesPaginator;

View File

@@ -0,0 +1,31 @@
import { type StackProps, VStack, Table, Th, Tr, Thead, Tbody, Text } from "@chakra-ui/react";
import { useDatasetEntries } from "~/utils/hooks";
import TableRow from "./TableRow";
import DatasetEntriesPaginator from "./DatasetEntriesPaginator";
const DatasetEntriesTable = (props: StackProps) => {
const { data } = useDatasetEntries();
return (
<VStack justifyContent="space-between" {...props}>
<Table variant="simple" sx={{ "table-layout": "fixed", width: "full" }}>
<Thead>
<Tr>
<Th>Input</Th>
<Th>Output</Th>
</Tr>
</Thead>
<Tbody>{data?.entries.map((entry) => <TableRow key={entry.id} entry={entry} />)}</Tbody>
</Table>
{(!data || data.entries.length) === 0 ? (
<Text alignSelf="flex-start" pl={6} color="gray.500">
No entries found
</Text>
) : (
<DatasetEntriesPaginator />
)}
</VStack>
);
};
export default DatasetEntriesTable;

View File

@@ -0,0 +1,26 @@
import { Button, HStack, useDisclosure } from "@chakra-ui/react";
import { BiImport } from "react-icons/bi";
import { BsStars } from "react-icons/bs";
import { GenerateDataModal } from "./GenerateDataModal";
export const DatasetHeaderButtons = () => {
const generateModalDisclosure = useDisclosure();
return (
<>
<HStack>
<Button leftIcon={<BiImport />} colorScheme="blue" variant="ghost">
Import Data
</Button>
<Button leftIcon={<BsStars />} colorScheme="blue" onClick={generateModalDisclosure.onOpen}>
Generate Data
</Button>
</HStack>
<GenerateDataModal
isOpen={generateModalDisclosure.isOpen}
onClose={generateModalDisclosure.onClose}
/>
</>
);
};

View File

@@ -0,0 +1,128 @@
import {
Modal,
ModalBody,
ModalCloseButton,
ModalContent,
ModalHeader,
ModalOverlay,
ModalFooter,
Text,
HStack,
VStack,
Icon,
NumberInput,
NumberInputField,
NumberInputStepper,
NumberIncrementStepper,
NumberDecrementStepper,
Button,
} from "@chakra-ui/react";
import { BsStars } from "react-icons/bs";
import { useState } from "react";
import { useDataset, useHandledAsyncCallback } from "~/utils/hooks";
import { api } from "~/utils/api";
import AutoResizeTextArea from "~/components/AutoResizeTextArea";
export const GenerateDataModal = ({
isOpen,
onClose,
}: {
isOpen: boolean;
onClose: () => void;
}) => {
const utils = api.useContext();
const datasetId = useDataset().data?.id;
const [numToGenerate, setNumToGenerate] = useState<number>(20);
const [inputDescription, setInputDescription] = useState<string>(
"Each input should contain an email body. Half of the emails should contain event details, and the other half should not.",
);
const [outputDescription, setOutputDescription] = useState<string>(
`Each output should contain "true" or "false", where "true" indicates that the email contains event details.`,
);
const generateEntriesMutation = api.datasetEntries.autogenerateEntries.useMutation();
const [generateEntries, generateEntriesInProgress] = useHandledAsyncCallback(async () => {
if (!inputDescription || !outputDescription || !numToGenerate || !datasetId) return;
await generateEntriesMutation.mutateAsync({
datasetId,
inputDescription,
outputDescription,
numToGenerate,
});
await utils.datasetEntries.list.invalidate();
onClose();
}, [
generateEntriesMutation,
onClose,
inputDescription,
outputDescription,
numToGenerate,
datasetId,
]);
return (
<Modal isOpen={isOpen} onClose={onClose} size={{ base: "xl", sm: "2xl", md: "3xl" }}>
<ModalOverlay />
<ModalContent w={1200}>
<ModalHeader>
<HStack>
<Icon as={BsStars} />
<Text>Generate Data</Text>
</HStack>
</ModalHeader>
<ModalCloseButton />
<ModalBody maxW="unset">
<VStack w="full" spacing={8} padding={8} alignItems="flex-start">
<VStack alignItems="flex-start" spacing={2}>
<Text fontWeight="bold">Number of Rows:</Text>
<NumberInput
step={5}
defaultValue={15}
min={0}
max={100}
onChange={(valueString) => setNumToGenerate(parseInt(valueString) || 0)}
value={numToGenerate}
w="24"
>
<NumberInputField />
<NumberInputStepper>
<NumberIncrementStepper />
<NumberDecrementStepper />
</NumberInputStepper>
</NumberInput>
</VStack>
<VStack alignItems="flex-start" w="full" spacing={2}>
<Text fontWeight="bold">Input Description:</Text>
<AutoResizeTextArea
value={inputDescription}
onChange={(e) => setInputDescription(e.target.value)}
placeholder="Each input should contain..."
/>
</VStack>
<VStack alignItems="flex-start" w="full" spacing={2}>
<Text fontWeight="bold">Output Description (optional):</Text>
<AutoResizeTextArea
value={outputDescription}
onChange={(e) => setOutputDescription(e.target.value)}
placeholder="The output should contain..."
/>
</VStack>
</VStack>
</ModalBody>
<ModalFooter>
<Button
colorScheme="blue"
isLoading={generateEntriesInProgress}
isDisabled={!numToGenerate || !inputDescription || !outputDescription}
onClick={generateEntries}
>
Generate
</Button>
</ModalFooter>
</ModalContent>
</Modal>
);
};

Some files were not shown because too many files have changed in this diff Show More